Block floating point (BFP) is a hybrid of floating-point and fixed-point arithmetic where a block of data is assigned a common exponent. Learn about the only FPGA with machine learning processors that can deliver native BFP capabilities with higher performance and lower power consumption compared to traditional FPGA DSP blocks.
Presented by: |
Mike Fitton, PhD - Sr. Director of Strategy and Planning at Achronix
Mr. Fitton has 25+ years of experience in the signal processing domain, including system architecture, algorithm development, and semiconductors across wireless operators, network infrastructure and most recently in machine learning. |