Effect of fixed-point arithmetic on deep belief networks (abstract only)

2013 
Deep Belief Networks (DBNs) are state-of-the-art learning algorithms building on a subset of neural networks, Restricted Boltzmann Machine (RBM). DBNs are computationally intensive posing the question of whether DBNs can be FPGA accelerated. Fixed-point arithmetic can have an important influence on the execution time and prediction accuracy of a DBN. Previous studies have focused only on customized RBM accelerators with a fixed data-width. Our results experiments demonstrate that variable data-widths can obtain similar performance levels. We can also observe that the most suitable data-widths for different types of DBN are not unique or fixed. From this we conclude that a DBN accelerator should support various data-widths rather than only fixed one as done in previous work. The processing performance of DBN accelerators in FPGA is almost always constrained not by the capacity of the processing units, but by their on-chip RAM capacity and speed. We propose an efficient memory sub-system combining junction and padding methods to reduce bandwidth usage for DBN accelerators, which shows that supporting various data-widths is not as difficult as it may sound. The cost is only little in hardware terms and does not affect the critical path. We design a generation tool to help users reconfiguring the memory sub-system with arbitrary data-width flexibly. Our tool can also be used as an advanced IP core generator above FPGA memory controller supporting parallel memory access in irregular data-width for other applications.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []