Algorithmic complexity analysis on data transfer rate and data storage for multidimensional signal processing

2013 
Algorithmic complexity, such as data storage size and data transfer rate, is dramatically increased in multidimensional signal processing, including visual computing exploiting temporal and spatial information to achieve better visual quality. This paper present a systematic method, which is a new paradigm of designing on the complex multidimensional signal and is called as algorithm/architecture co-exploration, to efficiently quantify the algorithmic complexity, including data storage and data transfer rate, whose characteristics are independent from platforms. By exploring design space based on the dataflow with different executing orders and various data granularities, the trade-off between data storage size and data transfer rate is made by a systematic manner and hence the algorithm could be smoothly mapped onto architecture. Case studies reveal that our framework can effectively characterize the complexity of algorithms, and that the extracted complexity can facilitate design space exploration at various data granularities.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    23
    References
    2
    Citations
    NaN
    KQI
    []