Cheaper by the Dozen: Batched Algorithms*
2001
1 Introduction While computing power and memory size have been steadily increasing as predicted by Moore's Law, they are still dwarfed by the size of massive data sets resultant from a number of applications. Many problems arising from astrophysics, computational biology, telecommunications, and the Internet often have an amount of accompanying data in the terabyte range. The analysis of this data by classical algorithms is often prohibitively expensive. Thus new ideas are necessary to create algorithms to deal with these massive data sets.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
0
References
0
Citations
NaN
KQI