CASCON workshop on developing big data applications and services

2018 
Research from Gartner (2015) indicates that, in 2017, 60% of Big Data projects failed or did not provide the expected benefits [1]. However, in November 2017, Nick Heudecker, a Gartner analyst, posted in his twitter account that they were too conservative. The Big Data project failure rate is now close to 85%. The reasons are not only related to technology itself [2]. It is a mix of environmental, technological and managerial problems. Some of the reasons for Big Data projects failure are: At the project level [3], [4]: missing link to business objectives, lacking big data skills, relying too much on the data, failing to convince executives, and poor planning; At the technical level [5]: Rapid technology changes, difficulty in selecting Big Data technologies to address the systems and project requirements, complex integration between new and old systems, computation of intensive analytics, and the necessity of high scalability, availability and reliability, to name a few. Further, a previous study [6] has shown that there is approximately a 80:20 split in the industry focus in favor of algorithms for analytics and infrastructure, thereby shortchanging the aspects of creating and evolving applications and services concerned with Big Data.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    3
    References
    0
    Citations
    NaN
    KQI
    []