What is Extreme Big Data?

 JST CREST: EBD: Extreme Big Data – Convergence of Big Data and HPC for Yottabyte Processing (Since Oct, 2013)

Although the data being handled in the current “Big Data” infrastructure is often actually not so “big” by HPC standards, in the future they are expect to explode by several orders of magnitude, both in terms of their capacity and complexity. This poses immense problems for the existing IDC/Cloud “Big Data” infrastructures due to their lack of system bandwidth and processing capacity, as well as for HPC/Supercomputers because of their lack of real-time processing capabilities etc. Our work will focus on the next generation “Extreme Big Data” infrastructure by developing a set of technologies and the resulting system attaining their convergence through co-designing with representative future big data applications, aiming for up to 100,000 times improvement in the data processing capabilities over the next 10 years.