Encash Big Data: Latest Big Data Trends & Statistics

One of the fundamental questions when we come across the term "Big Data" relates to how "Big" it actually is? And that's exactly where the Big Data gets exciting, because it keeps growing bigger and richer every day.

According to a recent IDC forecast, by 2025 the global datasphere will grow to 163 zettabytes (that is a trillion gigabytes). That’s also ten times the 16.1ZB of data generated in 2016. All this data will unlock unique user experiences and a new world of business opportunities.

Here are some key trends from the study Data Age 2025 highlighting the changing and growing role of big data:

  • The evolution of data from business background to life-critical: IDC estimates that by 2025, nearly 20% of the data in the global datasphere will be critical to our daily lives and nearly 10% of that will be hypercritical.
  • Data from embedded systems and the Internet of Things (IoT): By 2025, an average connected person anywhere in the world will interact with connected devices nearly 4,800 times per day basically one interaction every 18 seconds.
  • Mobile and real-time data: By 2025, more than a quarter of data created in the global datasphere will be real time in nature, and real-time IoT data will make up more than 95% of this.
  • Data usage for Cognitive/artificial intelligence (AI) systems: IDC estimates that the amount of the global datasphere subject to data analysis will grow by a factor of 50 to 5.2ZB in 2025; the amount of analyzed data that is “touched” by cognitive systems will grow by a factor of 100 to 1.4ZB in 2025.
  • Big Data Security. By 2025, almost 90% of all data created in the global datasphere will require some level of security, but less than half will be secured.
More on what exactly is Big Data

According to Gartner, Big data can be defined as high-volume, high-velocity and/or high-variety information assets that demand cost-effective, innovative forms of information processing that enable enhanced insight, decision making, and process automation. What is evident here (and in several other definitions of Big Data across sources), it does not point to a single entity or or technology but a rather large concept which includes technology and solutions geared towards tackling the deluge of data.

The benchmark of "big" data keeps evolving but 3 specific dimensions (the 3Vs) have characterized this evolution for quite some time now:

  • Volume: This refers to the size of generated and stored data and as we discussed above, benchmarks continuously evolve here. While a typical PC might have had 10 GB of storage in 2000,  Facebook ingests 500 terabytes of new data every day and a Boeing 737 will generate 240 terabytes of flight data during a single flight across the US. And with proliferating mobile as well as Internet of Things adoption, the volume of data businesses will be dealing with, is only increasing at an unprecedented rate.  
  • Variety: This refers to the type and nature of the data. Data today is extending beyond traditional types like numbers, dates, and strings to include geospatial data, 3D data, audio and video, and unstructured text, including log files and social media. You may also include within this - data set inconsistencies (Variability) and data quality (Veracity) which some organizations also feel comfortable keeping as separate dimensions.  
  • Velocity: This refers to the speed at which the data is generated and processed. For example, clickstreams and ad impressions capture user behavior at millions of events per second. Similarly, high-frequency stock trading algorithms reflect market changes within microseconds. Infrastructure and sensors generate massive log data in real-time. On-line gaming systems support millions of concurrent users, each producing multiple inputs per second.

Here is an interesting infographic from Aureus Analtics which summarizes some key Big Data trends through 2015