0

Big Data Losing Out to Fast Data

bigdata

The world wide web is a complex maze of data getting compressed with more and more data every minute. Infact Wired.com reveals figures on data being amassed every minute at a staggering, 48 hours of video uploaded onto YouTube. 204 million email messages sent and 600 new websites generated. 600,000 pieces of content are shared on Facebook, and more than 100,000 tweets are sent, all of this in a minute. This is just layer of data the world experiences every minute socialising, not taking into account corporate databases, healthcare related data, data generated and processed by financial institutions, government agencies and a lot many more other sources that generate data.

  

The obsession with data is slowly taking over the world with every industry dependant on data for some or the other purpose. The most important use data serves today is that of providing insights. In case of medical practitioners the insight could be used to learn about the spread about diseases, for pharma companies it could give insight about the popularity or the impact of a newly released drug, for marketers whose life cycle has turned data centric trying to learn about the right audience, presenting the right audience with the right product and services so and so forth. As the data strata increases at rapid pace, it is getting identified and categorised into different types. The one type of data that has captured imaginations and has been considered a boon to the digital world is Big Data.

 

TechTarget considers ‘Big Data’ to be an evolving term that describes the voluminous capacity of data that is either structured, semi-structured, or unstructured and holds the potential of revealing information or insight. Big data can be pinned to three main characteristics:

 

Extreme Volume: Big data when referred to means to specify humongous amounts of data that is usually measured in Petabytes, Exabytes, Zettabytes and Yottabytes.

 

Wide Variety: Speaking about the variety of data Forbes states, rarely does data present itself in a form perfectly ordered and ready for processing. A common theme in big data systems is that the source data is diverse, and doesn’t fall into neat relational structures. It could be text from social networks, image data, a raw feed directly from a sensor source. None of these things come ready for integration into an application.

 

Velocity: The rate at which data streams or is processed can be attributed as big data. Sqlauthority.com explains The data growth and social media explosion have changed how we look at the data. There was a time when we used to believe that data of yesterday is recent. The matter of the fact newspapers is still following that logic. However, news channels and radios have changed how fast we receive the news. Today, people reply on social media to update them with the latest happening. On social media sometimes a few seconds old messages (a tweet, status updates etc.) is not something interests users. They often discard old messages and pay attention to recent updates. The data movement is now almost real time and the update window has reduced to fractions of the seconds. This high velocity data represent big data.

 

While the world already is sitting on an extraordinarily huge pile of data IBM pointed out to 2.5 exabytes, that’s 2.5 billion gigabytes (GB)  of data was generated every day in 2012, a report from 2015 mentioned that the world’s volume of data was expected to grow 40 percent year on year and 50 times by 2020.

 

With such a huge amount of data available for use businesses are keen to use it to their advantage and are looking at newer avenues in relation to data gathering and processing to find a competitive edge over competition. No wonder that the world’s attention in this fast paced world has shifted focus from big to fast data.

 

So what exactly is Fast Data?

 

Infoworld.com defineS Fast Data as, Big data is often created by data that is generated at incredible speeds, such as click-stream data, financial ticker data, log aggregation, or sensor data. Often these events occur thousands to tens of thousands of times per second. No wonder this type of data is commonly referred to as a “fire hose.” When we talk about fire hoses in big data, we’re not measuring volume in the typical gigabytes, terabytes, and petabytes familiar to data warehouses. We’re measuring volume in terms of time: the number of megabytes per second, gigabytes per hour, or terabytes per day. We’re talking about velocity as well as volume, which gets at the core of the difference between big data and the data warehouse. Big data isn’t just big; it’s also fast.

 

Since fast data is real time solution where in data is processed and analysed in real time. Its value is much more than data getting collected and sitting in storage to be analysed later. The use of fast data can be many:

 

  • For retailers it can help them personalise and present relevant offers
  • In case of financial institutions it can help in averting fraud before the company or its customers sustain a loss
  • Service providers can ensure compliance with operational service level agreements
  • Data aggregators or processors can respond to data events in real time
  • Manufacturers can improve on operations, billing and service delivery

 

A worthy example of fast data at work is ‘Uber’ which works on real time data as soon as a request for its service is posted by a commuter the message is captured by Uber to connect to the nearest available driver, calculating the costs for the trip, the company immediately reverts back to the requesting consumer all within a few seconds. In a study by Capgemini titles ‘Big Fast Data, The rise of Insight Driven Business’, 77 percent respondents agreed that decision makers increasingly required data in real time, while 54 percent consented that leveraging fast data was more important than leveraging big data.

 

For customer facing digital businesses with a quest for data can look towards Customer Identity Management as the new digital relationship imperative as stated by Gartner. CIM capabilities ensures businesses capture first party data, which is made available to the business in shape of customer insights. Customer insights provides for segmented in depth analysis of profile data and other data captured that gives valuable insight to drive the marketing efforts of a business.

 

Leave a Reply

Your email address will not be published. Required fields are marked *