Data: Think Big

Data: Think Big

What is Big Data?

There are many diverse definitions of Big Data. BI consultancy, Hurwitz and Associates, defines Big Data as “the capability to manage a huge volume of disparate data, at the right speed, and within the right time frame to allow real time analysis and reaction”.

Some people define the concept of Big Data by its characteristics. VC Choudary, Associate Professor of Information Systems at UCI, says “what differentiates Big Data from traditional data is the sheer volume of information, velocity at which it is created, and the variety of sources from which it is drawn”.

Some characteristics focus on the collection of data:

Volume: Terabytes to Exabytes, petabytes to Zetabytes of lots of data

Velocity: Streaming data, milliseconds to seconds, how fast data is produced, and how fast the data must be processed to meet the need or demand

Variety: Structured, unstructured, text, multimedia, video, audio, sensor data, meter data, html, text, e-mails, etc.

Some characteristics focus on the usefulness of data:

Veracity: Uncertainty due to data inconsistency and incompleteness, ambiguities, latency, deception, model approximations, accuracy, quality, truthfulness or trustworthiness

Variability: The differing ways in which the data may be interpreted, different questions require different interpretations

Value: Data for co-creation and deep learning

Whatever the definition, you can’t deny that there are some big benefits to using Big Data:

  • Timely – 60% of each workday, knowledge workers spend attempting to find and manage data.
  • Accessible – Half of senior executives report that accessing the right data is difficult.
  • Holistic – Information is currently kept in silos within the organization. Marketing data, for example, might be found in web analytics, mobile analytics, social analytics, CRMs, A/B Testing tools, email marketing systems, and more… each with focus on its silo.
  • Trustworthy – 29% of companies measure the monetary cost of poor data quality. Things as simple as monitoring multiple systems for customer contact information updates can save millions of dollars.
  • Relevant – 43% of companies are dissatisfied with their tools ability to filter out irrelevant data. Something as simple as filtering customers from your web analytics can provide a ton of insight into your acquisition efforts.
  • Secure – The average data security breach costs $214 per customer. The secure infrastructures being built by big data hosting and technology partners can save the average company 1.6% of annual revenues.
  • Authoritative – 80% of organizations struggle with multiple versions of the truth depending on the source of their data. By combining multiple, vetted sources, more companies can produce highly accurate intelligence sources.
  • Actionable – Outdated or bad data results in 46% of companies making bad decisions that can cost billions
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s