Big Data (BD) is not a “thing” (noun). Big Data is a set of actions (verb), orchestrated by people, using specialized tools (software, hardware, algorithms) to organize data streams for the purpose of gaining additional insight. The quantity of data, in unto itself, does not equate to Big Data. The processes used to correlate and draw inferences from (potentially) disparate data streams, ultimately leading to insight, is the soil from which BD thrives.

Nestlé Waters North America, as documented by Marketing Land, demonstrates the power of a creative Big Data strategy.

Antonio Sciuto (CMO @ Nestlé Waters North America): “We listen [to] our consumers on social platforms to understand: conversation topics, share of conversation by social platform, tone, consumer sentiment, roles and consumer engagement rules by media touch-point.

This understanding is enabling us to define the right opportunities to engage consumers with content and the right calls to action by online and offline touch-points. Our whole consumer journey is managed by leveraging… marketing cloud solutions, powered by Salesforce, that allow us to listen, analyze, and engage consumers by automating consumer interactions.

Our mission is to build communities around our brands and content based on the real needs of our audience, offering them a truly personalized omni-channel experience to deepen their engagement with our brands. Success to be measured in market share and loyalty to our brands.”

In 1998 a company was launched with the following mission:

“To organize the world’s information and make it universally accessible and useful.”

With the clarity provided by 17+ years of hindsight, Google’s sublime mission statement not only signals the modern era of Big Data, it provides a succinct, human-readable, and aspirational “true north” when setting expectations for all Data Science, including BD, outcomes.

The following, based on Google’s mission statement, attempts to illustrate a theory of Big Data relativity:

B = DQE

B = Big Data Cost (a.k.a. cost to achieve a “useful” and / or “insightful” result)

D = Number of discrete data streams (a.k.a. “information”)

Q = Average quantity of information per data stream (a.k.a. “information”)

E = Effort required to process

Working with the assumption that a “useful” and / or “insightful” result is the objective of any BD exercise, we see that although the quantity of inputs (data streams) impacts the cost, the more significant driver is the level of effort required to achieve a result. From a practical business perspective, the key to implementing a cost effective Big Data strategy is not the quantity of data or the number of streams, but the human and machine effort required to achieve a “useful” or “insightful” result.

If we accept the premise that “Big Data” is a verb, any organization looking to achieve cost effective results from the analysis of their data must consider the following:

  1. Do you have the expertise (a.k.a data scientists), from the beginning, to build an effective strategy?
  2. Do you have the software and hardware tools to accomplish the required level of data ingestion and analysis?
  3. Are your resources (human, software, hardware) capital expenses, or are they elastic (operational expenses)?

Understanding (and appreciating) the linguistic differences between Big Data (verb) and data (noun) will allow individuals and enterprises to more effectively collaborate on strategies which achieve outcomes of insight and value.

Is your business prepared for the new state of analytics? Download the free report to learn more.