Data Gravity — Why Should Marketers Care?

Published on February 14, 2018

Illustration of data gravity for applications and services

Dave McCrory first introduced the concept of Data Gravity back in 2010, referring to Salesforce’s Database.com launch. The big idea of data gravity in his own words is —

As data accumulates (builds mass) there is a greater likelihood that additional services and applications will be attracted to this data. This is the same effect gravity has on objects around a planet. As the mass or density increases, so does the strength of gravitational pull.

That’s sounds interesting in theory, but what does it mean in real life? As the volume of data generated grows exponentially, due to the Von Neumann bottleneck it becomes critical for applications to head towards the location of data storage in order to maximize throughput and reduce latency. And just like the gravitational pull increases as the mass is higher, applications that are close to the data have increasing value as its volume increases.

Historically, since enterprise data was primarily stored locally Tableau was built as a desktop application. However, as enterprises now move away from on-prem to cloud storage, keeping the BI Platform and data store separate adds latency and slows down performance.

To optimize for performance, cloud providers like Amazon and Microsoft have begun to adapt and launched their own BI Applications like Amazon’s QuickSight and Microsoft’s PowerBI. These solutions sit right on top of the data and perform a lot better than legacy solutions.

In a world where marketing leaders will spend more than $100 billion by 2019 on search, display, social, and email marketing — a figure that surpasses broadcast and cable television spend (Source) — marketing data is probably the most massive of all! It’s surprising that a majority of performance marketing leaders still rely on solutions like Tableau that were never built for marketing, and sit multiple levels away.

There are multiple problems with Tableau. First, though Tableau has led the wave in data visualization, latency issues that arise from cloud deployments can severely hinder its ability to handle large marketing datasets. As the volume of data increases and real-time performance insights require more and more pulls, the latency will add up (Source). Insights have a quick expiry, which renders Tableau a severely handicapped solution for marketers’ needs.

There’s a second, deeper problem with Tableau. It boils down to the power of context. Consider you are presented with an extension-less file — DSC0001. If you open it with a text editor, all you would see is junk values which don’t make sense. But, in the text dump you notice the words — Canon PowerShot. You guess that this is an image file and add the extension .jpg. Now you can see the actual image! That’s one layer of context — understanding that cameras generate .jpg files.

Let us add another layer of context — you upload the file to Google Photos. It recognizes and automatically shows metadata like dimensions, resolution, f-stop, shutter speed, ISO, etc. Now you not only have the image, but also a host of information that will allow you to understand why the photo looks the way it does. Is it too blurry? You can take one look at the shutter speed and understand why.

Illustration of how data contextualized into information can be combined with knowledge to take action

Data by itself is useless. Applications need depth of context, in order to help marketers make sense out of it. Adding context is what turns data into useful information, which when combined with prior knowledge can help facilitate action. And this is exactly where Tableau, and even QuickSight and PowerBI fall flat. As BI tools that are horizontal solutions, they can ingest any dataset. But only an application with context can understand that quality score cannot simply be summed up, it needs to be treated as a weighted average.

Until now marketers have been forced to adapt their workflows to fit in BI tools that are slow, cross-functional and controlled by IT. Instead of a fast, flexible marketing-centered application that’s built to empower the existing workflows of marketers. Hence the case for one that not only sits right where the data is, but also has all of the marketing context built-in — to work with Marketing that is always-on rather than BI systems which were built primarily for historical analysis.

That’s exactly what we at Granular are building. Some of the largest data-driven companies and agencies around the world have been able to transform the way they leverage their marketing data, using Granular. To know more, get in touch!