As a marketing professional for the better part of four decades, I can’t imagine any of us doing our jobs without soaking up as much data as possible to help with our decision-making.
And data accessibility is miles ahead of where it was when I first entered the marketing field. Back in the day, “finding data” meant hitting the reference libraries to access government or other reporting – especially if you were lucky enough to be located within a reasonable distance of one.
There was the phone for real-time information-gathering … and also the FAX machine for quick receipt of “facts in brief” — not to mention the “wait-and-wish-for” mail and package delivery services.
If it was insight you needed from customers or prospects about a new industry or business venture, primary research was always an option — if you had the money and the time to allocate to the effort.
As for “first-party” data, that was available as well – but how often were we at the mercy of the bureaucratic machinations of in-house IT departments to get even basic data requests processed in a timely way?
All of which is to say that marketers have always used data – but the quantity wasn’t as great, while the timeframe of data acquisition was at a snail’s pace compared to today’s reality.
But now, after having become quite spoiled at the availability of all sorts of information, might it be that we’re regressing a little?
In particular, third-party information purchased in bulk, often from data aggregators, seems to be where the backsliding is occurring.
Consider ad targeting and building audiences: We have access to valuable first-party data thanks to website analytics and studying the results of our own e-mail campaigns.
There’s no question that the first- and second-party data which marketers are able to access are highly valuable in that the information creates efficiencies in campaigns and drives higher conversion rates. But theoretically, the ability to layer on accurate third-party data would take things even further.
There’s also been third-party behavioral data from three big behemoths — Google, Facebook and Amazon – that can be used for MarComm targeting purposes. But of those three platforms, just one of them allows third-party data to be made publicly available to end-users.
This poses challenges for the suppliers that aggregate and sell third-party data, as the quantity and quality of their information isn’t on the upswing at all.
Fundamentally, finding a good source for third-party data entails understanding what sources each data aggregator is using and the methodology it employs to collect the data. Factors of scale, quality, reputation and price also come into play.
But despite best efforts, when testing third-party data for MarComm campaigns and lead-generation efforts the results are often pretty ugly — the data loaded with inaccuracies and basically terrible for efficiency metrics.
It doesn’t help that with the rise of Amazon as yet another “walled garden” of data, the “open web” represents a ever-smaller portion of the total ad spend — and hence also a decreasing amount of the third-party data that’s available to end-users.
With the veracity of third-party data becoming more suspect, it’s had an interesting effect on data management platforms, which are now focusing more on the actual messages themselves and not the “personas” of the people receiving the messages or how they were identified and targeted.
Is it possible for third-party data to provide good information to AI systems — intelligence that can verify and augment the value of the first-party data? If leading ad platforms can use such third-party data to enhance the accuracy and value of what they sell to advertisers, there still may be valuable material to work with. As it stands, though, I’m not sure that’s the case.
What are your experiences? Please share your perspectives with other readers here.