“Follow the science” has become somewhat of a COVID-era catchphrase. And while it’s a well-intentioned sentiment — we should absolutely listen to what science tells us — it’s also limited and simplistic.
The phrase is problematic because it reduces the whole scientific method to a three word directive. Worse, it doesn’t actually direct you toward what to do with that science you’re now blindly following.
Believe it or not, the phrase “data-driven marketing” is just as reductive and shortsighted. Marketing cannot be completely driven by data in a spreadsheet. Why? Because, while data is undeniably powerful and necessary, it’s not definitive. It doesn’t magically reveal a clear-cut next step. Even with the very best data in hand, marketers still need to think critically. They need to use their own human brain power in order to decide what to do with that spreadsheet data.
To avoid these limiting data-only mindsets, science and marketing professionals alike should employ a guess, test, and revise process — basically, the scientific method. This allows you to leverage your best data and your own critical thinking skills to inform well-rounded, holistic, and strategic decisions and next steps. Next steps rooted in data and human expertise.
Why You Can’t Just “Follow the (Data) Science”
The superintendent of a school district can’t simply “follow the science” to decide if she should reopen schools during a pandemic.
Yes, the science does give her information and data about the safety of reopening schools. But it stops short of making the decision for her. Instead, she must tap into her own reasoning to apply the science, the data, to her specific situation.
In addition to this scientific data, she has to account for the opinions of the school board, teachers, staff, and parents as well as local infection rates, the efficacy of her school buildings’ HVAC system, vaccination availability, etc. Her school reopening decision is far from obvious, and there are so many factors she has to grapple with alongside the science.
What’s more, superintendents in different parts of the country might make different decisions with the same scientific data because of their own prejudices, interpretation of that scientific data, and surrounding circumstances.
Clearly, it’s not as cut and dry as, “the data says X, so we’ll do Y.”
Where Marketing Data Falls Short — Understanding Human Behavior
Even with insightful data and robust martech tools, the marketers’ path to sales success is no clearer than the superintendents’ school reopening decision. Not to mention, marketing data can just as easily be interpreted and applied differently depending on the circumstances.
Take this example: A healthcare company needs to hire additional physicians. The CMO launches an extensive marketing campaign to target physicians fresh out of med school. The campaign includes direct mail, LinkedIn ads, conference booths — the whole nine yards. One more expensive aspect of the campaign is a card mailed to promising prospective hires. The card includes a link that these hires can visit to claim an elaborate care package.
As the CMO is analyzing campaign data, he realizes very few people clicked that link. He comes to the conclusion that, based off that one data point, the care package wasn’t worthwhile. It didn’t directly result in any new hires.
The problem is that human behavior — physician behavior, in this instance — is much more nuanced than one data point can show.
What if one of the physicians who received the care package card wasn’t in the market for a new job at that time? What if, instead, he came across this company’s LinkedIn ad a year later, when he was looking for work, and remembered the logo from that card? That card, which he recalled fondly, nudged him to convert on the LinkedIn ad. Sure, he ultimately converted on LinkedIn, but the care package was an integral, inciting part of his conversion journey. Again, the data couldn’t have accounted for this.
All of this means that you can’t dismiss entire marketing tactics based solely on data. You have to think critically and holistically about the campaign, data points and otherwise, before you make judgements about what to do next.
The Scientific Method Leverages Powerful Data and Your Expertise
If you feel there’s value in your care package-type marketing tactic, don’t do away with it outright. Even if the data isn’t as compelling a case for this tactic as it might be for something like Google Ads, you should employ a guess, test, and revise method before you write it off completely.
As mentioned, guess, test, and revise is the scientific method boiled down to its core. You have a hypothesis, you test it, and then you revise it based on your findings. This isn’t blindly “following the science.” It’s using the science as a stepping stone or a tool to optimize your own hypothesis.
Let’s return to our care package example one last time. Rather than toss the direct mail portion of the campaign out, the CMO could revise it. Optimize it. So the data says not enough people are converting on the link in the card. Should the card copy change? How about the link — can it be shortened? Or can design add a QR code for easier access to the care package webpage?
Try something new, test it, and revise again if necessary.
We’re not arguing that you should stick to tired or fruitless marketing tactics if nothing you do makes them better. Just don’t let data alone decide that for you; data can’t do the work of humans.
Think critically about what your data might be saying to you, and leave room for other, non-data factors to influence your marketing plans too.
At the end of the day, data, though powerful, only gives us half or (at best) three-quarters of the picture. It’s your expert guidance that fills in the rest, acting on that data to provide value for your company.