Correlation, Causation, and Meta Ads Mistakes

When you change your targeting and results improve, or implement GA4 integration and performance tanks, it's tempting to assume one caused the other. Jon explains why advertisers constantly mistake correlation for causation and what you should do instead of jumping to conclusions.
So today's episode is for the statistics nerds and philosophy nerds.
At some point, you may have heard someone say, "Correlation does not imply causation." A proud moment of fatherhood for me was when I was watching a football game with my sons. The announcers started talking about some nonsense regarding how one team needed to run the ball a certain number of times to win. For example, they said, "If they run the ball 25 times in a game, they win."
That's a bad, logical argument. Teams that win often run the ball more because they have the lead, while a team that's behind will be forced to pass. My oldest son looked at me and said, "Correlation does not imply causation." I had never been prouder.
This is a critical concept in advertising. What does it mean, how does it describe common mistakes, and what can we do to avoid them?
This is one of my favorite topics. I was even a philosophy major back in the day. We definitely covered this, though it’s all kind of a blur.
When we talk about correlation and causation, there’s a statistical relationship between two variables. Correlation means a change in one variable happened, followed by a change in another variable. In advertising terms: you changed X, and results went up or down. Causation means that one variable directly causes a change in the other. But just because you changed X and results changed doesn’t mean that X caused the change.
Let’s go over a real-world example. There’s a correlation between hot chocolate sales and flu cases. When hot chocolate sales go up, flu cases go up. Does that mean hot chocolate causes the flu? Of course not. Flu cases rise in the winter. People also drink more hot chocolate in the winter. There’s correlation, but not causation.
We know that’s ridiculous, but we do this kind of thing in advertising all the time.
Here are some examples.
Let’s say you're not getting results when using Advantage+ Audience with detailed targeting as a suggestion. You switch to using “digital marketers” as your audience suggestion and see improved results. You assume that targeting digital marketers caused the improvement. There's a correlation, but it's unlikely to be the cause. Audience suggestions often have little impact, and the results were likely random.
Here’s another example. You pause your ads, and sales go up. Your client becomes convinced that ads were hurting revenue. But they also sent out a big email promotion and increased organic efforts at the same time. Those likely caused the spike in sales. The variables were correlated, but one didn’t cause the other.
Another example: an advertiser implements the GA4 integration, and results immediately tank. It’s tempting to assume causation. But there's no evidence that the GA4 integration has any impact on attribution or optimization. Plenty of other factors could be at play.
You could apply this to so many things: switching from narrow to broad targeting, combining ad sets, using emojis. If performance changes, it’s tempting to think the last thing you changed was the cause. That’s rarely the case.
Bottom of the Glass
When trying to determine whether a change caused your results to go up or down, consider the following:
First, limit the changes you make at one time. If you change your budget, campaign structure, and ads all at once, and results drop, you can’t be sure which change caused it. A/B tests isolate a single variable, but even those aren’t perfect. I’ve seen a 25% difference in performance when testing identical ad sets.
Second, understand that correlation rarely implies causation. Results must be meaningful based on volume, but also predictable and repeatable. You need to rule out other factors like market conditions, algorithm shifts, seasonality, or organic traffic.
Finally, be comfortable with uncertainty. Don’t be so quick to assume you’ve found the cause. Note the correlation, but look for strong evidence before assuming a cause-and-effect relationship.
You don’t need to ignore correlation entirely. But don’t build your strategy around a relationship that might not be real.