The launch of digital media back in the early 2000s saw great excitement from marketers across the globe when they realised they could finally measure actual ad interactions, not recall based metrics, writes Alice Almeida, regional head of research, data, and insights at SQREEM Technologies.
Digital measurement businesses were suddenly booming off the back of a single metric, clicks, something traditional media could never offer nor track.
Readership? TARPs? Sure, they were nice to have. But the real question became: “How many people clicked?” We all became click obsessed.
The Publisher’s Dilemma
For publishers, it quickly became clear that this obsession was going to be a problem. Campaign performance, and by extension, publisher performance, was being judged purely by how many clicks were delivered, even though we had little control over some of the biggest factors influencing that number: poor creative, weak calls to action, irrelevant messaging.
Our job as publishers was to deliver the audience and the platform. The rest, creative strength, timing, cultural relevance, was out of our hands.
To steer attention away from click mania, we introduced Advertising Effectiveness (AdE) studies, which soon became the next big thing. These studies focused on measuring brand impact; shifts in awareness, consideration, recommendation, sentiment, and purchase intent.
AdE studies became so integral to campaigns that they also turned into a bargaining chip in client negotiations. And I use the word “negotiations” loosely. It usually went something like:
Agency: “Run an AdE study or you won’t get this revenue”
Publisher: “Guess we’re running the study.”
They became so popular that we had to cap how many we conducted each month, the user experience became chaotic and frustrating. When someone chose to say “Fxxk you for ruining this website” in every open text, we knew something needed to change.
But as with all trends, overuse of AdE led to fatigue across the industry. Vendors began standardising templates, stripping away the ability to tailor studies to specific campaign objectives. I once had to convince someone that a standard Brand Awareness AdE template for Telstra was a waste of money…. But we still did it because #revenue.
The Return of the Click
Eventually, AdE fatigue set in, and CTR started creeping back into the spotlight. AdE was labelled “fluffy,” while clients began demanding “real” numbers again:
“How many people clicked through to my website?”
“How many units did we sell from that banner?”
CTR was easy to report and grounded in actual user behaviour. Digital ad creative and messaging matured and improved. As an industry, we learnt how to use CTR as a metric correctly. But the question still stands. Is it really an effective measure of success?
It’s a piece of the puzzle, sure. But any CTR figure should come with a giant disclaimer, and here’s why.
1. The Fat Finger Effect
How many times have you tried to scroll on your phone, only to accidentally tap an ad? Or gone to close a pop-up and missed the tiny “X,” landing on the ad instead? For me, it’s a daily occurrence, and not because my eyesight’s fading as I start to inch closer to my mid-40’s, but because many placements are designed to make you click by mistake.
While exact numbers are hard to pin down, studies across multiple ad networks estimate that 10 per cent to 50 per cent of clicks are accidental or unintentional, and that’s before factoring in fraudulent or invalid clicks. When you consider that the average CTR for display ads is only 0.05 per cent to 0.1 per cent, those accidental clicks represent a huge distortion.
2. The Bot Problem
Beyond the “fat finger” issue, bots are another major contributor to inflated CTRs. Automated clicks can account for 10 per cent to 30 per cent of all ad interactions, making it almost impossible to trust CTR as a genuine reflection of audience engagement.
While the industry is making progress in detecting and filtering invalid traffic, we’re still a long way from being able to take CTR at face value.
3. When Creative Repels Clicks
Back in 2010, I worked on an Advertising Effectiveness study for a major financial institution’s rebrand campaign in Australia. They went all in, saturating every media channel with their bold new mascot. (Out of respect for the brand, which I genuinely like and am a customer of, I won’t name them.)
The results were disastrous. It was one of the first studies I’d seen that showed a 20-point decline across most brand metrics, and the CTR was practically non-existent. Because the campaign had a strong call-to-action and CTR was the key success measure, both the client and agency were far from pleased.
The publisher I worked for took the blame. We faced demands for “make goods,” “bonus spots,” and even threats to pull spend. To salvage the relationship, we offered an additional research piece. This time, I added a few creative focused questions, because I had my suspicions (I strongly disliked the new mascot…)
The findings confirmed it: Australians hated the creative. Not mildly disliked, hated. Many found it creepy and actively avoided it. In this case, it wasn’t the publisher or the placement that failed. It was the creative that repelled clicks entirely.
4. High CTR ≠ Campaign Success
Even when a campaign achieves a strong CTR, it tells only part of the story. What happened after the click? Did users find what they were promised? Did they sign up, purchase, register, or engage meaningfully?
Without context, CTR alone is meaningless. Tools like Google Analytics now allow us to track far richer insights; dwell time, bounce rate, pages per session that paint a clearer picture of whether your campaign genuinely worked.
CTR is just one piece of a much larger story.
The Bottom Line
Clicks might be easy to count, but not everything that can be counted counts. CTR is useful as a directional indicator, but it’s a weak measure of true marketing performance overall.
In a world where audiences swipe, scroll, skip, and block faster than ever before, success isn’t about who clicked, it’s about who noticed, connected, and acted.
It’s time to move beyond the click.

