In this op-ed, Kevin Wong, Co-Founder and CTPO of Symphonics AI, argues that advertising is entering a fundamental reset, one that replaces surveillance-led identity tracking with signal-driven understanding. For decades, the industry has relied on knowing who a person is by following them across the web. But as Wong outlines, the next era won’t be built on identifiers at all. It will be built on signals – the patterns of behaviour, language and intent that reveal not just where people have been, but what actually drives them.
The advertising industry has been built on a simple premise: In order to reach the right person, you need to know who that person is.
Track them.
Identify them.
Follow them across the web.
For two decades this premise drove one of the most sophisticated technological ecosystems ever built – and it worked, well enough, for long enough, that the industry stopped questioning whether the premise itself was sound.
I have spent nineteen years with my hands on this problem from every angle.
Auditing the programmatic supply chain and watching value disappear between systems that were never designed to talk to each other.
Sitting inside agency trading desks, watching carefully constructed briefs lose their strategic intent the moment they entered a platform and an algorithm took over.
Working inside a DSP, understanding exactly where execution infrastructure’s competence ends and something else needs to begin. Building AI-native audience intelligence without a single identity identifier — and realising that the signals to understand people at genuine depth had always existed.
The computational tools to act on them simply had not.
The question that kept surfacing was not how to make the existing model work better. It was whether the model was asking the right question in the
first place.
The model we built, and what it missed
The identity-based model was not a deliberate choice to track people rather than understand them. It was an engineering solution to a genuine limitation.
In the early years of digital advertising, the infrastructure to read complex behavioural signals at scale simply did not exist.
What did exist however, was the ability to drop a cookie, record a page visit, and match that identifier elsewhere.
So, the industry built on what it had.
Cookies became the currency.
Identity graphs became the asset.
And a generation of ad tech was constructed on the assumption that knowing where someone had been was a reasonable proxy for knowing who they were.
It was never a great proxy. It was the best one available.
The problem was not that identity infrastructure existed. The problem was that it got repurposed. Attribution technology became audience technology.
A tool designed to measure what happened after the fact was stretched to answer a fundamentally different question: who should we be trying to reach in the first place?
A trail of where someone has been is not the same as an understanding of who they are.
Recognition tells you who came back.
Understanding tells you who to go find.
The industry needed both and only built one.
The technology to do it differently now exists
The signals that genuinely reveal who a person is have always existed – they were always more revealing than any identifier trail.
What was missing was the computational sophistication to act on them at scale. Consider what characterises an audience: not the pages they visited, but the content they chose to engage with.
The language they use.
The topics that recur across their attention.
The psychographic patterns that connect their interests into a coherent picture of who they are and what motivates them.
Large language models can now read a brief and extract not just what was said but what was meant – the underlying human picture that a skilled planner would intuit from years of category experience, now derivable systematically and in real time.
That is not incremental improvement on the old model. It is a categorically different capability applied to a problem the old model was never designed to solve.
What we discovered when we started building on this foundation is that the value is not just in targeting accuracy. It is in explainability.
When the brief, the audience, and the inventory selection all derive from the same upstream intelligence, a planner can point to exactly why every decision was made.
Explainability is what turns a result into a repeatable process rather than a lucky outcome – and it is precisely what the industry has been missing since the algorithm took over the brief.
Where does this lead?
Follow this thinking to its natural conclusion and something becomes clear that the industry has not yet fully stated aloud.
If an audience can be understood deeply enough through immutable signals and behavioural patterns – not where they have been, but who they are – then the same intelligence that defines that audience for targeting also describes how they think.
What they respond to.
What language resonates.
What falls flat.
This means a genuine audience model is not just a better targeting input. It is a simulation capability. The same unified intelligence that tells you where to run can also tell you how the audience will respond before a single impression is served — not by surveying real people, but by applying the same signal intelligence to a question about response rather than reach.
The line between planning and research dissolves when both draw from the same upstream intelligence.
The platforms built their advantage on knowing where people had been.
The next era will be built on understanding who they are.
That shift however, is not a threat to the industry.
It is the industry finally solving the problem it always meant to solve.

