Remember when “just Google it” meant typing a few keywords and clicking through blue links? That era is officially over, writes Jean-Yves “JY” Scauri, iProspect’s Sydney head of SEO.
Google’s I/O 2025 conference didn’t just introduce cool new features, it was the funeral for traditional search and the birth announcement of something entirely different.
For business leaders, this transformation demands not just attention, but strategic adaptation as the very foundation of digital discovery evolves.
What happens after you hit enter?
For 25 years, that little search box has been our gateway to the internet. While it isn’t going away overnight, what happens after you type has fundamentally changed.
With AI Mode rolling out across the US (no signup needed), Search still feels familiar on the surface, but now works like an AI-powered research assistant underneath.
As Elizabeth Reid, VP and head of Google search, said at I/O 2025: “AI Mode is not just this AI-powered experience end to end, but it is also a glimpse of what’s to come in Search overall.”
The biggest technical leap? Query fan-out. Ask a question and Google breaks it down into smaller chunks and issues multiple searches simultaneously, diving deeper into the web than traditional search ever could.
That shift has real implications.
It’s changing not just how we search, but what we expect search to do. We’re moving from finding answers to having them delivered, ready to go.
From finding to solving
As query fan-out technology matures, Google is taking the next logical step: Google is no longer just helping you find information. It’s now solving your problems.
Deep Search, another key innovation, takes query fan-out further. It can issue hundreds of searches, reason across multiple pieces of information, and create an expert-level, fully-cited report in minutes.
What used to take hours now happens in the time it takes to drink a coffee.
Former Google CEO Eric Schmidt touched on this in his TED Talk last week: “You have an agent to do this, an agent to do this, an agent to do this. And you concatenate them together, and they speak language among each other.”
AI Mode pushes things further by borrowing your own context. Give it permission and it will look in your Gmail bookings, past searches, and flight details. Ask “Things to do in Melbourne this weekend” and it already knows you land on Friday night, love a good live-music pub, and fancy a fried chicken burger near your hotel.
The agent isn’t just solving a generic query; it’s solving your weekend.
The change Schmidt describes marks a complete reinvention: our search box now anticipates and solves problems, not just returns links.
This creates both opportunities and challenges for businesses: how do you ensure your brand appears when AI is making the selection?
Search beyond the box
While the search box continues to evolve, the most exciting announcement at I/O 2025 was about search breaking free from the box entirely.
Project Astra is Google’s attempt at a universal AI engine that can “see” through your camera, “hear” your voice and pull real-time information on the fly. First previewed last year, it now sits inside Gemini and Search, letting you aim your phone at anything, ask a question and get an instant answer. No typing needed.
Google DeepMind CEO Demis Hassabis called these updates “critical steps” toward building a “universal AI assistant” that can better understand the user and take actions on their behalf. “This is our ultimate goal for the Gemini app: an AI that’s personal, proactive, and powerful,” Hassabis said.
Shopping is also getting a major upgrade. Instead of typing in a product, you can point your camera at it. Gemini can identify the item, compare prices, surface reviews and show relevant ads instantly.
With virtual try-ons, spatial search and AI agents that can complete the purchase, the line between discovery and checkout is starting to disappear.
The world is visual, spatial, and contextual – and now, so is search. When your camera becomes your search box, and any object becomes a query, the very concept of “searching” transforms into something completely new.
The road ahead
Google I/O 2025 wasn’t just a product showcase; it was a preview of a fundamentally different information landscape. The future of search doesn’t feel like searching at all. It’s an ambient, multimodal, personalised AI layer that reimagines our relationship with information.
Looking further ahead, these developments may be setting the stage for even more profound transformations. Hassabis has consistently predicted artificial general intelligence (AGI) arriving within 5-10 years. If that plays out, search won’t just evolve, it could disappear entirely as AI begins to anticipate, reason and act without a query ever being typed.
The potential evolution of search technology raises important questions about the boundaries between humans and machines that deserve serious consideration. For consumers, businesses, and society, the implications are profound and not yet fully understood.
One thing is certain: the blue link era is ending, and what replaces it will reshape not just how we find information, but how we experience the world itself.
Google I/O made it clear this transformation is already underway.
The question is whether we’re ready for a world where, as Elizabeth Reid put it, “searching becomes effortless” or perhaps, stops being searching at all.