Google wants Search to be more like an AI assistant in 2025

4 hours ago 2
ARTICLE AD

Google Search is in the midst of a “journey” around AI, Google CEO Sundar Pichai said during the company’s earnings call on Tuesday. The start of that journey was AI overviews, a controversial and monumental shift in how Google delivers information to billions of Search users.

But that was just the beginning.

“As AI continues to expand the universe of queries that people can ask, 2025 is going to be one of the biggest years for search innovation yet,” said Pichai during his opening remarks on the call.

Throughout the call, Pichai laid out the next phase of Google’s plan to pack Search with AI features from the company’s research lab, DeepMind. The Search product is slowly becoming more like an AI assistant that browses the internet for you, looks at web pages, and returns an answer.

It’s a long way off from a simple search system that gives you ten blue links.

Google has been on this path for a few years now, ever since the search giant was caught flat footed by the release of OpenAI’s ChatGPT in 2022. The shift has massive implications for websites that rely on Google’s traffic and businesses that buy ads on Google Search.

Not everyone is happy about it, but Google is pushing ahead.

When asked about the future of AI and Search, Pichai said that, “You can imagine the future with Project Astra,” a reference to DeepMind’s multimodal AI system, which can process live video from a camera or computer screen and answer user questions about what the AI sees in real time.

Google has big plans for Project Astra in other parts of its business too. The company says it wants the multimodal AI system to power a pair of augmented reality smart glasses one day, which Google will create the operating system for.

Pichai also mentioned Gemini Deep Research – an AI agent that takes several minutes to create long research reports – as a feature that could fundamentally shift how people use Google Search. Deep Research automates work that people have traditionally done with Google Search. But now, it seems Google wants to do that research for users.

“You are really dramatically expanding the types of use cases for which Search can work – things which don’t always get answered instantaneously, but can take some time to answer,” said Pichai. “Those are all areas of exploration, and you will see us putting new experiences in front of users through the course of 2025.”

Pichai said further that Google has a “clear sense” of the Search experiences it could create with another one of Google’s AI agents, Project Mariner. That system can use the front-end of websites on behalf of users, making it unnecessary for people to use websites themselves.

Google’s CEO also said there’s an “opportunity” around letting users interact more and ask follow-up questions with Google Search. Pichai was light on details there, but it sounds like Google is considering ways to make its Search interface more like a chatbot.

“I think the [Search] product will evolve even more,” said Pichai. “As you make it more easy for people to interact and ask follow-up questions, etc., I think we have an opportunity to drive further growth.”

Today, ChatGPT has matured into one of the internet’s most used products, with hundreds of millions of weekly users. It presents an existential threat to Google Search’s long-term business. To address it, Google is not only building a competitor AI chatbot with Gemini, but also injecting AI features directly in Search.

Of course, the first step on Google Search’s AI journey did not go very well. When Google rolled out AI overviews to all of Google Search, the system displayed inaccurate and weird AI hallucinations. These included answers that told people to eat rocks and put glue on their pizza. Google admitted at the time that AI overviews needed some work.

Despite this negative rollout, it appears Google is just getting started putting AI into Search.

Maxwell Zeff is a senior reporter at TechCrunch specializing in AI and emerging technologies. Previously with Gizmodo, Bloomberg, and MSNBC, Zeff has covered the rise of AI and the Silicon Valley Bank crisis. He is based in San Francisco. When not reporting, he can be found hiking, biking, and exploring the Bay Area’s food scene.

Read Entire Article