Google on Tuesday unleashed another wave of artificial intelligence technology to accelerate a year-long makeover of its search engine that is changing the way people get information and curtailing the flow of internet traffic to websites.
The next phase outlined at Google's annual developers conference includes releasing a new ''AI mode'' option in the United States. The feature makes interacting with Google's search engine more like having a conversation with an expert capable of answering questions on just about any topic imaginable.
AI mode is being offered to all comers in the U.S. just two-and-a-half-months after the company began testing with a limited Labs division audience.
Google is also feeding its latest AI model, Gemini 2.5, into its search algorithms and will soon begin testing other AI features, such as the ability to automatically buy concert tickets and conduct searches through live video feeds.
In another example of Google's all-in approach to AI, the company revealed it is planning to leverage the technology to re-enter the smart glasses market with a new pair of Android XR-powered spectacles. The preview of the forthcoming device, which includes a hands-free camera and a voice-powered AI assistant, comes 13 years after the debut of ''Google Glass,'' a product that the company scrapped after a public backlash over privacy concerns.
Google didn't say when its Android XR glasses will be available or how much they will cost, but disclosed they will be designed in partnership with Gentle Monster and Warby Parker. The glasses will compete against a similar product already on the market from Facebook parent Meta Platforms and Ray-Ban.
AI's big role in Google search
The expansion builds upon a transformation that Google began a year ago with the introduction of conversational summaries called ''AI overviews'' that have been increasingly appearing at the top of its results page and eclipsing its traditional rankings of web links.