At its I/O 2025 developer conference, Google announced an expansion of its AI-powered “Live” mode—bringing real-time camera-based search to the Gemini app on iOS and integrating it into Google Search’s new AI Mode.

The feature, previously available on Android, allows users to interact with the world around them by pointing their camera and asking questions.

Originally introduced as part of Project Astra, the Live mode enables Gemini to “see” through your camera and assist with tasks like identifying ingredients or recommending recipes based on what’s in your fridge. Now, users can tap a “Live” icon in Search or Gemini to activate the camera-sharing feature and get instant AI-powered insights about what they’re seeing.

The feature, branded Live Search, will launch in beta later this summer through Google’s Labs program. It complements other upcoming AI Mode features like Deep Search, a research-oriented tool, and AI agents that can take actions on your behalf online.

READ
Google Chrome Will Soon Auto-Change Weak or Compromised Passwords