Google has revealed some intriguing updates to its Gemini AI platform. Many of these enhancements are tailored for devices like the new Samsung S25, although some also function on the older Samsung S24 and the Pixel 9 phones. The recent updates align with the launch of the Samsung S25 range of devices, showcased at the newly held Galaxy Unpacked event.
Stand-out feature: Chaining actions
The most remarkable feature is Gemini’s new capability to chain actions together. This means you can now perform tasks like connecting to Google Maps to find nearby restaurants, then drafting a text in Google Messages to send to those you’d like to invite to lunch, all through Gemini commands.
The chaining ability is being integrated into all devices running Gemini, “depending on extensions.” This indicates that the extensions required to link specific apps to Gemini must be developed by a third party to be included. Naturally, all major Google apps already have extensions for Gemini, but extensions are also accessible for the Samsung Reminder, Samsung Calendar, Samsung Notes, and Samsung Clock applications.
Read more | Exploring Google’s new reasoning AI: What is Gemini 2.0 Flash Thinking?
Gemini Live goes multimodal
Google’s Gemini Live, which allows for natural, human-like conversations with the AI, is also receiving significant multimodal upgrades. You will now have the ability to upload images, files, and YouTube videos during your conversation. For instance, you could ask Gemini Live, “Hey, take a look at this picture of my school project and tell me how I could make this better,” then upload the picture to receive feedback.
However, the Gemini multimodal improvements are not universally available and will necessitate a Galaxy S24, S25, or Pixel 9 for functionality.
Project Astra: the next frontier
Lastly, Google has announced that capabilities from Project Astra will be arriving in the upcoming months, initially on Galaxy S25 and Pixel devices. Project Astra represents Google’s prototype AI assistant that allows you to engage with your surroundings by asking questions about what you’re viewing and where you are using your phone’s camera. For example, you can simply point your phone at an object and inquire what Gemini knows about it, or ask when the next stop on your bus route will be.
While Project Astra operates on mobile phones, it enhances your experience significantly when paired with Google’s prototype hands-free AI glasses, enabling you to ask Gemini questions about your surroundings without needing to interact with a screen at all.