Google unveiled a new artificial intelligence (AI) upgrade feature called ‘Expressive Captions’ on Thursday. The feature is being rolled out to its Live Captions feature on Android. With this, users will be able to see live captions of videos played across the device in a new format to better convey the context behind the sounds. The AI feature will convey excitement, shouting, and loudness with text showing in all caps. Currently, Expressive Captions is available in English on Android 14 and Android 15 devices in the US.

Google’s ‘Expressive Captions’ Feature Relies on AI

The search giant shared details of the new AI feature which is being added to Android’s Live Captions, and said that while captions were first popularised in the 1970s as an accessibility tool for the deaf and hard-of-hearing community, their presentation has not changed in the last 50 years.

Many people today use captions while streaming content online in loud public spaces, to better understand what’s being said, or while consuming content in a foreign language. Noting the popularity of captions among Android users, Google said it is now using AI to innovate the information that captions convey.

With Expressive Captions, the live subtitles will be able to communicate things like tone, volume, environmental cues as well as human noises. “These small things make a huge difference in conveying what goes beyond words, especially for live and social content that doesn’t have preloaded or high-quality captions,” Google said.

One of the ways Expressive Captions will innovate captions is by showing all capitalised letters to indicate the intensity of speech, be it excitement, loudness, or anger. These captions will also identify sounds such as sighing, grunting, and gasping, helping users better understand the nuances of speech. Further, it will also capture ambient sounds being played in the foreground and background, such as applause and cheers.

Google says that Expressive Captions are part of Live Captions, and the feature is built into the operating system and will be available across the Android device, no matter which app or interface the user is on. As a result, users can find real-time AI captions while watching live streams, social media posts, and memories in Google Photos, as well as videos shared on messaging platforms.

Notably, the AI processing for Expressive Captions is done on-device, meaning users will see them even when the device is not connected to the Internet or is on the airplane mode.



Source link

Shares:
Leave a Reply

Your email address will not be published. Required fields are marked *