The gadgets that were once limited to works of science friction are inching close to reality.
Apple, recently in an iPhone unveiling event showed a demo of its new Live Translation feature, built into the AirPods Pro 3. In the video it showed an English-speaking tourist buying a flower in an unnamed Spanish-speaking country. As a florist speaks in Spanish, the tourist hears a clear, natural-sounding English translation— “Today all the red carnations are 50% off”—with near-instantaneous delivery.
The video was marketing material for Apple’s latest AirPods Pro 3. And the Live Translation is the selling feature available on the AirPods Pro 3, priced at 250 US Dollar, starting this week. The earbuds support translations from French, German, Portuguese, and Spanish into English. This feature also gets updated on AirPods 4 and AirPods Pro 2. The system allows two-way, real-time translation between users wearing AirPods, and the conversation can be translated both ways simultaneously inside each user’s headphones.
Live Translation, feature is one of several AI-driven advancements being rolled out by major tech companies, that include Google parent Alphabet and Meta, which owns Facebook and Instagram.
The feature requires Apple Intelligence, the company’s new AI suite, which runs on newer iPhone models. Analysts believe this could be a major driver for device upgrades. “If we can actually use the AirPods for live translations, that's a feature that would actually get people to upgrade,” said DA Davidson analyst Gil Luria.
As AI gets good enough to translate languages as quickly as people speak, Live Translation is emerging as a key battleground in the technology industry. And Apple is not alone. Google and Meta are also investing in similar capabilities. Google’s Pixel Buds and Meta’s AI-powered devices are testing real-time translation features, often integrated with cloud-based AI models.
In past years, Google and Meta have released hardware products featuring real-time translation capabilities.
Google’s Pixel 10 phone feature, called Voice Translate can translate from one language to another during phone calls and the translated speech mimics the speaker’s voice inflections. Voice Translate applies to real-time phone call conversations in languages like Spanish, Japanese, and Hindi that will start showing up on people’s phones through a software update.
In August during Google’s live demo, Voice Translate was able to translate a sentence from entertainer Jimmy Fallon into Spanish, and it actually sounded like the comedian. Apple’s feature does not try to imitate the user’s voice.
In May, Meta also announced that its Ray-Ban Meta glasses will be able to translate using the device’s speakers, what a person is saying in another language, and the other person in the conversations will be able to see translated responses transcribed on the user’s phone. Meta held its annual Connect keynote on Wednesday, where CEO Mark Zuckerberg revealed several new AI-powered smart glasses and updates to its metaverse technology.
In June, OpenAI showcased an intelligent voice assistant mode for ChatGPT that has fluid translation built in as one of many features. Apple’s Siri is integrated with Chat GPT, but not in voice mode. With Apple’s former design guru Jony Ive, OpenAI is planning in releasing new hardware products in the coming years.
The rise of live translation could entirely reshape industries. A Microsoft Research study published in August founded, Translators and interpreters are the most threatened job by AI, as 98% of translators’ work activities overlap with what AI can do.
For more information on IT Services, Web Applications & Support kindly call or WhatsApp at +91-9733733000 or you can visit https://www.technodg.com