This technology is one of the most powerful examples of how artificial intelligence can improve people's lives in a seamless and practical way.
This week, I met an old friend who told me about his summer vacation. He and his girlfriend went to Arizona to visit relatives. His niece dragged him to see Lilo & Stitch. He's working hard at a startup. He said all of this in Spanish, a language I've never studied, yet I understood every single word.
I was able to understand because I was wearing Apple's new AirPods Pro 3, which will be available on Friday. These $250 AirPods Pro 3 use artificial intelligence for real-time translation, their most significant new feature.
(The noise cancellation of the headphones has been slightly improved, but otherwise it is not much different from the previous generation.) When my friend spoke, Apple's virtual assistant Siri acted as an interpreter, instantly translating Spanish into English into my ears with a robotic voice.
Afterwards, I checked the generated conversation history on my iPhone to confirm the accuracy of the translation. Apart from Siri getting a few pronoun mistakes (referring to my friend's girlfriend as "he" instead of "she"), the translation was excellent.
I am deeply impressed. This is the most powerful example I have ever seen of AI technology truly benefiting the masses in a seamless and practical way. Children from immigrant families will find communication easier if they are more accustomed to speaking their native language. Travelers will also be better able to understand taxi drivers, hotel staff, and airline employees when traveling abroad.
It also comes in handy in my daily life, such as helping me understand contractors or pest control workers who don't speak English, when they have to explain to me what they've found in the basement of my house.
Frankly, I was surprised too. Apple's foray into generative artificial intelligence (driving technologies like ChatGPT for OpenAI and Gemini for Google) has been fraught with difficulties. The company even failed to deliver on some of the AI features promised in last year's iPhone 16 due to poor technical performance. And Apple's existing AI tools, such as photo editing and article summarizing, have been disappointing compared to similar tools from Google.
AirPods' powerful translation capabilities demonstrate that despite early setbacks, Apple remains active in the AI race. Digital translators are not new, but Apple's integration of them with AirPods—a product that fits perfectly in the ear—means that the frequency of people using this technology could potentially leap forward.
For over a decade, consumers have been awkwardly using software like Google Translate and Microsoft Translate on their phones. These require pointing the phone's microphone at the person speaking the foreign language and then waiting for the translation to appear on the screen or be played from the phone's small speaker. And the translations are often inaccurate.
In contrast, AirPods users can activate the digital translator with just a gesture. About a second after someone speaks, the translation is played through the earphones in the user's preferred language.
Why has the translation improved?