Apple is working on language-generating AI

These days, almost every conversation in tech seems to center around AI and chatbots. OpenAI, backed by Microsoft, released a new language model called GPT-4. Google said it is integrating AI into its Workspace tools like Gmail and Docs. Microsoft Bing has drawn attention to itself with a chatbot-enabled search.

People have long complained that Siri doesn’t understand queries (including mine). Siri (and other assistants like Alexa and Google Assistant) can’t understand the different accents and phonetics of people in different parts of the world, even if they speak the same language.

ChatGPT’s new fame and text-based search make it easy for people to interact with various AI models. But for now, the only way to chat with Siri, Apple’s artificial intelligence assistant, is to enable the feature in Accessibility Settings.

Image Credit: Wang Gang/Getty Images

In an interview with the NYT, John Burke, a former Apple engineer who worked on Siri, said the Apple Assistant has been slow to evolve because of “clunky code” that makes updating even basic functionality difficult. He also mentioned that Siri has a huge database with many words. So when engineers needed to add features, the database had to be rebuilt. The process reportedly took up to six weeks.

The NYT report did not clarify whether Apple is building its own language model or whether it wants to adopt an existing model. But like Google and Microsoft, Apple doesn’t want to limit itself to offering chatbots powered by Siri.

Apple has generally kept quiet about its AI efforts. But in January, the company launched a program that offers authors an AI-powered narration service that converts books into audiobooks. This shows that iPhone makers are already thinking about generative AI use cases.





tel. 06-6454-8833(平日 10:00~17:00)