Apple is currently studying the possibility of using artificial intelligence technologies developed by Anthropic or Openai to propel a new version of its Siri vocal assistant, instead of relying on its own internal models, reports Bloomberg News on Monday.
The manufacturer of the iPhone would have started discussions with the two companies in order to examine the integration of their major language models (LLM) in Siri, even soliciting the training of these LLM versions that can operate on Apple’s cloud infrastructure for test purposes, according to sources close to Bloomberg.
This exploration of third -party models by Apple would still be at a preliminary stage, and no final decision would have been made as to their adoption, specifies the American media.
Anthropic, supported by Amazon, refused to comment on the information, while Apple and Openai did not respond to Reuters requests.
Last March, Apple announced that the improvements related to the artificial intelligence of its vocal assistant Siri would be postponed to 2026, without providing details on the reasons for this delay.
Faced with these difficulties, the apple firm has reorganized its management team to relaunch its AI efforts, after several months late. Mike Rockwell took over the management of the Siri project, while CEO Tim Cook has lost confidence in the ability of John Giannandrea, AI manager, to carry out product development, according to Bloomberg in March.
At its annual conference Worldwide Developers Conference earlier this month, Apple has focused on progressive advances aimed at improving the daily life of users-such as live translation of telephone calls-rather than radical AI ambitions, like its competitors.
Craig Federighi, software manager at Apple, said that the fundamental AI model used by the brand for certain features would be open to third -party developers. He also clarified that Apple would offer both its own code self -compulsory tools and those of Openai in its main software for developers.