top of page
  • Voltaire Staff

Apple in talks with news outlets to train AI model

Apple has been in discussions with prominent news and publishing groups in the past few weeks seeking to use their content for the advancement of generative AI systems.

However, there has been no formal announcement on the development from Apple yet.

In July this year, Bloomberg reported that Apple had initiated testing for its proprietary AI chatbot, internally dubbed 'Apple GPT,' with the firm concurrently working on developing large language model framework named 'Ajax.'

The New York Times in an article on December 22 revealed that Apple is in discussions with Conde Nast, NBC News, and IAC, among other news publishers, to secure licences for their news archives.

It said that Apple is in talks with these firms discussing multi-year agreements, each valued at a minimum of $50 million. The intention behind these deals is to enhance Apple's AI capabilities by incorporating factual and current information with real-world language usage.

Generative AI tools, like ChatGPT, had been accused in October by the News/Media Alliance, a trade group representing over 2,200 publishers, of using copyrighted news material without authorisation to train their chatbots.

Axel Springer and OpenAI recently announced collaboration under which the publisher will provide news for users of ChatGPT.

The data obtained from news archives is expected to contribute to the training of Apple's AI by providing factual accuracy, diverse writing styles, insights into ongoing events and trends, and opportunities for human-like analysis. Recognising the value of news content from top publishers, AI and tech companies are increasingly seeking to leverage it to train their Large Language Models.

Recent research findings, as reported by Bloomberg, disclose Apple's efforts in developing on-device AI technology, which includes the creation of animated avatars and the efficient operation of large language models on iPhones or iPads.

A research paper titled 'LLM in a Flash: Efficient Large Language Model Inference with Limited Memory,' published on December 12, holds the promise of revolutionising the iPhone user experience. If implemented, it could provide a more immersive visual encounter, allowing users to access sophisticated AI systems directly on their iPhones and iPads.



bottom of page