top of page
  • Voltaire Staff

Meta investing in AI model to keep you even more hooked on Reels



Meta, the parent company of Facebook, has revealed investing in an artificial intelligence model with plans to develop a system tailored to drive Facebook's video recommendation engine across its various platforms.


Tom Alison, Facebook's head overseeing the development and strategy across News Feed, Stories, Groups, Video, Marketplace, Gaming, News, Dating, Ads and more, revealed that as part of Meta's technology roadmap extending to 2026, they aim to create an AI recommendation model capable of supporting both short-form videos akin to TikTok's Reels and longer, traditional videos.


During a presentation at Morgan Stanley's tech conference in San Francisco, Alison revealed that Meta has traditionally employed distinct models for each of its products, including Reels, Groups, and the core Facebook Feed.


However, as part of Meta's extensive investment in artificial intelligence, the company is directing billions of dollars towards Nvidia’s Graphics Processing Units (GPUs), which have emerged as the primary chips utilized by AI researchers.


These GPUs are essential for training large language models, such as OpenAI's ChatGPT chatbot and other generative AI models, aligning with Meta's ambitious AI initiatives, reported CNBC.


Alison said that the first phase of Meta's tech plan is to move their current recommendation systems to GPUs instead of traditional computer chips. The switch is aimed at enhancing the performance of their products overall.


Meta is currently in the third phase of its system re-architecture, focusing on validating the technology and implementing it across various products.


Meta executives were impressed by the capabilities of LLMs in handling vast amounts of data and various general-purpose activities like chatting, according to Alison. LLMs process extensive textual data efficiently.


Meta envisions a massive recommendation model that could be applied across their range of products. By last year, Meta had developed a new model architecture and tested it on Reels.


The new model architecture boosted Reels watch time on Facebook. Alison said the company found "learning from the data much more efficiently than the previous generation."


 He added, "We’ve really focused on kind of investing more in making sure that we can scale these models up with the right kind of hardware.


"Instead of just powering Reels, we’re working on a project to power our entire video ecosystem with this single model, and then can we add our Feed recommendation product to also be served by this model."


Alison said, if got right, the recommendations will be kind of more engaging and more relevant.


The executive said Meta has gathered many GPUs which will aid the firm in generative AI projects like digital assistant development.


Meta also plans to integrate advanced chatting tools into its core Feed. For instance, users could click a button and ask, "Hey Meta AI, tell me more about what I'm seeing with Taylor Swift."

 

コメント


bottom of page