![](https://distilinfo.com/it/wp-content/uploads/sites/4/2023/05/121-scaled.jpg)
Meta, formerly known as Facebook, has unveiled its custom computer chips designed for AI and video processing. The chips, including the Meta Scalable Video Processor and Meta Training and Inference Accelerator, will power metaverse tasks, VR/AR, and generative AI. Meta emphasizes open-source collaboration and its commitment to advancing AI technology.
Meta, the company formerly known as Facebook, has made public its internal silicon chip projects for the first time. These custom computer chips have been developed to enhance Meta’s artificial intelligence (AI) and video processing capabilities. Among them is the Meta Scalable Video Processor (MSVP), designed to efficiently process and transmit videos while reducing energy requirements. Another chip, the Meta Training and Inference Accelerator (MTIA), handles AI-specific tasks such as inference. These chips will play a crucial role in powering metaverse-related tasks, virtual reality, augmented reality, and generative AI.
Meta’s investments in AI and related data center hardware have drawn significant attention from investors, particularly as the company embarks on a “year of efficiency” that includes significant layoffs and cost-cutting measures. Despite the high cost associated with designing and building custom chips, Meta believes that the improved performance justifies the investment. The company has also been revamping its data center designs to prioritize energy-efficient techniques like liquid cooling to mitigate excess heat.
The unveiling of these chips sheds light on Meta’s ongoing internal initiatives. Alexis Bjorlin, Vice President of Infrastructure at Meta, emphasized that this disclosure offers a glimpse into the company’s internal developments, as it has not felt the need to publicly advertise its data center chip projects due to its focus on social networking rather than selling cloud computing services.
Aparna Ramani, Vice President of Engineering at Meta, stated that the new hardware is a widely popular tool used by third-party developers to create AI applications and was designed to effectively integrate with Meta’s home-grown PyTorch software. The chips will eventually support metaverse-related tasks, virtual reality, augmented reality, and generative AI, which involves the creation of compelling text, images, and videos using AI algorithms.
Furthermore, Meta has developed a generative AI-powered coding assistant for its developers, similar to Microsoft’s GitHub Copilot tool. Meta’s commitment to open-source collaboration and its belief in contributing to the advancement of AI technology and research are evident. The company has disclosed details about its LLaMA language model, including its largest model, LLaMA 65B, which contains 65 billion parameters and was trained on 1.4 trillion tokens.
While Meta’s custom chips and language models are paving the way for innovation, the company faced a setback when its LLaMA language model was leaked to the public after being shared with researchers. Nevertheless, Meta remains dedicated to open science and cross-collaboration as part of its overarching philosophy.