Hello Guest

Sign In / Register

Welcome,{$name}!

/ Logout
English
EnglishDeutschItaliaFrançais한국의русскийSvenskaNederlandespañolPortuguêspolskiSuomiGaeilgeSlovenskáSlovenijaČeštinaMelayuMagyarországHrvatskaDanskromânescIndonesiaΕλλάδαБългарски езикGalegolietuviųMaoriRepublika e ShqipërisëالعربيةአማርኛAzərbaycanEesti VabariikEuskeraБеларусьLëtzebuergeschAyitiAfrikaansBosnaíslenskaCambodiaမြန်မာМонголулсМакедонскиmalaɡasʲພາສາລາວKurdîსაქართველოIsiXhosaفارسیisiZuluPilipinoසිංහලTürk diliTiếng ViệtहिंदीТоҷикӣاردوภาษาไทยO'zbekKongeriketবাংলা ভাষারChicheŵaSamoaSesothoCрпскиKiswahiliУкраїнаनेपालीעִבְרִיתپښتوКыргыз тилиҚазақшаCatalàCorsaLatviešuHausaગુજરાતીಕನ್ನಡkannaḍaमराठी
Home > News > Meta Launches New AI Chip to Reduce Dependence on NVIDIA

Meta Launches New AI Chip to Reduce Dependence on NVIDIA

On Wednesday (10th), Meta Platforms (META-US), the parent company of Facebook, unveiled its newly developed chip, the "Meta Training and Inference Accelerator" (MTIA), aimed at powering artificial intelligence (AI) services. This move not only facilitates content ranking and recommendation on Facebook and Instagram but also reduces reliance on NVIDIA (NVDA-US) and other external semiconductor companies. Meta had announced its first-generation MTIA product last year.

Notably, the latest MTIA iteration will utilize TSMC's (2330-TW) 5-nanometer process technology, boasting three times the performance of its predecessor. This chip has been deployed in data centers to serve AI applications, with ongoing projects to expand MTIA's application scope, including support for generative AI operations.

Meta's pivot towards AI services reflects the growing demand for computational power. Last year, this social media giant launched its AI model to compete with OpenAI's ChatGPT, adding new generative AI features to its suite of social apps, such as custom stickers and chatbot personas with celebrity faces.

In October of last year, Meta announced plans to invest up to $35 billion in AI-supporting infrastructure, including data centers and hardware. CEO Mark Zuckerberg stated at the time, "By 2024, AI will be our biggest area of investment."

A significant portion of this expenditure is likely still directed towards NVIDIA, which produces the H100 graphics cards that power AI models. Earlier this year, Zuckerberg mentioned that the company would purchase 350,000 H100 chips, each costing tens of thousands of dollars.

However, more tech giants are starting to develop chips in-house. Meta joins the ranks of its competitors like Amazon's (AMZN-US) AWS, Microsoft (MSFT-US), and Alphabet Inc.'s (GOOGL-US) Google in an attempt to break free from this costly dependence. Yet, this isn’t a quick fix. So far, these efforts have not diminished the industry's insatiable demand for NVIDIA's AI accelerators.

The AI boom has positioned NVIDIA as the world's third most valuable tech company, trailing only behind Microsoft and Apple (AAPL-US). The company's sales to data center operators totaled $47.5 billion in the 2024 fiscal year, up from just $15 billion the previous year. Analysts predict this number will more than double by the 2025 fiscal year.