Meta is investing nearly $15 billion in corporate-scale AI that labels data, investing 49% stakes in startups while also introducing CEO Alexandr Wang to help lead the company’s new “super intelligence” lab.
The deal is reminiscent of Meta’s previous large, risky bets, such as the $19 billion acquisition of WhatsApp and the $1 billion Instagram purchase. When these mergers were shut down, many people suggested that Meta was severely overpaying the platform. And today’s discourse is no exception. There’s no shortage of investors and founders who have been scratching their heads over Meta’s latest alliance this weekend.
Ultimately, WhatsApp and Instagram became integral parts of CEO Mark Zuckerberg’s empire. The question is whether scale AI trading works in the same way with meta favors, whether Zuckerberg’s visionary strategy again, or whether the company is grasping the straw with misguided efforts to keep up with rivals like Open, Google, and humanity.
In this case, Meta is not betting on emerging social media apps, but on the data used to train the TOP AI models. Over the past few years, leading AI labs such as Openai have relied on scale AI to generate and label the data used to train models. In recent months, Scale AI and its data annotation competitors have begun hiring high-quality people, including PhD scientists and senior software engineers, generating high-quality data for Frontier AI Labs.
Meta can help to have a close relationship with data providers such as scale. According to those familiar with the issue, Meta leaders complain about the lack of data innovation for the company’s leading AI teams.
Earlier this year, Meta Generation AI Unit launched the Llama 4, a family of AI models that rival the capabilities of the models of the Chinese AI Lab Deepseek. Meta is not supporting the issue and is trying to combat the issue of turnover. According to data compiled by SignalFire, Meta lost 4.3% of top talent to AI Labs in 2024.
Meta is not betting solely on AI to rekindle AI efforts, nor is it even betting on the king leading the aforementioned new super intelligence team. The 28-year-old CEO has proven himself the founder of a powerful startup. He is an ambitious, excellent salesman around Silicon Valley and is known to be extremely well connected. Over the past few months, the king has met with world leaders to discuss the impact of AI on society.
However, Wang has not led this type of AI lab before, but he does not have the same AI research background as many other AI lab leaders, such as Ilya Sutskever from Safe Superintelligence and Arthur Mensch from Mistral. That’s probably why Meta is said to be employed by famous talents like Jack Ray of Deep Mind to close out the new AI research group.
The fate after acquiring the scale AI is a bit vague. The role of actual data in AI model training is changing. Some AI labs have brought data collection efforts internally, while others have increased their reliance on synthetic (i.e., AI-generated) data. In April, the information reported that Scale AI missed some of its revenue targets.
According to Robert Nishihara, co-founder of every scale, several frontier AI labs are looking for new ways to leverage and optimize data.
“Data is a moving goal,” Nishihara told TechCrunch in an interview. “It’s not a finite effort to catch up, we have to innovate.”
The relationship between Meta and One can scare other AI labs that have traditionally collaborated with Scale AI. If so, the deal could benefit scale AI competitors, such as Turing, Surge AI and even non-traditional data providers such as the recently launched LM Arena.
Turing CEO Jonathan Siddders told TechCrunch via email that customers were gaining interest in light of rumors about Meta’s Scale AI contract.
“I think there are clients who prefer to work with more neutral partners,” he said.
It’s only time to understand how Meta’s investments will be fed into AI efforts, but the company has important grounds to make up for it. Meanwhile, competition has not slowed down. Openai is preparing for the release of its next flagship model, the GPT-5, and the first openly available model in years. This is a model that competes with Meta’s current and future Llama releases.