ai models

Current AI scaling laws are showing diminishing returns, forcing AI labs to change course

AI labs traveling the road to super-intelligent systems are realizing they might have to take a detour. “AI scaling laws,” the methods and expectations that labs have used to increase the capabilities of their models for the last five years, are now showing signs of diminishing returns, according to several AI investors, founders, and CEOs […]

Current AI scaling laws are showing diminishing returns, forcing AI labs to change course Read More »

Zuckerberg says Meta will need 10x more computing power to train Llama 4 than Llama 3

Meta, which develops one of the biggest foundational open-source large language models, Llama, believes it will need significantly more computing power to train models in the future. Mark Zuckerberg said on Meta’s second-quarter earnings call on Tuesday that to train Llama 4 the company will need 10x more compute than what was needed to train

Zuckerberg says Meta will need 10x more computing power to train Llama 4 than Llama 3 Read More »