Llama

Zuckerberg says Meta will need 10x more computing power to train Llama 4 than Llama 3

Meta, which develops one of the biggest foundational open-source large language models, Llama, believes it will need significantly more computing power to train models in the future. Mark Zuckerberg said on Meta’s second-quarter earnings call on Tuesday that to train Llama 4 the company will need 10x more compute than what was needed to train […]

Zuckerberg says Meta will need 10x more computing power to train Llama 4 than Llama 3 Read More »

Mark Zuckerberg imagines content creators making AI clones of themselves

Content creators are busy people. Most spend more than 20 hours a week creating new content for their respective corners of the web. That doesn’t leave much time for audience engagement. But Mark Zuckerberg, Meta’s CEO, thinks that AI could solve this problem. In an interview with internet personality Rowan Cheung, Zuckerberg laid out his

Mark Zuckerberg imagines content creators making AI clones of themselves Read More »

Meta releases its biggest ‘open’ AI model yet

Meta’s latest open-source AI model is its biggest yet. Today, Meta said it is releasing Llama 3.1 405B, a model containing 405 billion parameters. Parameters roughly correspond to a model’s problem-solving skills, and models with more parameters generally perform better than those with fewer parameters. At 405 billion parameters, Llama 3.1 405B isn’t the absolute

Meta releases its biggest ‘open’ AI model yet Read More »