compute

Clarifai’s new reasoning engine makes AI models faster and less expensive

On Thursday, the AI platform Clarifai announced a new reasoning engine that it claims will make running AI models twice as fast and 40% less expensive. Designed to be adaptable to a variety of models and cloud hosts, the system employs a range of optimizations to get more inference power out of the same hardware. […]

Clarifai’s new reasoning engine makes AI models faster and less expensive Read More »

Oracle is reportedly looking to raise $15B in corporate bond sale

Oracle is reportedly looking to raise funds just weeks after the company inked an historic AI infrastructure deal with OpenAI.   Cloud infrastructure giant Oracle is looking to raise $15 billion through corporate bond sales, according to reporting from Bloomberg, citing sources. The sale could include up to seven different parts, with the potential for one said part to be an uncommon 40-year bond, Bloomberg reported.   TechCrunch reached out

Oracle is reportedly looking to raise $15B in corporate bond sale Read More »

Microsoft expects some AI capacity constraints this quarter

An executive cautioned during Microsoft’s earnings call on Wednesday that customers might face AI service disruptions as demand outstrips the company’s ability to bring data centers online. Microsoft’s EVP and CFO Amy Hood said during the company’s fiscal 2025 third-quarter earnings call that the company may face AI capacity constraints as early as June. “We

Microsoft expects some AI capacity constraints this quarter Read More »

Hugging Face makes it easier for devs to run AI models on third-party clouds

AI dev platform Hugging Face has partnered with third-party cloud vendors including SambaNova to launch Inference Providers, a feature designed to make it easier for devs on Hugging Face to run AI models using the infrastructure of their choice. Other partners involved with the new effort include Fal, Replicate, and Together AI. Hugging Face says

Hugging Face makes it easier for devs to run AI models on third-party clouds Read More »

OpenAI CEO Sam Altman says lack of compute capacity is delaying the company’s products

In a Reddit AMA, OpenAI CEO Sam Altman admitted that a lack of compute capacity is one major factor preventing the company from shipping products as often as it’d like. “All of these models have gotten quite complex,” he wrote in response to a question about why OpenAI’s next AI models were taking so long.

OpenAI CEO Sam Altman says lack of compute capacity is delaying the company’s products Read More »