AI

OpenAI created a team to control ‘superintelligent’ AI —then let it wither, source says

OpenAI’s Superalignment team, responsible for developing ways to govern and steer “superintelligent” AI systems, was promised 20% of the company’s compute resources, according to a person from that team. But requests for a fraction of that compute were often denied, blocking the team from doing their work. That issue, among others, pushed several team members

OpenAI created a team to control ‘superintelligent’ AI —then let it wither, source says Read More »

EU warns Microsoft it could be fined billions over missing GenAI risk info

The European Union has warned Microsoft that it could be fined up to 1% of its global annual turnover under the bloc’s online governance regime, the Digital Services Act (DSA), after the company failed to respond to a legally binding request for information (RFI) that focused on its generative AI tools. Back in March, the

EU warns Microsoft it could be fined billions over missing GenAI risk info Read More »

Sony Music warns tech companies over ‘unauthorized’ use of its content to train AI

Sony Music Group has sent letters to more than 700 tech companies and music streaming services to warn them not to use its music to train AI without explicit permission. The letter, which was obtained by TechCrunch, says Sony Music has “reason to believe” that the recipients of the letter have “may already have made

Sony Music warns tech companies over ‘unauthorized’ use of its content to train AI Read More »

Google’s call-scanning AI could dial up censorship by default, privacy experts warn

A feature Google demoed at its I/O confab yesterday, using its generative AI technology to scan voice calls in real time for conversational patterns associated with financial scams, has sent a collective shiver down the spines of privacy and security experts who are warning the feature represents the thin end of the wedge. They warn

Google’s call-scanning AI could dial up censorship by default, privacy experts warn Read More »