OpenAI created a team to control ‘superintelligent’ AI — then let it wither, source says



OpenAI’s Superalignment team, responsible for developing ways to govern and steer “superintelligent” AI systems, was promised 20% of the company’s compute resources, according to a person from that team. But requests for a fraction of that compute were often denied, blocking the team from doing their work.

That issue, among others, pushed several team members to resign this week, including co-lead Jan Leike, a former DeepMind researcher who while at OpenAI was involved with the development of ChatGPT, GPT-4 and ChatGPT’s predecessor, InstructGPT.

Leike went public with some reasons for his resignation on Friday morning. “I have been disagreeing with OpenAI leadership about the company’s core priorities for quite some time, until we finally reached a breaking point,” Leike wrote in a series of posts on X. “I believe much more of our bandwidth should be spent getting ready for the next generations of models, on security, monitoring, preparedness, safety, adversarial robustness, (super)alignment, confidentiality, societal impact, and related topics. These problems are quite hard to get right, and I am concerned we aren’t on a trajectory to get there.”

OpenAI did not immediately return a request for comment about the resources promised and allocated to that team.

OpenAI formed the Superalignment team last July, and it was led by Leike and OpenAI co-founder Ilya Sutskever, who also resigned from the company this week. It had the ambitious goal of solving the core technical challenges of controlling superintelligent AI in the next four years. Joined by scientists and engineers from OpenAI’s previous alignment division as well as researchers from other orgs across the company, the team was to contribute research informing the safety of both in-house and non-OpenAI models, and, through initiatives including a research grant program, solicit from and share work with the broader AI industry.

The Superalignment team did manage to publish a body of safety research and funnel millions of dollars in grants to outside researchers. But, as product launches began to take up an increasing amount of OpenAI leadership’s bandwidth, the Superalignment team found itself having to fight for more upfront investments — investments it believed were critical to the company’s stated mission of developing superintelligent AI for the benefit of all humanity.

“Building smarter-than-human machines is an inherently dangerous endeavor,” Leike continued. “But over the past years, safety culture and processes have taken a backseat to shiny products.”

Sutskever’s battle with OpenAI CEO Sam Altman served as a major added distraction.

Sutskever, along with OpenAI’s old board of directors, moved to abruptly fire Altman late last year over concerns that Altman hadn’t been “consistently candid” with the board’s members. Under pressure from OpenAI’s investors, including Microsoft, and many of the company’s own employees, Altman was eventually reinstated, much of the board resigned and Sutskever reportedly never returned to work.

According to the source, Sutskever was instrumental to the Superalignment team — not only contributing research but serving as a bridge to other divisions within OpenAI. He would also serve as an ambassador of sorts, impressing the importance of the team’s work on key OpenAI decision makers.

After Leike’s departure, Altman wrote in X that he agreed there is “a lot more to do,” and that they are “committed to doing it.” He hinted at a longer explanation, which co-founder Greg Brockman supplied Saturday morning:

Though there is little concrete in Brockman’s response as far as policies or commitments, he said that “we need to have a very tight feedback loop, rigorous testing, careful consideration at every step, world-class security, and harmony of safety and capabilities.”

Following the departures of Leike and Sutskever, John Schulman, another OpenAI co-founder, has moved to head up the type of work the Superalignment team was doing, but there will no longer be a dedicated team — instead, it will be a loosely associated group of researchers embedded in divisions throughout the company. An OpenAI spokesperson described it as “integrating [the team] more deeply.”

The fear is that, as a result, OpenAI’s AI development won’t be as safety-focused as it could’ve been.

We’re launching an AI newsletter! Sign up here to start receiving it in your inboxes on June 5.





Source

In addition, the shoe base 52 has an outer sole 24 and a sole. We explore the people trajectory of, check-ins and 22, users over a period of one month in Makassar city, Indonesia. Pretty much from nowhere, Gorillacam arrived in December from the creators of the Gorillapod tripods. Todd from the University of Sydney prepared pyrimethamine as an illustration that the synthesis is comparatively easy and the price-hike unjustifiable. Hi Leon, I can attest to the fact that this is a very refreshing drink, when you get here you will need to make a batch of them up and keep them in the frig. Please check your email inbox and also the Spam folder in your mail-box for an email from info. I would assume that this option exists in the full version. We need some trusty people such as: Economic Manager and a Fleet Manager! Circe by Madeline Miller is an interesting, unique tale based on the Greek myths. Russell retired from professional tennis at the US Open, at 37 years of age. As we mentioned earlier, this will allow Xbox One owners to stream games they have downloaded and installed on their console to their Android smartphone. Hermann, a now-retired English professor, tried in the s and s to end what he referred to as "taxpayer-supported segregation". My stay this property is Worth spending every penny, indian breakfast, spacious rooms, quality staff, cleanliness up to the mark. A meticulous intervention on fissures and cracks took place as well. Spend a few minutes exploring our Seoul vacation travel guide and SeoulTravel the world better. Please call us to check the latest situation before booking and we advise that you contact us 36 hours in advance of when you would like to travel. Some animals can only be used if the research cannot be done in any other species or in exceptional circumstances: non-human primates may only be used for basic or specific medical research, or research aimed at preservation of the species. Mike Wells' pick: The Colts are on a three-game winning streak, while the Titans have won two in the row. The best prices for Prinivil are gathered at our store and if you want to take advantage of our deals, as we offer it for only 0.