OpenAI vows to provide the US government early access to its next AI model


OpenAI will give the US AI Security Institute early access to its next model as part of its security efforts, Sam Altman announced in a tweet. Apparently, the company is working with a consortium to “advance the science of AI assessments.” The National Institute of Standards and Technology (NIST) officially created the Artificial Intelligence Security Institute earlier this year, though Vice President Kamala Harris announced At the 2023 UK AI Security Summit. based on Description of NIST The consortium’s goal is to “develop science-based and empirically supported guidelines and standards for AI measurement and policy, laying the foundation for AI security around the world.”

The company has promised to share its AI models with DeepMind With the UK government last year. whom TechCrunch There are growing concerns that OpenAI is making security a lower priority as it seeks to develop more powerful AI models. There was speculation that the board decided to fire Sam Altman from the company – he was too soon was restored — due to safety and security concerns. However, the company told his employees an internal memo at the time cited a “miscommunication” as the reason.

In May of this year, OpenAI acknowledged this He disbanded his Superalignment team was created to ensure the safety of humanity as the company advances its work on generative artificial intelligence. Before that, Ilya Sutskever, co-founder and chief scientist of OpenAI, one of the leaders of the team, left the company. One of the leaders of the team, Jan Leyke, also resigned. In a series of tweets, he said he had long disagreed with OpenAI management about the company’s top priorities and that “security culture and processes took a backseat to brilliant products.” OpenAI created a new security group Until the end of May, however, it is run by board members, including Altman, raising concerns about self-policing.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *