OpenAI has banned military use. The Pentagon tested its models through Microsoft anyway


Sam Altman, CEO of OpenAI, remains in his position this week after his company signed a deal with the US Army. OpenAI employees criticized the move, which came after Anthropic received nearly $200 million The contract with the Pentagon has collapsedHe asked Altman to reveal more information about the agreement. Altman admitted he looked “dirty” on social media mail.

Although this incident has become a major news story, it may be just the latest and most public example of OpenAI creating vague policies about how the US military accesses its AI.

In 2023, the OpenAI Use Policy explicitly prohibited the military from accessing its AI models. But some OpenAI employees discovered that the Pentagon had already begun experimenting with Azure OpenAI, a version of Microsoft’s OpenAI models, two sources familiar with the matter said. At the time, Microsoft had been contracting with the Department of Defense for decades. It was also the largest investor in OpenAI, and had a broad license to commercialize the startup’s technology.

That same year, OpenAI employees saw Pentagon officials walking through the company’s offices in San Francisco, the sources said. They spoke on condition of anonymity because they were not authorized to comment on the company’s private affairs.

Some OpenAI employees were wary about being associated with the Pentagon, while others were simply confused about what OpenAI’s usage policies meant. Does the policy apply to Microsoft? While sources told WIRED that it was not clear to most employees at the time, OpenAI and Microsoft spokesmen say that Azure OpenAI products are not and have not been subject to OpenAI policies.

“Microsoft has a product called Azure OpenAI Service that becomes available to the US government in 2023 and is subject to the Microsoft Terms of Service,” Microsoft spokesperson Frank Shaw said in a statement to WIRED. Microsoft declined to comment specifically on when it made Azure OpenAI available to the Pentagon, but noted that the service had not been approved for “Top secretGovernment workloads until 2025.

“AI already plays an important role in national security, and we believe it is important to have a seat at the table to help ensure it is deployed safely and responsibly,” Liz Bourgeois, an OpenAI spokeswoman, said in a statement. “We have been transparent with our employees as we approach this work, providing regular updates and dedicated channels where teams can ask questions and interact directly with our national security team.”

The Department of Defense did not respond to WIRED’s request for comment.

By January 2024, OpenAI updated its policies to remove the blanket ban on military use. Sources say several OpenAI employees found out about the policy update through an article in The Intercept. Company leaders later addressed this change in an all-hands meeting, explaining how the company will tread carefully in this area moving forward.

In December 2024, OpenAI announced a partnership with Anduril to develop and deploy AI systems for “national security missions.” Before the announcement, OpenAI told employees that the partnership was narrow in scope and would only handle unclassified workloads, the same sources said. This was in contrast to the deal Anthropic had signed with Palantir, which would see Anthropic’s AI used for covert military work.

Palantir reached out to OpenAI in the fall of 2024 to discuss participating in the FedStart program, an OpenAI spokesperson confirmed to WIRED. The company ultimately refused, telling employees it would be too risky, two sources familiar with the matter told WIRED. However, OpenAI is now working with Palantir in other ways.

Around the time the Anduril deal was announced, a few dozen OpenAI employees joined a public Slack channel to discuss their concerns about the company’s military partnerships, sources confirmed and a company spokesperson confirmed. Some believe the company’s models are unreliable at handling a user’s credit card information, let alone helping Americans on the battlefield.

Leave a Reply

Your email address will not be published. Required fields are marked *