What do AI models of war look like in reality?


It may be anthropic Concerns about allowing the US military unfettered access Many companies have their own AI models, but some startups are building advanced AI specifically for military applications.

Smac techniquesThe company, which announced a $32 million funding round this week, is developing models that it says will soon surpass CLOUD’s capabilities when it comes to planning and executing military operations. Unlike Anthropic, the startup seems less concerned with banning certain types of military use.

“When you serve in the military, you swear that you will serve honorably, lawfully, and according to the rules of war,” says CEO Andy Markoff. “For me, the people who deploy technology and make sure it is used ethically should wear the uniform.”

Markov is not your average AI executive. He is a former commander of the US Marine Corps Special Operations Command and has helped conduct high-risk Special Forces operations in Iraq and Afghanistan. He co-founded Smack with Clint Alanis, another former Marine, and Dan Gould, a computer scientist who previously served as vice president of technology at Tinder.

SMAK models learn to identify optimal mission plans through a process of trial and error. Similar to how Google trained its 2017 AlphaGo program. In the case of SMAC, the strategy involves running the model through different wargaming scenarios and hiring expert analysts to provide a signal that tells the model whether the chosen strategy will pay off. Markov says the startup may not have the budget of a traditional AI lab, but it is spending millions to train its first AI models.

Battle lines

The military use of artificial intelligence has become a hot topic in Silicon Valley after Defense Department officials He went face to face with Anthropic executives Under the terms of a contract worth approximately $200 million.

One of the issues that led to the collapse, which led to Defense Minister Pete Hegseth declaring it a humanitarian Supply chain risksAnthropic’s desire was to limit the use of its models in autonomous weapons.

Markov says this hype obscures the fact that today’s large language models are not optimized for military use. He says that general-purpose models, such as CLOUD, are good at summarizing reports. But they are not trained in military data and lack a human-level understanding of the physical world, making them unsuitable for controlling physical devices. “I can tell you they are absolutely incapable of identifying a target,” Markoff says.

“Nobody to my knowledge at the War Department is talking about fully automating the kill chain,” he claims, referring to the steps involved in making decisions about the use of lethal force.

Task scope

The United States and other militaries already use autonomous weapons in certain situations, including missile defense systems that need to respond at breakneck speeds.

“The United States and more than 30 other countries are deploying weapons systems with varying degrees of autonomy, including some that I would define as fully autonomous,” says Rebecca Krotoff, an expert on the legal issues surrounding autonomous weapons at Harvard University. University of Richmond School of Law.

In the future, specialized models like the ones Smack is working on could be used for mission planning purposes as well, according to Markov. The company’s models aim to help leaders automate much of the hard work involved in drawing up mission plans. Markov says military mission planning is still typically done manually using whiteboards and notebooks.

If the United States goes to war with a “near-peer” like Russia or China, Markov says, automated decision-making could provide the United States with much-needed “decision dominance.”

But it is still an open question whether AI can be used reliably in such circumstances. One recent experiment, conducted by a researcher at King’s College London, alarmingly showed that holders of LL.M She tended to escalate Nuclear conflicts in war games.

Leave a Reply

Your email address will not be published. Required fields are marked *