Coby Adcock’s Scout AI raises $100 million to train war models We visited a bootcamp.

At a U.S. military base in central California, four-seat all-terrain vehicles roam the hills. This is a training exercise, but not for the occupants of the vehicle. Efforts to train AI models to enter conflict zones.

The autonomous military ATV is operated by Scout AI, a startup founded in 2024 by Coby Adcock and Collin Otis and describes itself as a “defense frontier laboratory.” The company said Wednesday it has raised a $100 million Series A round led by Align Ventures and Draper Associates, following a $15 million seed round in January 2025.

Scout invited TechCrunch for an exclusive tour of training operations at a military base, which asked not to be named.

The company is building an AI model called “Fury” to operate and command military assets, initially for logistics support and soon for autonomous weapons. CTO Collin Otis likens this work, which builds on an existing LLM, to training a soldier.

“They start at age 18 and sometimes even after college, so they want to start with a base level of intelligence,” Otis told TechCrunch. It’s useful to ask, “How do we start with someone who’s already invested, and then teach this device to be an incredible military AGI rather than just a broadly intelligent AGI?”

Scout has secured military technology development contracts totaling $11 million from organizations such as DARPA, the Army Applied Research Laboratory and other Department of Defense customers. It is one of 20 autonomous companies whose technology the U.S. Army’s 1st Cavalry Division is using during regular training at Ft. Hood, Texas, expects the device to bring product to prove itself when it next deploys in 2027.

For Scout’s internal testing, the rubber encountered the dust of the base’s hilly terrain. There, the company’s operations team, led by former military personnel, is testing the vehicle on simulated missions.

Tech Crunch Event

San Francisco, California
|
October 13-15, 2026

Self-driving cars are starting to be seen in more cities around the world, but they are operating in a more structured environment with rules. Operating autonomously on unmarked trails or off-road is another challenge entirely. Otis, a former executive at self-driving truck company Kodiak, said he got into Scouting when he realized the systems he helped build were not intelligent enough to operate in unpredictable war zones.

Scout Still 8
It is an autonomous ground vehicle controlled by Scout AI’s Fury model. Image Credits:Scout AI / Scout AI

A new approach to autonomy

Scout is transitioning to a new autonomy technology called the Vision Language Action model (VLA), which is based on LLM and used to control the robot. First launched by Google DeepMind in 2023, the technology became the seed for robotics startups such as Physical Intelligence and Figure.AI, a humanoid robot company led by Adock’s brother Brett.

Adcock serves on Figure’s board of directors. He said the experience convinced him of the opportunity to provide broader intelligence to the military’s growing fleet of autonomous vehicles. His brother introduced him to Otis, who was advising Figure, and they began applying the latest advances in AI to military solutions.

“I could hand you a drone controller right now, put you on a headset, and you could learn how to fly a drone in a matter of minutes,” Otis said. “You’re actually learning how to plug your prior knowledge into these two little joysticks. It’s not a huge leap. This is how we think about VLAs and why they’re so unlocked.”

I actually got a chance to drive one of Scout’s ATVs around the Rusty Trail, and the terrain was tough. Steep hills, loose sand at turns, disappearing tracks, confusing intersections. I’m not an experienced ATV driver, but I made fair progress on my first attempt (if I do say so myself). This is the kind of general intelligence the company wants in its models. The model began the process using civilian ATVs and has been training on these ATVs for just six weeks.

I also rode a self-driving ATV, and I could feel the difference as it accelerated faster than a human who cared about the comfort of its passengers. The operations team points out how, like a training driver, the vehicle hugs the right side on wider trails and stays in the center of narrow trails. They also suddenly slow down when confused and think about their next move. This happens several times during our 6.5 km loop before returning to base.

Although VLAs are a new enough technology that they have not yet been fielded by any company in an operational environment, “the technology is good enough for us to conduct experiments in the field with soldiers to figure out what would be most effective for the U.S. military,” said Stuart Young, a former DARPA program manager who worked on ground vehicle autonomy. And like other autonomy companies, Scout’s full autonomy stack also includes deterministic systems and other types of AI to complete the agent’s capabilities.

Young left DARPA this month to join Field after managing a program called RACER. It asked companies to seed this space in the same way the organization’s Grand Challenge revitalized self-driving cars by creating high-speed autonomous off-road vehicles. Field AI and Overland AI, two competitors in this space, spun off from the program, with Scout joining later.

According to Scout executives and military engineers, the first application of ground autonomy will be automated resupply. Transporting water or ammunition to a remote observation post, or convoys of six to 10 autonomous vehicles behind crewed trucks, can save valuable manpower for more important tasks. Brian Mathwich, an active-duty infantry officer who serves as a military fellow at Scout, recalled a recent training exercise in Alaska in which he led a resupply convoy in complete darkness and hoped autonomous vehicles could help him.

Orchestrator UI 3
Image Credits:Scout AI / Scout AI

Add intelligence to the army motor pool

Scout is primarily seen as a software company that builds intelligence layers for military machines. The goal is not to build an autonomous car itself, but to build an autonomous car on top of it.

Adcock expects the startup’s first product to be widely adopted. The product is “Ox,” the company’s command and control software bundled with enhanced computer hardware (GPU, communications, cameras). This is intended to allow individual soldiers to coordinate multiple drones and autonomous ground vehicles using commands such as the prompt “Go to this waypoint and watch for enemies.”

However, operating that software requires training on real vehicles. Therefore, the company calls it a military base training ground. There, drivers drive the ATVs in eight-hour shifts, and then a reinforcement learning system records where work needs to be done and is used to improve the model. The base commander requested that the company’s ATVs be assigned to conduct security patrols.

One of the hypotheses that Scout is testing is that VLAs will deliver fully capable driving agents by activating a relatively limited data set along with training data from simulations. For example, a vehicle may look comfortable on the trail, but it’s not fully ready to operate off-road.

Scout is also practicing using reconnaissance and weapons drones to provide intelligence through the Vision Language Model, a multimodal LLM variant.

Scout is working on a system in which a group of munitions drones would fly using a larger “quarterback” platform that would give them more computing resources to issue commands. In one mission, drones scan and attack geographic areas for hidden enemy tanks without human intervention. Otis argues that an alternative approach in this scenario could be indirect artillery fire, which is inaccurate compared to drone strikes.

Autonomous weapons are a flashpoint in defense technology politics, but experts say the concept is outdated. Heat-seeking missiles and mines have been in use for decades. Jay Adams, a retired U.S. Army captain who leads Scout’s operations team, told TechCrunch that technical experts ask questions about how the weapon is controlled.

He points out that the company’s military drones can be programmed to only attack threats in specific geographic areas, or only with human confirmation. He also said autonomous weapons platforms won’t fire because they’re scared like an 18-year-old soldier.

VLA also promises better targeting. Scout says its models are pre-trained on specific military datasets to prepare for what happens if, for example, it runs into an enemy tank during a resupply mission. Lt. Col. Nick Rinaldi, who oversees Scout’s work at the Army Applications Research Laboratory, says that automated targeting is difficult and unlikely to be used outside of a confined environment in the near term, but the potential of VLAs to make inferences about threats makes them a promising technology to investigate.

Adams said the promise of drones that can identify their targets is key to future warfare. Russia’s invasion of Ukraine has increased interest in drone warfare, but having people piloting individual UAVs is not believed to be large enough for the U.S. to face a large number of low-cost unmanned systems if they threaten U.S. forces.

Mission to counter the rebel mood

Scout Still 15
Image Credits:Scout AI / Scout AI

Like many defense startups, Scout wears its mission on its sleeve, and executives will feel free to criticize companies that are reluctant to hand over their technology to the government. For example, Google has reportedly pulled out of a Pentagon contest to develop a control system for swarms of autonomous drones, and Scout is also developing this capability.

“AI people don’t want to work with the military,” Otis told TechCrunch, referencing Anthropic’s fight with the Department of Defense over terms of service. “None of them are open to running agents on one-way attack drones or running agents on missile systems.”

Nonetheless, Scout is indeed building its agents based on existing LLMs, but has not revealed which LLMs. Otis says it has contracted with “a very well-known hyperscaler” to provide pre-trained intelligence for Scout’s base model. Otis also declined to comment on whether it uses an open model like the one offered by the Chinese company. Many companies that rely on AI inference rely on these models to operate at lower costs compared to models from cutting-edge labs like Anthropic or OpenAI.

Scout expects to solve this problem by building its own model from scratch in the coming years, and its founders say much of the capital will go into training and computational costs. In fact, Otis wonders whether Scout will ever be able to beat existing leaders in AGI, since Scout’s model will constantly interact with the real world.

“There is a claim in the AGI community that intelligence comes from reading the Internet, and that most intelligence comes from interacting with the world,” Otis said.

So does that mean Adcock is competing against his brother’s army of humanoid robots in Figure? No, says Otis. But “we can scale much faster because our customers own the assets,” he said, referring to the Department of Defense.

If you purchase through links in our articles, we may receive a small commission. This does not affect our editorial independence.