The US- AI Has a Wingman for Robots on Aerial Combat Over Taiwan
What's being done and is it enough in enough time?
“It’s a very strange feeling,” Maj. Ross Elder said. “I’m flying off the wing of something that’s making its own decisions. And it’s not a human brain.”Credit...Edmund D. Fountain for The New York Ti
Back in 1947, Chuck Yeager, then a young test pilot from Myra, W. Va., became the first human to fly faster than the speed of sound. Seventy-six years later, another test pilot from West Virginia has become one of the first Air Force pilots to fly alongside an autonomous, A.I.-empowered combat drone. Tall and lanky, with a slight Appalachian accent, Major Elder last month flew his F-15 Strike Eagle within 1,000 feet of the experimental XQ-58A Valkyrie — watching closely, like a parent running alongside a child learning how to ride a bike, as the drone flew on its own, reaching certain assigned speeds and altitudes. The basic functional tests of the drone were just the lead-up to the real show, where the Valkyrie gets beyond using advanced autopilot tools and begins testing the war-fighting capabilities of its artificial intelligence. In a test slated for later this year, the combat drone will be asked to chase and then kill a simulated enemy target while out over the Gulf of Mexico, coming up with its own strategy for the mission. During the current phase, the goal is to test the Valkyrie’s flight capacity and the A.I. software, so the aircraft is not carrying any weapons. The planned dogfight will be with a “constructed” enemy, although the A.I. agent onboard the Valkyrie will believe it is real“It wants to kill and survive,” Major Elder said of the training the drone has been given.
An unusual team of Air Force officers and civilians has been assembled at Eglin, which is one of the largest Air Force bases in the world. They include Capt. Rachel Price from Glendale, AZ., is wrapping up a Ph.D. at the Massachusetts Institute of Technology on computer deep learning, as well as Maj. Trent McMullen from Marietta, Ga., has a master's degree in machine learning from Stanford University. between simulations run by computer before the flight and the actions by the drone when it is actually in the air — a “sim to real” problem, they call it — or even more worrisome, any sign of “emergent behavior,” where the robot drone is acting in a potentially harmful way. During test flights, Major Elder or the team manager in the Eglin Air Force Base control tower can power down the A.I. platform while keeping the basic autopilot on the Valkyrie running. So can Capt. Abraham Eaton of Gorham, Maine, who serves as a flight test engineer on the project and is charged with helping evaluate the drone’s performance. “How do you grade an artificial intelligence agent?” he asked rhetorically. “Do you grade it on a human scale? Probably not, right?”In early tests, the autonomous drones already have shown that they will act in unusual ways, with the Valkyrie in one case going into a series of rolls. At first, Major Elder thought something was off, but it turned out that the software had determined that its infrared sensors could get a clearer picture if it did continuous flips. The maneuver would have been like a stomach-turning roller coaster ride for a human pilot, but the team later concluded the drone had achieved a better outcome for the mission. Air Force pilots have experience with learning to trust computer automation — like the collision avoidance systems that take over if a fighter jet is headed into the ground or set to collide with another aircraft — two of the leading causes of death among pilots. The pilots were initially reluctant to go into the air with the system engaged, as it would allow computers to take control of the planes, several pilots said in interviews. As evidence grew that the system saved lives, it was broadly embraced. But learning to trust robot combat drones will be an even bigger hurdle, senior Air Force officials acknowledged. Air Force officials used the word “trust” dozens of times in a series of interviews about the challenges they face in building acceptance among pilots. They have already started flying the prototype robot drones with test pilots nearby, so they can get this process started. Real adversaries will likely try to fool the artificial intelligence, for example by creating a virtual camouflage for enemy planes or targets to make the robot believe it is seeing something else. The initial version of the A.I. software is more “deterministic,” meaning it is largely following scripts that it has been trained with, based on computer simulations the Air Force has run millions of times as it builds the system. Eventually, the A.I. software will have to be able to perceive the world around it — and learn to understand these kinds of tricks and overcome them, skills that will require massive data collection to train the algorithms. The software will have to be heavily protected against hacking by an enemy. The hardest part of this task, Major Elder and other pilots said, is the vital trust building that is such a central element of the bond between a pilot and wingman — their lives depend on each other, and how each of them react. It is a concern back at the Pentagon too. “I need to know that those C.C.A.s are going to do what I expect them to do because if they don’t, it could end badly for me,” General White said. The Air Force has also begun a second test program called Project Venom that will put pilots in six F-16 fighter jets equipped with artificial intelligence software that will handle key mission decisions. The goal, Pentagon officials said, is an Air Force that is more unpredictable and lethal, creating greater deterrence for any moves by China, and a less deadly fight, at least for the United States Air Force. Officials estimate that it could take five to 10 years to develop a functioning A.I.-based system for air combat. Air Force commanders are pushing to accelerate the effort — but recognize that speed cannot be the only objective. “We’re not going to be there right away, but we’re going to get there,” General Jobe said. “It’s advanced and getting better every day as you continue to train these algorithms.”
Officials estimate that it could take five to 10 years to develop a functioning A.I.-based system for air combat.Credit...Edmund D. Fountain for The New York Times
Eric Lipton is a Washington-based investigative reporter. A three-time winner of the Pulitzer Prize, he previously worked at The Washington Post and The Hartford Courant. More about Eric Lipton
Comments