{"id":84834,"date":"2023-08-28T13:22:27","date_gmt":"2023-08-28T13:22:27","guid":{"rendered":"https:\/\/celebritytidings.com\/?p=84834"},"modified":"2023-08-28T13:22:27","modified_gmt":"2023-08-28T13:22:27","slug":"ai-brings-the-robot-wingman-to-aerial-combat-the-denver-post","status":"publish","type":"post","link":"https:\/\/celebritytidings.com\/world-news\/ai-brings-the-robot-wingman-to-aerial-combat-the-denver-post\/","title":{"rendered":"AI brings the robot wingman to aerial combat – The Denver Post"},"content":{"rendered":"
By Eric Lipton<\/strong>, The New York Times<\/em><\/p>\n EGLIN AIR FORCE BASE, Fla. — It is powered into flight by a rocket engine. It can fly a distance equal to the width of China. It has a stealthy design and is capable of carrying missiles that can hit enemy targets far beyond its visual range.<\/p>\n But what really distinguishes the Air Force\u2019s pilotless XQ-58A Valkyrie experimental aircraft is that it is run by artificial intelligence, putting it at the forefront of efforts by the U.S. military to harness the capacities of an emerging technology whose vast potential benefits are tempered by deep concerns about how much autonomy to grant to a lethal weapon.<\/p>\n Essentially a next-generation drone, the Valkyrie is a prototype for what the Air Force hopes can become a potent supplement to its fleet of traditional fighter jets, giving human pilots a swarm of highly capable robot wingmen to deploy in battle. Its mission is to marry artificial intelligence and its sensors to identify and evaluate enemy threats and then, after getting human sign-off, to move in for the kill.<\/p>\n On a recent day at Eglin Air Force Base on Florida\u2019s Gulf coast, Maj. Ross Elder, 34, a test pilot from West Virginia, was preparing for an exercise in which he would fly his F-15 fighter alongside the Valkyrie.<\/p>\n \u201cIt\u2019s a very strange feeling,\u201d Elder said, as other members of the Air Force team prepared to test the engine on the Valkyrie. \u201cI\u2019m flying off the wing of something that\u2019s making its own decisions. And it\u2019s not a human brain.\u201d<\/p>\n The Valkyrie program provides a glimpse into how the U.S. weapons business, military culture, combat tactics and competition with rival nations are being reshaped in possibly far-reaching ways by rapid advances in technology.<\/p>\n The emergence of AI is helping to spawn a new generation of Pentagon contractors who are seeking to undercut, or at least disrupt, the long-standing primacy of the handful of giant firms who supply the armed forces with planes, missiles, tanks and ships.<\/p>\n The possibility of building fleets of smart but relatively inexpensive weapons that could be deployed in large numbers is allowing Pentagon officials to think in new ways about taking on enemy forces.<\/p>\n It also is forcing them to confront questions about what role humans should play in conflicts waged with software that is written to kill, a question that is especially fraught for the United States given its record of errant strikes by conventional drones that inflict civilian casualties.<\/p>\n And gaining and maintaining an edge in AI is one element of an increasingly open race with China for technological superiority in national security.<\/p>\n Military planners are worried that the current mix of Air Force planes and weapons systems — despite the trillions of dollars invested in them \u2014 can no longer be counted on to dominate if a full-scale conflict with China were to break out, particularly if it involved a Chinese invasion of Taiwan.<\/p>\n That is because China is lining its coasts, and artificial islands it has constructed in the South China Sea, with more than 1,000 anti-ship and anti-aircraft missiles that severely curtail the United States\u2019 ability to respond to any possible invasion of Taiwan without massive losses in the air and at sea.<\/p>\n After decades of building fewer and fewer increasingly expensive combat aircraft — the F-35 fighter jet costs $80 million per unit \u2014 the Air Force now has the smallest and oldest fleet in its history.<\/p>\n That is where the new generation of AI drones, known as collaborative combat aircraft, will come in. The Air Force is planning to build 1,000 to 2,000 of them for as little as $3 million apiece, or a fraction of the cost of an advanced fighter, which is why some at the Air Force call the program \u201caffordable mass.\u201d<\/p>\n There will be a range of specialized types of these robot aircraft. Some will focus on surveillance or resupply missions, others will fly in attack swarms and still others will serve as a \u201cloyal wingman\u201d to a human pilot.<\/p>\n The drones, for example, could fly in front of piloted combat aircraft, doing early, high-risk surveillance. They could also play a major role in disabling enemy air defenses, taking risks to knock out land-based missile targets that would be considered too dangerous for a human-piloted plane.<\/p>\n The AI — a more specialized version of the type of programming now best known for powering chat bots \u2014 would assemble and evaluate information from its sensors as it approaches enemy forces to identify other threats and high-value targets, asking the human pilot for authorization before launching any attack with its bombs or missiles.<\/p>\n The cheapest ones will be considered expendable, meaning they likely will only have one mission. The more sophisticated of these robot aircraft might cost as much as $25 million, according to an estimate by the House of Representatives, still far less than a piloted fighter jet.<\/p>\n \u201cIs it a perfect answer? It is never a perfect answer when you look into the future,\u201d said Maj. Gen. R. Scott Jobe, who until this summer was in charge of setting requirements for the air combat program, as the Air Force works to incorporate AI into its fighter jets and drones.<\/p>\n \u201cBut you can present potential adversaries with dilemmas — and one of those dilemmas is mass,\u201d Jobe said in an interview at the Pentagon, referring to the deployment of large numbers of drones against enemy forces. \u201cYou can bring mass to the battle space with potentially fewer people.\u201d<\/p>\n The effort represents the beginning of a seismic shift in the way the Air Force buys some of its most important tools. After decades in which the Pentagon has focused on buying hardware built by traditional contractors like Lockheed Martin and Boeing, the emphasis is shifting to software that can enhance the capabilities of weapons systems, creating an opening for newer technology firms to grab pieces of the Pentagon\u2019s vast procurement budget.<\/p>\n \u201cMachines are actually drawing on the data and then creating their own outcomes,\u201d said Brig. Gen. Dale White, the Pentagon official who has been in charge of the new acquisition program.<\/p>\n The Air Force realizes it must also confront deep concerns about military use of AI, whether fear that the technology might turn against its human creators (like Skynet in the \u201cTerminator\u201d film series) or more immediate misgivings about allowing algorithms to guide the use of lethal force.<\/p>\n \u201cYou\u2019re stepping over a moral line by outsourcing killing to machines — by allowing computer sensors rather than humans to take human life,\u201d said Mary Wareham, the advocacy director of the arms division of Human Rights Watch, which is pushing for international limits on so-called lethally autonomous weapons.<\/p>\n A recently revised Pentagon policy on the use of AI in weapons systems allows for the autonomous use of lethal force — but any particular plan to build or deploy such a weapon must first be reviewed and approved by a special military panel.<\/p>\n Asked if Air Force drones might eventually be able to conduct lethal strikes like this without explicit human sign-off on each attack, a Pentagon spokesperson said in a statement to The New York Times that the question was too hypothetical to answer.<\/p>\n Any autonomous Air Force drone, the statement said, would have to be \u201cdesigned to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.\u201d<\/p>\n Air Force officials said they fully understand that machines are not intelligent in the same way humans are. AI technology can also make mistakes — as has happened repeatedly in recent years with driverless cars \u2014 and machines have no built-in moral compass. The officials said they were considering those factors while building the system.<\/p>\n It is an awesome responsibility,\u201d said Col. Tucker Hamilton, the Air Force chief of AI Test and Operations, who also helps oversee the flight-test crews at Eglin Air Force Base, noting that \u201cdystopian storytelling and pop culture has created a kind of frenzy\u201d around AI.<\/p>\n \u201cWe just need to get there methodically, deliberately, ethically — in baby steps,\u201d he said.<\/p>\n The long, wood-paneled corridor in the Pentagon where the Air Force top brass have their offices is lined with portraits of a century\u2019s worth of leaders, mixed with images of the flying machines that have given the United States global dominance in the air since World War II.<\/p>\n A common theme emerges from the images: the iconic role of the pilot.<\/p>\n Humans will continue to play a central role in the new vision for the Air Force, top Pentagon officials said, but they will increasingly be teamed with software engineers and machine learning experts, who will be constantly refining algorithms governing the operation of the robot wingmen that will fly alongside them.<\/p>\n Almost every aspect of Air Force operations will have to be revised to embrace this shift. It\u2019s a task that through this summer had been largely been entrusted to White and Jobe, whose partnership Air Force officers nicknamed the Dale and Frag Show (Jobe\u2019s call sign as a pilot is Frag).<\/p>\n The Pentagon, through its research divisions like the Defense Advanced Research Projects Agency and the Air Force Research Laboratory, has already spent several years building prototypes like the Valkyrie and the software that runs it. But the experiment is graduating to a so-called program of record, meaning if Congress approves, substantial taxpayer dollars will be allocated to buying the vehicles: a total of $5.8 billion over the next five years, according to the Air Force plan.<\/p>\n Unlike F-35 fighter jets, which are delivered as a package by Lockheed Martin and its subcontractors, the Air Force is planning to split up the aircraft and the software as separate purchases.<\/p>\n Kratos, the builder of the Valkyrie, is preparing to bid on any future contract, as are other major companies such as General Atomics, which for years has built attack drones used in Iraq and Afghanistan, and Boeing, which has its own experimental autonomous fighter jet prototype, the MQ-28 Ghost Bat.<\/p>\n A separate set of software-first companies — tech startups such as Shield AI and Anduril that are funded by hundreds of millions of dollars in venture capital \u2014 are vying for the right to sell the Pentagon the AI algorithms that will handle mission decisions.<\/p>\n The list of hurdles that must be cleared is long.<\/p>\n The Pentagon has a miserable record on building advanced software and trying to start its own AI program. Over the years, it has cycled through various acronym-laden program offices that are created and then shut down with little to show.<\/p>\n There is constant turnover among leaders at the Pentagon, complicating efforts to keep moving ahead on schedule. Jobe has already been assigned to a new role and White soon will be.<\/p>\n The Pentagon also is going to need to disrupt the iron-fisted control that the major defense contractors have on the flow of military spending. As the structure of the Valkyrie program suggests, the military wants to do more to harness the expertise of a new generation of software companies to deliver key parts of the package, introducing more competition, entrepreneurial speed and creativity into what has long been a risk-averse and slow-moving system.<\/p>\n The most important job, at least until recently, rested with Jobe, who first made a name for himself in the Air Force two decades ago when he helped devise a bombing strategy to knock out deeply buried bunkers in Iraq that held critical military communication switches.<\/p>\n He was asked to make key decisions setting the framework for how the AI-powered robot airplanes will be built. During a Pentagon interview, and at other recent events, Jobe and White both said one clear imperative is that humans will remain the ultimate decision-makers — not the robot drones, known as CCAs, the acronym for collaborative combat aircraft.<\/p>\n \u201cI\u2019m not going to have this robot go out and just start shooting at things,\u201d Jobe said during a briefing with Pentagon reporters late last year.<\/p>\n He added that a human would always be deciding when and how to have an AI-enabled aircraft engage with an enemy and that developers are building a firewall around certain AI functions to limit what the devices will be able to do on their own.<\/p>\n \u201cThink of it as just an extension to your weapons bay if you\u2019re in an F-22, F-35 or whatnot,\u201d he said.<\/p>\n Back in 1947, Chuck Yeager, then a young test pilot from Myra, West Virginia, became the first human to fly faster than the speed of sound.<\/p>\n Seventy-six years later, another test pilot from West Virginia has become one of the first Air Force pilots to fly alongside an autonomous, AI-empowered combat drone.<\/p>\n Tall and lanky, with a slight Appalachian accent, Elder last month flew his F-15 Strike Eagle within 1,000 feet of the experimental XQ-58A Valkyrie — watching closely, like a parent running alongside a child learning how to ride a bike, as the drone flew on its own, reaching certain assigned speeds and altitudes.<\/p>\n The basic functional tests of the drone were just the lead-up to the real show, where the Valkyrie gets beyond using advanced autopilot tools and begins testing the war-fighting capabilities of its AI. In a test slated for later this year, the combat drone will be asked to chase and then kill a simulated enemy target while out over the Gulf of Mexico, coming up with its own strategy for the mission.<\/p>\n During the current phase, the goal is to test the Valkyrie\u2019s flight capacity and the AI software, so the aircraft is not carrying any weapons. The planned dogfight will be with a \u201cconstructed\u201d enemy, although the AI agent onboard the Valkyrie will believe it is real.<\/p>\n Elder had no way to communicate directly with the autonomous drone at this early stage of development, so he had to watch very carefully as it set off on its mission.<\/p>\n \u201cIt wants to kill and survive,\u201d Elder said of the training the drone has been given.<\/p>\n An unusual team of Air Force officers and civilians has been assembled at Eglin, which is one of the largest Air Force bases in the world. They include Capt. Rachel Price from Glendale, Arizona, who is wrapping up a doctorate at the Massachusetts Institute of Technology on computer deep learning, as well as Maj. Trent McMullen from Marietta, Georgia, who has a master\u2019s degree in machine learning from Stanford University.<\/p>\n One of the things Elder watches for is any discrepancies between simulations run by computer before the flight and the actions by the drone when it is actually in the air — a \u201csim to real\u201d problem, they call it \u2014 or even more worrisome, any sign of \u201cemergent behavior,\u201d where the robot drone is acting in a potentially harmful way.<\/p>\n During test flights, Elder or the team manager in the Eglin Air Force Base control tower can power down the AI platform while keeping the basic autopilot on the Valkyrie running. So can Capt. Abraham Eaton of Gorham, Maine, who serves as a flight test engineer on the project and is charged with helping evaluate the drone\u2019s performance.<\/p>\n \u201cHow do you grade an artificial intelligence agent?\u201d he asked rhetorically. \u201cDo you grade it on a human scale? Probably not, right?\u201d<\/p>\n Real adversaries will likely try to fool the AI, for example by creating a virtual camouflage for enemy planes or targets to make the robot believe it is seeing something else.<\/p>\n The initial version of the AI software is more \u201cdeterministic,\u201d meaning it is largely following scripts that it has been trained with, based on computer simulations the Air Force has run millions of times as it builds the system. Eventually, the AI software will have to be able to perceive the world around it — and learn to understand these kinds of tricks and overcome them, skills that will require massive data collection to train the algorithms. The software will have to be heavily protected against hacking by an enemy.<\/p>\n The hardest part of this task, Elder and other pilots said, is the vital trust building that is such a central element of the bond between a pilot and wingman — their lives depend on each other, and how each of them react. It is a concern back at the Pentagon too.<\/p>\n \u201cI need to know that those CCAs are going to do what I expect them to do, because if they don\u2019t, it could end badly for me,\u201d White said.<\/p>\n In early tests, the autonomous drones already have shown that they will act in unusual ways, with the Valkyrie in one case going into a series of rolls. At first, Elder thought something was off, but it turned out that the software had determined that its infrared sensors could get a clearer picture if it did continuous flips. The maneuver would have been like a stomach-turning roller coaster ride for a human pilot, but the team later concluded the drone had achieved a better outcome for the mission.<\/p>\n Air Force pilots have experience with learning to trust computer automation — like the collision avoidance systems that take over if a fighter jet is headed into the ground or set to collide with another aircraft \u2014 two of the leading causes of death among pilots.<\/p>\n The pilots were initially reluctant to go into the air with the system engaged, as it would allow computers to take control of the planes, several pilots said in interviews. As evidence grew that the system saved lives, it was broadly embraced. But learning to trust robot combat drones will be an even bigger hurdle, senior Air Force officials acknowledged.<\/p>\n Air Force officials used the word \u201ctrust\u201d dozens of times in a series of interviews about the challenges they face in building acceptance among pilots. They have already started flying the prototype robot drones with test pilots nearby, so they can get this process started.<\/p>\n The Air Force has also begun a second test program called Project Venom that will put pilots in six F-16 fighter jets equipped with AI software that will handle key mission decisions.<\/p>\n The goal, Pentagon officials said, is an Air Force that is more unpredictable and lethal, creating greater deterrence for any moves by China, and a less deadly fight, at least for the U.S. Air Force.<\/p>\n Officials estimate that it could take five to 10 years to develop a functioning AI-based system for air combat. Air Force commanders are pushing to accelerate the effort — but recognize that speed cannot be the only objective.<\/p>\n \u201cWe\u2019re not going to be there right away, but we\u2019re going to get there,\u201d Jobe said. \u201cIt\u2019s advanced and getting better every day as you continue to train these algorithms.\u201dThis article originally appeared in The New York Times.<\/p>\n Get more Colorado news by signing up for our daily Your Morning Dozen email newsletter.<\/em><\/p>\nThe Pentagon Back Flip<\/h4>\n
The Test Pilots<\/h4>\n