The Kabul times, Afghanistan Trustable News Agency.
ReportWorld

US tests AI-controlled F-16 fighter jet; what it means for war

With the midday sun blazing, an experimental orange and white F-16 fighter jet launched with a familiar roar that is a hallmark of US airpower. But the aerial combat that followed was unlike any other: This F-16 was controlled by artificial intelligence, not a human pilot. And riding in the front seat was Air Force Secretary Frank Kendall. AI marks one of the biggest advances in military aviation since the introduction of stealth in the early 1990s, and the Air Force has aggressively leaned in. Even though the technology is not fully developed, the service is planning for an AI-enabled fleet of more than 1,000 unmanned warplanes, the first of them operating by 2028. It was fitting that the dogfight took place at Edwards Air Force Base, a vast desert facility where Chuck Yeager broke the speed of sound and the military incubated its most secret aerospace advances. Inside classified simulators and buildings with layers of shielding against surveillance, a new test-pilot generation is training AI agents to fly in war. Kendall traveled here to see AI fly in real time and make a public statement of confidence in its future role in air combat. “It’s a security risk not to have it. At this point, we have to have it,” Kendall said in an interview with The Associated Press after he landed. The AP, along with NBC, was granted permission to witness the secret flight on the condition that it would not be reported until it was complete because of operational security concerns. The AI-controlled F-16, called Vista, flew Kendall in lightningfast maneuvers at more than 550 miles an hour that put pressure on his body at five times the force of gravity. It went nearly nose to nose with a second human-piloted F16 as both aircraft raced within 1,000 feet of each other, twisting and looping to try force their opponent into vulnerable positions. At the end of the hourlong flight, Kendall climbed out of the cockpit grinning. He said he’d seen enough during his flight that he’d trust this still-learning AI with the ability to decide whether or not to launch weapons in war. There’s a lot of opposition to that idea. Arms control experts and humanitarian groups are deeply concerned that AI one day might be able to autonomously drop bombs that kill people without further human consultation, and they are seeking greater restrictions on its use. “There are widespread and serious concerns about ceding lifeand-death decisions to sensors and software,” the International Committee of the Red Cross has warned. Autonomous weapons “are an immediate cause of concern and demand an urgent, international political response.” Al- Arabiya

Related posts

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

The Kabul times, Afghanistan Trustable News Agency.