Video Above: Air War in 2050 – Air Force Research Lab Commander
By Kris Osborn – President & Editor-In-Chief, Warrior Maven
The (now former) Commanding General of the Air Force Research Lab is already thinking about the next-generation of AI as something which could progressively move beyond pure data-base comparisons and do more real-time learning and help analyze more subjective phenomena unique to human cognition.
Artificial Intelligence – AI
“AI today is very database intensive. But what can and should it be in the future? How can we graduate it from just being a database to something that can leverage concepts and relationships, or emotions and predictive analyses? And so there’s so much more that we as humans can do that AI cannot? How do we get to that?,” Maj. Gen. Heather Pringle, Commanding General of the Air Force Research Lab, told Warrior in an interview.
Pringle’s thinking does appear closely aligned with cutting edge AI-focused research which is indeed focused on ways to catalog, discern, interpret, organize and ultimately analyze things typically more difficult for mathematically-driven machines to understand.
For example, as Pringle said, what about emotion? Certain subjective or more nuanced and varied concepts? Is there a way these kinds of cognitive phenomena could be accurately tracked by computers? Certainly would seem difficult, as an interwoven blend of emotional, philosophical and even psychological variables could all inform human behavior and human perceptions.
Nonetheless, Pringle does appear to be referring to areas of great promise, such as the emerging ability of AI to determine context and, for instance, understand the difference between foot “ball” and dance “ball” by analyzing surrounding words. This kind of machine learning represents the cutting edge or new boundary for AI, which Pringle explained is not without challenges. There is some early military and industry progress underway along these lines, as things like previous behavior patterns, philosophical concepts or speech patterns, for instance, can be cataloged and potentially analyzed to predict solutions.
However, as Pringle explains, there are still many yet-to-be understood complexities and variables, and there are many things specific to human consciousness and decision making which seem well beyond the reach of what AI-enabled systems can do.
“The AI that we see today, like the navigation systems that automatically give you a pathway to get from point A to point B? Well, we placed a lot of trust in those systems, but the consequences are pretty low. And so it was pretty easy to develop a human machine trusting relationship. But when we’re talking about warfare and warfighters, we want to build in that trust along the way,” Pringle explained.
This challenge is often referred to as “zero trust,” meaning that advanced AI-empowered algorithms need to improve reliability by better integrating an ability to assimilate and analyze new data or information that is not part of its database. It is often said that AI is only as effective as its database to a large degree, as it must bounce new or incoming information off of a vast or seemingly limitless database. So what happens when an AI-capable computer comes across something it has not seen? That is a fundamental predicament in certain respects, as there are numerous abilities and faculties entirely unique to human cognition and cannot be replicated by machines… at least not yet.
Part of the solution, Pringle explained, lies in increasing the ability for human-machine interface, meaning each can inform the other in a way to optimize data analysis and decision making. Pringle described this as a “symbiotic relationship.”
“Right now, a lot of times when we see AI, we don’t fully understand why it’s taking the actions that it is. Its leveraging so much data and coming up with novel solutions that we can’t understand. So it’s going to cause the trust relationship to be a little bit lower… Then, at a point in the future, when we’re able to make that more transparent, or have the AI or autonomous vehicle communicate better with the human or to even respond to the human, we even have a line of research where we’re looking at how can we adapt a machine to respond to what the human is learning, knowing, understanding, communicating,” Pringle explained.
At the same time, Pringle was also clear that alongside longer-term efforts to uncover breakthrough methods of using computers to discern and analyze certain more ineffable human characteristics, there are many extremely promising near-term applications of AI which are already showing impactful breakthroughs and of great consequence to current platforms, weapons and networks. .
For instance, AI and autonomy are already helping fixed wing aircraft such as F-35s share data in real time with nearby drones, a step toward ultimately enabling a 5th-generation stealth fighter to operate numerous drones from the cockpit of an aircraft. This reduces latency and massively multiples tactical options for pilots who could use drones to test enemy defenses, blanket an area with surveillance or even fire weapons when directed by a human.
Early iterations of this have already been demonstrated through the Air Force’s Valkyrie program in which an unmanned system flew alongside an F-35 and F-22 while sharing information in real time. By extension, the Valkyrie drone has even itself launched mini-drones. The Valkyrie launched a Kratos-built ALTIUS-600 mini drone in what the Air Force describes as the first ever opening of its internal weapons bay. This demonstrates a number of interesting and significant tactical possibilities, as a drone launched drone could operate as a mini-scout surveillance node over extremely hostile or high threat areas amid heavy enemy fire. It would not only have a better chance at not being shot down by virtue of its small size, but a small drone of this kind could even function itself as a weapon. The Valkyrie is configured to drop bombs and fire weapons as part of a manned-unmanned teaming operational scope.
The concept is to of course ultimately ensure human command and control in a supervisory capacity, especially when it comes to the use of lethal force, yet breakthroughs in AI and autonomy can enable machines and unmanned systems to increasingly perform a growing number of time-sensitive warfare functions without needing human intervention. Pringle explained that researchers and weapons developers are still exploring complexities with these sorts of questions and technology.
“What is the role of the human? How are they managing these systems? How can we ease their cognitive load? How can we be most efficient with the number of vehicles? What is the right ratio of the vehicles? There are a lot of really great S&T questions to answer,” Pringle said.
“There’s a lot of challenges to address when you’re looking at increasing the number of systems and the number of platforms, due to the integration and the data links between them,” Pringle said.
Air War in 2050 – Air Force Research Lab Commander on Golden Horde
In yet another instance, emerging programs such as Golden Horde are already demonstrating an ability for weapons to autonomously share data while en-route to a target, greatly expanding the tactical attack envelope and introducing the ability for weapons to change course in flight.
The largest or most impactful near-term application of AI, arguably, is its continued contribution to “data processing” and identifying moments of relevance at the point of collection to streamline the networking of organized, relevant information across the battlefield in near real time. This can enable multi-domain connectivity, or connect fighter jets with command and control centers, bombers, drones and even ground forces and Navy ships. The speed at which new information can be gathered, bounced off a database, analyzed and effectively transmitted continues to massively increase the speed of attack, reduce “sensor-to-shooter” time and enable attacking forces to operate inside of or ahead of an enemies decision-making process.
Kris Osborn is the defense editor for the National Interest and President of Warrior Maven – the Center for Military Modernization. Osborn previously served at the Pentagon as a Highly Qualified Expert with the Office of the Assistant Secretary of the Army—Acquisition, Logistics & Technology. Osborn has also worked as an anchor and on-air military specialist at national TV networks. He has appeared as a guest military expert on Fox News, MSNBC, The Military Channel, and The History Channel. He also has a Masters Degree in Comparative Literature from Columbia University.