This week, Cape Henry Associates (CHA) presented our paper, “Digital Twins to Computer Vision: A Rapid Path to Augmented Reality Object Detection on the Battlefield” at the 2021 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) in Orlando, Fla. Chuck Wythe opened, and Nosika Fisher of CHA’s R&D department presented the paper.

Those who have been following our blog series this year about our R&D team may have quickly realized our paper centers on a topic we teased a few months ago when we discussed our team identifying deer in the lab. They were working on speeding the process of machine learning model development using only synthetic images as the training data as well as automating the processes of capture and labeling. We wanted to test how effective those models could be when applied in artificial intelligence (AI)- driven object detection applications.

The main problem with applying AI models to battlefield object identification purposes comes in the amount of training data (e.g., labeled images) necessary to allow for object identification at any high level of confidence. In our example, we showed that literally thousands of images of similar but distinct objects were required – in this case, images of M1A1 Abrams tanks vs. Panther tanks. While we can achieve object identification at high levels of confidence, the degree of human intervention necessary is intense when using traditional photographic images and manual labeling techniques.

Using 3D models and CHA’s FogBoxer plug-in for Unity, the team produced similar results to traditional methods that would require hundreds of man-hours in only a few hours, proving object detector applications can be produced for classes of objects rapidly without sacrificing efficacy.

The CHA team ended the presentation with a discussion about the future practicality of using AI gaming object identification capabilities and digital twins in rapid, automated generation of AI-powered applications to improve battlefield situational awareness. The presentation was well-received and the Q&A session immediately following evoked a thought-provoking discussion.