Experiencing the battlefield before boots hit ground

Posted: August 24, 2023

Simulations are essential in the training and preparedness of our armed forces. Traditionally, simulations have been exercises and practice missions in the field, but the military is increasingly utilizing computer-generated synthetic environments to train soldiers and plan dangerous missions.

With funding from the Office of Naval Research (ONR), engineers at The Ohio State University are developing next-generation 3D geoinformation techniques to make synthetic battlefield environments as accurate and as realistic as possible. Associate Professor Rongjun Qin is the principal investigator.

Rongjun Qin at an Office of Naval Research expo
Qin at the ONR Science Fair in Quantico, Virginia (2022)

Synthetic environments (SE) aim to replicate the real-world scenario to a degree that simulated activities can accurately predict potential engagement in the field, so that decisions made in SE can be adapted in the real world. Qin’s team will develop a set of fundamental techniques to improve error characterization, semantic resolution, and realism of geo-specific 3D photogrammetric models. He calls this approach “uncertainty-aware photogrammetric reconstruction.” The overarching goal is to enhance the fidelity and trustworthiness of 3D data for virtual combat training and rehearsal, as well as many other applications such as pedestrian, traffic, flood and wind simulations.

According to Qin, users of 3D data are often unaware of how accurate – or inaccurate – the data represents the physical world. 3D data are intricate, and the level of error varies with every single vertex. Qin’s approach will harness statistical modeling, physics-based 3D photogrammetry, and AI to characterize the errors of 3D assets to the greatest detail, such that the SE users are aware of the trustworthiness and precision of every single corner of the physical world presented by the 3D data.

“This will help users of SE to make the most accurate decisions in simulation and transfer that experience to the real world,” Qin explained.

This is Qin’s third research project funded by ONR. The other two focused on reducing data heterogenous redundancies to create seamless 3D models and exploring crowd-sourced photographs and satellite images to achieve low-cost and city-scale 3D reconstruction, respectively.

Qin holds a joint appointment in the Department of Civil, Environmental and Geodetic Engineering and the Department of Electrical and Computer Engineering. His cross-disciplinary background facilitates strong expertise in using images and videos to perform accurate localization and detection of objects.

Category: Research