MVRsimulation’s Virtual Reality Scene Generator (VRSG) supports dynamic simulation-based fixed-wing and unmanned aircraft flight training at the U.S. Air Force Academy’s (USAFA’s) Multi-Domain Laboratory (MDL) in Colorado, opened in 2021.
Video courtesy of the USAFA Association of Graduates
The details: The lab has two flight bays, each containing a suite of 12 networked ZedaSoft Zuse Simulation Stations, and an RPA control room housing three ZedaSoft Mockingbird® Drone operator suites, which are configured to provide overwatch for the flight simulators.
VRSG provides visuals for both simulator systems, with 100 VRSG licenses in use at the Academy. For the Zuse station, this includes out-the-window, real-time visuals for each system’s three-monitor configuration, as well as radar views and 3D content. For Mockingbird, VRSG provides sensor views for the unmanned aircraft, including real-time generation of pilot and sensor operator EO/IR and Synthetic Aperture Radar (SAR) views.
The ZedaSoft systems are used as role player agents that are assigned from the top-level Air Tasking Order and instantiated through ZedaSoft’s Man-in-the-Loop Transfer of Control (MiLToC) software using DIS protocols. All attributes for the platform are transferred instantly when assigned, immediately immersing the trainee in the virtual battlespace.
Each flight simulator’s cockpit presentation is provided by the transferred platform data including navigation, sensors, weapons and fuel. The RPA simulators use the standardized ZedaSoft Mockingbird Drone software interface which is also customized by the transferred platform data. Trainees have the opportunity to assume multiple different roles in the theater of operations, building knowledge on each platform’s contribution to winning the battle.
The virtual world: VRSG creates the virtual world in which simulated training scenarios take place. MVRsimulation’s round-earth 3D terrain covers most of the world, all built with MVRsimulation’s Terrain Tools for Esri® ArcGIS, and available as standard for all VRSG license holders.
When used in applications such as fixed-wing cockpit simulation, VRSG terrain datasets serve as a baseline to which higher-fidelity information can be added to refine the database in a given area of interest. For example, VRSG’s dataset of Buckley Air Force Base in Colorado is built from 0.15 mpp imagery blended into 1 mpp CONUS imagery surrounding the area. The terrain is enriched with cultural features, geospecific buildings and over 4.5 million trees. This level of detail allows pilots to train in a virtual world that exactly replicates the real world.
Virtual overwatch: VRSG offers extensive capability for RPA training and is widely used in this capacity, including being the primary visual system for U.S. Air Force Predator/Reaper simulators as part of the MJAT system. VRSG can be configured using VRSG’s internal camera payload model in which the telemetry of the simulated RPA is provided by a DIS entity; or used in fully integrated applications such as MUSE/AFSERS.
A key feature of VRSG for simulated video feeds for unmanned intelligence gathering platforms is its ability to stream real-time HD-quality simulated video with KLV metadata using the H.264 protocol, which is indiscernible in composition from actual UAS video feed.
The result: VRSG brings flight and RPA simulators advanced capabilities for flexible, COTS-based training operations. As seen at the MDL this pairing with ZedaSoft, along with the technologies at the lab supplied by other industry partners, enables users to train in a mixed-reality environment that replicates the real-world in all the ways that matter in laboratory or early-stage training environments.