In the VAM technology space, the solutions being created are predominantly focused on the commercial and civilian markets. Large telecoms operators are building edge computing and the ability to store simultaneous location and mapping maps locally into 5G-capable cell towers, while smartphone manufacturers are creating very small form factor screens with high resolution and brightness.
All of this is being done to create the building blocks of what will be a wide augmented world, that arguably we may all share in very soon. Naturally, the military is also seeking to leverage VAM, so countless companies in the defence industry are working to develop their own solutions.
The use of the modalities within VAM depends on the level of virtualisation that is required for a given training task. The value of VR, for example, is its ability to simulate human presence in a completely virtual space.
‘The VR experience is meant to immerse the user such that one develops muscle memory and special awareness, and that enables a broader spectrum of ability to train like you fight and fight like you train,’ said Stephen Mitchell, director of the advanced readiness solutions unit in Northrop Grumman’s Defense Systems sector.
AR, meanwhile, is typically delivered by a headset using either optical or camera-based (digital) pass-through modalities; elements can also be projected onto surfaces like a windshield. This information typically takes the form of symbology or virtual graphics that are added to the FOV.
MR is, in essence, a blend of AR and VR, which allows for visualisation and interaction with virtual objects, along with virtual, sensor and instructional data to be delivered within the real, physical world.
As an example, Collins Aerospace’s Coalescence MR system enables a trainee to interact with real objects and people while still immersed in a synthetic environment.
Video content by Noemi Distefano
‘Zero delta latency is what we’re after, which means we want to match the real-world latency of the way things operate and work,’ said Nick Scarnato, director, strategic marketing, integrated solutions for Collins Aerospace. ‘We’ve spent a significant amount of time and effort developing a custom technology that’s hardware-accelerated to minimise latency significantly, since that is one of the key aspects of making digital pass-through work.’
Saab can also be seen as an early adopter of MR for military learning. One of its first applications was for counter-IED training, which included both classroom-based equipment work and real-time analysis of the performance of teams in the field.
‘Based on this work, we developed the Mixed Reality Sandbox, which is a battlefield visualisation system that allows commanders to view and interact with command and control systems in a holographic 3D environment,’ said David Ledger, lead for AR work at Saab Australia. ‘The Sandbox can of course be used for both real-time C2 as well as scenario gaming and training.’
Barriers to entry
MR is excellent for T&S as well as visualisation of real objects, but many contend that the principal barrier to entry is the cost of headsets such as Microsoft’s HoloLens or Magic Leap. Ledger added: ‘I believe MR will remain a more specialised solution while headset costs remain high but expect an explosion of VR/AR solutions now that Google has created a search engine for AR/VR material.’
There are a number of other challenges in this domain, including visual acuity, latency, clarity, human compensation, computation and connectivity. All technologies require the appropriate levels of display resolution, FOV and optical features such as depth, exposure or others depending on the use case.
For MR, visual acuity also demands the continuous and precise alignment of objects within the real world. Also, as VAM technology proliferates, so does the utilisation of wireless-driven systems, which bring security concerns of how data is transmitted, even via simple controllers and trackers.
According to Mitchell, computation and connectivity are the engineering problems that drive how and where virtual data is computed. ‘Where connectivity bandwidth is limited, computation must be done on board each user. This vastly affects weight, processing power and battery life for each participant,’ he said.
‘Where connectivity bandwidth is plentiful – a full 5G environment, for example – computation could theoretically be centralised to a server farm and broadcast to each head-worn “thin client”. Advancements are being made in all of these areas, but they remain the main challenges to be overcome.’
Offering sage perspective on the growing call to use VAM technology was Dr Jim Frey, a training psychologist with Plexsys Interface Products: ‘I think one part of this technology wave that has crested is common sense. There are often rooms of acquisition folks calling for more technology when sometimes a ten-cent paper target could suffice. Most call for more tech as opposed to calling for better training. Generally speaking, there is a mantra which says “this is how all the kids learn these days”. But is it? Are high schools or college campuses filled with people with VR goggles? Where exactly is the “all kids” statement coming from?’
Darren Shavers, director of business development and FMS at Meggitt Training Systems, noted that there are certain training events that are extremely expensive to conduct live, and therefore VAM becomes a more affordable solution.
‘As the small-arms training programme of record [provider] for most NATO countries, plus a number of other transactional authorities, we see that VR has matured and can meet most of our customers’ requirements currently, but some of our bigger customers are advocating for mixed reality and thus it may become the premier way to train with sufficient development funding and time,’ he explained.
Significant improvements in head-mounted displays (HMDs) have opened the door to new solutions in training, which could include portable and deployable solutions for mission rehearsal. This is enabled through a combination of technologies, which has reduced the space and weight of the display system, and also through improved resolution.
Phil Perey, head of technology for defence and security at CAE, observed: ‘We’ve seen over the last year a maturity in headsets that nears 20/20 acuity at least in a portion of the display, and that to me is a significant step in terms of enhancing training capabilities in areas like flight instruction. This means we’re now able to see the words on instrument panels, instead of just the buttons.’
CAE’s investment in this domain has allowed it to develop the Sprint VR Trainer, which was inspired in part by USAF use of VR in the Pilot Training Next (PTN) programme. Sprint VR uses the Varjo VR-2 headset and, along with CAE’s TRAX Academy curriculum, is meant to build on the capabilities found in programmes like PTN.
Varjo is a growing presence in providing HMDs. Its current VR display is the VR-2, available in two models, with the professional version supporting hand tracking via UltraLeap technology. The USAF’s Air Education and Training Command and USN’s Chief of Naval Air Training are currently upgrading existing VR-based systems, including PTN, with VR-2 headsets.
Varjo’s XR-1 is targeted for MR applications, using a video see-through capability powered by two 12mpx colour cameras running at 90Hz that can be synchronised with synthetically generated content. XR-1 is designed to support the emerging MR requirements of projects like the Pilot Training Transformation programme where trainees will interact with real and virtual content.
I/ITSEC 2019 featured Varjo’s XR-1 in a demo of an AH-64 Apache cockpit-based training simulation made in partnership with Bohemia Interactive Simulations. ‘You can see reality and the virtual simulation environment blending seamlessly together, while the pilot is able to see his/her hands on top of the physical cockpit. This allows unprecedented levels of sophisticated defence training,’ claimed Otakar Nieder, senior director of development at Bohemia Interactive Simulations.
Ledger spoke about Saab’s focus on AR headsets: ‘The latest generation of the HoloLens includes some new features and we are exploring ways to utilise these for military maintenance and training. We are currently prototyping a system which allows the HoloLens to automatically recognise a known object and bring up maintenance procedures to walk the user through each step.’
The company is also developing interfaces for HoloLens into its Tactical Engagement Simulation System product to provide real-time virtual input.
‘While the HoloLens is a wonderful tool, it is not a consumer device due to its cost, so we have also focused on AR solutions through mobile devices,’ added Ledger. ‘Currently, this has been limited to marketing military products through mobile devices, such as the Saab Solution AR app available now on iOS and Android; however, we are in the process of porting some of the HoloLens apps into iPads to provide a more cost-efficient solution for some AR applications.’
Many companies, however, like Lockheed Martin, take an open approach to VAM headsets. ‘We are trying to be very hardware-agnostic, so we take advantage of anything that’s available from a commercial perspective. We’ve used the Varjo XR-1 for some of our initiatives; and we’ve also integrated with the latest AR solutions that Microsoft has with its HoloLens,’ said Atul Patel, director of advanced technologies and innovation for Lockheed Martin Training and Logistics Solutions. ‘What we’re trying to do is stay in sync with these headset developers so we can introduce newer technologies as they become available.
‘We’ve recently introduced augmented reality training along with mixed reality at the RAF’s Synthetic Training System facility supporting the Chinook Mk 6 programme. We’re also seeing great application for maintenance training, so we’re starting to experiment with MR with our Maintain3d solution; a part of that is a very close tie-in with our Digital Transformation Initiative, which ensures we have a digital thread from the development environment all the way out to the user, and that allows for a digital twin of that platform as you conduct maintenance on it.’
Dan Groppa, principal technical project manager, simulation solutions and services for Collins Aerospace, shared similar views: ‘We don’t have one particular preferred manufacturer of AR/VR/MR headsets; we look at all of them and we do a custom fit for an integrated solution for our customers. Our intent from a technical standpoint is to make our technology HMD-agnostic and even IG-agnostic in some cases because we want to quickly adapt as new products come on the market.’
The US Army’s signature AR effort lies in the Integrated Visual Augmentation System (IVAS), based on the HoloLens. It consists of a see-through head-up display, which will be integrated into combat operations and enable soldiers to train in synthetic environments with the same equipment they use to fight. The system also includes thermal and low-light sensors, rapid target acquisition and aided target identification, along with AI capabilities.
Considered as ‘leap-ahead technology’, IVAS has made headlines since the concept was introduced when the army partnered with Microsoft in November 2018. Team IVAS consists of subject matter experts from the Soldier Lethality Cross-Functional Team (SL CFT), Program Executive Office Soldier, army labs, Microsoft and Army Forces Command units who support the soldier touch points (STPs) for which SL CFT has become known. Due to the COVID-19 pandemic, the much anticipated STP 3 has been postponed from summer to autumn, which has led to speculation that fielding of IVAS to troops would ultimately be delayed.
According to BG David Hodne, director of the SL CFT, the development of IVAS has never been linear. ‘It’s precisely because we take a nimble approach to our processes that we can shift Soldier Touch Point Three and still deliver all the same capabilities in the fall without impacting STP 4 and the date we will equip that first unit,’ he said. ‘The team has always managed development across all the capability sets to capitalise on opportunities to accelerate technology or bring capabilities forward to get soldier feedback early on. This is an agile acquisition process that matches the agility of industry.’
STP 3 is now scheduled to start in mid- to late October and will put to the test the first ruggedised military form factor of IVAS. STP 4, in the early part of 2021, will put IVAS to the test at company level in a variety of combat scenarios to challenge system performance and network integration across multiple echelons.
Another company prominently active in developing the AR domain is Red 6. The California-based business is working to integrate virtual targets into the I-LVC environment for airborne live training.
Initially given a kick-start by the USAF’s AFWERX initiative, Red 6 has since won a number of small business innovation research contracts and is currently working with the Air Force Research Laboratory (AFRL) to take the project further.
Glenn Snyder, chief product officer at Red 6, told Shephard: ‘One of the biggest challenges that we have been working on is trying to get the brightness up so pilots can look towards the sun and still see virtual aircraft. You get a lot of benefits in using camera pass-through AR headsets, but for someone like a fighter pilot who has already been trained to have near-zero latency in their reaction times, adding a camera feed will tend to give negative training.
‘Over the past year, we have been working on finding screens that give us the best trade-off of low power consumption, low ambient heat that they give off, high brightness and high display resolution. Honestly, it’s been very difficult. We started by assuming we could use an off-the-shelf unit, but that evolved to manufacturing our own because there wasn’t anything out there that has low enough latency on the pass-through side to give us the brightness, or was bright enough on the image-based side to be able to see.’
The need to add the virtual to the I-LVC mix has been clear for a number of years. Although some argue that fast-jet combat engagements all take place at beyond visual range, this is not strictly true. There is a training gap and this is crying out to be filled.
‘There is a lack of training and, more importantly, a lack of relevance in training,’ said Dan Robinson, Red 6’s co-founder and CEO. ‘In simple terms, every time we would go up to train, we would need someone to train against, and that represents a multibillion-dollar a year problem for the US Air Force alone. The same goes for the US Navy and marine corps and all allied nations, largely because there’s not enough Red Air aircraft to train against and because there is a chronic shortage of fighter pilots. That problem has only got worse over the years.’
In its pitch to the AFRL, the company was able to discuss a number of I-LVC issues that were well known to the USAF.
‘LVC is designed to connect real pilots in real airplanes with pilots in simulators on the ground and with AI-generated assets,’ explained Robinson. ‘The problem with LVC is that it’s a 50% solution because it’s a beyond-visual-range solution only. It absolutely represents the future in how we should train synthetically, but as soon as you transition to within visual range – ten nautical miles and in – the whole training system collapses because as soon as the pilot looks away from his scope and looks outside, there’s nothing there because there’s no way of putting those simulated airplanes into the real world, and that gets back to the fact that AR doesn’t work well outside or in dynamic environments. That is until now because that’s the piece of the jigsaw that we’ve solved.’
That’s the piece of the jigsaw that we’ve solved.
That effort led to an airborne demonstration in November 2019, which included a number of deliverables from the USAF Test Pilot School. Using a crawl, walk, run approach, Red 6 first demonstrated that it could create a 500ft (152m) framed cube which could be visualised using an AR headset.
‘The cube was scaled relative to the aircraft, so as we flew closer, the cube would get bigger, and because it’s a framed cube, we could fly through it as well, and that’s important because it demonstrates the immersive aspect of it,’ explained Robinson.
‘It was game-changing because it showed that we could create objects in the sky, and those could be threat-representative aircraft,’ said Robinson. ‘For the airborne test, we took US Air Force participants up to the virtual cube, and then we also flew them to a virtual KC-46 where we flew in formation, viewed the refuelling boom and positioned for pre-contact. It was so realistic that I could literally fly our airplane towards the boom such that it looked like it was entering the cockpit.
‘Afterwards, we did some barrel rolls around the KC-46. We then closed in on a Su-57, which was in an oval turn, so that allowed us to observe that [jet] and conduct circle entries to train for lead, lag, aspect, closure – all the things fighter pilots do. That’s how good it is, and I can tell you the air force was very impressed.’
A key development in the future of VAM/XR is a common development standard which is being led by the Khronos Group industry consortium. ‘OpenXR seeks to simplify AR/VR software development, enabling applications to reach a wider array of hardware platforms without having to port or rewrite their code and subsequently allowing platform vendors supporting OpenXR access to more applications. With the release of the OpenXR 1.0 specification, AR/VR developers can now create true cross-platform XR experiences,’ said Brent Insko, lead XR architect at Intel and the Open XR Working Group’s chair.
Many believe the future lies with the integration of the entire extended-reality spectrum into a seamless human experience where the physical and virtual can co-exist and be interacted with in real time, all the time. This will require the integration of additional sensors to support natural interaction with the environment.
According to Shavers, the end game solution for the VAM domain addresses the following scenario: ‘We are about to go to war. In real time, we will need to conduct training between all our forces and the other NATO countries that will fight with us. This initial training or wargaming will need to happen in an MR world, where users in real time can touch and generally interface with all participants no matter where they are in the world. Think about Call of Duty as a real-world mission with people everywhere playing together.’
In addition, data extracted from these training serials can be analysed in real time or near real time to reinforce best practices and for AAR.