Blending virtual and real worlds

Augmented reality, holographic displays, synthetic training – the concepts for future fighter aircraft cockpits are pushing the boundaries of technology. Chris Hepburn, senior engineer at BAE Systems, reveals some of the projects being explored which could transform how a pilot operates.

img_96_1.jpg
An idea of how the future cockpit could look in the Tempest programme, with holographic displays and interactive modelling.

You’re 18 and just out of senior school and joining the Royal Air Force. Imagine if you are issued with an artificial intelligence headset as you start your training as a pilot. You can watch videos of flight school and link up to a control column and throttle in a classroom and actually start learning how to fly – all in a virtual world. When you progress to the training aircraft for live flying, you are learning in new physical spaces, but you’re wearing that same headset with the same information being presented. Moving forward into either a Typhoon or the new Tempest front line fighter, it’s still wearing the same headset. It provides a continuous stream of training throughout the entire lifecycle process, with no negative learning, no changing of devices or learning new ways and means – one seamless training pipeline.

Essentially, this is about using technology in order to learn and operate in an intuitive and immersive manner. We are interested in exploring how a future cockpit could look. That applies to future programmes such as Tempest, as well as our current aircraft such as Hawk and Typhoon – which will have a fundamental role in developing technology for Tempest – and which is strategically critical to what we want to do in future.

This approach and direction isn’t coming from a current need, it’s more driven by a future requirement. Right now we use the man-machine interfaces to do what we want them to do. A pilot is in an environment where they must actively fly the aircraft. In future, through artificial intelligence and autonomy, it will be able to largely fly itself. There is a feeling that the pilot will become more of a systems manager, much the same as a car, for example; as they too become autonomous, the driver becomes more of a passenger. So what can you physically do as that passenger? As a forward operational commander in the aircraft, that person will need to be able to control the environment more fluidly. Augmented reality massively enhances the ability to view and understand the battlespace, observing and interacting with a hologram compared to looking at a flat screen and pushing buttons.

img_97_1.jpg
Hand and head tracking will be key element of a future cockpit.

A stepped approach

At its core, the next generation of cockpit leverages the adaptability of the current Striker II helmet-mounted display (HMD). The full-colour HMD is used to project augmented and virtual reality interactive cockpit displays and controls directly in front of the pilot’s eyes – and these replace the current physical ones.

The technology allows pilots to customise the display and the way they interact with it based on their own personal preferences and mission objectives. The technology is designed to improve situational awareness, speed of decision-making and the ability to affordably and rapidly upgrade the cockpit in line with aircraft enhancements.

We are using Striker II as a stepping stone to a cockpit that is fully augmented, where the only interface with the aircraft would be via the stick and throttle, potentially with the pilot only touching a synthetic environment that isn’t there in the real world. With augmented reality you can still look out of the cockpit, but imagine being able to see all the missile engagement zones, being able to keep your head up constantly.

Virtual reality is used to design and develop because it can put engineers in a suitable environment. Augmented reality is great, but we can’t easily go flying at 50,000ft and see what it looks like, so virtual reality allows us to design for the augmented side, to be there and test these ideas out.

The work includes voice inputs, use of gestures, eye-tracking and touch to allow the pilot to interact with these virtual cockpit controls by, for example, selecting something on the helmet display and then using gesture to reposition it.

img_97_2.jpg
There will be no need for physical displays; it will all be configured through augmented reality.

The aim is to develop a capability to meet the changing and increasingly complex demands that future developments in combat aircraft will bring. The concept will be usable in multiple aircraft, reducing development time, cost and training requirements – effectively doing things quicker, cheaper and smarter.

It would also introduce several potential benefits to the pilot, including the ability to change the number and sizes of displays to the pilot’s precise preferences, without being restricted by physical space. It would also provide the ability to lay concepts on to the display such as ground terrain overlays, pop-up tracking windows, restricted airspace and 3D situational displays.

Autonomous capability can also be used for physiological monitoring of the operator, understanding stress levels of the pilot. If the system felt the pilot was under too much duress or struggling, it would look at taking over some of the work from the pilot to enable them to better concentrate on the task at hand. We are looking at things like pupil dilation and heart rate, so for example, if a missile is coming at the aircraft, the pilot is likely to be concentrating on that factor – the countermeasures – so the autonomy might take over something else.

We are seeing this kind of technology in vehicles and gaming – holographic displays will change the way we interface with our environment. It will be a new way to interact with things in day-to-day life, making everything more adaptable.

That is the key word here. If, say, the pilot needs to take control because the systems fail, then the traditional flight controls - the get-you-home devices - would pop-out. But if everything’s running fine, you don’t need to see those things at all; they’d all be superfluous. You want a blank canvass and say: “I’m only interested in the operational space right now.”

img_97_3.jpg
Chris Hepburn with a virtual reality headset.
All photos courtesy of BAE Systems

I see this as a phased approach. In a Typhoon, you have all the traditional controls and displays, but when you bring the helmet visor down you unlock all that extra information. As we gain confidence of what it can and can’t do we will better understand what can be replaced or what is no longer needed and we can build assurance.

The best comparison I can make is with the Tesla car. They have the ability for you to have a fully autonomous car: you can sit in it and it will drive you from point A to point B, but you aren’t allowed to do that. The steering wheel knows if you’re gripping it – if you let go the car pulls over and puts the hazard lights on. That’s about building confidence in the system and about meeting strict rules and regulations. We do the same, in a rigorous, layered process.

Design and development

We are building that knowledge and confidence with two fighter aircraft development simulators, one of which includes an augmented reality headset. Every few minutes an engineer comes and checks something. Our fail-fast mantra means that if something doesn’t work, we move on quickly, learning from the experience.

It has all manner of implications, from the fighter cockpit to the way we train – we could potentially get away from the need for huge dome simulators and into using virtual reality headsets in far smaller spaces, bringing the associated costs down to thousands rather than millions.

There’s a big cost reduction element to this and it’s why it is so successful so quickly. The lifecycles are massively reduced in terms of design and development, requiring smaller test facilities and teams. If you look back 20 years, prestige was based on how big you are, whereas now it’s all how small and agile you are, while producing the same quality products.

The Tempest programme calls for an aircraft to be in the air by 2035. It will undoubtedly have an augmented cockpit. Already in Typhoon, Striker II brings a level of that technology and over the coming years I think we’ll see Typhoon bringing in most of that new capability as we move towards Tempest, proving the technology delivers and continues to evolve what Typhoon can do.