Integrating unmanned aircraft into civil airspace is a big challenge. As Mark Broadbent discovers, research into this area continues in the UK with a Jetstream J31
When in the late 2000s the UK aviation industry started researching how unmanned air vehicles (UAVs) could be successfully integrated into civil airspace, the issues of transparency and equivalence were identified as key areas to address.
Transparency refers to ensuring a UAV performs in non-segregated airspace in the same way as a manned aircraft, and equivalence refers to ensuring a UAV is at least as safe as a conventional aircraft in that environment.
The issue was how to achieve these things. Maureen McCue, Head of Research and Technology for Military Air and Information at BAE Systems, said: “That’s what ASTRAEA set out to solve.”
The Autonomous Systems Technology Related Airborne Evaluation and Assessment (ASTRAEA) project involved a consortium of companies researching technologies to allow the routine operation of UAVs in the UK’s commercial airspace. The organisations involved included Airbus, AOS, BAE Systems, Cobham, QinetiQ, Rolls-Royce and Thales.
ASTRAEA’s starting point was that unmanned aircraft fundamentally changed the key regulations affecting commercial aviation, because they remove the pilot from the aircraft and move human supervision to the ground. A first phase researched the requirements that would be needed to ensure safe and reliable autonomous operations, while phase two researched how new technologies – sense-and-avoid, autonomy, communications, operations and human/ system interaction – would achieve that allimportant transparency and equivalence.
McCue said: “There was increased regulatory engagement, all the way up to working some draft regulations, and working hand-in-hand with the regulator developing and testing those systems.”
BAE Systems provided its stalwart Jetstream J31 Flying Test Bed (G-BWWW) as a platform to test the new technologies. The culmination of the work came in April 2013 when G-BWWW fiew from BAE Systems Warton to Inverness Airport, completing the first flight in civil UK airspace by an unmanned aircraft – technically, a ‘surrogate’ aircraft, as there were pilots aboard – in non-segregated airspace.
McCue said: “The aim was that the research and the regulations would converge in ASTRAEA 3, and then ASTRAEA 3 would give a body of evidence to give con dence that transparency and equivalence could be achieved and regulated.”
That was not to be. The UK Government, which since 2006 had provided funding for ASTRAEA through increments from the Department of Business, Innovation and Skills, in September 2015 rejected a further request from the ASTRAEA consortium for a £26 million grant.
Despite the halt to the ASTRAEA programme after nearly ten years’ work the funding cut caused, BAE Systems nevertheless decided to build on ASTRAEA by continuing research and development on technologies designed to prove the safe operation of unmanned aircraft in UK airspace – speci cally, new sense-andavoid, weather avoidance and satellite-based communications systems.
The latest phase of work, funded by BAE Systems’ own internal funding to the tune of £400,000, initially involved ground and rig testing, then in late 2016 the research moved into the air with a flight test campaign using the Jetstream. McCue said BAE Systems’ decision to continue with the research came from a desire, “to start to gather some of that body of evidence to give us con dence that the regulations that are emerging are suf cient to prove transparency and equivalence in operation”.
The sense-and-avoid and weather detection technologies are both based on camera systems that record live video imagery. There are seven optical cameras and an infrared camera mounted around the Jetstream.
Algorithms interpret the imagery captured by the cameras and determine whether there’s an incoming hazard, such as an intruding aircraft or a cloud that contains icing conditions. McCue explains: “The algorithms are looking for objects that are approaching, the size of the object [and] the rate of the object. The algorithms are designed to filter out the things that naturally appear in the environment.
“It’s non-cooperative sense-and-avoid. In a cooperative system you would have a transponder and that would transmit and you’d know there was another aircraft there, [but] smaller aircraft don’t necessarily have a transponder, or there may be a system failure or malicious intent, [so] the [senseand-avoid] system is designed to identify and deal with it.
“On weather avoidance, [the algorithms are] looking at the type, range and the pattern of the cloud, and to determine whether the cloud is likely to present a hazard, such as icing.”
The infrared camera is being evaluated to test its utility to support emergency landing situations. The system compares the picture captured by the infrared camera with an onboard database to find the safest place to land the aircraft. “What the infrared camera does in that situation is to do a con dence check to ensure there are no people or animals on the ground,” McCue said.
The purpose of these camera systems and algorithms is to enable a UAV to recognise hazards automatically and enable it to plot a course autonomously that allows evasive action, just as a human pilot would in a manned aircraft. In other words, replicating the routine of human pilots and achieving those goals of transparency and equivalence to manned operations.
McCue said: “The human trusting the algorithm to make the decision is why the tests we’re doing at the moment – where we’re gathering the evidence, increasing the competence and gradually building from a narrow operation to a wider operation – are really important. That’s part of building the trust in the decision-making, that the human decides what [decisions] the algorithm can make on our behalf.”
While radar systems could also be used to detect weather and/or intruders, McCue said: “The problem with radar is it’s expensive. From a military perspective, it occupies prime sensor real estate. The aim of this system is to give that open access, so for smaller aircraft that couldn’t necessarily afford or fit a system there’s now a system that’s capable. From a military perspective, you’ve got the added bene t of not having to put a radar in prime sensor real estate.”
Another aspect of the latest trials using Jetstream G-BWWW concerns voice communications. Mccue explains: “One of the things we did in [ASTRAEA] test flights was to fly through controlled airspace where NATS [the UK’s air navigation services provider] gave command to the aircraft. Obviously, to keep things transparent and equivalent NATS want to keep things exactly as they would with a manned aircraft, and get the response back in the same way and the same time.”
To achieve this, the ASTRAEA programme developed a digital voice to enable a UAV to communicate with NATS and then trialled those technologies in flight using the Jetstream. The voice communications were transmitted between NATS, the ground pilot and the Jetstream using satellite comms links. McCue said. “One of the observations from those flight trials was that the system worked fine, but there was an increased latency. The solution we’ve looked at in this phase of trials is whether we can use secure internet protocol to do that first stage of communications, so at the moment you’ll go from a ground control station to a local earth satellite, up to the satellite. Can we use secure internet protocol to go directly to the satellite? That’s the concept we’re testing at the moment.”
BAE Systems evaluated all these systems in a series of 17 test flights, which started late in 2016 and continued into the early part of this year. Each flight lasted around 90 minutes, with the aircraft flying through a corridor of non-congested airspace on a route from Warton to Inverness (around 300 miles/482km) and normally flying at 15,000ft (4,572m).
Speaking of Jetstream G-BWWW itself, Martin Rowe-Willcocks, Head of Sales for Future Programmes and Services for BAE Systems Military Air and Information, told AIR International: “It’s a pretty versatile airframe. It’s got a set of benches for computers and workstations and a couple of places where the guys can actually operate the vehicle.”
A pilot and co-pilot were in control for take-off and landing, but once airborne and in controlled airspace the Jetstream flew itself. Two engineers were aboard the Jetstream in the cabin workstations. Together with NATS controllers, they continually assessed the performance of the systems on the testbed. On the ground, a flight test observer and UAV commander monitored the flights via satellite communications.
Rowe-Willcocks said: “At the front of the aeroplane it’s still a normal cockpit. It’s still got a pilot and a co-pilot and they’re there as a check pilot when you actually flick the aircraft into its surrogate UAV mode.”
With the latest flight trials now concluded, BAE Systems engineers will evaluate the findings to determine the scope of further research. AIR International asked McCue about the initial feedback from the latest test flights. She said “some degree of competence and greater evidence” has been gathered, but further work is required.
She continued: “Initial evidence suggests we’ll need to do some further work around the subtleties of cloud detection, for example its ability to deal with different light levels. It has that ability built in, but some minor modifications are needed there.” The test flights also suggest improvements are needed to make digital voice commands “more slick and effective”.
With further work needed into all these technologies, it appears likely research using the Jetstream will continue in the future. Rowe-Willcocks said using G-BWWW is, “certainly a concept we like and it’s a concept the Civil Aviation Authority likes because it allows us to explore and experiment but in a safe way”
The continuance of research using the Jetstream is also boosted by the bigger developmental picture at BAE Systems. The company is looking to mature technologies that could go not just into unmanned platforms, but also manned aircraft. RoweWillcocks said: “We’ve got a broad research programme, and this is just one example of what’s going on. We have a basket of technologies that we’re trying to develop across airframes, mission systems, the control environment, the human-machine interface and the level of automation that you’re comfortable to put inside these systems. That forms our overarching research programme. This is just one of the technology threads we’ve got running at the moment.”
Replicating the Human Brain
The underlying focus is about developing intelligence. Rowe-Willcocks said: “When we talk about this particular trial activity we do tend to focus on the sensor head side of it.
“The really clever part of this demo is the signal and data processing that sits behind it. Effectively what you’re trying to do is not just replace the detection side of the human eye, but also the intelligence of the human brain – having seen something, deciding what to do about it.
“In our language, it’s about the level of intelligence automation within the system, so you inform the people on the ground who are in command of the vehicle with the best information you possibly can so that, collectively, the human and the machine respond in a timeframe.
“The unmanned aircraft has to respond as if there was a pilot in the cockpit, looking out the window, spotting something and making a decision against a set of rules and flying norms to either ignore it or avoid it. That’s the process we’re trying to build up and it’s as important as the man on the ground in control of this platform.
“When you move that forward into other environments, intelligent decision-making is a technology translatable into other spaces as well.”