Managing resources

CREW RESOURCE MANAGEMENT

Dr Simon Bennett reviews the origins and purpose of crew resource management

img_76-1_91.jpg
Jose Bejarano/AirTeamImages

Crew resource management (CRM) is one of the reasons why flying is so safe. It specifies the skills, attitudes and behaviours required of pilots, cabin crew, dispatchers, engineers, handlers and controllers to ensure trouble-free flight operations.

These include: leadership; professionalism; awareness; curiosity; inquiry; flexibility; adaptability; collegiality; respectfulness; empathy; orderliness; alacrity; modesty; and a capacity for reflection. These skills are required of all, from novice dispatchers and cabin crew to time-served training captains and hard-bitten fleet managers. In a 1989 report for the Federal Aviation Administration (FAA) titled Aeronautical Decision-Making – Cockpit Resource Management, Dr Richard Jensen of the Ohio State University’s Aviation Psychology Laboratory, argued that the origins of CRM lay in ten major disasters, specifically:

Eastern Air Lines, L-1011 Tristar, Miami, 1972: Preoccupied with a malfunctioning nose-gear indicator light, the crew of three failed to notice that the autopilot had disconnected and that the aircraft was descending into the Everglades. Death toll: 101

TWA, Boeing 727, Washington Dulles, 1974: Ambiguous exchanges between the crew and controllers led to the aircraft being flown into Mount Weather. In its 1975 accident report, the National Transportation Safety Board (NTSB) noted “The failure of the FAA to take timely action to resolve the confusion and misinterpretation of air traffic terminology, although the Agency had been aware of the problem for several years”. Death toll: 92

Pan American and KLM, both Boeing 747s, Tenerife, 1977: Ambiguous, non-standard exchanges between the KLM crew and the control tower, in the context of operational and commercial pressures, led to the KLM 747 commencing its take-offroll without clearance. Death toll: 583

United Airlines, DC-8 Jet Trader, Salt Lake City, 1977: Ambiguous exchanges between the crew and an air traffic controller, in the context of distracting on-board electrical failures (several landing-gear lights were inoperative), led to the aircraft being flown into a mountain in the Wasatch Range. Death toll: 3

United Airlines, DC-8, Portland, 1978: The flightcrew’s failure to monitor fuel-state during a landinggear emergency led to fuel exhaustion. The National Transportation Safety Board’s accident report noted: “The Safety Board believes that this accident exemplifies a recurring problem – a breakdown in cockpit management and teamwork during a situation involving malfunctions of aircraft systems in light. To combat this problem, responsibilities must be divided among members of the light-crew while a malfunction is being resolved” (page 26 of the report). Recommendation X-79-17 requested that the FAA issue an operations bulletin directing its light operations inspectors to: “[U]rge … operators to ensure that … light-crews are indoctrinated in principles of light-deck resource management, with particular emphasis on the merits of participative management for captains and assertiveness training for other cockpit crewmembers” (page 30 of the report). Death toll: 10

img_78-1_96.jpg
Alexander Mishin/AirTeamImages

National Airlines, Boeing 727, Escambia Bay, 1978: Lax light-deck discipline, overconidence, complacency, poor crew co-ordination and poor intra-crew communication, in the context of an airtraic control-induced rushed approach and elevated work-rate, resulted in the aircraft being lown into the ocean. Death toll: 3

Western Airlines, DC-1O, Mexico City, 1979: A misjudged go-around, in the context of non-standard air traic control advisories concerning a sidestep manoeuvre and inadequate published documentation for that manoeuvre, resulted in the aircraft crashing on the airield. Death toll: 72 (aircraft); 1 (ground)

Dan-Air, Boeing 727, Tenerife, 1980: Pilot error, in the context of poor-quality air-traic control (including the issuing of an instruction whose meaning was open to interpretation) and poor-quality approach planning by the competent authority, led to the aircraft being lown into a mountain. Death toll: 146

Saudia, L-1011 Tristar, Jedda, 1980: A cargo hold ire required an emergency landing. Despite the presence of smoke and fumes in the cabin, the crew failed to initiate an evacuation. The aircraft was taxied to the end of the runway and parked. A lash ire destroyed the aircraft. Elapsed time between touchdown and lashover was around 31 minutes. Poor teamwork may have played a part in the mishandling of the emergency. In a 2004 article in the Canadian Journal of Anaesthesia, aviation psychologist Professor Robert Helmreich and psychologist Dr Jan Davies observed: “One component of national culture is ‘power distance’ or PD. In high-PD cultures, juniors do not question superiors. Leaders may be autocratic”. Death toll: 301

Air Florida, Boeing 737, Washington National, 1982: Poor light-deck CRM, in the context of false indications from the engine instrumentation and controller pressure to get lights back on schedule, led to the aircraft rotating with ice on the wings, climbing a few hundred feet, then falling into the frozen Potomac River. The NTSB’s accident report criticised the captain’s response to the irst oicer’s comments about the aircraft’s sluggish acceleration. Death toll: 74 (aircraft); 4 (on the Potomac’s 14th Street Bridge).

Versioning the birth of CRM

According to the Wikipedia entry for the 1978 United Airlines Portland accident, NTSB recommendation X-79-17 “… was the genesis for major changes in the way airline crewmembers were trained. This new type of training addressed behavioural management challenges such as poor crew coordination, loss of situational awareness and judgement errors frequently observed in aviation accidents. It is credited with launching … Crew Resource Management”. The Portland accident is described as “one of the most important in aviation history”. According to Wikipedia, “United Airlines instituted the industry’s irst crew resource management (CRM) for pilots in 1981”. In a paper published in the Journal of Aviation Technology and Engineering, Frank Wagener and David Ison, referring to the Portland accident, claim that “CRM was born from this catastrophe”.

Not everyone agrees that the 1978 United Airlines Portland accident was the catalyst for CRM. In the book Crew Resource Management (Academic Press, 2010), H. Clayton Foushee, Director of the Oice of Audit and Evaluation at the Federal Aviation Administration, and the late Professor Robert Helmreich, faculty member, University of Texas Human Factors Research project and recipient of the Flight Safety Foundation’s Lifetime Achievement Award, assert that CRM emerged well before the issuance of NTSB recommendation X-79-17: “Formal training in human factors aspects of crew operations was beginning to take root by the 1970s. For example, the late Frank Hawkins … had initiated a human factors training program at KLM, Royal Dutch Airlines, based on Edwards’ … SHEL model and trans-cockpit authority gradient”. In his 1995 book Pilot Judgement and Crew Resource Management, Dr Richard Jensen writes: “The [March 27, 1977] Pan American/KLM accident was the prime motivator behind KLM’s development of the irst (and still, the industry’s most extensive) CRM course in 1979”.

It is undoubtedly the case that the Dutch lag-carrier KLM played a key role in the development of CRM training for light-crew. Prior to the 1977 Tenerife disaster, KLM had worked with John Costley of the United Kingdom’s Air Transport and Travel Industrial Training Board to develop an interpersonal skills training course for check-in staf and pursers. Following Tenerife, KLM began thinking about human-factors training for pilots. John Costley made a record of KLM’s overtures in his work diary: “[July 5, 1977] [C]ockpit personnel seem to be coming into the picture, although it seems to be primarily on captaincy”. In August 1977, KLM’s Flight Operations and Training Department began working on a human factors training programme for captains. John Costley noted: “[August 2, 1977] Flight Ops meeting at top level in KLM to talk about captaincy training. Project … to start with wide investigation into what captain does from a management point of view. High prestige project with Jan de Jong involved throughout”.

Having joined KLM in 1968 as a marketing specialist, Jan de Jong was tasked to, in his own words, “improve the personnel services within KLM through training and consultancy”. To that end he worked with John Costley, now managing director of Interaction Trainers Ltd. (ITL), a consultancy Costley launched in 1977, KLM’s Alwin Maan Voogd Bergwerf, the KLM pilot group and the Vereniging Nederlandse Verkeersvliegers (Dutch Pilot Association) to develop KLM’s innovative Captaincy Development Programme (CDP), a human-factors programme aimed at improving light-deck leadership and management skills.

According to Alwin Maan Voogd Bergwerf, the skills, attitudes and behaviours covered in the Captaincy Development Programme included communication skills, decisionmaking skills, delegating skills and reporting skills. Pedagogy included small-group exercises in communication skills. The exercises were video-taped, then analysed by participants. The irst courses were delivered over a week. In time, the catchment was expanded to include First Oicers and Flight Engineers, the name changing from Captaincy Development Programme to Crew Management Training (CMT). As Stephen Walsh of Interaction Trainers notes, KLM’s light engineers lobbied hard to be included in the new training programme: “The light engineers at KLM kicked up a massive fuss. They said, ‘Hang on a second. It was our engineer whose questions were rebufed and dismissed by van Zanten, the [Tenerife] captain, so if the captains are having it, we are having it’. Apparently, it was that much of a fuss they were threatening to go on strike …Once the engineers kicked up this fuss, KLM turned around and said ‘OK. Everybody is having it’”.

Clients for KLM’s innovative human-factors training included Kuwait Airways, South African Airways, Martinair, Transavia, Ansett, Qantas, SAS and the Dutch government (which sent its light operations inspectors to be trained in human-factors). The airline sent John Costley of ITL to NASA’s watershed 1979 human-factors workshop, Resource Management on the Flight-Deck, held in San Francisco. The Proceedings of the workshop (NASA CP-2120) noted that: “The fostering of increased awareness and use of available resources by aircrews under high workload conditions is becoming a matter of greater concern to the airlines. New research and training programs to enhance aircrew capabilities are being developed, largely independently, by the various organisations involved”. The Proceedings noted “a need to develop more efective resource-management training for light-deck crewmembers”.

img_79-1_70.jpg

The teaching of CRM

In their Journal of Aviation Technology and Engineering paper, Frank Wagener and David Ison observed that “training in CRM requires communicating basic knowledge of human factors concepts that relate to aviation, and providing the tools necessary to apply these concepts operationally”.

Crew resource management training does two things. First, it describes the attitudes and behaviours appropriate for those involved in operating aircraft. Secondly, through the medium of initial and recurrent training, it teaches the skills required to operate aircraft safely and eiciently. Speciically, it teaches personnel about leadership, planning, organising, teamwork and delegation – the non-technical (no-tech) or ‘soft’ skills essential for trouble-free aviation. CRM helps operational personnel – pilots, cabin crew, dispatchers, engineers, handlers, fuellers and controllers – organise for success. In his 2005 book Building Safe Systems in Aviation – A CRM Developer’s Handbook, Norman MacLeod noted how the industry was able to embed CRM into its operational DNA “by placing soft skills on an equal footing with the hard stick-and-rudder skills”.

CRM draws on the work of pioneering British social scientists Eric Trist and Fred Emery, and Ken Bamforth, an ex-miner. Speciically, it draws on Trist and Bamforth’s conception of the workplace as a sociotechnical system (STS) and of employees as social beings whose performance can be improved through active engagement in the organisation of work. In their 1940s study of Britain’s coal mines, Trist and Bamforth looked at the impact of mechanisation on morale, productivity and industrial relations. They noted how miners’ loss of control of the work process (mechanisation eliminated the autonomous, multi-role work-team) lowered morale, reduced job satisfaction and led to in-ighting and worker resistance.

To the extent that it discourages a controlling style of management and promotes a facilitative or enabling style of management, CRM also draws on highreliability organisation (HRO) theory and on the theoretical underpinnings of the qualityof- working-life (QWL) movement. Advocates of QWL argue that encouraging employees to contribute ideas and participate in problem-solving improves productivity and product quality. HRO theory argues that operational performance and reliability can be improved through consultation, delegation and respect for expertise regardless of origin. A founding principle of HRO (and therefore of CRM) is that, potentially, insight, ideas and skills exist at every level, from management to shop-loor.

Finally, CRM draws on psychologist Elwyn Edwards’s systems-theory-informed conception of aviation, speciically his SHEL model (elaborated in 1975 by Frank Hawkins) which he launched at a British Airline Pilots Association (BALPA) technical symposium in 1972. Edwards’s premise was that anyone involved in providing air service requires both craft skills (hard skills) and social skills (soft skills). Thus pilots require both psychomotor skills (the capacity to ly an aircraft safely and eiciently under all conditions) and human-factors skills, such as a facility for collaborative labour (teamwork) and a capacity for, and predisposition towards, empathy. In Edwards’s conception, the aviation system – the means of providing safe and eicient air service – consisted of four elements:

Software: rules, regulations, laws, notices to airmen (NOTAMS), standard operating procedures (SOPs), norms, values, etc.

Hardware: the physical infrastructure (air traic control, airports, etc.) together with aircraft, ground-starter units, vehicles, tools, etc.

Environment: the workspace (light-deck, cabin, ramp, crew-room, etc.) together with the legal, social, economic and political context within which the industry operates

Liveware: the human element (pilots, cabin crew, dispatchers, engineers, fuellers, handlers, controllers, check-in staf, oice staf, airline managers, aviation medical examiners and light operations inspectors).

Crucially, Edwards argued that safety was an emergent property of the relationships between the four elements that make up the aviation system (SHEL), and that inter-element friction could impact safety and eiciency. As William Rankin and Bill Johnson explain in the 2018 publication Handbook of Human Factors in Air Transportation Systems: “A mismatch at any of these interfaces can be a source of human error or system vulnerability”.

Edwards’s SHEL model suggested that safety was an emergent property of a wide variety of working relationships, including: captain – irst oicer; light-deck – air traic control; light-deck – engineering; lightdeck – cabin; light crew – aviation medical examiner; light crew – trades unions; light crew – management; airline – regulator; regulator – government.

Today, the SHEL model provides the airline industry with two things. First, a way of conceiving the human-factors aspects of air service provision. Secondly, a practical, supportive framework for the day-to-day management of the aviation system and for the development of aviation policy. It features in regulators’, airlines’ and consultants’ literature and training courses. For example, here is the Single European Sky ATM Research unit’s (SESAR’s) description of SHEL: “[A] conceptual model of human factors that clariies the scope of aviation human factors and assists in understanding the human factor relationships between aviation system resources/ environment (the lying subsystem) and the human component in the aviation system (the human subsystem). The …model adopts a systems perspective that suggests the human is rarely, if ever, the sole cause of an accident. The systems perspective considers a variety of contextual and task-related factors that interact with the human operator within the aviation system to afect operator performance …The human element or worker of interest is at the centre or hub of the … model”.

img_80-1_88.jpg
@Roman Becker/AirTeamImages
img_81-1_70.jpg
@Mehrad Watson/AirTeamImages

CRM’s ups and downs

Despite the consensus view that – following the spectacular disasters of the 1970s – there was an urgent need to improve light-deck team working, the industry’s human-factors innovators did not have a clear run. There was resistance. As Suzanne Gordon, Patrick Mendenhall and Bonnie Blair O’Connor explain in their 2013 book Beyond the Checklist: “CRM did not succeed because in the 1980s pilots … threw up their hands and said: ‘We give up’. A great many pilots, in fact, dismissed CRM as ‘charm school’ or even a ‘Communist plot’ to erode their authority …. [O]ne of the primary fears that pilots had was that CRM would devalue their expertise and cripple their ability to command. This kind of concern is common wherever there are steep hierarchies and status diferentials and where those at the top of the hierarchy generally shoulder the greatest legal liability”. Frank Wagener and David Ison Concur: “CRM was not universally accepted by the pilot community. It was sometimes decried as charm school, psychobabble, and attempted brainwashing by management”.

Despite the resistance, CRM gained acceptability and scored some notable successes. The most spectacular of these was the landing of a crippled DC-10 at Sioux City airport, Iowa, on 18 July, 1989, by Captain Al Haynes and his crew. An uncontained engine failure at 37,000 feet had disabled the aircraft’s light controls. By varying the thrust to the number one engine and number three engine the crew was able to gain a semblance of control over the aircraft. Despite a crashlanding during which the aircraft broke apart, 185 out of 296 persons on board survived. Haynes credited the successful outcome to CRM. In his own words: “The preparation that paid of for the crew was something that United started in 1980 called Cockpit Resource Management …. Up until 1980, we kind of worked on the concept that the captain was THE authority on the aircraft …. And we lost a few aeroplanes because of that …. We had 103 years of lying experience there in the [DC10’s] cockpit, trying to get that airplane on the ground, not one minute of which we had actually practiced, any one of us. So why would I know more about getting that airplane on the ground … than the other three? So if … we had not let everybody put their input in, it’s a cinch we wouldn’t have made it”. By canvassing crewmembers’ opinion, trying things out and delegating tasks, Haynes was able to ‘work the problem’. In Beyond the Checklist, Gordon, Mendenhall and O’Connor describe how, despite the psychological pressure and physical stresses created by the emergency, the DC-10’s crew became “hyper-organised”: “[The crew’s] conversations contained a series of intense interactions – averaging thirtyone communications per minute …. During these conversations, junior crewmembers freely suggested alternatives and the captain responded by welcoming that input. Bursts of social conversation provided emotional relief and support, enabling the crew to cope with what must have been extreme stress and to save the lives of 185 … people”.

Endorsements such as that provided by Al Haynes have helped embed CRM in airline practice. CRM training is in its sixth iteration. In addition to focusing on the person-machine interface it addresses team formation and maintenance, workload management, delegation, communication skills and maintenance of situation-awareness (SA). It is supported by other human-factors initiatives such as the line operations safety audit (LOSA), a proactive safety-management tool developed by the University of Texas Human Factors Research Project that records threats, errors and corrective or mitigative actions in-light in real-time. CRM and LOSA are complementary. Actioned concurrently they create safety synergisms. The International Civil Aviation Organisation vigorously promotes CRM and LOSA. Perhaps the greatest testament to CRM is its adoption by other high-risk industries and domains, such as maritime transport, rail transport and healthcare.

Conclusion and acknowledgements

Crew resource management is a safety tool with demonstrable beneits: one hundred and eighty-ive of the people on Al Haynes’s crippled DC-10 owe their lives to it. In concert with other human-factors safety tools such as LOSA and line-oriented light training (LOFT), it contributes to aviation’s improving safety record. It is also helping to save lives and reduce injuries in other ields, including healthcare. Contrary to received wisdom, CRM was pioneered in Europe, although the United States did much to develop, popularise and propagate it. It is the author’s opinion that, for clarity, crew resource management should be known as crew and resource management – CARM. It is important to foreground management. Safe and eicient light requires the skilled, in-situ management of crewmembers and hardware. Finally, the author would like to thank Mr Philip Costley, son of CRM pioneer John Costley, for making his father’s diaries and early training material available, and Mr Stephen Walsh of Interaction Trainers for talking about the evolution of CRM.