Flight 8303: What happens when all of your risk management layers work the same?

Flight 8303: What happens when all of your risk management layers work the same?

Passengers and cabin-crew were shocked alike; no-one expected to see sparks ripping out the back of both the engines of flight 8303. The one-and-a-half-hour internal flight from Lahore had been normal and weather conditions were fine the whole way. Scorch marks indented the runway of the Karachi Jinnah International Airport. Captain Sajjad Gul, the pilot of the unfortunate aircraft, calmly came on the intercom and announced to the cabin that they would be executing a fly around to complete the landing on a second attempt.

From perfect takeoff to tragic landing

22nd May 2020, after months in lockdown due to the Coronavirus pandemic, at a time where there were hardly any flights operating, a crash of this magnitude seemed out of the ordinary. The aircraft, an Airbus A320-214, had been through the required checks before departure and was flight-ready. The flight did not complete the go-around and crashed into the tightly packed streets of Model Colony, a residential area, just 24 seconds away from the threshold of runway 25L. The plane crashed at low speed and a high angle of attack as it stalled out of the sky, resulting in a large fireball. There were 2 survivors amongst the 99 souls on board, with incredibly, only 1 subsequent death on the ground.

Captain Gul and First Officer Usman Azam had long experience. They would have known that the golden safety rule for landings is that if your approach is not stabilised (meaning that every aspect of the approach is according to plan and schedule), then you do a go-around. It simply avoids making any mistakes when you find yourself under time pressure. After all, safety has to be the first priority of any pilot, knowing that gravity is a persistently unforgiving force.

Landing a plane is all about getting the numbers right: altitude, airspeed, rate of descent, flap settings, headings, wind speed, glideslope angles. Large airports make this fairly easy for pilots by providing a system known as ILS, which effectively makes a virtual tunnel through the air, created by radio waves, for the plane to lock on to and follow. It guides planes smoothly down to the runway threshold, so the pilots just have to focus on speed and landing gears (we accept we are oversimplifying things a little here!).

The cost of distraction

Subsequent analysis of the Cockpit Voice Recorders (CVR) show that both pilots had been engaged in a long discussion about the effects of Coronavirus on their families. They were not concentrating on the job, (over)confident that they were flying within their limits. Instead of the Pilot and First Officer managing risk by validating each other’s effectiveness in executing their duties, they had lapsed into the same loss of situational awareness. They were absorbed by their Coronavirus discussion, unaware of the risks that were starting to accumulate.

4 minutes to planned touchdown, their plane was fast and high. Instead of being at the ideal height of 4,000 feet, they were at 10,000, way above the ILS glidepath. Instead of having a gentle approach speed of 200 knots, they were at 260. Air Traffic Control contacted them and suggested they could already divert them into an orbit so they could stabilise their approach. The response from 8303 – “We’ll handle it”.

The CVR reveals the flight crew went right back to their conversation but they did put the plane into a sharp descent and made another, really serious mistake. They instructed the plane to lower the landing gear. The Airbus has a safety algorithm that will not deploy the wheels if the plane is travelling faster than 250 knots but it seems the pilots didn’t really notice this event. Responsibility for the landing passed from Air Traffic Control to the Airport Tower who manage the runways. They offered a go-around two more times as they could see the incoming flight was well outside the normal parameters for landing.

Flight 8303 FDR parameters

A go-around could be considered a bit embarrassing to a pilot, an admission that you made a mistake. It’s at this point, we have to question the prevailing culture in the airline. What did Gul and Azam fear most? The possibility of death, or admitting to their peers and supervisors they had had a lapse of concentration? A strong risk culture would recognise the professionalism of respecting safety. Tragically, Gul preferred his reputation and Azam failed to challenge him, as was his duty.

“Winging it” is not always the answer

Within a minute of each other, the crew set the landing gear down position again but inexplicably, just 6 miles from the runway, selected with the same handle, to raise the gears back up. We can only speculate at this moment that they were confused about the sequence of activities they had performed and were not functioning as a team, following the required Crew Resource Management rules of execution and cross-checking against a checklist of activities. They were “winging it” – completely ignoring procedures, supremely confident that they were in control.

As the plane was about 30 seconds from the runway, they finally intersected the ILS glidepath but thanks to the rapid descent, were travelling at 210 knots rather than 140 – 50% too fast! Warnings about the landing gear were going off in the cockpit but no one remarked on them. The tower saw that the planes landing gear was up but did not warn them. Why?! As the engine cowls inevitably touched the runway, the Flight Data Recorder (FDR) shows that the pilot engaged reverse thrust to slow the plane down, before suddenly increasing the power to fly around. Clearly they had no idea that the landing gear was in the wrong position, despite all the warning systems telling them. They had become “risk blind”.

Airport Tower saw the engines grinding into the runway but they never passed that information back to the flight, who were now trying to gain altitude, unaware that their risk profile had dramatically changed. They believed themselves to be within normal boundaries but no-one alerted them. Even in such an extraordinary circumstance, it did not trigger a reaction from the Tower. Were they risk blind too?

With extensive damage to the engines, the airplane was unable to achieve the 3,000 feet of altitude they were instructed to take. They made it above 2,000 feet but by then, both engines stopped producing power and the plane began to drop. The pilots could have taken some actions to improve their gliding ability, to get over the housing beneath them, but there was no evidence that they ever had a full and accurate mental model of what was really happening. The final impact, in sight of the runway, became inevitable. At a certain point, risks can no longer be prevented.

The crew did not follow standard callouts and did not observe CRM aspects during most parts of flight.

Preliminary investigation report on the accident of the PIA flight PK8303

Mistakes and planes don’t mix well

Human error has been documented as the primary contributor to more than 80% of commercial airplane hull loss accidents, as reported by Boeing. However, it has been overused as a term by the aviation industry to shift blame to people who can’t fight back. A simplified excuse that often doesn’t examine properly what systems and operating realities created the circumstances where the mistakes became possible. (As an aside, we strongly recommend the book “The Field Guide to Human Error” by Sidney Dekker).

In the case of flight 8303, human error is not enough to describe Captain Gul and First Officer Azam’s performance. They had a reckless level of over-confidence that cannot be explained as a one-off. According to the preliminary accident report findings, “The crew did not follow standard callouts and did not observe CRM aspects during most parts of flight.” An occasional deviation from procedure during a flight is rarely noteworthy; what can be seen with flight 8303 is that there was an almost complete disregard for professional airmanship.

A missed opportunity to speak out?

It takes time, and lack of standards enforcement, oversight or regulatory sanction to create an environment where not one, but both flight crew were persistently and casually negligent. This was normal behaviour for them, not a momentary distraction. First Officers are pilots in their own right and on any flight, responsibilities are distributed between control of the aircraft, and all other tasks such as communication with Air Traffic control, completing logs etc. Even if the First Officer is not flying the plane, they are called the “Monitoring Pilot”, with the duty to ensure everything is being done properly.

If there’s a culture of subordination, where the pilot signals they are above challenge, then the co-pilot may have felt unable to speak out, rendering his risk monitoring role ineffective.

There’s plenty of evidence that suggests that the operating and organisational culture in Pakistan’s aviation industry is not effectively regulated, and steps like the implementation and regulation of Crew Resource Management would improve performance standards and allow the development of better communication, leadership and decision. These issues of cockpit authority are nicely illustrated in this infographic which illustrates well what a healthy risk culture looks like in the cockpit.

Fake Pilots are Real News

How could aircrew become so unfit for duty? At least a part of this question was answered by Pakistan’s Transport Minister, Ghulam Sarwar Khan, 5 weeks after the crash and the day after the preliminary crash report was released. He told a press conference that more than 30% (262) of the country’s pilots held licenses under false pretences. It had become common practice to pay other people to take some or all of the 8 exams required for their flying licences, so many accredited pilots had not proven their ability to fly.

Of those pilots, 25% of them had paid for others to do more than half their exams, with a staggering 13% having outsourced all 8 of their tests to other people. Pakistan International Airlines (PIA) who ran Flight 8303, had been sent the names of 141 pilots whose qualifications were suspicious. 5 officials at Pakistan’s Civil Aviation Authority (CAA) had been suspended because of the fake licence scam, with a criminal inquiry on the way.

The Minister said that appointments and promotions within PIA and the CAA had been politicised and were done “outside-of-merit”. Other aviation safety regulators around the world have responded; US Federal Aviation Authority has banned PIA from flying in their airspace, whilst the European EASA has additionally also banned all pilots with Pakistani licences for the next 6 months.

The contagion of corruption

It is difficult to manage a corruption risk that extends from the overseeing regulator, to the employer, to the cockpit. The tell-tale signs were visible, but no-one seemed to take responsibility for owning the risk of air safety.

In a comment piece for the Pakistan Daily Times newspaper, columnist Shahid Khan had this to say:

In Pakistan, hardly anyone appears for the driving test at the motor license issuing office, and these licenses are normally granted for a small fee in addition to the paperwork. Our national culture has a high tolerance for shortcuts, bypassing rules, and neglecting procedures. Many of us associate pride with such practices. The same happened in the case of the flying licenses.

Corruption has a way to distort risk data by establishing an opaque operating reality that is different from the outward rules. As people, we learn to live by what is seen, not by what is said to be done, which in turn spreads a culture of irresponsibility (we only become accountable for what we say, not what we do). Evidence for this claim was clearly seen in the cockpit of Flight 8303. PIA might not feel responsible for fake pilot licences, but they failed to assert an appropriate safety culture and consequently, the “International” aspect of their business has largely been taken away from them.

Whilst some people will have felt uncomfortable with the lack of integrity in this licensing system, it seems that whistle-blower mechanisms did not provide an effective risk reduction. It is easy for us to assume that we have such procedures in place, but it is another thing to test and monitor their effectiveness. Clear rules of governance and the repeated communication of their sincere intent are required for people to believe that the observable risk culture reflects the one that is actually operating below the surface.

Maintenance after a pandemic

The Covid-19 pandemic has taken people and planes out of the skies, so instead of aircraft being involved in a continuous operation, fault logging and maintenance cycle, many planes have been idled in remote storage. Although the Airbus A320 was “100% fit for flying” after being grounded for 46 days, many planes will be heading skywards again, but with much less attention having been paid to their air-worthiness. Small issues that may be unnoticed could go on to become consequential. Pilots will have lost some of their feel for their planes or become a bit ‘rusty’ on procedures through lack of practice.

For many of us in our organisations, we too face new risks that come from having stopped process, eliminated capacity or simply rehearsal of key skills. A thorough inventory of “re-start risks” needs to be done, recognising that people and equipment are more likely to fail after periods of inactivity.

Auditors – We Need You

Bringing in (external) auditors is an important step in testing whether or not a risk culture is fit for purpose. This is particularly true in safety-first environments but is applicable elsewhere too. Wherever there is a deviation from the risk management we say we are doing, to an underlying approach that is ad-hoc, complacent and allergic to oversight, then concerns must be recorded and reported.

Examples of unreasonable overconfidence, disregard for established risk avoidance procedures without consequences, no top-down messaging to set out expectations of professionalism, are all early-warning indicators of a problematic culture. To challenge these, is healthy for high quality, sustainable business. It’s not only aviation customers that expect it.

People on Flight 8303 died needlessly, but none of us are bound to repeat those same mistakes in our own working environments.

Take a look at our bespoke e-learning services and any other ways we can help you and your business communicate about risks for greater advantage.

No Comments

Sorry, the comment form is closed at this time.