In 2013, the Federal Aviation Administration (FAA) issued a safety alert encouraging airlines to promote manual flight operations when appropriate. The alert was motivated by an analysis of flight operations data (including normal flight operations, incidents and accidents) that indicated an increase in manual handling errors. The FAA alert specifies that “airline operational policies should ensure that all pilots have the appropriate opportunities to exercise the … knowledge and skills in flight operations.”

On April 17, 2018, Southwest Flight 1380 experienced an engine failure 20 minutes after takeoff from LaGuardia Airport. The plane’s left engine suddenly exploded after one of its fan blades broke off. With 144 passengers and five crew members on board, flight 1380 became a scene of chaos and impending calamity. Captain Tammie Jo Shults, a veteran Navy pilot, flew on with one engine. Calm and collected throughout the ordeal, she was well-trained to handle stress in the cockpit and managed, with her crew, to land safely in Philly. Only one passenger, regrettably, did not survive.

Modern aircrafts have reliable autopilot systems, and automation provides enormous data-processing capability. But the convenience of automation software in the cockpit masks the complexity of the interactions between software components. That complexity can inadvertently compound the confusion experienced by pilots when they face unexpected, unfamiliar events, resulting in loss of control of their aircrafts. Stressful situations cause us to rely less on our intellectual knowledge and more on our automatic responses. A hidden cost of automation in the cockpit may well be a slow erosion of pilots’ cognitive abilities as they continue to fly on autopilot without problems. Artificial intelligence has not yet given us sentient automated systems that can successfully anticipate all the potential interactions and pathways leading to disaster. No algorithm or software code can do what Captain Shults did to save Flight 1380.

In the knowledge and big data economy, many of us have the equivalent of flight computer software. We use software to execute processes and harvest data. Software is as good as its user’s ability to use it properly and recognize signs of its anomalous behavior. We pile up projections and, at times, infer illusory insights, leading to faulty decisions that may destroy value.

Like airlines, we must invest in maintaining and improving the knowledge underlying the tools we use and sharpen the skills required to perform mission-critical functions when technology suddenly withdraws its services. To prepare ourselves to “fly manual,” we would need to do at least two things:

Conquer Complexity Where We Can.

When we look at most products, processes, systems and structures, complexity appears to be a defining characteristic. Unfortunately, human beings can deal with little complexity. It breeds and obscures fragility. Features such as branching and feedback loops can make failure somewhere a cause for failure everywhere.

We need to stick with simple designs and resist the rush to rationalize complexity in the name of more functionality and higher efficiency. De-coupling our interconnected processes and units can significantly reduce the likelihood of widespread system failures.

Go Back to Nature and Relearn the Basics.

Nature forces us to make hard choices at times, and the laws of physics will always be there to check our work. Investing in people to learn how to use technology is well and good, but apps and gadgets come and go and, sometimes, outlive their usefulness. It is important that we give people opportunities for regular, hands-on engagement with the products and systems they oversee and the customers they serve. Learning and development programs must also focus on critical thinking and reasoning skills to separate the wheat from the chaff.

To survive and thrive in today’s technologically advanced world, we must be prepared to “fly manual” when we must, to cruise above the miasma of complexity and forestall fateful flaws.