Policies, Processes and Human Mistakes

In early October the US air force, operating in Afghanistan, bombed a hospital in Kunduz that was run by Doctors Without Borders (MSF), killing  31 patients and medical personal.  Since then there have been investigations by both the Associated Press and by the US forces in Afghanistan.  A report was just issued by the US forces, and US Army General John Campbell described the tragedy as “an example of human and process error.”

Briefly, Afghanistan forces called for help in an attack on an enemy position, which the Afghanis described.  The US aircrew responded by flying to the location and attacking a target which they identified, visually, from the description provided by the Afghanistan forces.  At the same time as he initiated the attack the pilot relayed coordinates for the target to the operations center at Bagram Air Force base.  Unbeknownist to the pilot, the coordinates described a MSF hospital .  US policy demands that such sites not be attacked, but the operators at Bagram were folowing the action and didn't focus on the fact that the coordinates described a location that was in their database and listed as an MSF hospital  — a location that was not be attacked.  The plane fired 211 shells at the MSF compound over a 25 minute period before a phone call to the US  operations center alerted it to the fact that a hospital was being attacked and word was sent to the pilots to dissist.  (The action ceased 5 minutes after word of the problem was received at the operations center.)

Several individuals involved have been suspended, and new training has been ordered for many of the remaining individuals involved in this action.

Let's consider this tragedy from several points of view.  There were policies, and they were undoubtedly instantiated in specific “business rules.” Some rules, for example, required that attack coordinates be matched against database of “non-target” locations before an attack could be launched.  Some rules may have been embedded in computers and others were in manuals and in training materials.  Undoubtedly the pilots were required to call in the coordinates of the target and receive an “approval” before they commenced an attack.  These policies and procedures should have prevented the tragic error that occurred.

Unfortunately, in actual practices, well-defined rules are often ignored.  I am always impressed, when there is a labor dispute in the US between commercial pilots and airline management, that airport traffic is slowed to a crawl when the pilots implement the FAA's rules on times between take-offs and landings.  Once there is a settlement, the airport traffic begins to flow again, as the pilots return to their everyday practice of ignoring the FAA rules and following the orders from the tower — taking off more frequently than the rules allow.

How does a process practitioner come to grips with this?  There are rules, but the “rules” are understood to be impractical by the human participants who “modify” them in practice to make the process work efficiently.  I'm sure something like this went on in Afghanistan.  Pilots are under terrific pressure to provide rapid help to ground forces.  Good pilots have undoubtedly learned to act on their own initiative, before “approval” can be obtained.

As an aside, this certainly emphasizes why you can't separate busness rules and processes, broadly conceived, when you try to analyze business problems or develop effective solutions.

Similarly, you can't separate processes and people.  People implement processes.  Processes are how work gets done:  Real processes describe what people do.  And people largely do things because of the contingencies that follow their actions.  Air Force pilots get strongly reinforced for saving ground forces from enemy attacks.  Pilots learn tricks and shortcuts to make themselves more effective in accomplishing what everyone knows needs to be done.  Officers may recite rules on occasions, but they cheer and reward pilots who get the job done.

So, faced with a battle, and relying on a verbal description of the target provided by the Afghanistan solders on the ground, US pilots spotted a compound that fit the description and they attacked without waiting for conformation from operations central.  If the solders had been truly threatened by enemy forces in the MSF compound, it would have been the right move — because operations were not rushing to check the coordinates and generate an approval.  Instead, apparently, they were caught up in the radio traffic about the attack and cheering the pilot on.

If only, one thinks, someone at the operations center had been on the ball, checked the coordinates, and then called off the attack. The men who were supposed to check are at fault.  And the manager who was responsible for seeing that those men did their job was at fault for not assuring that they did their job.

There are now consequences.  People have been suspended and probably a couple of careers have been ended.  It would have been better if other consequences have been present, earlier.  An officer, yelling, within seconds of the initiation of the attack:  “Have we confirmed the coordinates yet?!” would have made a big difference.

Training is to be redone.  It's probably a waste of time.  If pressed, I bet every solder in the operations center could tell you that you don't initiate an attack until the coordinates have been cleared.  These men and women don't require additional training designed to teach them what they already know.  Training works by changing knowledge and developing skills, which, in this case, don't need to be changed.  They need to be implemented!  And implementation, once one knows what to do, is largely up to managers and to the consequences they impose on the employees.

 The key to changing this, in the future is, perhaps, to change the assumptions of the pilots as to what is really acceptable.  It definately involves changing the consequences placed on operational personal as regards checking coordinates as soon as they are obtained.   Rules are important — and they were in place.  A good process description is important and it was in place.  Training people to know how to execute the rules and the process is important — and had probably been done. Execution made the difference, and execution has to do with the manager's and the contingencies that managers impose.  In this case the managers and the contingencies were obviously inadequate.  That's what needs to be change to prevent this from occuring again.

Process, broadly speaking, includes all of these considerations.  When we look at a failure like the MSF attack, we must consider the policies and rules, the flow of activities, the databases and the flow of information, the people and their training, and the management and the contingencies they impose.  All these things are part of the process involved in organizing and controlling bombing raids.  But knowing which specific element failed in a given case is what makes the difference between eliminating a problem and just instituting busy work.

 

 

Share

Speak Your Mind

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Share
Share