Practical human factors: recognising the human condition

THE HUMAN CONDITION

Whilst the aphorism ‘To err is human’ may well be a truth of the human condition, it is important to recognise that human errors vary in type and likelihood. Since both of these variables are, in principle, predictable, the capacity for human error can also be characterised and managed. For example, the probability of human error is closely related to the complexity of a task, the time available, the usability of equipment, the quality of training and procedures, and the prevailing environmental conditions.

As the architecture of safety related systems changes, so do the demands placed on the human operators that support system safety.

For example, an operator’s role may be primarily passive, monitoring changes in system state, and confirming automatic actuation of systems.

Or perhaps the operator is a ‘man-in-the loop’ controller, performing actions to control a plant or process, or initiate protective systems.

In most cases the role of the operator will be to support a number of safety related systems, each with differing demands. Clearly then, system safety can be heavily dependent on human factors such as the quality of the plant interfaces used by the operators, and the clarity and user-friendliness of operating and maintenance procedures.

ACCIDENTS AND OPERATORS

Investigation of accidents across disparate industry sectors, such as Three Mile Island, Chernobyl, Ladbroke Grove, Milford Haven [see Box 1] and more recently the Buncefield Oil Storage Depot explosion, add weight to the view that the root cause is rarely as simple as the front line operator.

In a UK HSE study [Ref 2], 34 accidents and incidents due to control system failures were investigated to identify the primary cause and attribute it to a safety lifecycle phase. Only15% related to the operations and maintenance phase, with more than 60% of failures classed as built into the safety related systems before taken into service. Hence it seems that designers, safety assessors and managers are only human too!

HUMAN FACTORS

The discipline of human factors (also called ergonomics) concerns itself with answering the following questions:

  • What are people being asked to do (the job and its characteristics)?
  • Who is doing it (the individual and their competence)?
  • Where are they working (the organisation and its attributes)?

These issues are more wide ranging than those relating specifically to an operator’s role as monitor or controller in the architecture of safety systems, and cover the whole lifecycle. For example, competence clearly applies to those involved in specification, design, manufacture, commissioning, training, operations and maintenance.

THE MAN IN THE MACHINE

In working to prevent human error, it is essential to keep in mind how important people are to safety. They are proactive and can solve problems; they can make systems and facilities more robust and are irreplaceable in many situations. Unlike automatic safety systems people can learn. With a human as part of the system, the system has the potential to improve.

Active participation of operators in the design, assessment, management and ongoing improvement of safety-related systems should be an essential ingredient towards creating safer working conditions.

Human errors are inevitable – but perhaps Prof. James Reason put it best when he said:

“We cannot change the human condition, but we can change the conditions under which people work.”

References

1. Health & Safety Executive (1997), The explosion and fires at the Texaco Refinery, Milford Haven, 24th July 1994.
2. Health & Safety Executive (2003), Out of Control: Why control systems go wrong and how to prevent failure (2nd edition).

This article first appeared in RISKworld Issue 15.