Machine Safety – Why we need to eliminate Blame Culture

Dec 06, 2022

Aug 3, 2017

“Well, if they’re stupid enough to do that then they deserve to get hurt!”

If you’ve ever been involved in machine safety or machine risk assessments then I’m sure you’ve heard someone – it may even have been you – say this or something similar to this. It’s common.

The intention of this article is to address the reasons why this is common, and to attempt to change that culture of thinking. I’ve recently been listening to the audio version of Black Box Thinking by Matthew Syed, and it’s got me thinking. Later in this article, I’m going to throw down the gauntlet with a statement that you might not like very much but I hope gets you thinking too.

Most readers will be aware of, either through first-hand knowledge or via professional networks, accidents in the workplace which have resulted in significant injury or even death. In many cases, the plain and simple truth is that the person put themselves into a situation where harm was foreseeable in the cold light of day to detached, objective observers. You’ll have heard someone say glibly in response “You just can’t legislate for stupid” and people will nod sagely in agreement. Why bother spending time and money – and taking a machine out of production – when the cause is so clearly operator error? We’ll address this by reinforcing training, or maybe threats of discipline.

At this point in the piece I’d like to introduce you to Alphonse Chapanis. You’ve almost certainly never heard of him but he is, in fact, a pioneer of aviation safety. He wasn’t an engineer or, to my knowledge, a pilot, so what was his role? Chapanis had a PhD in Psychology.

During World War Two, the American military noticed that there had been a cluster of similar yet unexplained accidents with the Boeing B-17 which had all occurred during landing or taxiing. Engineers, pilots and military supervisors had been unable to find a cause for these crashes but the pattern was undeniable.

In short, Chapanis discovered that in the B-17, the controls for the landing gear and the flaps were not only located next to each other, but were of identical design. In a normal landing, in good weather and visibility, this wasn’t a problem and the pilots were able to discern the correct controls and operate them accordingly. However if other situational factors were added to the landing, then what Chapanis suggested was that pilots were pulling the wrong control – for example retracting the landing gear when they intended to extend the flaps. The solution proffered by Chapanis was to make the physical design of the levers different. The landing gear lever had a small wheel on the end of it, and the flap lever looked like a flap. This meant that the controls felt different to the pilots in the hand even if they weren’t looking at them.

Once these changes were implemented, the accidents stopped overnight. These changes exist even in aircraft flying today. Today, the study of Human Factors is a discipline of its own.

I’d like you now to imagine how a typical in-company accident investigation would go given similar parameters. The controls were adequately labelled. The Standard Operating Procedures were clear. The “operator” had been trained adequately and his records were up to date. Put simply, the cause of the accident was that the operator used the wrong control and this was the direct cause of the accident.

What would likely happen? I’d say from experience that in the majority of cases, the operator – assuming that they survived the accident – would have taken direct blame. They may or may not have received compensation for their injuries but in some cases they would have been subject to disciplinary procedures too. People have been sacked in cases where fortunately there was no injury.

The design of the machine will almost certainly remain unchanged. “You can’t fix stupid, right?”

Famously, the aircraft industry learns from its mistakes, and strives continuously to make air travel safer. This is how it should be, and the public would be rightly apoplectic and outraged if this didn’t happen.

So why the difference in attitude when it comes to the safety of machinery? I said I was going to challenge you with a bold statement, and here it comes;

The reason for the difference in attitude is that you don’t identify with the operator who just stuck their hand into a moving machine, but you DO identify with the passengers in the back of the plane when the pilot pulls the wrong lever.

Now imagine that your son or daughter comes home from work and tells you that there’s a machine at work which could cause life-changing injuries if they accidentally pressed the wrong button. I bet you’d want that machine changed, wouldn’t you?

Spiers Engineering Safety can assist you in that endeavour.

Moving beyond the limitations of “blame” thinking

These sorts of incidents have clearly been contemplated by the committee of people who contributed to the creation and evolution of the Machinery Directive, currently in version 2006/42/EC.

Annex 1 of the Directive contains the Essential Health and Safety Requirements that all completed machinery must meet. The first entry refers to the obligation on the manufacturer to do a risk assessment, which includes;

By the iterative process of risk assessment and risk reduction referred to above, the manufacturer or his authorised representative shall:

  • Determine the limits of the machinery, which include the intended use and any reasonably foreseeable misuse thereof. [emphasis added] Source: 2006/42/EC Annex 1 (General Principles)

This statement is quite clear. The intention of the authors here is that risks are identified – including reasonably foreseeable misuse – and control measures considered. Annex 1 is also unambiguous about the existence of a hierarchy in terms of control measures. This is:

  • Eliminate or reduce risks as far as possible (inherently safe machinery design and construction)
  • Take the necessary protective measures in relation to risks that cannot be eliminated
  • Inform users of the residual risks due to any shortcomings of the protective measures adopted, indicate whether any particular training is required, and specify any need to provide personal protective equipment.

Source: 2006/42/EC Annex 1, clause 1.1.2(b)

Taking these two sections together, the designer of the machine must assess the risks, contemplate risk reduction methods in the order given, and then reassess to determine if the level of risk has been sufficiently reduced. If not, then further risk reduction methods must be used until the remaining risk level is considered to be low enough. There are slightly differing interpretations of when a risk is low enough. The Directive itself does not define this but the Guide to the Machinery Directive from the European Commission has this to say:

“…each risk reduction measure envisaged to deal with a particular hazard must be evaluated to see if it is adequate and does not generate new hazards.”


As if that wasn’t enough, the Directive goes even further:

When designing and constructing machinery, and when drafting the instructions, the manufacturer or his authorised representative must envisage not only the intended use of the machinery but also any reasonably foreseeable misuse thereof.

The machinery must be designed and constructed in such a way as to prevent abnormal use if such use would engender a risk. Where appropriate, the instructions must draw the user’s attention to ways – which experience has shown might occur – in which the machinery should not be used. [emphasis added]

Source: 2006/42/EC Annex 1, clause 1.1.2(c)

The Directive – and any number of the standards harmonised to it – contain many more references to reasonably foreseeable misuse, but for the purposes of brevity they are not mentioned here.

But this only applies to machine manufacturers, surely?

It’s true that the Machinery Directive only applies to machines put into service for the first time within the EEA. Whether you bought the machine from a manufacturer or made it yourself is immaterial. Users of work equipment in the UK are governed by the Provision and Use of Work Equipment Regulations (PUWER). If you are a user of machinery then your obligation to carry out a risk assessment comes from Regulation 3 of the Management of Health and Safety at Work Regulations.

Regulation 11 concerns Dangerous Parts of Machinery. Measures taken under Regulation 11 might include fixed guards, light curtains or interlocked movable guards. In terms of misuse, one of the requirements of Regulation 11 is that whatever measures are taken to prevent access to dangerous parts, that these are not “easily bypassed or disabled”. The Approved Code of Practice (ACOP) for PUWER defines a dangerous part as

“… if a piece of work equipment could cause injury, while being used in a foreseeable way, it can be considered a dangerous part.”

Regulation 18, Control Systems is also relevant here. The PUWER ACOP refers the user to the harmonised standard EN13849-1 for the robust design of control systems. This standard also deals with misuse, defining it thus:

reasonably foreseeable misuse

use of a machine in a way not intended by the designer, but which may result from readily predictable human behaviour

Source: BS EN ISO 13849-1:2015 Clause 3.1.19

The concept of foreseeable misuse, as we have already discussed, could involve bypassing or defeating safeguards, or it could simply be operator interaction with the machine in a way not anticipated or prevented by the manufacturer. This might result in a dangerous part not being adequately guarded or even not being guarded at all, as was the case in the following recent incident.

A bread maker was fined £1.9m (plus costs) for breaching Regulation 11 of PUWER after a worker trapped their arm while cleaning a part of the line. The worker suffered friction burns which required skin grafts. Amongst the key findings:

HSE inspectors found the machine could have been fitted with localised guarding to prevent access between the conveyors.


The obligations on machine manufacturers and end users could not be clearer. There is a legal duty to consider the “stupid user” via reasonably foreseeable misuse. The author would also argue that even if a particular kind of misuse was not anticipated at the machine’s conception, if experience shows that it may recur, then the machine should be modified to prevent such misuse.

Spiers Engineering Safety has many years of collective experience of the sort of misuse that occurs in the workplace over various different sectors. We can assist you with your design risk assessment process and suggest appropriate risk reduction measures.

Your goal should be that your machines are safe enough that you would be happy for your family to work on them.

Article Categories
Browse articles from our other categories