Simply make no mistakes

Guest contribution by | 20.06.2022

Much has been said about error culture in recent years. There was, for example, the paean to a “good error culture” with the bitter dispute about what that even means. Or the discussion about good and bad mistakes and the distinction between avoidable mistakes and unavoidable errors.

Unfortunately, many of these discussions seemed rather academic. After all, in the end it was clear to everyone that avoiding mistakes would be the best thing for everyone involved. After all, the most successful companies are those that achieve the best results and whose customers are the most satisfied – mistakes only get in the way.

Nevertheless, there is of course something to the fact that mistakes still happen and that it would be good to at least make the best of the situation. At some point, someone will make the wrong move, make a mistake or do the wrong thing against their better judgement and with the best of intentions. We can stand on our heads, jump in circles or drum on the floor: That won’t change anything.

But we are not at the mercy of our mistakes, because we have a number of possibilities to deal with them better. Let’s take a look at these possibilities in detail:

Lowering the fall height

An important insight in dealing with mistakes is that it is actually much more the effects than the mistakes themselves that cause us headaches. For example, whether the mistake leads to the customer turning his back on us, we lose sales or reputation or, in the worst case, perhaps even people are harmed. So if we can’t always avoid mistakes happening, we could and should at least reduce the impact.

A good example of this is the fuse in the circuit, which serves a simple purpose. If we overload the circuit, the fuse prevents our house from burning down. The RCD, on the other hand, ensures that the bathtub does not become a death trap if we accidentally drop the hairdryer or radio into it. And in our cars we have assistance systems that intervene if we have missed our chance to brake.

These mechanisms have in common that they recognise a certain inevitability of error, an indispensable margin of error, and ensure that a single mistake does not become a catastrophe.

The underlying principle is simple but effective: instead of hoping that the “error gods” will be kind to us or that the working person will do everything right, one tries to reduce or eliminate pitfalls. This can be about preventing the worst possible thing or making activities safer overall.

In prevention, this is one of the sharpest swords we have at our disposal. Because beyond disaster prevention, there are often plenty of ways to make an activity safer: from checklists and peer reviews to signal lights and tool support to automation that relieves people of particularly error-prone sub-processes.

In general, it helps to relieve people of tasks that are repetitive and require a high level of attention, or whose success depends heavily on the error-free recall of memory content.

However, it is not always a matter of standard processes. Sometimes you have to dare to do something new in order to be successful as a company in the long term. This includes, for example, developing a new product without knowing in advance whether it will be successful. That’s where the misconceptions mentioned at the beginning come into play. It could be that I was wrong in my assessment of whether an envisaged product can be successful on the market. This form of failure is easier because, after all, one could not have known better. In the end, however, it is no less annoying or a threat to success than making a mistake against one’s better judgement.

The good thing is that the approach of reducing the height of the fall is also helpful here. If, for example, I decide to develop a product iteratively, i.e. to keep the functional scope small and to gather feedback from relevant target groups regularly, at short intervals, then that is exactly what I am doing. It reduces the probability that a misjudgement will have the greatest possible impact on my company.

In any case, it’s better than battening down the hatches of the company after a year because you have consistently followed a plan, but then realise that nobody wants the result.

Detect deviations as early and reliably as possible

Another way to reduce the fall height is to detect errors and deviations as early and reliably as possible. Because the earlier deviations from the expected are detected, the earlier we can react to them and prevent the worst. Most of the time, this also reduces the effort required to solve the problem.

In software development, for example, we have mechanisms that can alert us to (some) errors even during development. This way we can fix them before another colleague spends his time with the faulty code. Or let’s take another look at the car: if we are tired and should better make a stop at the rest area, the fatigue detection can do us a good service.

However, the example of fatigue detection also shows the limitations of technical means, because the detection rate of fatigue detection is rather low. Ultimately, the early detection of errors also requires people with their intuition, wealth of ideas and ability to recognise patterns, which, unlike technical aids, is not dependent on measured values and prior programming.

In order to make optimal use of this ability, the appropriate conditions are needed above all.

The aforementioned technical tools are a start, but – and this brings us back to the error culture – it is also about how “bad news” is handled in the company. Simply put, it’s about the climate in the company: Does the bearer of bad news have to fear “consequences” or does an environment exist in which bad news is used?

This is important because research in occupational safety shows that people are quite good at identifying sources of error and measures against them. At the same time, there are incentives for individuals to keep their concerns, ideas or suggestions for improvement to themselves.

As Amy Edmondson writes in the book “The Fearless Organisation”¹, we may remain quiet in situations where we would be better off speaking up. This has to do with our desire to maintain a certain external image of ourselves. We tend to want to be perceived as clever, helpful and competent rather than disruptive, incompetent or ignorant. The possibility of being judged by others in one way or another is called interpersonal risk. By weighing up the benefits and risks (often unconsciously), we try to reduce this risk. However, for people to share concerns, ideas or suggestions for improvement, it would be beneficial if the benefit outweighed the risk.

On the one hand, people need the certainty that it is desirable to point out deviations or possible problems or to contribute ideas. On the other hand, they need the certainty that it will also bring something. If someone runs the risk of being ignored, belittled or treated worse in the future, that is bad. It is even worse if there is a tendency not to even use this feedback.

Gain insights from mistakes and derive actions

But in the end, despite all these efforts, it sometimes happens that the cart is driven up against the wall. Then it is a matter of making the best of the situation.

First and foremost, of course, this means stopping the immediate problem and minimising the damage. But once the danger has been eliminated and the damage limited, the next step is to learn for the future. In the best case, this should be done in a way that helps to avoid problems in the future, to further reduce the fall height and to recognise errors even better or even earlier. In IT, for example, there is the post-mortem process². Here, everything that led to the undesired event is compiled and measures for improvement are derived.

Again, how bad news is handled plays a role here. Can a person fully engage and be open here, even though he or she may have made a mistake? Can he or she feel safe to share an idea about what happened that differs from the opinion of others? Is the analysis and the result of the effort also given appropriate value, which in the best case translates into action?

Final thoughts

Of course it should be in our interest to avoid mistakes. But “simply making no mistakes” is not an effective strategy for this.

We can and should take much more care that a small mistake does not turn into a big debacle. It doesn’t matter whether we made a mistake or acted against our better judgement. To this end, we are looking for measures that make the work safer and ways to detect mistakes as early as possible. And if the cat does fall into the well, we try to make the best of it and derive improvements for the future.

Because if the work resembles a minefield, you don’t need to be surprised if things go bang at every second step.

 

Notes (partly in German):

[1] Amy Edmondson: The Fearless Organisation
[2] Post Mortem Prozess

Patrick Schoenfeld

Patrick Schoenfeld

Patrick Schoenfeld has been working in IT for almost 20 years and asks himself: how can we make the chaos we commonly call work better for everyone? In his blog chaosverbesserer.de, he therefore deals, among other things, with what makes people work together “smoothly” and what does not. Is a foosball table enough or do you have to rethink some parts of work? Apart from that, he likes coffee, good food, travelling and long bike rides.