Have you ever heard of someone trying to “game the system”?
This refers to the process whereby it is clear what the targets for success are, and an individual, group or company does just enough to qualify for meeting those specific criteria.
Even if it means that more important overall success criteria suffer as a result.
This can also be the reason why company performance begins to suffer over time, as more bureaucratic processes are introduced to manage employee performance. It can also lead to innovation efforts that are focussed on solving one issue inadvertently causing many more issues as a result.
Goodhart’s Law is named after economist Charles Goodhart, who in an 1975 essay about Monetary policy noticed that “Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.”
Or, as anthropologist Marilyn Strathern paraphrased in 1997, which became the more commonly known version of the Law:
When a measure becomes a target, it ceases to be a good measure.
Essentially, when targets are set, they can no longer be used as a way to measure progress or success, since people will adjust their performance and processes to simply meet those specific targets.
Often, this focus on just meeting the specific targets can take focus and effort away from meeting the more important overall targets aligned with achieving the company’s strategy.
Some famous examples of this happening include:
- In India while it was a British Colony, the Colonial Rulers wanted to reduce the number of snakes. So they offered to pay for dead snakes brought to them. Goodhart’s Law: Enterprising Indians began to breed and farm snakes to kill and get their reward, which was much easier than killing wild snakes, actually increasing the total snake population.
- Soviet Union planners wanting to increase production of iron nails, so telling factories to produce as many nails as possible. Goodhart’s Law: factories redesigned their processes to make millions of nails which were so small they were almost useless.
- Wells Fargo employees were under pressure to cross sell as many new accounts as possible. Goodhart’s Law: To meet their quotas, the employees just created new accounts without the customers knowing
- Scientists are rewarded for publishing research which is cited often by other research papers. Goodhart’s Law: Some scientists publish large numbers of papers which only cite their own previous work, giving the previous work more listed citations.
- Sales Managers whose performance is tracked based on the total number of sales they make. Goodhart’s Law: Some of these Sales Managers sell the products at a loss, so they get a sale but the company actually loses money.
- Hospitals are rated on several metrics, including “Quality of Care” and survival rates of patients. Goodhart’s Law: Some hospitals will refuse very sick patients because if they were to die, it would bring down their existing average score.
One area where Goodhart’s Law gets even more dangerous is in target-based software development, especially Artificial Intelligence and Machine Learning.
If humans set a specific target for an Artificial Intelligence system, it will try everything it can to optimise its performance to meet that target. However, since it does not understand context, it might find a way to reach the target in ways that go against the overall intentions of the programmers.
As a harmless example, an early Artificial Intelligence learned that in order to not die while playing Tetris, at a certain point the most effective move was just to pause the game. Forever. That way it could not die.
However, other examples are not so harmless. In 2015, Google used machine learning to let Google Photos tag faces of friends in uploaded photos. The system was told to use its algorithms to match faces in its databases. However, due to flaws in the data used to train it, and vague matching criteria, it began identifying photos of black people as “Gorilla”.
The algorithm had in fact been successful in finding what was considered the closest link according to how it was programmed. That was not its fault. It was poor data and oversight by the people developing, testing and implementing the system that was at fault.
The extreme example of Goodhart’s Law often seen in Science Fiction is Artificial Intelligence which is programmed to help astronauts succeed at a mission, but when the astronauts begin doing things which were unexpected, the AI decides the best way to complete the mission is to kill the humans.
So how can we overcome Goodhart’s Law?
Well, firstly, we need to become very clear on what we are actually trying to achieve.
What is the strategy? Is it clearly communicated and understood.
And then ensure that before targets for teams and individuals are set, asking whether there are situations where people could achieve these targets but it actually does not align with the overall strategy?
Adjust the types of targets as necessary.
This way, effective targets should prevail in the end.
Latest posts by Nick Skillicorn (see all)
- Podcast S6E132: David Schonthal – The frictions which prevent innovation adoption - September 23, 2021
- What are you actually working for? - September 22, 2021
- Abilene’s paradox: How we decide to do things nobody really wants - September 10, 2021
- Podcast S6E131: Colin Hunter – Building playgrounds for innovation - September 9, 2021