Authority Bias & Why There’s No Such Thing As A Stupid Question
Do you generally do what you’re told when someone you see as having more authority than yourself such as a policeman or a superior at work tells you to do something? What about if that something conflicts with your personal conscience?
How about if you are told something by an expert which conflicted with your views, would you assume you are wrong because you are not an expert? Would you bow to their authority on the matter?
Not so long ago I wrote an article about outside thinking – why asking the experts is not always the best idea (here: Outside Thinking: Why The Experts Don’t Always Know Best).
I’d like to explore that idea one stage further, along with how we look at authority figures in general by taking a look at Authority Bias.
It’s something we are all prone to suffer from.
What Is Authority Bias?
Authority bias is the tendency to attribute greater accuracy to the opinion of an authority figure (unrelated to its content) and be more influenced by that opinion (Wikipedia definition).
All around us experts, politicians, teachers, CEOs, consultants, gurus in various sectors, economists doctors and scientists bombard us with statements, seek to underline their status and celebrity and we tend to buy it simply because we see them as authority figures. Add to this media outlets, social media, news, invited guests, commentators and experts on news programs and talk shows. Because of where they are and how we see them we tend to believe they have authority and therefore tend to believe them.
Over the years we have even symbolised authority figures with badges and labels, with awards and accolades. Through simple celebrity a person can increase their authority status but consider also how we use gradings in sports to mark people’s experience and progress (how revered is the black-belt in karate?), rank badges in the military, ranked titles in the police force and in fact in nearly all corporate institutions. Doctors and scientists wear white coats, kings wear crowns, captains wear arm bands and so on and so forth. In most cases these labels and badges are deserved. Their purpose is precisely to distinguish the experts and authority figures from the rest and they are true indicators of authority.
But what about when they’re not?
We have a collective respect for authority and are quick to believe who we see as authority figures over everyone else – often to exaggerated degrees and often with little merit other than the fact we perceive these people as authority figures.
This perception may be as simple as being told they are an authority figure, or seeing one of these symbols or labels – just seeing a badge or hearing someone’s title is enough for us to afford the person the authority and also the respect and deference that goes with it (without looking at any facts their track record, their actual credibility or any justification for that authority). That’s authority bias.
Milgram’s Electric Shock Experiment
In 1961 psychologist Stanley Milgram demonstrated the effects of authority bias by getting participants in a series of social psychology experiments to perform acts conflicting with their personal conscience. Namely electrocuting someone.
Participants clearly suffered a great deal of cognitive dissonance (more on that below), but they also suffered from authority bias as Milgram’s experiments showed, so they continued.
Milgram instructed his subjects to administer ever-increasing electrical shocks to a person sitting on the other side of a pane of glass. They were told to begin with 15 volts and increased incrementally until they reached a maximum, a lethal dose of 450 volts!!
The electric shocks in the experiment were fake and Milgram used an actor to play the role of the victim (the ‘learner’) but those charged with administering the shocks didn’t know that.
The results of the experiment were nothing short of amazing. As the victim wailed and writhed in pain and the subjects of the experiment started to question and protest, Milgram would say “keep going, the experiment depends on it.” and the majority of the subjects would continue to administer the ever-increasing electrocutions.
The experiment found, unexpectedly, that a very high proportion of people would fully obey the instructions, albeit reluctantly, hence clearly demonstrating the concept of authority bias – more than half of the participants went all the way up to the maximum voltage – out of sheer obedience to authority.
Experts Track Records
I said above that we often perceive people as authority figures or experts without much scrutiny.
Often experts track records speak for themselves, but often they don’t. The point here is that it doesn’t do any harm to check, especially when the consequences of taking the advice of the expert in question are significant.
A few quick examples:
- economists love to forecast, they do it all of the time, yet of the million or so trained economists around the world, none of these so-called experts predicted the 2008 financial crisis (the ‘credit crunch’).
- the medical industry is a whole sector where authority isn’t questioned, even in obvious cases. Doctors simply aren’t questioned and mistakes are made. The typical response rather than admit error is to cite ‘complications’, have you ever heard that? The word iatrogenics comes from the concept of illness caused by medical examination or treatment, side effects. Until 1900 it was statistically better for patients to avoid doctor’s visits altogether – more often than not the ‘treatment’ worsened the illness due to such practices as blood letting and poor hygiene. This idea of cognitive dissonance comes back again, there are many known examples of nurses and doctors assistants not questioning the authority figure when they can clearly see things are going wrong. The system is set up to respect the authority of the more senior staff to a grossly exaggerated extent – i.e. systematic authority bias. Matthew Syed writes extensively on this in his excellent book ‘Black Box Thinking‘.
Again, this is not to say that you should start ignoring experts, but being aware of the concept of authority bias, allow yourself to simply check that any particular authority figure is as much of an authority as your first impression.
Types of Authority
Of course authority can come in many different forms, whether real or perceived and whether merited or not. You could for example be influenced by someone purely due to their position (absolute authority) or because they are a proclaimed expert in something (expert authority). If you want to learn more about the different types of authority, check out this article: Why You Need To Lead…
What Happens When We Override Our Beliefs (Cognitive Dissonance)
Silencing that doubting voice inside of you in favour of trusting an expert or authority leads to cognitive dissonance – mental discomfort or psychological stress which happens as a result of a conflict between a course of action which contradicts personal beliefs, ideals, and values, i.e. the dissonance is between the course of action and our own beliefs (in this case accepting the authorities beliefs over our own). If you want to know more about Cognitive Dissonance, check out the following articles:
- Cognitive Dissonance: The Lies We Tell Ourselves and How They Can Help Us Change
- Cognitive Dissonance II: How To Influence Others
No Such Thing As A Stupid Question
Out of respect for someone’s authority or perceived authority or because of groupthink (following others in a group, assuming they know better or more than you do and therefore not challenging the status quo), we often fail to put our hands up when we have concerns for fear of looking stupid or out of touch.
In my view, honesty and integrity is the best way to go. Be comfortable in your own skin and don’t be afraid to put your hand up. This means asking questions, challenging your own beliefs and tendency toward authority bias (so questioning the experts or checking their credentials).
This also includes saying you don’t know because guess what? Just as you may be prone to suffer from authority bias, so can everybody else – somebody could be assuming you have more expertise than you actually have and this can also get you into hot water!
(and there are many ways to do this even in the most difficult situations as I’ve written about here: If You Don’t Know The Answer Be The First To Say It)
So authority bias can work both ways. You can be the person who affords too much authority of others, but you can also be the person who is afforded too much authority yourself (due to other people’s authority bias).
History has shown us some pretty horrific examples of where authority bias has led to catastrophic outcomes. I’m sure you can think of some pretty obvious examples for yourself.
Conclusion – Don’t Be Afraid To Question
There’s nothing wrong with trusting the experts, but there’s also nothing wrong with checking that the Expert actually deserves your trust.
If someone, for whatever reason puts you in a position of authority due to their authority bias, be the first to correct them by saying you don’t have all the answers that they perhaps think you do.
Knowing that authority bias is something we’re all prone to suffer from, don’t be afraid to question. Whenever you are about to make a decision, particularly an important decision, whenever you are about to follow a particular path or choose a course of action, think about what authority figures may have had an influence on that decision or that course. Think to what degree authority bias has come into play and if appropriate, challenge the authority figure in question or at least check that the authority you have afforded them is justified.
Hi,
We can all be an expert!
At least that is what the Zulu principle tries to state.
Jim Slater used Zulu principle for investing in the stock market, but the example is in anthropology.
Jim Slater’s Zulu Principle best illustrates how the power of focus can lead to mastery and/or expertise in any given niche, topic, domain or field. This is the title of his financial investment book based on an insight he had as a result of his wife reading a four-page Reader’s Digest article on Zulus.
He stated that if she had gone to the local library and borrowed all the available books on Zulus she could find then she would become one of the leading experts in their city on the subject. If she had traveled to South Africa and lived in a Zulu kraal for several months and studied all the literature on Zulus at a South African university then she would have gone on to be one of the top experts in the UK and possibly in the world.
Slater believed that the more you focus on an area, then the easier it is to become an expert in that area. This expertise gives you a competitive advantage over others.
Read the blurb: http://www.horebinternational.com/the-zulu-principle/
Thanks for the comment & I agree 100%.
We can all be experts.
That’s not really the point of this article though. Authority Bias is a (proven) tendency most of us have to afford [exaggerated levels of] authority often when it is not deserved and to fail to question authority when we sometimes should be, e.g. in Milgram’s experiment.
But I do agree we can become experts in anything and these days more than ever if you are prepared to learn something and immerse yourself in that learning, the resources are easier to find than ever, just look at YouTube.
Zulu principle could have been a good article for ‘Z’ as I was struggling for a moment but quite happy I went with Zen, ‘Zen in the Art of Archery’ was a really good read!!
p.s. I’d already heard of the Zulu principle & read the blurb you linked to but on 2nd thoughts it’s not something I would have really wanted to write about or align myself too much.
Difficult to disagree with as focus is massive but to that extent that you invest so much time and energy over a sustained period in one thing – only if that’s what floats your boat. For me life is too short to be so concentrated on just one thing and I believe that we have enough capacity (in fact tremendous capacity) to learn a lot about a lot of different subjects.