Drs. Dunning and Kruger and 300 Million More Health Experts |
People who don’t know much often think they know more than they know.
Misinformation can start when minimally informed people mistake associations with evidence of causation.
People exhibiting the Dunning-Kruger Effect often reject information that conflicts with their beliefs.
Cognitive dissonance can help unwind belief in misinformation.
In a class I recently taught about U.S. Health Care Policy & Politics, I asked the students how many health policy experts there are in the United States. Amidst a field of befuddled faces, one student said "300 million." That is basically the right answer because virtually everyone who has been ill or seen a clinician for a checkup or physical believes they understand the U.S. health care system’s problems and know how to fix them.[i]
This illustrates what is sometimes called the Dunning-Kruger Effect (DKE)[ii], in which, for example, someone with some knowledge of a subject believes they know more than people who’ve studied (or worked in) a field for many decades.
Drs. Dunning and Kruger
We are now in situation where health misinformation is being promoted by people who have significant public platforms for spreading fallacies, which could be dangerous to the health of individuals, communities, and countries. For example, during and since the COVID-19 pandemic, public figures have repeatedly asserted substantial knowledge and understanding of vaccines while criticizing many actual experts. This resulted in lower vaccine confidence and reinforced many people’s belief in misinformation. Interestingly, in the fall of 2025, many people who had (incorrectly) come to believe that vaccines were a primary cause of autism were confounded (if not distraught) when the U.S. government announced that acetaminophen was a cause of autism without also mentioning vaccines.[iii]
People exhibiting the Dunning-Kruger Effect also often do not understand (or appreciate, or recognize) the difference between associations (or correlations) and causations. An extreme example of that fallacy would be that using sunscreen causes shark attacks. (Both increase during warmer weather, but it does not mean that sunscreens attract sharks.) In contrast, an actual cause-and-effect relationship is that decreased rates of children receiving MMR vaccinations caused an increased number of measles infections.
Identifying False Experts
As individuals and society strive to reduce the ill effects of health misinformation, one key for assessing someone’s actual expertise is to determine if they have a background that would actually give them deep knowledge of the topic. A second key – or clue – that may identify someone who is a "Dr. DKE" is if they also denigrate and dismiss others who do have deep subject-matter backgrounds, for example by calling them “so-called experts.”
When you look at many misinformation spreaders who believe they are experts, they are usually highly selective in choosing the information they share about a complex situation. Importantly, they reject information contrary to what they already believe, which is known as confirmation bias.
Thus, two other clues for assessing the validity of health information from people who profess to be experts are: Do they describe something as evidence of causation when it is only an association or a correlation? And do they dismiss information that conflicts with their stated “expert” insights?[iv]
What to Do About Drs. Dunning and Kruger?
The first step in defusing false information[v] is to recognize it – as discussed above. The next step is to create some cognitive dissonance so that a misinformed person questions their belief in the falsehood. Clinicians are likely too busy to have extended conversations with misinformed patients, but one shortcut they can try with someone resistant to get a recommended vaccine is to ask “Who would you get vaccinated for?” This direct question might create cognitive dissonance since it challenges the patient to ask themselves who they love enough to question their belief in vaccine misinformation. (A similar question could be asked of a patient who is refusing to take a medicine for a chronic illness like diabetes or high blood pressure. The difference being that with a vaccine the patient might be indirectly protecting a loved one from getting sick, but with those conditions they would be primarily trying to avoid their own death or disability which could burden their child, spouse, or parent.)
As for reducing the influence of misinformation in the public sphere, whether it be on social media or elsewhere, that requires different approaches. One option is to vote with your media choices, since many of those outlets and platforms make their money based upon their numbers of viewers, readers and listeners, or clicks, like and shares. Another avenue is to become more involved with your neighbors and community since it seems that communities with stronger interpersonal ties are more immune to misinformation.[vi]
People who believe they are experts in a field even though they only know a small amount about the topic are exhibiting the Dunning-Kruger Effect. Those people may resist considering any information that conflicts with their belief in their own expertise. Such people can spread false information about important health matters, such as vaccines, which can lead to health problems for individuals, communities, and countries.
There are four clues to identifying such false experts:
They don’t have the background or experience to be an expert.
They denigrate people who do have extensive experience as “so-called experts."
Their explanations of complex matters relies on cherry-picking data points, and to "identify" associations or correlations, which they may incorrectly describe as evidence of causation.
They dismiss any information that is contrary to their false beliefs, sometimes with dismissive statements such as “you’re just trying to confuse things.”
Reducing the influence of misinformation requires generating cognitive dissonance. Doing this with individuals can be complicated, as I’ve written elsewhere, and in my book Reversing Misinformation, but it can be done. Resistance to misinformation can also be increased with stronger community connections so that falsehoods that come from the outside (e.g., from social media) are less likely to find a susceptible host.
[i] In 2005 I started writing a book with the working title of “Fixing the U.S. Health Care System.” For about 12 years, every time I told someone about this project, they immediately started telling me what the problems were and how they should be fixed. Those statements ranged from single payer (obvious and simple), to tax on millionaires (all about the money and affordability), to more primary care (personal and practical), to lowering drug costs (a perennial notion), to taking profits out of health care (presumably not referring to all the non-profit and government parts of health care). And that was only part of the list. In fairness I should note that a friend of a friend did start to tell me their “solution,” but then stopped and said, “Why am I telling you? You’re writing the book. You should tell me what you think!” To which I responded with a long laugh and told them that they were the only person who’d ever said that. (Thanks Ray!)
[ii] The Dunning-Kruger Effect (DKE) was described in a 1999 paper by Kruger and Dunning. (pubmed.ncbi.nlm.nih.gov/10626367/) It is a form of cognitive bias and can appear in many, many areas beside health care, including investing and gambling (are they the same?), driving ability (particularly among new drivers), and even cooking. However, what is popularly described as the DKE is often an exaggeration of what Dunning and Kruger actually observed in their research. (skepchick.org/2020/10/the-dunning-kruger-effect-misunderstood-misrepresented-overused-and-non-existent/)
[iii] hhs.gov/press-room/hhs-trump-kennedy-autism-initiatives-leucovorin-tylenol-research-2025.html & https://www.whitehouse.gov/articles/2025/09/fact-evidence-suggests-link-between-acetaminophen-autism/
[iv] Actual experts typically welcome new information that can improve their understanding of something that they’ve studied for many years since it can lead to new discoveries – or at least refinement in their perspectives. That is how science works.
[v] There are some very smart people that I respect who believe that the word misinformation should no longer be used. They prefer words like rumor or gossip or false statements. I think that can make sense for the casual spreading of false information. But for public figures – or media with significant numbers of readers, viewers or listeners – I still prefer the term misinformation since it seems like their goal is to misinform whether or not they realize what they are communicating is incorrect.
[vi] For example: en.chessbase.com/post/how-a-neighbourhood-chess-game-can-create-an-unexpected-sense-of-community