What is your first-hand experience with the "Dunning–Kruger effect"?
In the field of psychology, the Dunning–Kruger effect is a cognitive bias in which people of low ability have illusory superiority and mistakenly assess their cognitive ability as greater than it is. The cognitive bias of illusory superiority comes from the inability of low-ability people to recognize their lack of ability. Without the self-awareness of metacognition, low-ability people cannot objectively evaluate their competence or incompetence. (Wikipedia)
Some of my fellow programmers seem to think the world turns around their knowledge as if there was no valid reasoning whatsoever beyond math and computer science. They seem to think logic (a tool with multiple uses which exists since at least 380 BC) is merely an attribute of computer science. It's not uncommon for them to think they can understand the intricacies of every phenomenon under the sun.
I have to control myself to avoid countering each of their flawed arguments. To my own detriment, I'm not always able to do so. I feel surrounded by arrogance and cognitive bias, and have to silence my better judgment in order to avoid constant conflict.
To be clear, I'm not looking for advice, as I already know the "solution", which is no solution. You can't use reason to fight something that is not motivated by reason. I'm posting to know your stories and maybe find some solace in the knowledge that I'm not alone.
Have you ever had to deal directly with people who grossly inflate their own competence, possibly stretching it to an unrelated field? if so, what's your story?
There's a corollary finding in that Dunning-Kruger research that is almost as important as the one about the incompetent person's lack of self-awareness: the most competent people are also unaware of how uncommon their competency level is. The top quarter of competence and the bottom two quarters all rate themselves as the second highest quarter in startling numbers. Not to be too cheeky, but I'm seeing a little of that happening in these comments too.
The reason that finding is super important (and all the moreso for being so often overlooked when people talk about Dunning-Kruger) is that this problem of perspective hits at both ends of the competency scale, but only one end is expected to do anything about it. What harm comes from a super competent person thinking that more people are at their level than truly are? Could it be that certain design decisions, for example, get made assuming higher average competency levels than appropriate?
There is a classic theory (hierarchy of competence), that the top quartile of competence find it difficult to communicate with the bottom quartile: The fundamental principles are unconscious to the top quartile, so they can not communicate effectively to someone who is unconsciously incompetent.
I don't remember if I read it or if someone told me, but this is something I definitely observed in myself and in others: when some people drastically increase their knowledge on something they get even more humble, because when you learn a lot about a field you also realize it's much bigger and complex than you previously thought. In this sense, "mastering" something can be an exercise in humility.
I'm not sure what I described is the same thing you're referring to, but it reminded me of.
It's kind of similar. What we're talking about when we talk about the Dunning-Kruger research is people's ability to think about their own meta-placement in competency. Put another way: Are you able to accurately evaluate how competent you are compared to the entire population employing that skill? The research suggests that both the most competent and the least competent are bad at self-evaluation.
Now that might be in part driven by humility about how much else in the field this super competent person doesn't know. You'd be talking about a potential cause of this effect, basically.
I think they call that the Kruger-Dumbing effect.
Well, honestly one of the things I see happening with the hypercompetent peeps I know is that they undersell themselves and get passed up for things or give up on them. So I'd imagine that's the biggest issue here: if overconfident peeps and underconfident peeps both think of themselves as being about the same then it'll be hard to figure out which is which when seeing them in action, doubly so if its a topic thats tough for most people to determine the quality of work on.
If that's true, then we can see that even if we were to basically eliminate overconfidence this would still be an issue, since second-tier and first-tier peeps would still rate themselves the same. But that said, its also a tricky problem to manage, because blocks about how you must not actually be that good can be just as strong as blocks about how you must be good.
To throw some paraphrased anectdotal evidence from two friends of mine in the ring:
Bf: "You're amazing at computer science."
Gf: "No I'm not, and anyway you're way better than me."
Bf: "If I'm as good as you say then I must be able to accurately judge your skills, right?"
Gf: "..."
Bf: "Or do you think that's the one thing I'm not good at?"
Gf: "Yeah."
and yes she is amazing at compsci and needs to have more confidence in herself >:o
Your comment is really interesting and mentioned some things I didn't know.
It also reminded me of one of my favorite quotes, attributed to Bertrand Russell:
“The fundamental cause of the trouble in the modern world today is that the stupid are cocksure while the intelligent are full of doubt.”
I always remember that and now I don't trust my judgement anymore :P
I see it all the time in various techy sites, especially when people who actively avoid Javascript talk about Javascript.
The "you don't need an SPA" article got to the top of HN yesterday and it's pretty flagrant when people haven't actually seriously worked with SPAs, or the JS ecosystem at all. There was a .NET developer complaining that node_modules was 60MB for example, and another freelancer who complained an SPA instead of a server-rendered app would "10x the development costs".
I disagree there's no solution; the solution is education. Education in the field itself, as well as teaching people to take it down a fucking notch. It is frustrating, but I think the main difficulty is reasoning where the knowledge pitfalls are to be able to tell what the person might have gotten stuck on. It also doesn't have to be conflictive.
Though I think my favourite recent example of Dunning-Kruger by far is this one by Paul Vixie. Putting aside the technical arguments for/against the Chromecast's behaviour there, I just find it really funny.
Thanks for answering.
I agree that education is a possible long-term solution for this problem, but my particular problem is that I am in no position to educate the grown men I collaborate with :P
I'm not sure I understand how the example you linked is Dunning-Kruger. It just seems like a privacy conscious individual being angry about a forced choice by Google?
I'm kind of sick of this from the other side I understand peoples frustrations with some bloated JS web apps but now people are complaining about anything at all that uses JS. I built a web app SPA and a few people complained that it didn't load without js. The thing is this is an actual application in the browser its not possible to do what I have done without JS, or at least without seriously crippling the functionality.
I just don't understand why you would want a worse version that is server side rendered. The app runs fast even on really old hardware, there are no tracking scripts and no bloat. Before SPAs this kind of thing would have been a desktop app but thanks to SPAs you can actually build a desktop app using the same API that powers the website rather than parsing a html page or dealing with a second class API that can't do the same things the web page can.
SPAs are a tool, they can be used very effectively and its just a pain to see all the complaints about the effective uses rather than just complaining about the real problems like slow pages and tracking.
The highlight of the comment. So well put.
Both Dunning and Kruger are my life-long friends. I met them in my mother's womb. One of my children is named after them.
Jokes aside... I'm a thinker type. I think a lot, about everything, trying to build an internal model of the world as complete as possible. I want to know how everything works, and if I can't get my hands on it, I try to extrapolate the data from what I already know.
It may come as no surprise to you that I'm not competent at everything. As such, there's a vast field of things I don't know. I also can't extrapolate everything, simply because, if there is such a thing, I have no access to the fundamental principles of the Universe from which I could take steps and see where they lead me. Therefore, a lot of the time, I'm wrong and I don't even know it. Hey, the extrapolation makes sense to me! It must be correct, right? Yeah, chief, not quite.
This is something that led me to assume a whole lot more about myself than I was when I was a teen. (Naturally.) If you think I'm insufferable now, you can imagine how that worked out for me when I was younger. My mental capacity and the sheer volume of data I'd accumulated through curiosity still made me an interesting person, but I imagine I've made a mess of myself a whole lot more than I did something good.
It's something I take care about when I'm going off in my thinker mode. I remind myself that some of the things I'm assuming I don't have the basis for. The mind is a powerful mechanism, but it also operates with laziness as one of its priorities; "more work for less energy" and that sorta thing. It's important to keep the biases in check. I think I've gotten somewhat-good at doing so for myself.
I get this completely. I very carefully try not to "know" anything about anything - I'll rarely make a statement without doing independent homework first, and my curiosity beast is a cosseted, fat, and happy pet.
My example of the Dunning-Kruger effect is a network director who believes that "network is network", and absolutely no other inputs are required. Bandwidth, applications, reliable platforms, management tools and techniques, carrier service quality, end costs, support, change, and outage information, and all the other little details which might make a successful implementation are not consequential.
I don't work directly for this person, but they're fucking up my projects, those of others, and national service quality on a regular basis.
Need I mention that they're excellent at politics? :-/
I'd like to make one thing clear: I know I know some things. I don't deny myself that some of the mental models I have are reliable representations of the world. What I'm wary of is when they aren't but I would act as if they are.
(Just recently, I was telling @Deimos about potential for Tildes' further optimization. I mentioned spritesheets and HTTP/2. I knew about the latter's ability to collate requests, which is supposed to lower overall website load time, but I'd never checked whether it applied to spritesheets vs. separate icons, and I couldn't make the calculation in my head. I did a quick search, and a reply at Stack Overflow proved me wrong. I noted that and linked to the reply.)
I'm also wary of the reliability of my sources. Perhaps I remembered the data correctly, but the source was wrong; therefore, I'm wrong. I find it important to state my sources – or that I have none, as is often the case 'cause I compose half-remembered facts from multiple different sources – for accountability and responsibility.
"Network director", huh? Meritocracy is a dream I hope we wake up to one day.
My clearest experience with Dunning-Kruger comes from my mother's attitude. My father has some experience in building and housekeeping, the latter particularly with electric devices. Whenever he tried to figure out what's wrong with any particular device, my mother would chime in with her own ideas as if they're worth the same. (Most often, they aren't.) I would rather not be that.
I'm trying to keep in mind that my mental models of the world need constant updating. Certainty is reserved to the omniscient deities humanity keeps inventing.
It seems to me that when we speak of the "Dunning-Kruger Effect", we're talking about people who are biased towards a belief in their own efficacy, regardless of their knowledge.
I've spent way too much time chasing down the rabbit hole of certainty - verifying sources, reflecting on my experience, gathering information, lab-testing, asking questions, and doing as much as possible to eliminate error.
There's a point at which this becomes counterproductive. It's taken me considerable work to balance the worry about being wrong with the worry about not getting done.
Having some grasp of risk helps measure the effort. As you've described, posting on a forum might not seem like much of a risk, but the chance of leading others into error, or making a great big ass of oneself, are certainly factors in how much work you do.
We're always at risk of error, but somehow we expect our leaders to project a confidence which hides whether or not they're actually competent.
In the example I complained of, the director is much more concerned with a show of getting things done, than with actual effectiveness. I expect that a variety of mal- and misfeasance will eventually catch up with them, iron self-assurance or not. It's hard to evade responsibility for recurring company-wide service outages indefinitely. Though they may manage it through outsourcing and blaming the vendors - who knows.
[I suspect that one of the other deficits of the Dunning-Kruger Effect-afflicted is recognizing when others are better-informed, and realizing when to take their advice.]
I think being curious and exercising the ability to extrapolate from limited data and experience can be a very good thing. Provided that you understand the provisional character of your knowledge, it helps you create better hypothesis and also gives some initial directions on where and how to better educate yourself on the subject. The issue is when people gives too much credit to their earlier assumptions and become resistant to change on matters they know little about. On the other hand, there are situations when your expertise actually requires that you hold your ground, but some people are just unable to draw this distinction.
Absolutely!
My problem is that I can easily assume facts that aren't true-to-life because they easily fit into the picture, without examining whether they are, in fact, true. It's the "laziness" part I mentioned, where things fall into place all too easily. Can't quite give you a more concrete example, because nothing specific comes to mind.
What do you mean by that?
I mean that we must understand that whatever knowledge we posess is most frequently temporary, and may be easily disproved by someone with more extensive and/or accurate information.
It think it's a defence mechanism, especially the case you describe. Insecurities about being out side their field, acting the same way as they would within their area of expertise. Either because it's the only way they know how, or use it to cover up that insecurity by constantly reminding others about the field they are competent in. Honestly pay it no attention, it even comes off a bit sad when you think about it.
Thanks. I agree with you. Unfortunately, even though self-control is not hard for me at all, every once in a while I make the mistake of speaking when I should be silent. Oh, well :P
I wonder if psychedelics can help here... Doing studies on the magnitude of the DK effect pre- and post-psychedelic experience would be interesting.
My reasoning here is that psychedelics have the ability to dissolve cognitive self-protection mechanisms that manifest in rationalizing I am correct. We all have narratives about who we are; in general we are good (and competent) people. Otherwise you have a crisis. Even individuals that are considered evil by the majority usually feel justified in one way or another, this is the downside of the self-protection mechanism. (On a tangent, highly competent people usually have the skills to rationalize just about any behavior in any way, which is ironic with respect to the DK effect.)
I can't find the exact research papers that touches on this, but more broadly speaking this is supported by the inhibition of the default-mode network (DMN) of the brain by 5HT2A-targeting molecules like the tryptamine class. The DMN is where our narative exists - the "I", "me" and "my" - and it's inhibition results in the phenomenon of ego death, that is, the stepping outside of ones own perspective.
Meditation could also be a good start to such research...
Ram Dass (Richard Alpert) wrote in his book Be Here Now - his life was distinctively broken down into three stages: social sciences > psychedelics > yogi.
Strangely enough, I can totally relate to that...
It's way deeper than people realize.
The narrative you describe is like each one of us is a drop of water that fell out of the ocean. We think we are water but fail to realize the ocean is where we came from.
Our life's practice is to realize the illusion (maya) that separates us from the whole and seek the path to reunite with the ocean itself.
Hard work...
Oh man I love that book, such a creative work!
I haven't really seen it too much in programming. New devs have it for a little while when they first set up a wordpress blog or slap some assets together in to unity and think they are a pro game dev. But after about a year they start to work out what their actual skill level is. One thing I have seen is junior devs don't understand how hard a task is. Sometimes when I am explaining the task I am working/stuck on they will just say "Can't you just to x" and its quite hard to show why "just doing x" is a complex multi step process.
If it's not my money/job that depends on it: let them ramble on and let it be their problem.
If it is: no idea tbh