86
votes
Psychologists at the University of Cambridge developed a Misinformation Susceptibility Tests. What's your MIST score?
Link information
This data is scraped automatically and may be incorrect.
- Title
- Misinformation Susceptibility Test (MIST)
- Word count
- 368 words
This doesn't make sense to me. I thought this was supposed to tell me how susceptible to fake news I am? How can they possibly determine that just by looking at headlines?
This feels to me more like a: do you already believe in incorrect information. A better test would be to use news headlines and articles that are all "fake", but where some are sensationalized, make sweeping claims without evidence while others are... you know, written credibly.
I'm disappointed in these psychologists
It seems like it's a test for just how amazingly gullible and detached from reality a person is, the headlines are clearly split between "Sounds like something that could happen" and "Downright mental", at least they were for me. I got "so and so appoints minister to fight corruption" vs "the government is controlling the weather!"
The thing is, misinformation is rarely so obvious and in-your-face. I'm sure a more modest headline, from a source I'm partial towards, that aligns with my views and misrepresents some research will have me quoting it in arguments in no time.
Not only that, but a medium could have 100% entirely true headlines, yet still misinform readers (e.g., by over reporting X and under reporting Y until the audience begins to believe X is more common than Y).
There weren't any instructions against it, so I assumed participants are allowed to use external sources to verify the veracity of dubious headlines. When you're uncertain about something, you should take steps to verify it. So I can see how their test may be able to measure susceptibility to misinformation, with the caveat that this requires participants to have access to trustworthy sources.
I thought for sure that I'd auto failed immediately when I hit "accept" without actually reading their privacy policy.
I kept hitting "Submit" and nothing happened so I immediately thought "ah shit this isn't an actual study from the University of Cambridge is it? I just fell for it like a sucker."
I just read the comments here and didn't even try the test. Does the test even exist? Maybe I just failed because I trusted you all...
Same thing happened to me, after about 3 mins of waiting the question eventually loaded up and then when it came to submitting my answers it wouldn't work either...
I was debating this as well and came to the same conclusion that you could use external sources as that is what you would do irl generally if you weren’t sure of something however the “It only takes 2 minutes!” line at the start of the survey made me question that even more because I feel like completing it in 2 minutes only applies when your not looking up sources or at least not doing so for most of the questions. A little clarity by the researchers definitely wouldn’t have gone astray.
Another tilder said there were no rules against googling information from the test, and if the test is about gullibility, to research something you don't know makes you less gullible, right?
Not necessarily. It might mean you are less lazy, but it's possible to be lazy without being gullible ("I don't know if that's true or not, but I don't care enough to look into it"). If the test fails to account for that — for example, by not having a neutral option or the opportunity to skip questions — then the results will be less accurate.
I could say someone curious would qualify as less lazy, and I'm pretty sure someone as curious as to research little trivia like that, is not gullible. And the lazy one, by choosing to believe (the test doesn't have a neutral option, so you have to choose a side) whatever is presented to them, becomes gullible. So the test is to identify how much curiosity motivates you to know more about anything, even things you don't care too much about.
Alas, there are plenty of very curious people who are nonetheless highly gullible. They are the ones we see getting caught up in increasingly elaborate woo and conspiracy theories — and telling the rest of us that we are skeptical only because we haven't done as much research as them. It's true that most skeptics likely have not; I certainly know far less about the supposed synergistic relationships of various celestial bodies than an astrologer does.
My assessment of that one was that there isn't any leading conclusion in the headline and there's not an obvious group/influence that would drive someone to make up something like that, so if reported it's likely true.
A better question I guess would have been is this misinformation or not — to that question the Hyatt headline doesn't suggest anything or drive active misinformation so the answer might be clearer.
It's really more a guess on what headlines are accurate and what not. And while first impressions are important they really don't say how susceptible you are eventually. Nothing about crosschecks when wondering.
I also like how the psychologists only give a liberal-conservative political spectrum, while also giving the option for other countries. All the while being a socialist is not uncommon in many other countries. Not too mention that's really not a complete spectrum even for the US imo, and the articles are mostly US-centric, apart from one mention in the EU.
If you're going to research specific to the US that's fine, but then make it US-only.
This seems like the type of study that will end with results stated under some future click-baity headline reading "Cambridge study shows X is better at detecting Fake News than Y."
Over a week late here, but it's noteworthy that the very website where you take this test makes the claim that the study shows young people are worse at identifying fake news, so we're already getting there.
The site seems to be under heavy load, i filled the test but it won't submit
edit: just gotta let the site buffer for a little bit, got 14/20
https://i.imgur.com/uYWwXaU.png
Honestly doesn't seem to be a good test at all, it was pretty obvious to see what answers were "supposed" to be correct and which were "supposed" to be labelled fake. Seems more like an high school project than a psychology test, pretty bummed.
How come you say the answers seemed obvious and yet only got 2/3 correct?
Personally, I think it seems like a great test. ;)
Because i threw a couple of oddball answers
The headline "the military complex is in charge of controlling the news" or something like that is obviously supposed to be fake news, but then you remember that stuff like this happens https://youtu.be/hWLjYJ4BzvI and you're not so sure anymore
There is a difference between some elements of the military industrial complex influencing the news and them being in charge. I thought this was more of an exercise in recognizing precise language.
When I left the page to come back I didn’t realize I’d lose my results, so I redid it. This time I got 19/20, but the first time it said the same line as yours so it must’ve been 20/20 then. Somehow I got worse! I thought I hit the same answers, heh.
Personally, I feel like it wasn’t very clear whether sensationalized headlines counted as fake news, so I went ahead with the obvious fake headlines and got 19/20
I agree the "fake news," option isn't great. There's actually a difference between misinformation and fake news (stories) to me at least. I'd consider sensationalized headlines to be often misinformation even if the news isn't fundamentally "fake."
It seems like it was written with a particular type of "fake news" person in mind.
Also, kinda vague on what the definition of fake is, varying from "that never happened" to "that is worded very misleadingly"
I think this stems from an awareness of reality right now. There's a strong partisan correlation with misinformation, so being aware of how that correlation presents makes you less susceptible to fake news, as simple as it sounds. I also wonder to what extent a psychological test of gullibility regarding news headlines could exist in a vacuum, sans politics. At the very least, in the real world, it's largely only relevant in the context of the current media/political climate. Ultimately, I agree that this test is easy, and feels almost juvenile, but maybe that's the point? Just a thought.
For me it doesn’t load at all
Likewise. Must be the 'tildes hug of death'...
I got 19/20 so to you and me, yeah it seems obvious. But the thing is, these are "real" headlines that a lot of people would fall for unfortunately. I tend to be pretty in tune with the sensationalization of headlines and taking them with a grain of salt. Not everyone is, and a lot of people take these sensationalized and misleading headlines as facts.
I also got 19/20 and yes... the point is a lot of people AREN'T as good at discerning which kinds of headlines are truthful, and would get lower scores. If it seems easy it just means you're doing a good job of discerning. I do find it funny that the OP above got 1/3 of the test wrong and said it was too easy.
Exactly. I scored the same. I know people who have said either some of the same things or stuff directly adjacent. The bar is through the floor and underground somewhere.
Part of what guided me through the test was just thinking about what was more likely to come out of some specific folks' mouths and I got it lol
Most Cambridge stuff is actually an entire school level behind where you would expect based on their name/brand.
Don't get me started on what a joke their AS/A level curriculum is compared to AP courses.
Did we just invent the Tildes hug of death?
Perhaps a Tildes Wave?
So a Tildes tilde?
This is very useless because 99% of people who actually are susceptible to misinformation and need to do this will probably never land on this test page, they'll be too misinformed already to do that! Preaching to the choir if you will.
This is a study, probably with the goal of writing a paper on the results. It's not an attempt at educating the subjects. That part is tacked onto the end as an easy way to get people to take the survey.
It might lead to biased data; I'm not convinced. I expect more of a bimodal distribution between people who can reliably tell the difference and people who think they can and will get mad at the researchers when they get a bad score.
This is most likely used in controlled settings in addition to data gathered from the test being publicly available.
18/20:
I don't think "fake" was well-defined enough. There's a whole spectrum from "literally nobody ever wrote this story" to "technically true but misleadingly worded" and "fake" is too colloquial to point to a specific threshold on that scale. It didn't tell me which ones I got wrong, but it wouldn't surprise me if they were the ones that sounded too close to that line.
I’m the opposite of you. Real news detection 60%. Fake news detection 100%. The test concluded I might be a bit on the skeptical side -4. But I think they are probably lying liars.
I'm the same as you, but I don't believe your results so maybe we're not the same
Touché!
20/20. I don't even read most of the news if I can avoid it.
Information literacy really needs to be taught in schools.
Anyway my favourite part was that the 'age' slider went to 130.
The real question is, would an article headline about that woman discussing her age get flagged by users doing this MIST assessment as fake news?
Yep 20/20.
Seems like a weird test to judge veracity of just headlines. But then again, I got 20/20 so my confirmation bias says that it is a fantastic test and I am fantastic.
I just Fake or not'ed based on what I felt like had actually happened in the last decade or so ish.
I got 19/20 and my social group seems to do above average. So idk if it's just easy or what.
More information here: https://www.cam.ac.uk/stories/mist
Funny that the respondents were most divided over "Government Officials Have Manipulated Stock Prices ...". I spent the longest staring at that one too. Like, no doubt, but I can't imagine an headline that broad and assertive actually running.
The headline presented was more than that though. The question wasn’t “have politicians manipulated the stock market?” It was “have they manipulated it to hide scandals”
It’s the “hiding scandals” part of that that’s important. Politicians absolutely influence stock prices. Laws affect markets and politicians make laws. The line between influence and manipulation can be blurry. If that was the whole headline it would be fairly well impossible to answer. The misinformation is that it’s happening, or even has ever happened, in an attempt to hide a scandal.
The headline stinks of Fake because of its conspiratorial bent and how it is presented. Like others said it touches upon an issue where yes clearly politicians have been fucking around and insider trading and enriching themselves but politicians, personally manipulating stock prices to hide scandals is way too out there, it's one of those "neat" conspiracy theories fitting a Sunday morning cartoon, oversimplifies how the world works and makes no goddamn sense.
I guess it depends somewhat on the country one lives in. In an authoritarian country with state controlled news and media, people would score lower (I assume).
At the same time many of the questions mention democrats and republicans, which seems very us-centric. I think this is aimed at us citizens.
20/20, a lot that I flagged as real I had no idea if they were actually true, but were written the way factual journalism would be. I think it was really more about identifying the conspiratorial ones.
I ran into loading issues after I hit submit but wanted to point out that only giving two choices is reflecting exactly the problem with misinformation: overconfidence of truthfulness based on little evidence. On many of these given only a headline my real answer would be: I'm not sure, I'd have to dig a little deeper.
EDIT: Based on the answers provided by someone here I think I got 19/20. Here's an example of one that I said fake, but really I'd need more details to make a conclusion: "Government Officials Have Manipulated Stock Prices to Hide Scandals". Which officials, which stocks, and which scandals? Some small scale manipulations can happen so I can't dismiss this out of hand without the details.
Yes, indeed, I think it also misses the complexity of the nature of misinformation. We live in a world where outlandish sounding but true headlines are incentivised as click bait, actively playing on and abusing our kind of internal truthiness litmus that this test seems to be measuring for. Further, we live in a world where misinformation will be obscured by reporting that ‘x think tank says they did a study that says y’ - there is no problem with the truth value of these sentences, and yet the template is often intended to push a biased and largely baseless view, or even complete untruth
Fascinating. Should have gotten 20/20 but I misread this
and thought it said countries and not counties. I thought it was playing into some crazy replacement theory bullshit.
That said, I found the quiz, outside of a my misreading, incredibly easy. So many of the factual articles were just reports from Pew with incredibly innocuous titles. I suppose that's the point, like how scammers come up with the craziest sounding scams to root out anyone with even the slightest skepticism, resulting in a pool of gullible whales.
19/20... but frankly speaking, many of them felt like they were too ambigious to determine. Like, if I was interested in casting judgement, I would have had to actually read the article, and the source it was attached to.
16/20.
I would have liked to know what I missed. Headlines are not always enough to judge, but I guess that’s the point of the test.
Here are the answers if you'd like to check (used R/F for real/fake):
Oh thanks!
I missed Hyatt, Eye Color and Intelligence, Manipulated Stock to Hide Scandals, not sure on the fourth one. I would definitely have checked the source of the headline for those first three to help determine their legitimacy if I encountered them in the wild.
Thanks for sharing! I missed the stock prices, attitudes towards EU, and the non-governmental organizations (the last two felt like propaganda to me), along with a fourth I can't remember.
Seems like a lot of this experiment goes out the window just by telling the person their being tested on what's fake or not. It primes people to look harder at the headline than most would in their day to day lives, I think.
This study is only looking at headlines, not how to sneak lies into people.
When studying things you generally try to remove as many confounding factors as you can in the study design.
In real life things like the publishing media, time of day, your sleepiness and stress levels, your distraction levels etc. all will play a part in how well you distinguish real from fake.
I earned a score of 17/20. I'm quite disappointed in myself, frankly. I missed these three:
I want to improve my BS detection skills, but I'm not sure how to do it.
Editing because I think it's more useful if I include the breakdown of the results:
For me, the marijuana headline wasn’t trying to push a viewpoint and sounded like it was reporting factual information. “Left-wingers” was the indicator for me for the second, plus lying for higher salary doesn’t seem like something that would be influenced by political viewpoints.
I also missed the Hyatt question, I thought it sounded a little like “they’re takin’ our shampoo bottles!” but I should have treated it like the marijuana question: no clear bias indicated in the title so probably factual.
Generally if a headline tries to elicit an emotional response, I am skeptical.
Also, every single one that said "new study" was fake.
To be fair, you only get part of the picture on this test. In the real world you would also be considering the source and hopefully also the contents of the article.
Got 20/20.
Honestly it's probably more of a sign I read too much of the news... the fact that I actually remembered the Morocco one is a bit disturbing.
Also 20/20 and I hadn’t heard that one. I marked it real on the basis that it’s almost inconceivable any nation would not have someone whose job is to address poverty. Doesn’t say anything about how effective they are, what resources and power they have, how seriously anyone takes them, and so on. I just took it has “Morocco has a poverty person in government” and, well, duh.
I got an 19/20 but I'm scared to know what the average is...
Going off this thread and including my score of 20/20, it looks like the average so far is 18.7/20. I think @de_fa may have had a point about the test being easy. I mean, we've got seven perfect scores already.
These comments are likely biased, as people scoring high are more likely to post about it.
Not to mention the demographics of people on this site are highly biased, just due to the nature of the invite system and where those invites are shared. Only very specific types of people will be on this site.
Fair and undoubtedly true, and I really should've realized that. Still, that said, per the source:
...I wouldn't be terribly surprised if this also skewed the results. AI tends to have a certain flavor – however faint – to what it writes that tends to stick around, from what I've seen.
I got 18/20 and I see the average here seems to be around the same. My method was to simply assume any headline trying to push an agenda with emotional manipulation was probably fake.
14/20 although I feel I should have done better
I got 20/20. That supposedly means I'm "more resilient to misinformation than 96% of the US population!" (I wonder how I would compare to my fellow Aussies?)
But it's a very dodgy study. Some of the headlines which are actually fake news could just be overly sensationalised headlines for real news.
For example, "The Government Is Knowingly Spreading Disease Through the Airwaves and Food Supply" could be a legitimate report about some public health measures that the government is failing to take despite scientific reports showing that X and Y are dangerous to consumers - but some over-excited sub-editor decided to glam up the title to make it more clickbait-y.
Also, "Certain Vaccines Are Loaded with Dangerous Chemicals and Toxins" could be a factually true statement. For instance, some vaccines are made using dead virus particles, which could be a considered a toxin by an overly pedantic person. Even the egg albumen used to grow some flu vaccines could be a considered a dangerous chemical in the context that some people are allergic to eggs (hence the question about egg allergies on consent forms for flu vaccines).
So, I did some meta-guessing and figured the people running the study were probably putting obvious examples of fake news headlines and real news headlines in their test, rather than ambiguous headlines, to make things easier on the people taking the test.
19/20, got the small bottles of shampoo one wrong.
It was really fun to do. It shows how much some of these are worded that muddle the pool a bit for actual news or information.
The feel when you are already skeptical of the test itself being real. Did I win? 😩
I got 19/20 and my inner perfectionist is furious that I don’t know which one I got wrong.
Same. The one I got wrong was "Reflecting a Demographic Shift, 109 US Counties Have Become Majority Nonwhite Since 2000." Which while it is a factually correct headline I perceived it as attempting to subtly spread white replacement theory.
Someone posted the answers above and I got the same one wrong for the same reason. It might be technically true, but they’re leaving an awful lot out of that headline (how many counties are there in the US, for example? Must be thousands, right?). To my mind manipulated stats are also fake news, but maybe my definition is incorrect
Edit: 3142 counties in the US, so 109 of those is about 3%
It's a headline that had I known the source I would have definitely marked it as true. That same headline from Pew Research is much different than if I saw it coming from Fox News.
That’s a really good point. We make decisions based on a lot more than just the words in the headline - the source, as you say, and who shared it (in the case of seeing it shared on social media or whatever) and even what other news is happening at that time all help us decide whether to trust the story
17/20
MIST score: 18/20
100% ability to detect fake news, but 80% ability to detect real news. “A bit skeptical.” I’ll take it.
I got 20/20, but I agree with what beenrak said, this doesn’t seem like a very well designed study.
19/20, better than 90% of americans. Fucking yikes if that's true, hard to believe 9 in 10 people do worst than that when it all seems so obvious. Then again, nearly 50% of the population elected the orange man, so I'm not suprised.
Don't put too much stock into it.
I got 15/20 simply because some of them may not be true right now but definitely could be if a report appeared.
Consider the "Government Officials Have Manipulated Stock Prices to Hide Scandals" question. It's fake, according to this site, but it's not so outlandish that it can't happen (it's just a crime and people have been jailed for stupider things). Based on that I went "Sure, could be real". In an actual article with that title I would read the article and source before claiming it to be true or false.
We don't really have any information besides the headlines to prove whether or not it's true. On almost all of the dubious claims I would've liked to see the source before blanket-stating true or false. That wasn't possible so I just clicked based on if it could be true. Aside from the shampoo bottles, there wasn't a single one that I would say is definitively true. I simply couldn't verify any one of them.
It's a bad test.
That headline seems incredibly clickbaity to me, which instantly makes me doubt it. Seems to me you're too trusting if anything, maybe you need some more skepticism?
Considering the panama papers and multiple other leaks I'm going to go with a healthy dose of pessimism.
I'm not saying the title was true, there was no way to verify it after all, but none of the titles were verifiable. All I'm saying is that it could be true. If someone has sufficient proof of financial meddling done by government officials I would believe it, I simply don't think it's too far fetched to consider the possibility it's true.
Conversely, the headline "Ebola Virus 'Caused by US Nuclear Weapons Testing', New Study Says" will most likely never be true. It's too outlandish.
It sounds like you think I thought the headline was an absolute true statement, which I never thought it was. But correct me if I'm wrong about that assumption.
It's funny, the test actually said I should be less skeptical.
16/20. But I know that I can be totally gullible at first blush. I usually check and double check information before I take it as truth. I was so tempted to Google some of these headlines to be sure, but that would be cheating.
So I've been curious about this study and started digging into it last night.
There is a more controlled study and journal article detailing the researchers aim to develop a unified framework to gauge susceptibility misinformation. I'll admit, I'm unable to fully grasp specifics of the method and findings but I'm very interested in their objectives and the analysis gap they are trying to fill.
I'm tempted to contact the Dr who conducted the study and try to invite them for a small AMA. Would anyone be interested?
Ok, I'm not sure on this one. I've marked it as fake, because I'm pretty sure there isn't a causal link; but I'm also pretty sure you'd find a correlation, even if you didn't look too hard. Brown eyes are prevalent in most non-white countries, I'd think. Those countries are generally less wealthy, and thus have less access to education. Ergo they score lower on IQ tests. Complete BS obviously, eye color didn't have anything to do with that, but there's a blueprint for a study actually proving that link. The headline wouldn't even be wrong or fake, really.
The story around the test's development and their data is more interesting than the test.
https://www.cam.ac.uk/stories/mist
There's also a corresponding paper that backs up that this is legitimate research: https://link.springer.com/article/10.3758/s13428-023-02124-2
The way this was posted feels a bit like a "Which Harry Potter Character are You?" chain letter gimmick thing. It's a bit more nuanced and worthy of some additional thinking.
That was really interesting, and fun.
20/20 for what it's worth.
I probably read way too much news.
20/20 so I guess if anyone has any questions about the real world I can tell you what the real true true is. /s
https://i.imgur.com/imIUMOw.png
Got 20/20, 1 or 2 of them I could've gone either way.
Yeah same, I felt like a couple weren't fake news in that they were completely made up, but that the headline was misleading. I labeled these suspected misrepresentations as fake news which seems to be what they were looking for.
20/20
I spent a bit of time on each trying to decide if the headline was completely accurate, but then I hit the "fake news" headlines and was like "oh, no, they're really in your face".
After that I went down the line with immediate "is this completely ridiculous?" answers.
Those who beleive in Conspiracy and don't trust govt will do bad in this test.
20/20
The only one I was unsure of was: One-in-Three Worldwide Lack Confidence in Non-Governmental Organizations
That number just seems really low to me in our modern world, and I also thought, "Hell, I don't trust the Governmental ones, either." Still, went ahead and answered Real, because I figured not everyone was as untrusting of institutional authority as I am. Seriously though, people, you gotta pump those numbers up, those are rookie numbers in this racket.
18/20
Veracity Discernment: 80% (ability to accurately distinguish real news from fake news)
Real News Detection: 90% (ability to correctly identify real news)
Fake News Detection: 90% (ability to correctly identify fake news)
Distrust/Naïvité: 0 (ranges from -10 to +10, overly skeptical to overly gullible)
Not perfect, but good enough!
16/20
Veracity Discernment: 60%
Real News Detection: 90%
Fake Need Detection: 70%
Distrust/Nativity: +2
I dunno if that's good or not haha
📈 Your MIST results: 20/20
Veracity Discernment: 100% (ability to accurately distinguish real news from fake news)
Real News Detection: 100% (ability to correctly identify real news)
Fake News Detection: 100% (ability to correctly identify fake news)
Distrust/Naïvité: 0 (ranges from -10 to +10, overly skeptical to overly gullible)
🔥
Not even going to bother with this. I'm like 10 questions in and half of these take a nuanced topic and word it so vaguely that it's difficult to answer. You'd think MIT could come up with something not so poorly thought out.
This is the most ridiculous "test" I have ever seen. It's literally pointless in its current form.
18/20, I'm pretty sure the only two I got "wrong" were the ones about the MIC controlling the media and the politicians manipulating the stock market.
I can't wait for this study to be used in "real" news about how white men in their 20s are super smart and good at detecting fake news!
Anyone crying about this test being bad or whatever really missed the point. No one should get below a 20/20. The "fake news" headlines are blatantly sensationalized. The "real" headlines are measured. If you can't tell why the headlines are accurate or sensationalized, that's the point. You can't tell. That's something you should work on.