On the flipside, an easy-to-use and easy-to-search internet has also contributed to the greatest democratization of knowledge in human history. Making the web harder to search isn't going to...
On the flipside, an easy-to-use and easy-to-search internet has also contributed to the greatest democratization of knowledge in human history.
Making the web harder to search isn't going to magically teach people critical thinking skills. It will, however, put up a lot of artificial barriers that we've just barely torn down in the past couple of decades and good riddance to them.
I don't know the precise solution to a complex problem like fake news and conspiracy theories, but it's not forcing people to spend ten times as much time to find information. Because people are lazy, and information is ubiquitous; it's quality information that's scarce. The sort of people who get into antivaxx conspiracy theories are not going to think, "oh, Google is down, I guess I will go to the university library and ask a researcher to help me locate peer-reviewed articles on vaccines." They'll simply grab whatever information is available that confirms their existing worldview. On the other hand, people who are willing to go to an effort to verify information will at best be inconvenienced, at worst, will simply settle for "good enough" rather than "good" because even a critical thinker only has a limited amount of time and effort available for any given research task.
Agreed. I've said it here before, it's not the ease at which information can be gathered that's the problem, it's the ease at which it can be shared. While everyone is entitled to their opinion,...
On the flipside, an easy-to-use and easy-to-search internet has also contributed to the greatest democratization of knowledge in human history.
Making the web harder to search isn't going to magically teach people critical thinking skills. It will, however, put up a lot of artificial barriers that we've just barely torn down in the past couple of decades and good riddance to them.
Agreed.
I've said it here before, it's not the ease at which information can be gathered that's the problem, it's the ease at which it can be shared. While everyone is entitled to their opinion, they are not entitled to their own facts, and more importantly they are not entitled to a platform with unlimited reach. The barrier to entry should not be in searching, it should be in making your voice heard.
It's also worth noting, and well known in tech circles but perhaps not the populace at large, that Google caters results to your search history. This excerpt:
I had no idea what he meant by this, so I typed the following into a search box: “WHO PCR COVID test accuracy” ... and got back a tsunami of conspiracy and antivax propaganda.
...is a perfect example of that. If I google the phrase I have exactly zero conspiracy/antivax results. If you read conspiracy theories, even if you don't buy into them, you get conspiracy theories back.
I think everyone being entitled to their own opinion is part of the problem. I don't care for opinions, I care for well-reasoned arguments that draw on a wider body of knowledge (i.e. that cite...
I think everyone being entitled to their own opinion is part of the problem. I don't care for opinions, I care for well-reasoned arguments that draw on a wider body of knowledge (i.e. that cite other people, articles, studies, etc.). Paul Graham's essay Keep Your Identity Small resonates with me and I try to not have too many opinions, and when I do, I try not to get emotionally invested in them, just like I don't try to define my identity by the set of algorithms I have written to solve a Rubik's cube. You have a better/different algorithm? That's cool, I will use that and learn from it. I don't wanna be glued to my idea-babies.
I agree that technology alone is not a solution here. I think part of the problem is poor research skills. Some of the deeper conspiracy theory / anti-vax / QAnon / etc. bullshit I've seen maintains itself because people think they are 'doing the research' and finding multiple sources and building theories, they just don't understand you need to build multiple hypotheses, wield Occam's razor, and look for faulty reasoning (cf. Sagan's Baloney Detection Toolkit). I am reminded of this great article A Game Designer’s Analysis Of QAnon:
I stared in horror because it all fit so well. It was better and more obvious than the clue I had hidden. I could see it. It was all random chance but I could see the connections that had been made were all completely logical. I had a crude backup plan and I used it quickly before these well-meaning players started tearing apart the basement wall with crowbars looking for clues that did not exist.
These were normal people and their assumptions were normal and logical and completely wrong.
Which is why I frequently open a private/incognito window to search for a lot things, from the innocuous to the conspiratorial. I will also frequently use incognito to open articles that people...
If I google the phrase I have exactly zero conspiracy/antivax results. If you read conspiracy theories, even if you don't buy into them, you get conspiracy theories back
Which is why I frequently open a private/incognito window to search for a lot things, from the innocuous to the conspiratorial. I will also frequently use incognito to open articles that people send me. One if I'm feeling lazy and don't want to strip off the inevitable tracking URL params, but most importantly so that it doesn't contaminate my search history. DDG is great, but I still use Google quite a bit. I have no interest in getting search suggestions related to things I've looked at only a few times.
I'm seeing a lot of my enthusiasm for this kind of thinking crumble. There's always been wacky information on the internet (one of my first web memories is the "time cube"). But that used to be...
On the flipside, an easy-to-use and easy-to-search internet has also contributed to the greatest democratization of knowledge in human history.
I'm seeing a lot of my enthusiasm for this kind of thinking crumble.
There's always been wacky information on the internet (one of my first web memories is the "time cube"). But that used to be niche. You looked at that stuff and though, "yea, the internet enables all kinds of information so 0.5% will be stupid". But the people running the misinformation got ambitious. It's probably 10% or 20% of the web now and it learned how to disguise itself as reasonable enough to draw people in quicker. It shapes elections and health measures. The internet has become a real and active danger to society. In terms of impact, I believe this has almost become the primary force on the internet.
I don't quite the get article's headline, either. I don't think it has to do with making things harder to use. But it might work if we again make it harder to publish. Like, making a geocities site actually takes effort. Sending people the link actually takes effort. You have to learn basic HTML, you have to "sell" your site before people click it, etc. It's become too easy for a tweet to go viral or a youtube video reaching millions of people because it blows up in some abstract recommendation algorithm. I wouldn't mind a web in which it becomes way, way harder to reach the google front page or to land in youtube recommendations.
Public Libraries have filled the role of democratization of knowledge for over 100 years, and even had in-built fact checking and curation via educated librarians. People making the argument an...
Public Libraries have filled the role of democratization of knowledge for over 100 years, and even had in-built fact checking and curation via educated librarians. People making the argument an easy internet is necessary for free knowledge are either ill informed or have ulterior motives. If anything a difficult internet will strengthen our libraries that are struggling in the pandemic.
With regards to this, I wonder how much of what we see today is a result of most of America not being exposed to search engines and the Internet from an early age. Even my parents have suffered...
Making the web harder to search isn't going to magically teach people critical thinking skills.
With regards to this, I wonder how much of what we see today is a result of most of America not being exposed to search engines and the Internet from an early age. Even my parents have suffered from bad cases of confirmation bias with regards to things like 5G and vaccines (pre-pandemic). I see much less of this behavior in my peers (although it still exists - particularly with politics).
In my parents i'm seeing something even worse. They started as pro-green energy, climate change is real, vaccines are good pre-pandemic. and just yesterday they're unironically asking me "WHAT...
In my parents i'm seeing something even worse. They started as pro-green energy, climate change is real, vaccines are good pre-pandemic. and just yesterday they're unironically asking me "WHAT HAPPENED TO GLOBAL WARMING, GUESS ITS FIXED?!" because it's cold. Had to spend several hours with them showing them articles and scientific sources explaining averages and extreme weather associated with climate change. Last week I had to print and give a bunch of vaccine safety research to my mom when she was considering skipping her covid vaccine despite being high risk. It's freaky that the values they raised me on have somehow been erased from their brain and rewired to something else.
I feel like this is barely starting to get at the problem. I've been talking about this a lot since Trump has been elected into office, but the recent siege on the capital has only shown how deep...
I feel like this is barely starting to get at the problem. I've been talking about this a lot since Trump has been elected into office, but the recent siege on the capital has only shown how deep the threads run. Qanon may have found some people through poorly serviced search results, but many of these people were slowly converted through social media platforms and from their inability to tell fact from fiction.
But the problem isn't just in a human's ability to tell fact from fiction - the problem of misinformation is much more sinister. Techniques are only getting more sophisticated. AI can generate news or support the delivery of news. If a human programs an AI to create news that is designed to radicalize people, we have an entirely different problem.
This problem only gets worse when we start thinking of delivery mechanisms and the ability to generate content. We've already seen fake news outlets being created - radio stations which don't exist, websites mean to mimic the appearance of a local newspaper. Take an AI generated and slanted news source and have it then generate a webpage or attribute it to a news institution that doesn't exist, and you'll fool even more people.
Then you have secondary news sources - eyewitness accounts, twitter posts, videos uploaded to instagram. How many of these can be faked? If I purchase a botnet on the internet, I can have another computer anonymously generate this content to be cited by the fake news article and posted on the internet to deceive others. A botnet can retweet or like to add legitimacy.
Compound that with the recent emergence of deepfaked videos or other technology to make fake things which seem real? I'm hopeful at this point that you can see how this can radically spiral out of control to an extent where people will start to curate where they source their news, but even that is still susceptible to influence.
I'm not here to be alarmist or to suggest we can't figure a way out of this deep dark hole we are digging, but I think the idea that simply 'making the internet harder' is going to solve the problem is a very surface level thought. We need to devote a lot more money to research. We need to start holding companies more accountable for what they give a platform to. We need to start teaching everyone some basic skills on how to understand when they can trust a piece of information and when they should question it. We need so much more than I can even begin to comprehend and we needed it yesterday.
I’ve seen it with my own eyes. In 4 short years my parents went from being lightly political independents that leaned conservative into full blown QAnon believers. They aren’t stupid people,...
I’ve seen it with my own eyes. In 4 short years my parents went from being lightly political independents that leaned conservative into full blown QAnon believers. They aren’t stupid people, they’re an educated upper middle class suburban couple that went to church every other Sunday and special holidays and while we didn’t agree on politics often my mom used to be pretty apolitical and my dad was only barely a conservative. Now the crazy things they say and believe have consumed their lives and we can’t have any normal conversations anymore. You try to steer them away and you get slapped with “Your generation is completely brainwashed by the deep state/ big tech pedophiles!”
Critical thinking is something that is desperately needed to be taught in primary school.
We need to start teaching everyone some basic skills on how to understand when they can trust a piece of information and when they should question it. We need so much more than I can even begin to comprehend and we needed it yesterday.
Critical thinking is something that is desperately needed to be taught in primary school.
Ray Bradbury has a book that fits this to a scary extent. It’s The Martian Chronicles. Spoilers: TW suicide The quick summary of the parts that pertain to this is that there is life on Mars. It is...
Spoilers: TW suicide
The quick summary of the parts that pertain to this is that there is life on Mars. It is an advanced society and the beings there communicate telepathically. Because of this, people can share their imaginations with each other and create hallucinations that are indistinguishable from reality. The only way to tell between the two is a hallucination disappears when the martian creating it dies. So when a human spaceship lands and astronauts get off and say they are from earth, the Martians assume the astronauts are just crazy martians who are hallucinating a rocket ship. So they go to a Martian psychiatrist and the psychiatrist shoots them, assuming they are crazy and that once they are dead hallucinations disappear. When the men are dead and the rocket still exists, instead of realizing he was wrong and they were actually explorers from another planet, he kills himself because he thinks he has caught whatever disease made the “martians” hallucinate the rocket.
</detail>
Hey, I don't disagree but what is the relevance of your comment to the parent comment or the OP? I think you could in theory make a connection, but you're not. It'd also simultaneously by shifting...
Hey, I don't disagree but what is the relevance of your comment to the parent comment or the OP? I think you could in theory make a connection, but you're not. It'd also simultaneously by shifting the conversation away from the technical means being discussed to fix social issues manifesting on the internet. Talking about why it can't be fixed is important, but grand standing with very broad ideas of systemic change don't contribute to progress or discourse.
I’d say that these three cover > 90% of what can be considered “mis/malinformation” and is perpetrated by nearly every media source. Not always maliciously, of course.
false context: When genuine context is shared with false contextual information.
misleading content: Misleading use of information to frame an issue or individual.
false connection: When headlines, visuals, or captions don't support the content.
I’d say that these three cover > 90% of what can be considered “mis/malinformation” and is perpetrated by nearly every media source. Not always maliciously, of course.
Y’know… I was thinking of this the other day while taking a stroll. Started coming up for a model of trust and…uh…“truthiness.” Suffice it to say, it got kinda complex and circular. Might try to...
Y’know… I was thinking of this the other day while taking a stroll. Started coming up for a model of trust and…uh…“truthiness.” Suffice it to say, it got kinda complex and circular. Might try to sketch it out for shits. Though, it kinda ended up lookin like what we have now. I.e, a degree from an accredited university signifies likelihood of knowledge on subject in which degree was earned. higher level of degree and specialization is worth more trust in that particular specialization (though shouldn’t necessarily indicate more trust in general). a consensus of people with said degrees forming an opinion weights the “truthiness” of that opinion. etc. Of course, there are also “anti-trust” factors… like monetary interests and stuff.
Anyway, that was just a portion of what I was ponderin.
On the flipside, an easy-to-use and easy-to-search internet has also contributed to the greatest democratization of knowledge in human history.
Making the web harder to search isn't going to magically teach people critical thinking skills. It will, however, put up a lot of artificial barriers that we've just barely torn down in the past couple of decades and good riddance to them.
I don't know the precise solution to a complex problem like fake news and conspiracy theories, but it's not forcing people to spend ten times as much time to find information. Because people are lazy, and information is ubiquitous; it's quality information that's scarce. The sort of people who get into antivaxx conspiracy theories are not going to think, "oh, Google is down, I guess I will go to the university library and ask a researcher to help me locate peer-reviewed articles on vaccines." They'll simply grab whatever information is available that confirms their existing worldview. On the other hand, people who are willing to go to an effort to verify information will at best be inconvenienced, at worst, will simply settle for "good enough" rather than "good" because even a critical thinker only has a limited amount of time and effort available for any given research task.
Agreed.
I've said it here before, it's not the ease at which information can be gathered that's the problem, it's the ease at which it can be shared. While everyone is entitled to their opinion, they are not entitled to their own facts, and more importantly they are not entitled to a platform with unlimited reach. The barrier to entry should not be in searching, it should be in making your voice heard.
It's also worth noting, and well known in tech circles but perhaps not the populace at large, that Google caters results to your search history. This excerpt:
...is a perfect example of that. If I google the phrase I have exactly zero conspiracy/antivax results. If you read conspiracy theories, even if you don't buy into them, you get conspiracy theories back.
I think everyone being entitled to their own opinion is part of the problem. I don't care for opinions, I care for well-reasoned arguments that draw on a wider body of knowledge (i.e. that cite other people, articles, studies, etc.). Paul Graham's essay Keep Your Identity Small resonates with me and I try to not have too many opinions, and when I do, I try not to get emotionally invested in them, just like I don't try to define my identity by the set of algorithms I have written to solve a Rubik's cube. You have a better/different algorithm? That's cool, I will use that and learn from it. I don't wanna be glued to my idea-babies.
I agree that technology alone is not a solution here. I think part of the problem is poor research skills. Some of the deeper conspiracy theory / anti-vax / QAnon / etc. bullshit I've seen maintains itself because people think they are 'doing the research' and finding multiple sources and building theories, they just don't understand you need to build multiple hypotheses, wield Occam's razor, and look for faulty reasoning (cf. Sagan's Baloney Detection Toolkit). I am reminded of this great article A Game Designer’s Analysis Of QAnon:
Which is why I frequently open a private/incognito window to search for a lot things, from the innocuous to the conspiratorial. I will also frequently use incognito to open articles that people send me. One if I'm feeling lazy and don't want to strip off the inevitable tracking URL params, but most importantly so that it doesn't contaminate my search history. DDG is great, but I still use Google quite a bit. I have no interest in getting search suggestions related to things I've looked at only a few times.
I'm seeing a lot of my enthusiasm for this kind of thinking crumble.
There's always been wacky information on the internet (one of my first web memories is the "time cube"). But that used to be niche. You looked at that stuff and though, "yea, the internet enables all kinds of information so 0.5% will be stupid". But the people running the misinformation got ambitious. It's probably 10% or 20% of the web now and it learned how to disguise itself as reasonable enough to draw people in quicker. It shapes elections and health measures. The internet has become a real and active danger to society. In terms of impact, I believe this has almost become the primary force on the internet.
I don't quite the get article's headline, either. I don't think it has to do with making things harder to use. But it might work if we again make it harder to publish. Like, making a geocities site actually takes effort. Sending people the link actually takes effort. You have to learn basic HTML, you have to "sell" your site before people click it, etc. It's become too easy for a tweet to go viral or a youtube video reaching millions of people because it blows up in some abstract recommendation algorithm. I wouldn't mind a web in which it becomes way, way harder to reach the google front page or to land in youtube recommendations.
I agree with everything you said and wanted to just put in here "holy cow fuuuuuuuuuuuuccccccckkkkkk recommendation algorithms"
Public Libraries have filled the role of democratization of knowledge for over 100 years, and even had in-built fact checking and curation via educated librarians. People making the argument an easy internet is necessary for free knowledge are either ill informed or have ulterior motives. If anything a difficult internet will strengthen our libraries that are struggling in the pandemic.
With regards to this, I wonder how much of what we see today is a result of most of America not being exposed to search engines and the Internet from an early age. Even my parents have suffered from bad cases of confirmation bias with regards to things like 5G and vaccines (pre-pandemic). I see much less of this behavior in my peers (although it still exists - particularly with politics).
In my parents i'm seeing something even worse. They started as pro-green energy, climate change is real, vaccines are good pre-pandemic. and just yesterday they're unironically asking me "WHAT HAPPENED TO GLOBAL WARMING, GUESS ITS FIXED?!" because it's cold. Had to spend several hours with them showing them articles and scientific sources explaining averages and extreme weather associated with climate change. Last week I had to print and give a bunch of vaccine safety research to my mom when she was considering skipping her covid vaccine despite being high risk. It's freaky that the values they raised me on have somehow been erased from their brain and rewired to something else.
I feel like this is barely starting to get at the problem. I've been talking about this a lot since Trump has been elected into office, but the recent siege on the capital has only shown how deep the threads run. Qanon may have found some people through poorly serviced search results, but many of these people were slowly converted through social media platforms and from their inability to tell fact from fiction.
But the problem isn't just in a human's ability to tell fact from fiction - the problem of misinformation is much more sinister. Techniques are only getting more sophisticated. AI can generate news or support the delivery of news. If a human programs an AI to create news that is designed to radicalize people, we have an entirely different problem.
This problem only gets worse when we start thinking of delivery mechanisms and the ability to generate content. We've already seen fake news outlets being created - radio stations which don't exist, websites mean to mimic the appearance of a local newspaper. Take an AI generated and slanted news source and have it then generate a webpage or attribute it to a news institution that doesn't exist, and you'll fool even more people.
Then you have secondary news sources - eyewitness accounts, twitter posts, videos uploaded to instagram. How many of these can be faked? If I purchase a botnet on the internet, I can have another computer anonymously generate this content to be cited by the fake news article and posted on the internet to deceive others. A botnet can retweet or like to add legitimacy.
Compound that with the recent emergence of deepfaked videos or other technology to make fake things which seem real? I'm hopeful at this point that you can see how this can radically spiral out of control to an extent where people will start to curate where they source their news, but even that is still susceptible to influence.
I'm not here to be alarmist or to suggest we can't figure a way out of this deep dark hole we are digging, but I think the idea that simply 'making the internet harder' is going to solve the problem is a very surface level thought. We need to devote a lot more money to research. We need to start holding companies more accountable for what they give a platform to. We need to start teaching everyone some basic skills on how to understand when they can trust a piece of information and when they should question it. We need so much more than I can even begin to comprehend and we needed it yesterday.
I’ve seen it with my own eyes. In 4 short years my parents went from being lightly political independents that leaned conservative into full blown QAnon believers. They aren’t stupid people, they’re an educated upper middle class suburban couple that went to church every other Sunday and special holidays and while we didn’t agree on politics often my mom used to be pretty apolitical and my dad was only barely a conservative. Now the crazy things they say and believe have consumed their lives and we can’t have any normal conversations anymore. You try to steer them away and you get slapped with “Your generation is completely brainwashed by the deep state/ big tech pedophiles!”
It’s absurd, sad, and incredibly frustrating.
Critical thinking is something that is desperately needed to be taught in primary school.
Many things are taught in primary school. Most is forgotten.
But critical thinking isn’t taught in most American elementary schools, nor even many high schools.
Ray Bradbury has a book that fits this to a scary extent. It’s The Martian Chronicles.
Spoilers: TW suicide
The quick summary of the parts that pertain to this is that there is life on Mars. It is an advanced society and the beings there communicate telepathically. Because of this, people can share their imaginations with each other and create hallucinations that are indistinguishable from reality. The only way to tell between the two is a hallucination disappears when the martian creating it dies. So when a human spaceship lands and astronauts get off and say they are from earth, the Martians assume the astronauts are just crazy martians who are hallucinating a rocket ship. So they go to a Martian psychiatrist and the psychiatrist shoots them, assuming they are crazy and that once they are dead hallucinations disappear. When the men are dead and the rocket still exists, instead of realizing he was wrong and they were actually explorers from another planet, he kills himself because he thinks he has caught whatever disease made the “martians” hallucinate the rocket. </detail>Hey, I don't disagree but what is the relevance of your comment to the parent comment or the OP? I think you could in theory make a connection, but you're not. It'd also simultaneously by shifting the conversation away from the technical means being discussed to fix social issues manifesting on the internet. Talking about why it can't be fixed is important, but grand standing with very broad ideas of systemic change don't contribute to progress or discourse.
I’d say that these three cover > 90% of what can be considered “mis/malinformation” and is perpetrated by nearly every media source. Not always maliciously, of course.
Who should we be making the internet harder to use for? Companies and institutions, or individuals?
Y’know… I was thinking of this the other day while taking a stroll. Started coming up for a model of trust and…uh…“truthiness.” Suffice it to say, it got kinda complex and circular. Might try to sketch it out for shits. Though, it kinda ended up lookin like what we have now. I.e, a degree from an accredited university signifies likelihood of knowledge on subject in which degree was earned. higher level of degree and specialization is worth more trust in that particular specialization (though shouldn’t necessarily indicate more trust in general). a consensus of people with said degrees forming an opinion weights the “truthiness” of that opinion. etc. Of course, there are also “anti-trust” factors… like monetary interests and stuff.
Anyway, that was just a portion of what I was ponderin.