I'm a bit conflicted on the thrust of this article, which is basically an exhortation to scientists to avoid airing doubts and dirty laundry out in public for fear of it being weaponized by...
I'm a bit conflicted on the thrust of this article, which is basically an exhortation to scientists to avoid airing doubts and dirty laundry out in public for fear of it being weaponized by climate denialists or anti-vaxxers or whatever.
I'm conflicted because I think she's probably right. But throughout the 90s and early 2000s I vividly remember when the economics and political science establishments functionally adopted exactly this posture about "globalization," privatization, and liberalization of trade. We recognize today that this was kind of a disaster and led to severely downplaying the downsides of these policy choices. Of course, that gag order was the result of some pretty obvious pecuniary interests, but it was often accompanied by what I perceived as a kind of fear that any critique of it would open the door to the bad old days of Import Substitution and failed state centered industrial policy.
I'm gonna get nervous about any call to keep your doubts to yourself lest the plebs hear. There's something really worrying about this kind of 'closure' of the public sphere. It doesn't say anything good about our ability to make collective decisions as a society.
I struggle with this idea constantly because nearly all scientific journalism I read involves at least some level of misinterpretation. Often times it's not purposefully misconstrued, but people...
I struggle with this idea constantly because nearly all scientific journalism I read involves at least some level of misinterpretation. Often times it's not purposefully misconstrued, but people like to jump to conclusions when they aren't intimately familiar with the right framework they need to constructively approach scientific studies or don't know how to fully interpret the findings of a paper. We've all encountered and recognize this in some fashion, as we seem to curing cancer every week and every food is now secretly a vector for cancer, poor health, and yet paradoxically also the super food you've never heard of.
However, I feel this is part of a larger narrative around information and trust (or perhaps more accurately sorting information from disinformation and when to trust information from a source) that our society and the world are struggling with on many levels. Seeing how many people out there will still maliciously spread misinformation and disinformation in order to further political gain, to espouse or proselytize their own thoughts on the world, and to wield power over many, I think that we need to focus our efforts on the misinterpretation and teaching people how to sort, rank, and quantify sources of information rather than attempting to change how we present the data at its source. The intentions of the scientists which create these papers is to share information in the scientific context, and fully understanding just how much weight we should be ascribing to their findings is an important part of explaining a paper as the people reading it did not go through the process of this research and do not have a full understanding of all the externalities and considerations absent this information. While I believe they should do their best to help reduce others from misinterpreting their data, ultimately someone taking something that they don't understand and spreading that information is at the fault of that someone, and not the researchers.
As a society we're starting to grapple with and train people on how to recognize when things shouldn't be shared on the internet or when a source is questionable. Obviously we haven't figured out how to train everyone, as we have a huge reckoning right now with trying to undo the brainwashing that qanon underwent, trying to educate people about climate change, prevent grandma from sharing that spicy article about <insert relevant political or scientific issue here>, reaching out to people headed down the path of inceldom or other problematic fringe groups, etc. There is a lot of active science on understanding how to tackle these problems, and the ones which are starting to arise from the emergence and dominance of AI and ML in absorbing, spreading, and modeling information. In my eyes things are likely to get worse, before they get better, but censoring the science that gets misinterpreted seems quite shortsighted and likely to backfire to me given how wildly information is being wielded in recent years. I mean, what's to stop them from publishing an article claiming that scientists have doubts or dirty laundry whether it's true or not? The correct path to me seems like some sort of public education - training people to recognize who to listen to and when, or how to get trained to interpret information for themselves even though we've pretty much always been very bad at doing this. If we don't find an appropriate framework for people to absorb data and information, we're going to live in a very chaotic world and we aren't going to solve any of the problems the author or you bring up.
However, I also recognize that I'm not an expert in this field and there's likely many merits behind what the author is claiming that I myself do not understand yet. Perhaps some kind of middle ground approach, where these talks happen behind some doors which aren't easy to access before they reach the public is a way that messaging can keep its coherence and help to prevent misinterpretation at the source while not preventing what is often much needed discourse.
I think this article boils down to asking scientists to be careful what they share and how they communicate. I like the third recommendation the best: This seems fine as far as it goes. I will add...
I think this article boils down to asking scientists to be careful what they share and how they communicate. I like the third recommendation the best:
[...] scientists can consistently highlight correct information and avoid serving as inadvertent amplifiers of flawed information; they can encourage journalists to do the same. Avoid links to news articles or commentaries that highlight poor studies or otherwise use science irresponsibly.
This seems fine as far as it goes. I will add that sharing an article in order to condemn it feeds the outrage machine and results in less emphasis on good information.
I'm not wild about the headline. I think it's good practice to be a little skeptical about what you read. It seems like the worst excesses of anti-vax nonsense and the like come from excessive self-confidence? It's not just doubting others, it's thinking you know the answer and stubbornly sticking with it.
We actually do want to teach people to be skeptical about what they read, to understand that there are gaps and uncertainty, to judge how serious that uncertainty is and how to handle it well. The issue is not doubt itself, it's filling the gaps with ideology because you can't handle doubt.
I'm a bit conflicted on the thrust of this article, which is basically an exhortation to scientists to avoid airing doubts and dirty laundry out in public for fear of it being weaponized by climate denialists or anti-vaxxers or whatever.
I'm conflicted because I think she's probably right. But throughout the 90s and early 2000s I vividly remember when the economics and political science establishments functionally adopted exactly this posture about "globalization," privatization, and liberalization of trade. We recognize today that this was kind of a disaster and led to severely downplaying the downsides of these policy choices. Of course, that gag order was the result of some pretty obvious pecuniary interests, but it was often accompanied by what I perceived as a kind of fear that any critique of it would open the door to the bad old days of Import Substitution and failed state centered industrial policy.
I'm gonna get nervous about any call to keep your doubts to yourself lest the plebs hear. There's something really worrying about this kind of 'closure' of the public sphere. It doesn't say anything good about our ability to make collective decisions as a society.
I struggle with this idea constantly because nearly all scientific journalism I read involves at least some level of misinterpretation. Often times it's not purposefully misconstrued, but people like to jump to conclusions when they aren't intimately familiar with the right framework they need to constructively approach scientific studies or don't know how to fully interpret the findings of a paper. We've all encountered and recognize this in some fashion, as we seem to curing cancer every week and every food is now secretly a vector for cancer, poor health, and yet paradoxically also the super food you've never heard of.
However, I feel this is part of a larger narrative around information and trust (or perhaps more accurately sorting information from disinformation and when to trust information from a source) that our society and the world are struggling with on many levels. Seeing how many people out there will still maliciously spread misinformation and disinformation in order to further political gain, to espouse or proselytize their own thoughts on the world, and to wield power over many, I think that we need to focus our efforts on the misinterpretation and teaching people how to sort, rank, and quantify sources of information rather than attempting to change how we present the data at its source. The intentions of the scientists which create these papers is to share information in the scientific context, and fully understanding just how much weight we should be ascribing to their findings is an important part of explaining a paper as the people reading it did not go through the process of this research and do not have a full understanding of all the externalities and considerations absent this information. While I believe they should do their best to help reduce others from misinterpreting their data, ultimately someone taking something that they don't understand and spreading that information is at the fault of that someone, and not the researchers.
As a society we're starting to grapple with and train people on how to recognize when things shouldn't be shared on the internet or when a source is questionable. Obviously we haven't figured out how to train everyone, as we have a huge reckoning right now with trying to undo the brainwashing that qanon underwent, trying to educate people about climate change, prevent grandma from sharing that spicy article about <insert relevant political or scientific issue here>, reaching out to people headed down the path of inceldom or other problematic fringe groups, etc. There is a lot of active science on understanding how to tackle these problems, and the ones which are starting to arise from the emergence and dominance of AI and ML in absorbing, spreading, and modeling information. In my eyes things are likely to get worse, before they get better, but censoring the science that gets misinterpreted seems quite shortsighted and likely to backfire to me given how wildly information is being wielded in recent years. I mean, what's to stop them from publishing an article claiming that scientists have doubts or dirty laundry whether it's true or not? The correct path to me seems like some sort of public education - training people to recognize who to listen to and when, or how to get trained to interpret information for themselves even though we've pretty much always been very bad at doing this. If we don't find an appropriate framework for people to absorb data and information, we're going to live in a very chaotic world and we aren't going to solve any of the problems the author or you bring up.
However, I also recognize that I'm not an expert in this field and there's likely many merits behind what the author is claiming that I myself do not understand yet. Perhaps some kind of middle ground approach, where these talks happen behind some doors which aren't easy to access before they reach the public is a way that messaging can keep its coherence and help to prevent misinterpretation at the source while not preventing what is often much needed discourse.
I think this article boils down to asking scientists to be careful what they share and how they communicate. I like the third recommendation the best:
This seems fine as far as it goes. I will add that sharing an article in order to condemn it feeds the outrage machine and results in less emphasis on good information.
I'm not wild about the headline. I think it's good practice to be a little skeptical about what you read. It seems like the worst excesses of anti-vax nonsense and the like come from excessive self-confidence? It's not just doubting others, it's thinking you know the answer and stubbornly sticking with it.
We actually do want to teach people to be skeptical about what they read, to understand that there are gaps and uncertainty, to judge how serious that uncertainty is and how to handle it well. The issue is not doubt itself, it's filling the gaps with ideology because you can't handle doubt.