17
votes
Distinguishing between factual and opinion statements in the US news
Link information
This data is scraped automatically and may be incorrect.
- Authors
- Amy Mitchell, Jeffrey Gottfried, Michael Barthel, Nami Sumida
- Published
- Jun 18 2018
- Word count
- 2073 words
In case anyone is wondering, here are the questions:
Factual statements
Opinion statements
Borderline statements
The thing is, I can identify them as factual or opinion based on the wording alone, but I wasn't 100% certain each of their facts were actually facts - like I haven't seen the study, or the source that it comes from and I wonder if it's opinion stated as fact. I wonder how many of the people who responded to this survey chose opinion when something was a fact they were unsure of.
EDIT: They also seem to have a bit of political bias in what statements they chose as facts. It seems like mostly stuff you'd hear a democrat talk about, rather than a republican.
The study says in the readout that the questions to polled participants requires them to answer regardless of any actual truth to the fact. They are merely trying to gauge whether people can suss out what is a factual claim versus what is an opinion.
Here is the associated quote:
In other words, they are explicitly trying to sidestep your concern about bias to the fact or opinion stated. What's more, some of the claims are designed to be mroe friendly to different audiences (and republicans and democrats each do seem to respond differently toward those statements).
It's a fascinating study, really.
I get that they are, but when you get a large sample size like this together, plenty of them are going to ignore, skim, or otherwise not treat that section as they should.
What I worry is that people will read into the results too much, whereas if this same study were conducted again with less politically charged questions, I think the outcomes might be quite different.
If you click far enough into this study, the makers have a methodology page where they discuss how this survey was conducted, as any poll worth anything will provide. There you can answer a lot of the questions you might have over what structure the poll might have had and what the survey designers did to account for the concerns you have (like the potential to skim or quickly associate 'source I dis/trust' with fact or opinion).
In this case, 8,066 members of a polling group that was randomly selected (but needed internet access) were invited to participate, of whom 5,035 adults finished the survey. Their results were then weighted to mirror 2016 demographic data provided by the Census Bureau. So that's the sample size we're working with.
On this page you can also find their justification for intentionally using facts that had "ideological appeal" as they put it. You can also see various other tactics they took to account for political lean, such as attributing a statement to one of three news outlets (Fox, NYT, or USA today) based on their right, left, and mixed politically-leaning audiences respectively.
This is a huge issue. Knowledge, for the most part, is ultimately based on trust. You have to trust that the people who do studies and research are reporting accurate information. You have to trust that the people who discuss things like the law are telling the truth. Going even further, you have to trust that your senses are providing you with accurate information. Proving the truth of any of these things is impossible.
I don't find this variant of a post-modernist critique on truth particularly interesting or useful. Theoretically, sure we could all be in some simulation and wouldn't know the difference. Practically, it has no impact on whether we can measure how much of the US budget goes to certain programs. It would also have no impact on what documentation we could produce to prove or disprove where Obama was born.
I'm particularly disinclined to feel any sympathy toward this view because of Robert Fogelin's idea of "Deep Disagreements," which basically is a point at which participants' rules of a discussion become so separate that there can be no shared understanding for what compelling factual evidence even is. I find encouraging the environments that could create "Deep Disagreements" deeply poisonous to healthy debate, and will actively go out of my way to find shared rules for discussion wherever possible.
One principle I try to convince people to keep is the simple idea that whatever conviction we each hold, we might be wrong, so there is no sense in being 100% certain of a fact anyway, because maybe that fact is wrong.
Edit: One aspect of what you said that does appeal to me is that we have to trust in the institutions/people who did the work to arrive at some statistic. That doesn't necessarily need us to accept something as abstract as being unable to trust our senses, but institutional trust is something where I think we should do a better job at calling out both the successes and the failures. Like this recent report on the FBI investigation into Hillary largely showed that Comey made a lot of unusual choices but that the totality of the investigation largely wasn't politically biased. That's good news for the institution that should probably be trumpeted. Good job, gang. Thanks for not being the deep state we feared you to be, but also maybe don't do sketchy shit with your text messages to your close colleagues because it'll get you removed from investigations and rightfully so.
I don't think it's useful on it's own, but I do think it's important to recognize the importance of belief when it comes to establishing a consensus of truth. It's not enough to throw facts and figures at someone in an attempt to change their mind, unless you can undermine their belief in their reality you're not going to get anywhere. And perhaps this is pessimistic, but I think it's also important to realize that sometimes you are not going to be able to change someone's view, and that wasting energy on trying is just going to end up harming your cause.
I like this.
I think there is some value in acknowledging that "certainty," as Robert Burton would put it, is effectively a feeling similar to any normal emotion we might feel. What I don't think is particularly useful is rejecting that we could ever arrived at shared truths because we can't prove them at the most fundamental layers that a philosopher might enjoy delving down to. Undermining the rules for shared consensus making, particularly the idea that evidence can be gathered to prove or disprove a claim, makes it impossible for anyone to discuss "fact," and that makes it virtually impossible to ever convince anyone of anything when we dive that deep.
Because persuasion stops being a possibility if you go too far with that post-modernist critique, and because rhetoric and the art of communication and understanding is much more important to me than being certain, that is a theoretical road I do not tread.
Instead what I prefer to do is focus on people's value priorities in discussions. I am not so much concerned with beliefs as an unchangeable starting point for discussion, but I do think it's hugely important to figure out how people have learned to prioritize the "good concepts" (i.e., values) in specific situations.
I'm concerned that you see persuasion as necessarily requiring undermining someone's belief. A lot of times, it just comes down to adding something to their perspective that they hadn't had there before, rather than forcing them to outright reject their currently held beliefs (which is usually never going to happen all at once anyway, so what are you doing with having that aim?). People typically change bit-by-bit, not wholesale, and certainly not because some energetic rhetorician hit them with the good word. Personal belief changes of any gravity are introspective affairs that usually happen outside the reach of rhetoricians.
This is actually an opinion, given how "significant" isn't specific. What constitutes "significant" might be different for different people.
Anyway, if Reddit has taught me anything, it's that the difference between facts and opinions goes far over the heads of a scary number of people.
One key piece missing from the Methodology is the rationale by which respondents were classified as "High political awareness, Very digitally savvy, A lot of trust in national news organization, and Very interested in news."
That is a fact.
I for one wouldn't trust this as a scientific survey.
That is my opinion.
That information is included under "Terminology" in section 2, "The ability to classify statements as factual or opinion varies widely based on political awareness, digital savviness and trust in news media."
I'd still be less inclined to say this seems like circular reasoning if Pew presented an actual questionnaire disclosing how they determined these qualities.
I'm a bit confused. They did disclose the questions they used to determine things like political awareness or tech savviness, and they described how they used the answers to put people in categories such as high political awareness.* For example:
Are you looking for something different?
*Most of the questions were described in the section I linked, and the rest can be found in Appendix C, the Topline Questionnaire.
Thanks. That was exactly what I was looking for.