We have to realize that the internet is not a mirror of society, it’s a simulation that may or may not mirror society. It’s becoming increasingly clear how simple it is to manipulate this...
We have to realize that the internet is not a mirror of society, it’s a simulation that may or may not mirror society. It’s becoming increasingly clear how simple it is to manipulate this simulation.
It’s really disheartening for me. Seeing any user interaction online and knowing that it could all be phony ruins a lot of what I love about the internet.
The suspension of disbelief makes things way more interesting. Who cares if a story is made up if all I'm ever doing with that story is reading it for fun? (obviously there are times when you'd...
The suspension of disbelief makes things way more interesting.
Who cares if a story is made up if all I'm ever doing with that story is reading it for fun?
(obviously there are times when you'd want things to be true because the stakes are much higher than exaggerated anecdotes for comedic effect)
I see where you're coming from. The way I see it, though, I wouldn't wanna make a distinction between fun and high-stakes. If I think a population of people believes X, that might very well subtly...
I see where you're coming from. The way I see it, though, I wouldn't wanna make a distinction between fun and high-stakes. If I think a population of people believes X, that might very well subtly influence my own views on whatever that is. Ultimately that probably makes up the bulk (citation needed :P) of how our opinions are influenced online, rather than the high-stakes headlines.
Yes, and no. Everyone both knows and doesn't know that facebook and instagram images of people they know, "influencers" and celebrities are curated to show only a small sliver of what's real. It's...
Yes, and no.
Everyone both knows and doesn't know that facebook and instagram images of people they know, "influencers" and celebrities are curated to show only a small sliver of what's real.
It's not a simulation, but it isn't "real", either.
I'd argue the same things goes just as much, if not more for text online.
I can't say how disappointed I am in the major tech companies for letting it get to this point. With that being said, this is one of the most critical issues of our day. Who monitors speech on the...
I can't say how disappointed I am in the major tech companies for letting it get to this point. With that being said, this is one of the most critical issues of our day.
Who monitors speech on the internet? How do you discern intent of speech from ignorance, does it matter?
Had a conversation w/ some folks leading this charge @ some large tech companies. I get really fearful that they're going down the rabbit hole of censoring based on the effect speech has, and not the intent of the author. If I intend to lie/defraud/etc that's a big difference than if I am speaking something I believe to be true.
The challenge for these tech companies is that they MUST create echo chambers to keep folks coming back so they can drive revenue. It's not that ignorant people have a platform, it's that these advertising tech companies have created platforms that allow for the mass exploitation of anchoring, framing, and overconfidence bias in our population.
(Not well formed, I apologize, was typing in a rush)
The business model of trying to keep people on your site has backfired massively. When platforms begin competing for your attention they become echo chambers. These are dangerous in their own...
The business model of trying to keep people on your site has backfired massively. When platforms begin competing for your attention they become echo chambers. These are dangerous in their own right, but they become downright weaponized when malicious actors get involved. It turns everyone's personal news diet into everyone's personal propaganda feed. The whole Cambridge Analytica scandal was a real eye opener for me and I've been worried about this trend ever since. We need to find a way of encouraging diversity of thought online and preventing these type of attacks.
That's what I was trying to get at. The attention engineers at major tech companies have put a lot of work into ruining their platforms over the years and we are just now waking up to that fact.
That's what I was trying to get at. The attention engineers at major tech companies have put a lot of work into ruining their platforms over the years and we are just now waking up to that fact.
I would argue that there is a huuuge difference between kicking out a few extremists that are inciting real world violence based off of falsehoods and creating an echo chamber. I don't buy any of...
I would argue that there is a huuuge difference between kicking out a few extremists that are inciting real world violence based off of falsehoods and creating an echo chamber.
We have to realize that the internet is not a mirror of society, it’s a simulation that may or may not mirror society. It’s becoming increasingly clear how simple it is to manipulate this simulation.
It’s really disheartening for me. Seeing any user interaction online and knowing that it could all be phony ruins a lot of what I love about the internet.
The suspension of disbelief makes things way more interesting.
Who cares if a story is made up if all I'm ever doing with that story is reading it for fun?
(obviously there are times when you'd want things to be true because the stakes are much higher than exaggerated anecdotes for comedic effect)
I see where you're coming from. The way I see it, though, I wouldn't wanna make a distinction between fun and high-stakes. If I think a population of people believes X, that might very well subtly influence my own views on whatever that is. Ultimately that probably makes up the bulk (citation needed :P) of how our opinions are influenced online, rather than the high-stakes headlines.
That's the dangerous part. Not everyone is suspending their disbelief, or are even aware that the simulation isn't real.
Yes, and no.
Everyone both knows and doesn't know that facebook and instagram images of people they know, "influencers" and celebrities are curated to show only a small sliver of what's real.
It's not a simulation, but it isn't "real", either.
I'd argue the same things goes just as much, if not more for text online.
Fascinating write up about the tampering of information online via vote and comment manipulation.
I can't say how disappointed I am in the major tech companies for letting it get to this point. With that being said, this is one of the most critical issues of our day.
Who monitors speech on the internet? How do you discern intent of speech from ignorance, does it matter?
Had a conversation w/ some folks leading this charge @ some large tech companies. I get really fearful that they're going down the rabbit hole of censoring based on the effect speech has, and not the intent of the author. If I intend to lie/defraud/etc that's a big difference than if I am speaking something I believe to be true.
The challenge for these tech companies is that they MUST create echo chambers to keep folks coming back so they can drive revenue. It's not that ignorant people have a platform, it's that these advertising tech companies have created platforms that allow for the mass exploitation of anchoring, framing, and overconfidence bias in our population.
(Not well formed, I apologize, was typing in a rush)
The business model of trying to keep people on your site has backfired massively. When platforms begin competing for your attention they become echo chambers. These are dangerous in their own right, but they become downright weaponized when malicious actors get involved. It turns everyone's personal news diet into everyone's personal propaganda feed. The whole Cambridge Analytica scandal was a real eye opener for me and I've been worried about this trend ever since. We need to find a way of encouraging diversity of thought online and preventing these type of attacks.
I think it's less about keeping people on the site, and more about maximizing ARPU. Those are two very different things.
That's what I was trying to get at. The attention engineers at major tech companies have put a lot of work into ruining their platforms over the years and we are just now waking up to that fact.
I would argue that there is a huuuge difference between kicking out a few extremists that are inciting real world violence based off of falsehoods and creating an echo chamber.
I don't buy any of this slippery slope nonsense.