Large fine here, large fine there- when the fines aren't big enough, they are simply a cost of doing business. And the fines are never more than the company has profited from dishonest / illegal...
Large fine here, large fine there- when the fines aren't big enough, they are simply a cost of doing business.
And the fines are never more than the company has profited from dishonest / illegal behaviour.
Companies that break the rules or act in anti-consumer ways should experience genuine consequences; for example, Google should be broken up for monopolising.
European EU fines have been rather significant and are enough to cow the likes of Google. When the EU fines, you comply. I agree there need to be lasting consequences for the likes of Google but...
European EU fines have been rather significant and are enough to cow the likes of Google.
When the EU fines, you comply.
I agree there need to be lasting consequences for the likes of Google but these fines aren't just the cost of doing business.
I always feel an uncomfortable cognitive dissonance when trying to think through situations like this. On the one hand, I feel like if a person or company wants to build a product with addictive...
I always feel an uncomfortable cognitive dissonance when trying to think through situations like this.
On the one hand, I feel like if a person or company wants to build a product with addictive qualities (in the context of stuff like infinite scrolling, gambling mechanisms, recommendation algorithms) then they should be allowed to without government interfering and trying to be the "Mom" of their user base. Like how do you even define what should or shouldn't be allowed if you're talking about ubiquitous functionality like scrolling content? Or sending notifications? Or recommending stuff? Where and how do you draw the line between something like TikTok and something like Apple TV or Google web search? It feels very arbitrary, and prone to being abused by those with the power to say something is or isn't addictive.
But that's also predicated on the idea that users are aware enough to know that these things are addictive so they can regulate their use to a level that's not unhealthy for them, and for the actual Moms (and Dads) of children to educate their kids and ensure they're using them safely (or not at all). Which I know full well is a ridiculous assumption. We don't live in an ideal fantasy world where every adult and child is rational and has the self awareness or capacity to know when something they're doing is unhealthy and/or to stop doing it. The studies show that access to social media at a young age does have detrimental effects to those kids (and presumably society as a whole, since those kids are who will make up society as they get older).
It just feels like we're faced with choosing between two evils--governments with the nebulous authority to punish or shut things down at the whims of whoever is in power, hoping that they are incorruptible, actually have our best interests in mind, and are capable of determining the best way to act on that (another ridiculous assumption); or live with the fact that some segment of the population is going to struggle with these addictions, depression, and whatever other negative side effects come with overuse of things like TikTok and other poisonous social media apps.
Those two evils seem unbalanced. Allow an evil thing to persist, or worry that the systems necessary to resist evil may one day be turned to evil ends? Only one of those is a certain surrender to...
Those two evils seem unbalanced. Allow an evil thing to persist, or worry that the systems necessary to resist evil may one day be turned to evil ends? Only one of those is a certain surrender to a present evil.
My problem is that it doesn't feel quite so black and white to me--I have a hard time seeing TikTok as unquestionably evil and I have a hard time not seeing governments with overreaching authority...
My problem is that it doesn't feel quite so black and white to me--I have a hard time seeing TikTok as unquestionably evil and I have a hard time not seeing governments with overreaching authority as a really bad idea (and capable of far more harm).
We're not equipped to perceive the scale of the betrayal enacted by "social" networks upon our very humanity. No one wants to admit that so many of us can be reprogrammed like that, not even by...
We're not equipped to perceive the scale of the betrayal enacted by "social" networks upon our very humanity. No one wants to admit that so many of us can be reprogrammed like that, not even by humans but by automated rule-based decision making. It feels impossible. It feels like a weird tinfoil hat conspiracy theory.
But when examining the problem of rising authoritarianism throughout the world, I've been unable to find an explanation that doesn't ascribe the festering anger, the distorted perspective, the erosion of empathy among even formerly kindly, moral people to social networks. Message by message, post by post, algorithms wear us out over days, months, years. It's not even necessarily the content of what you're seeing, but the decision of what you are shown, contrasted with what you aren't. We become permanent elements of a global rage mob, easy prey for politicians with messages that resonate with those feelings.
I also dislike authoritarian governments, which is why if I was an absolute monarch I would shut down every social network and throw every single one of their owners in prison for life, then abdicate. (And that's a kinder punishment than they deserve.)
If you know a better way for us to collectively impose consequences on bad actors who do harm on a societal scale, I'm all ears, but governments seem like a pretty good solution to me.
If you know a better way for us to collectively impose consequences on bad actors who do harm on a societal scale, I'm all ears, but governments seem like a pretty good solution to me.
Large fine here, large fine there- when the fines aren't big enough, they are simply a cost of doing business.
And the fines are never more than the company has profited from dishonest / illegal behaviour.
Companies that break the rules or act in anti-consumer ways should experience genuine consequences; for example, Google should be broken up for monopolising.
European EU fines have been rather significant and are enough to cow the likes of Google.
When the EU fines, you comply.
I agree there need to be lasting consequences for the likes of Google but these fines aren't just the cost of doing business.
I always feel an uncomfortable cognitive dissonance when trying to think through situations like this.
On the one hand, I feel like if a person or company wants to build a product with addictive qualities (in the context of stuff like infinite scrolling, gambling mechanisms, recommendation algorithms) then they should be allowed to without government interfering and trying to be the "Mom" of their user base. Like how do you even define what should or shouldn't be allowed if you're talking about ubiquitous functionality like scrolling content? Or sending notifications? Or recommending stuff? Where and how do you draw the line between something like TikTok and something like Apple TV or Google web search? It feels very arbitrary, and prone to being abused by those with the power to say something is or isn't addictive.
But that's also predicated on the idea that users are aware enough to know that these things are addictive so they can regulate their use to a level that's not unhealthy for them, and for the actual Moms (and Dads) of children to educate their kids and ensure they're using them safely (or not at all). Which I know full well is a ridiculous assumption. We don't live in an ideal fantasy world where every adult and child is rational and has the self awareness or capacity to know when something they're doing is unhealthy and/or to stop doing it. The studies show that access to social media at a young age does have detrimental effects to those kids (and presumably society as a whole, since those kids are who will make up society as they get older).
It just feels like we're faced with choosing between two evils--governments with the nebulous authority to punish or shut things down at the whims of whoever is in power, hoping that they are incorruptible, actually have our best interests in mind, and are capable of determining the best way to act on that (another ridiculous assumption); or live with the fact that some segment of the population is going to struggle with these addictions, depression, and whatever other negative side effects come with overuse of things like TikTok and other poisonous social media apps.
Those two evils seem unbalanced. Allow an evil thing to persist, or worry that the systems necessary to resist evil may one day be turned to evil ends? Only one of those is a certain surrender to a present evil.
My problem is that it doesn't feel quite so black and white to me--I have a hard time seeing TikTok as unquestionably evil and I have a hard time not seeing governments with overreaching authority as a really bad idea (and capable of far more harm).
We're not equipped to perceive the scale of the betrayal enacted by "social" networks upon our very humanity. No one wants to admit that so many of us can be reprogrammed like that, not even by humans but by automated rule-based decision making. It feels impossible. It feels like a weird tinfoil hat conspiracy theory.
But when examining the problem of rising authoritarianism throughout the world, I've been unable to find an explanation that doesn't ascribe the festering anger, the distorted perspective, the erosion of empathy among even formerly kindly, moral people to social networks. Message by message, post by post, algorithms wear us out over days, months, years. It's not even necessarily the content of what you're seeing, but the decision of what you are shown, contrasted with what you aren't. We become permanent elements of a global rage mob, easy prey for politicians with messages that resonate with those feelings.
I also dislike authoritarian governments, which is why if I was an absolute monarch I would shut down every social network and throw every single one of their owners in prison for life, then abdicate. (And that's a kinder punishment than they deserve.)
If you know a better way for us to collectively impose consequences on bad actors who do harm on a societal scale, I'm all ears, but governments seem like a pretty good solution to me.