32 votes

EU says TikTok faces large fine over "addictive design"

19 comments

  1. [3]
    lynxy
    Link
    Large fine here, large fine there- when the fines aren't big enough, they are simply a cost of doing business. And the fines are never more than the company has profited from dishonest / illegal...

    Large fine here, large fine there- when the fines aren't big enough, they are simply a cost of doing business.

    And the fines are never more than the company has profited from dishonest / illegal behaviour.

    Companies that break the rules or act in anti-consumer ways should experience genuine consequences; for example, Google should be broken up for monopolising.

    20 votes
    1. CptBluebear
      Link Parent
      European EU fines have been rather significant and are enough to cow the likes of Google. When the EU fines, you comply. I agree there need to be lasting consequences for the likes of Google but...

      European EU fines have been rather significant and are enough to cow the likes of Google.

      When the EU fines, you comply.

      I agree there need to be lasting consequences for the likes of Google but these fines aren't just the cost of doing business.

      21 votes
    2. preposterous
      Link Parent
      Exactly. Also, it took 10 years or more for anything whatsoever to happen. 10 years to rake money in and assert a dominant position on the market. Nothing to see here, business is still great and...

      Exactly. Also, it took 10 years or more for anything whatsoever to happen. 10 years to rake money in and assert a dominant position on the market.

      Nothing to see here, business is still great and acting against society and people is still profitable.

      2 votes
  2. [15]
    Rudism
    Link
    I always feel an uncomfortable cognitive dissonance when trying to think through situations like this. On the one hand, I feel like if a person or company wants to build a product with addictive...

    I always feel an uncomfortable cognitive dissonance when trying to think through situations like this.

    On the one hand, I feel like if a person or company wants to build a product with addictive qualities (in the context of stuff like infinite scrolling, gambling mechanisms, recommendation algorithms) then they should be allowed to without government interfering and trying to be the "Mom" of their user base. Like how do you even define what should or shouldn't be allowed if you're talking about ubiquitous functionality like scrolling content? Or sending notifications? Or recommending stuff? Where and how do you draw the line between something like TikTok and something like Apple TV or Google web search? It feels very arbitrary, and prone to being abused by those with the power to say something is or isn't addictive.

    But that's also predicated on the idea that users are aware enough to know that these things are addictive so they can regulate their use to a level that's not unhealthy for them, and for the actual Moms (and Dads) of children to educate their kids and ensure they're using them safely (or not at all). Which I know full well is a ridiculous assumption. We don't live in an ideal fantasy world where every adult and child is rational and has the self awareness or capacity to know when something they're doing is unhealthy and/or to stop doing it. The studies show that access to social media at a young age does have detrimental effects to those kids (and presumably society as a whole, since those kids are who will make up society as they get older).

    It just feels like we're faced with choosing between two evils--governments with the nebulous authority to punish or shut things down at the whims of whoever is in power, hoping that they are incorruptible, actually have our best interests in mind, and are capable of determining the best way to act on that (another ridiculous assumption); or live with the fact that some segment of the population is going to struggle with these addictions, depression, and whatever other negative side effects come with overuse of things like TikTok and other poisonous social media apps.

    12 votes
    1. [6]
      MimicSquid
      Link Parent
      Those two evils seem unbalanced. Allow an evil thing to persist, or worry that the systems necessary to resist evil may one day be turned to evil ends? Only one of those is a certain surrender to...

      Those two evils seem unbalanced. Allow an evil thing to persist, or worry that the systems necessary to resist evil may one day be turned to evil ends? Only one of those is a certain surrender to a present evil.

      17 votes
      1. [5]
        Rudism
        Link Parent
        My problem is that it doesn't feel quite so black and white to me--I have a hard time seeing TikTok as unquestionably evil and I have a hard time not seeing governments with overreaching authority...

        My problem is that it doesn't feel quite so black and white to me--I have a hard time seeing TikTok as unquestionably evil and I have a hard time not seeing governments with overreaching authority as a really bad idea (and capable of far more harm).

        14 votes
        1. Protected
          Link Parent
          We're not equipped to perceive the scale of the betrayal enacted by "social" networks upon our very humanity. No one wants to admit that so many of us can be reprogrammed like that, not even by...

          We're not equipped to perceive the scale of the betrayal enacted by "social" networks upon our very humanity. No one wants to admit that so many of us can be reprogrammed like that, not even by humans but by automated rule-based decision making. It feels impossible. It feels like a weird tinfoil hat conspiracy theory.

          But when examining the problem of rising authoritarianism throughout the world, I've been unable to find an explanation that doesn't ascribe the festering anger, the distorted perspective, the erosion of empathy among even formerly kindly, moral people to social networks. Message by message, post by post, algorithms wear us out over days, months, years. It's not even necessarily the content of what you're seeing, but the decision of what you are shown, contrasted with what you aren't. We become permanent elements of a global rage mob, easy prey for politicians with messages that resonate with those feelings.

          I also dislike authoritarian governments, which is why if I was an absolute monarch I would shut down every social network and throw every single one of their owners in prison for life, then abdicate. (And that's a kinder punishment than they deserve.)

          13 votes
        2. [2]
          Lia
          Link Parent
          The annual budgets of the largest tech companies can be ten times the budget of many national governments. From that standpoint alone, companies are capable of more harm. As well, most western...

          The annual budgets of the largest tech companies can be ten times the budget of many national governments. From that standpoint alone, companies are capable of more harm. As well, most western governments change every few years as a protective mechanism against the evils that concentrated power tends to come with. In some cases this works better and in other cases not so well (as we can all see wrt the orange maggot's recurring presidency), but companies don't have a similar mechanism at all.

          Obviously most companies are not "unquestionably evil", but when it comes to products and services that systemically generate more profit by being bad for people, those will often be as bad as they are allowed to become. Products and services with addictive components are one example of this. Even when it's not a case of a deliberate aim to destroy the world, the outcome of being mindlessly profit-driven can still be really bad for society.

          9 votes
          1. Rudism
            Link Parent
            I'm not so sure. If this were true, wouldn't we see a much different power dynamic going on in the US than what we're seeing? Trump's net worth, across all his companies and investments, prior to...

            The annual budgets of the largest tech companies can be ten times the budget of many national governments. From that standpoint alone, companies are capable of more harm.

            I'm not so sure. If this were true, wouldn't we see a much different power dynamic going on in the US than what we're seeing?

            Trump's net worth, across all his companies and investments, prior to becoming president in his first term was maybe $1.5 billion (I think it's hard to estimate because he's not exactly known to be honest about how much he's worth, but that's a number that I see thrown around by more than one source).

            Today, in Trump's second term, we see Tim Cook (CEO of a company with hundreds of billions in assets and turning hundreds of billions of revenue each year) bending the knee and gifting golden trophies in the oval office, we see Jeff Bezos (personal net worth well over $200 billion) gutting and neutering a 150 year old respected newspaper because Trump doesn't like honest journalism and Bezos needs to stay on his good side, we see Elon Musk (on track to become the world's first trillionaire with a T) grovelling at Trump's feet apologizing for mean tweets...

            I agree that companies are capable of harm/evil, but knowing that, how much more harm is the man whom even they fear capable of?

        3. raze2012
          Link Parent
          That's why truth is the best disinfectant. I don't care about shutting down tiktok or youtube per se. Simply make sure we know the sausage that's going on. If it's bad then... well, it's bad. See...

          I have a hard time seeing TikTok as unquestionably evil

          That's why truth is the best disinfectant. I don't care about shutting down tiktok or youtube per se. Simply make sure we know the sausage that's going on. If it's bad then... well, it's bad. See Facebook in the 2010's. Pretty open and shut to realize that stuff like shadow profiles and rampant selling of data is not something we want to encourage. But we had to find out about it first.

          If it's not bad, then it'll open up competition to minimize odds of monopolistic convergence (which IS bad). Maybe such algorithms may be able to incentivize the people to become more educated as well. I'm not so optimistic, but it's a lens to consider.

          8 votes
    2. [3]
      Fiachra
      Link Parent
      I think this is a pretty extreme characterisation of something as mundane as government regulation. Laws have to precisely define what they're outlawing. There are legal standards for how courts...

      governments with the nebulous authority to punish or shut things down at the whims of whoever is in power

      I think this is a pretty extreme characterisation of something as mundane as government regulation. Laws have to precisely define what they're outlawing. There are legal standards for how courts establish if a law has been broken or not. There are trials where evidence has to be presented. There's nothing nebulous about that and the only "whims" at play is the decision if a case is strong enough to prosecute or not.

      Your point could be easily rewritten to argue against food safety standards on the grounds that it gives government the nebulous authority to shut down any restaurant at will by labelling food "unsafe".

      9 votes
      1. [2]
        Rudism
        Link Parent
        None of the legal standards, courts, laws, and trials are guaranteed to be based on the best interest of the people. I mean a decade ago I probably would have been saying the same thing you are in...

        None of the legal standards, courts, laws, and trials are guaranteed to be based on the best interest of the people. I mean a decade ago I probably would have been saying the same thing you are in response to what I'm saying now, but watching the government and rule of law in the country where I live--which at one point not too long ago I would have argued was immune to the kind of authoritarian coupe it's failing to now--completely crumble to shambles, the importance of limiting the scope of things that a government has the authority to meddle with is becoming a lot clearer to me.

        Where on that spectrum does protecting citizens from unsafe food or drugs stand? And where does protecting people from addictive smartphone apps stand? For me, the former falls closer to "let them meddle," and the latter maybe less so.

        3 votes
        1. Fiachra
          Link Parent
          Sorry to tell you, but if government doesn't decide it then a corporation will. I can't guarantee the law will be executed based on the best interests of the people, but I can guarantee that a...

          Sorry to tell you, but if government doesn't decide it then a corporation will. I can't guarantee the law will be executed based on the best interests of the people, but I can guarantee that a corporate decision will not be.

          Tiktok is left to self regulate and they made digital heroin for children. You freely admit the voluntary free market solution is absurd and will never work. So what do I, the individual, do? I frankly have no choice but to fight to keep my democracy functional, transparent, and fair.

          My choices are corporate dictatorship or eternal vigilance to preserve representative democracy. There is no scenario where I cut out government and everything is just freedom and liberty: law of the jungle is the strong eats the weak. That means a multinational eating me and my community and my family. No thank you.

          14 votes
    3. cheep_cheep
      Link Parent
      Governments, for better or worse, are tasked with orchestrating large scale rules and policies to support and manage their constituents. When governments participate in markets, say, for various...

      Governments, for better or worse, are tasked with orchestrating large scale rules and policies to support and manage their constituents. When governments participate in markets, say, for various kinds of tech products, they have the right to embrace, permit, dissuade, or ban products within their jurisdictions. If you look around the world, you have some governments that vehemently fight against some products and dictate rules about how and when they can be used (I think most notably China, when thinking about tech), and you also have some governments that may feel more lax or permissive (hello, USA). I feel like this is true with all manner of goods, not just tech, and so I don't totally understand what the potential issue is here specifically. If a government is completely banning or throttling platforms that permit communication (I think Iran is pretty notable for this), it can lead to civil unrest and a broad perception that a government is authoritarian, so it's not as though there are zero consequences for crushing public discourse. But I think the complete lack of regulation of tech and social media has led to some pretty vicious misinformation and lies, especially in the States (often fomented by authoritarian regimes, ironically). So I do think there is a gradient between "completely open and unregulated market" vs. "completely closed and state-regulated "market", and the EU I think is a leader in intelligent resistance to minimal regulations on tech.

      Canada talked about banning Tiktok before it was cool, especially based of the concern of China spying on Canadians. It doesn't have to mean that the Canadian federal government has nefarious goals against Canadian free speech (although I'm sure you can find such notions on Tiktok).

      4 votes
    4. [4]
      Cycloneblaze
      Link Parent
      If you know a better way for us to collectively impose consequences on bad actors who do harm on a societal scale, I'm all ears, but governments seem like a pretty good solution to me.

      If you know a better way for us to collectively impose consequences on bad actors who do harm on a societal scale, I'm all ears, but governments seem like a pretty good solution to me.

      2 votes
      1. [3]
        Rudism
        Link Parent
        In this kind of situation, the better way would be for everyone to recognize that TikTok (or Meta, or X, or whatever) is harmful to society and the consequence should be that TikTok becomes...

        In this kind of situation, the better way would be for everyone to recognize that TikTok (or Meta, or X, or whatever) is harmful to society and the consequence should be that TikTok becomes unprofitable because everyone stops using and letting their kids use the app. Nobody is being forced to use it, TikTok is providing a service and people are willingly and happily gobbling it up of their own free will. Why does the government get to step in between and declare their authority over that consensual relationship?

        But like I said I realize it's absurd to expect that to happen, hence my conflicted feelings towards it all.

        5 votes
        1. Fiachra
          Link Parent
          many of the users in question are children, so "consensual relationship" is probably not legally correct there sale of pure heroin is a consensual relationship too, yet most people agree with the...

          Why does the government get to step in between and declare their authority over that consensual relationship?

          1. many of the users in question are children, so "consensual relationship" is probably not legally correct there

          2. sale of pure heroin is a consensual relationship too, yet most people agree with the government intervening in some form because of the incredible harm addiction causes

          3. democratic governments are generally considered to derive their authority from the popular mandate of the voting public. If the majority of people want tiktok regulated, by god the government has the authority to regulate it, subject to constitutional limitations of course

          9 votes
        2. Cycloneblaze
          Link Parent
          When exactly is this going to happen - when does it ever happen? It's completely kneecapping ourselves to insist that the only way we have to bring these bad actors to heel is to hope that...

          In this kind of situation, the better way would be for everyone to recognize that TikTok (or Meta, or X, or whatever) is harmful to society and the consequence should be that TikTok becomes unprofitable because everyone stops using and letting their kids use the app.

          When exactly is this going to happen - when does it ever happen? It's completely kneecapping ourselves to insist that the only way we have to bring these bad actors to heel is to hope that everyone, individually, will coordinate against them to bankrupt them. Moreover, it's fundamentally reactive - if this happens it's only going to happen after the harm has already been done. We can do better than that.

          Why does the government get to step in between and declare their authority over that consensual relationship?

          I don't view the government as a third party separate from me or TikTok. I view government as what happens when we decide collectively how we want our society to be structured. I think it is a tool that allows us to do things that would be completely impossible individually or in ad-hoc gatherings, things like imposing consequences on extremely powerful and wealthy corporations, or laying out sets of rules to prevent harm to all of us on a societal scale. (And I don't think one's individual relationship with Bytedance the corporation is anything sacred.)

          Now I know, if we're talking about the US, that it's hard not to adopt such libertarian tendencies when you see how the very significant power of the administration is being weaponised (although my first worry would definitely not be about them regulating social media apps...) But I think that's kind of throwing the baby out with the bathwater - it's clear that the US government has been captured by a bunch of selfish and disorganised fascists, and they're destroying as much state capacity (see mass layoffs across all government agencies, see the abdication of the CDC's mandate to prevent disease) as they are abusing. They are doing that because "the government" is largely made up of reasonable people doing their best to implement sensible policies that benefit society, and that really gets in the way if you want to conduct fascist oppression with impunity. I don't think the problem was that the state capacity was there in the first place. I think the US would be worse off without it. Again, they know this, which is why they are trying to break it.

          6 votes
  3. Paul26
    Link
    At some point everyone realized smoking cigarettes is bad, and we no longer see ads for them, they are behind a closed cabinet in stores, and the packages have horrible pictures on them. It's...

    At some point everyone realized smoking cigarettes is bad, and we no longer see ads for them, they are behind a closed cabinet in stores, and the packages have horrible pictures on them. It's about time social media apps followed suit. Treat all the harmful things the same! You can't say alcohol is good and serve it at every restaurant, then in the same sentence say cannabis is the devil and get people in jail for smoking a joint. Why is social media allowed to function as it does while bittorrent is "stealing"? Oh because one profits the big corporation while the other one steals a fraction of profit from them. Like Protected said in the comment above, throw those bastards in jail. If I steal a song using Napster and get sued, how come they can steal everyone's personal data, spy on all of us 24/7 and get a away with it? Double standards and good lawyers I guess.

    7 votes