15 votes

Why technology favors tyranny

12 comments

  1. deeplyembedded
    Link
    Harari makes his argument in four parts: The Growing Fear of Irrelevance A New Useless Class? The Rise of Digital Dictatorships The Transfer of Authority to Machines

    if we want to prevent the concentration of all wealth and power in the hands of a small elite, we must regulate the ownership of data. In ancient times, land was the most important asset, so politics was a struggle to control land. In the modern era, machines and factories became more important than land, so political struggles focused on controlling these vital means of production. In the 21st century, data will eclipse both land and machinery as the most important asset, so politics will be a struggle to control data’s flow.

    Harari makes his argument in four parts:

    1. The Growing Fear of Irrelevance

    In the 20th century, the masses revolted against exploitation and sought to translate their vital role in the economy into political power. Now the masses fear irrelevance, and they are frantic to use their remaining political power before it is too late.

    1. A New Useless Class?

    By 2050, a useless class might emerge, the result not only of a shortage of jobs or a lack of relevant education but also of insufficient mental stamina to continue learning new skills.

    1. The Rise of Digital Dictatorships

    AI makes it possible to process enormous amounts of information centrally. In fact, it might make centralized systems far more efficient than diffuse systems, because machine learning works better when the machine has more information to analyze...The main handicap of authoritarian regimes in the 20th century—the desire to concentrate all information and power in one place—may become their decisive advantage in the 21st century.

    1. The Transfer of Authority to Machines

    Even if some societies remain ostensibly democratic, the increasing efficiency of algorithms will still shift more and more authority from individual humans to networked machines. We might willingly give up more and more authority over our lives because we will learn from experience to trust the algorithms more than our own feelings, eventually losing our ability to make many decisions for ourselves.

    12 votes
  2. [6]
    Sodliddesu
    Link
    Since this article was from 2018 I found this tidbit interesting And then we had COVID with all the 'Healthcare heroes' drek from the hospital admin staff. I wonder, if rewritten today, the WFH...

    Since this article was from 2018 I found this tidbit interesting

    He looked at the propaganda posters—which typically depicted coal miners and steelworkers in heroic poses—and saw himself there: “I am in that poster! I am the hero of the future!”

    And then we had COVID with all the 'Healthcare heroes' drek from the hospital admin staff. I wonder, if rewritten today, the WFH push, COVID (lockdowns and 'protests'), and the rise of LLMs (with all the good and bad) would change the outlook in any way.

    10 votes
    1. [5]
      deeplyembedded
      Link Parent
      I'm not sure that the years between 2018 and now have done anything other than reinforce his arguments. Under Covid, it feels like the working class learned two key things -- commuting and working...

      I'm not sure that the years between 2018 and now have done anything other than reinforce his arguments. Under Covid, it feels like the working class learned two key things -- commuting and working in an office space is contributing to their unhappiness; and companies will not hesitate to cut them immediately in the name of profits. One lesson that society didn't seem to fully agree on is that a lot of jobs deemed "menial", like healthcare workers, are actually crucial to a functioning society, and are not paid nearly enough.

      It's been encouraging to see the general movement in favor of working from home and better working conditions, but I also worry that what business has taken away from the covid years is that their employees don't actually need to put in 40 hours in order to get things done; workers will complain and cause disruption in order to improve conditions; and AI is closer to replacing workers than it seemed in 2018. They seem to be taking steps accordingly.

      14 votes
      1. [4]
        Gummy
        Link Parent
        The part of this movement towards AI replacing workers that I really don't get is, who's going to buy their goods/services when 95% of the country lives in poverty? I'm not in a bad situation by...

        The part of this movement towards AI replacing workers that I really don't get is, who's going to buy their goods/services when 95% of the country lives in poverty?

        I'm not in a bad situation by any means but with the rising cost of food and rent it's very rare for me to spend money on anything that isn't essential. I don't understand how people are continuing to get by. Even making well above minimum wage feels impossible to get anywhere right now.

        8 votes
        1. [2]
          deeplyembedded
          (edited )
          Link Parent
          I agree and think that this is one of the significant reasons that our current approach to capitalism won't work long term. Business has simultaneous incentives to short-change workers while also...

          I agree and think that this is one of the significant reasons that our current approach to capitalism won't work long term. Business has simultaneous incentives to short-change workers while also raising prices. There are no safeguards looking into the future to see how businesses are collectively ruining their customer base. I don't even need to trot out a human rights perspective to see how this isn't going to fucking work long term.

          I'm not sure why it is so hard to see this simple truth -- if greed is the engine of our economy, we better have some strong regulation to keep it in check. It's just like harnessing nuclear energy for a power plant.

          7 votes
          1. Curiouser
            Link Parent
            I agree completely. I think the only way we get out of this is to force companies to start paying people for the rights to their data, which could act like a capitalist-approved type of UBI for...

            I agree completely.

            I think the only way we get out of this is to force companies to start paying people for the rights to their data, which could act like a capitalist-approved type of UBI for those displaced from work by tech.

            Personally, I'd rather pay taxes into safety nets, UBI & retraining, but we can't have nice things in the US, it's unpatriotic or something

            Side note, my phone is old & can't open the article, but i intend to read it when I'm next at my desk.

            2 votes
        2. PopeRigby
          Link Parent
          I'm guessing their reasoning is "By the time that becomes a problem, I'll be retired. The new people can deal with that." There seems to be a lot of shortsightedness in business decisions. They're...

          I'm guessing their reasoning is "By the time that becomes a problem, I'll be retired. The new people can deal with that." There seems to be a lot of shortsightedness in business decisions. They're worried about what will make them money now.

          2 votes
  3. [4]
    Pioneer
    Link
    We desperately need to get away from this guy. He's a man who knows an awful amount of high-level across so many topics. He bumbled himself into my library and I enjoyed his work, until a friend...

    Yuval Noah Harari

    We desperately need to get away from this guy. He's a man who knows an awful amount of high-level across so many topics. He bumbled himself into my library and I enjoyed his work, until a friend of mine started picking apart his arguments due to their subject matter experice. I did the same when this guy started talking about data and technology over Covid. He knows enough high-level to make some assumptions, but then gets totally lost when he comes up against people who actually know what they're talking about usually.

    Liberalism reconciled the proletariat with the bourgeoisie, the faithful with atheists, natives with immigrants, and Europeans with Asians by promising everybody a larger slice of the pie.

    Did it?

    I'm fairly sure if we look at South America, Africa, The Middle East... then they've been under Liberalism/Capitalisms bootheel for a lot longer. It didn't unify, it just gave a collective group to oppress through our ideology.

    9 votes
    1. deeplyembedded
      Link Parent
      I don't read his description of liberalism as an endorsement -- I think he is describing how liberal democracy is perceived, and the resulting stability that goes along with this general...

      I don't read his description of liberalism as an endorsement -- I think he is describing how liberal democracy is perceived, and the resulting stability that goes along with this general acceptance. Much of what he is highlighting is a change in how people feel about these political systems now, and how this unrest will increase with advancements in technology.

      On a side note, I find it a little frustrating how every author that writes popular science ends up being maligned by experts. There is a significant lag between things that science knows, and the more general population understanding and accepting these things. Authors like Harari seek to bridge that gap, and sacrifice a level of detail and nuance in doing so. Meanwhile, most current academic knowledge is locked behind journal paywalls; is siloed within specific disciplines; and is so encumbered by field jargon that it reaches no one.

      In a world where we seek to have informed citizens making reasoned decisions, we need popularizers of science. They will never be as accurate as the lead researchers in a given field, but that seems like an unrealistic expectation.

      7 votes
    2. [2]
      raccoona_nongrata
      Link Parent
      This is what I noticed reading the article as well. He doesn't spend much time really discussing the outsized accelerating effect of capitalism on all these undemocratic future symptoms he...

      This is what I noticed reading the article as well. He doesn't spend much time really discussing the outsized accelerating effect of capitalism on all these undemocratic future symptoms he outlines.

      He sort of touches on it, but if human value is framed in terms of economic output, and we're entering an era where human labor is more or less obsolete, then it warrants reexaming our fundemental assumptions about what a human role in society, the goal of a society itself, ought to be and how power is distributed to maintain that structure.

      Our concepts about capitalism and even socialism or communism are not sufficient to address this. It will have to be something new.

      4 votes
      1. deeplyembedded
        Link Parent
        In my initial post this is why I highlighted the quote about how the initial power struggle was about property, then it was about the means of production, and now it is about data. There are not a...

        In my initial post this is why I highlighted the quote about how the initial power struggle was about property, then it was about the means of production, and now it is about data. There are not a lot of comments so far about what Harari is actually arguing, which is that technology is about to push us into another new era, and our politics and economic systems are not ready for it.

        3 votes
  4. panikode
    Link
    Horseshit. Technology created by authoritarian businesses favors tyranny but technology can also be liberatory. E2E encrypted tools like Signal certainly don’t favor tyranny. For the classic paper...

    Horseshit. Technology created by authoritarian businesses favors tyranny but technology can also be liberatory. E2E encrypted tools like Signal certainly don’t favor tyranny.

    For the classic paper in this space read through Langdon Winner’s “Do Artifacts Have Politics?”

    NOTE: I’m not arguing that no technology is tyrannical but that the title is greatly overblown. Yeah, current AI training processes tend towards authoritarian, centralized structures but it doesn’t have to be this way!

    5 votes