37 votes

Sam Altman says Meta offered OpenAI staff $100 million bonuses, as Mark Zuckerberg ramps up AI poaching efforts

46 comments

  1. [27]
    OBLIVIATER
    Link
    God I wish I worked in a field that was so incredibly, ridiculously, hilariously overvalued so I could get offered 100 million dollar bonuses (or just like, any raises in the past 5 years?) Guess...

    God I wish I worked in a field that was so incredibly, ridiculously, hilariously overvalued so I could get offered 100 million dollar bonuses (or just like, any raises in the past 5 years?)

    Guess suicide prevention just isn't worth that much to society in comparison to generating fake images of Jesus giving bread to poor people on Facebook...

    40 votes
    1. [12]
      Raspcoffee
      Link Parent
      All the while it's long-term profitability is questionable, too... Don't get me started on the ethics of it all. It may be a bit weird to say this as a software engineer in this context to you,...

      All the while it's long-term profitability is questionable, too... Don't get me started on the ethics of it all.

      It may be a bit weird to say this as a software engineer in this context to you, but yeah - social workers are incredibly undervalued in our society(and emotional intelligence in general). I helped a lot on online mental health communities in the past, and doing it with for a living must be hard at times. Especially when the pay is so often shit in your field. You deserve far more.

      18 votes
      1. [6]
        tauon
        Link Parent
        Profitability being “questionable” is an extremely generous wording, especially in the case of “Open”AI, and I say that as someone who isn’t inherently against the technology, lol.

        Profitability being “questionable” is an extremely generous wording, especially in the case of “Open”AI, and I say that as someone who isn’t inherently against the technology, lol.

        10 votes
        1. [4]
          skybrian
          Link Parent
          From that article: Looks like they will actually do it: OpenAI hits $10 billion in annual recurring revenue fueled by ChatGPT growth I wonder what the cost of serving that traffic was, though?

          From that article:

          For OpenAI to hit $11.6 billion of revenue by the end of 2025, it will have to more than triple its revenue.

          Looks like they will actually do it:

          OpenAI hits $10 billion in annual recurring revenue fueled by ChatGPT growth

          I wonder what the cost of serving that traffic was, though?

          6 votes
          1. [2]
            papasquat
            Link Parent
            Well, that's the problem. Currently, it doesn't matter how much revenue they have. They could have $500 billion in revenue, and they still wouldn't be profitable, because they'd still be spending...

            Well, that's the problem. Currently, it doesn't matter how much revenue they have. They could have $500 billion in revenue, and they still wouldn't be profitable, because they'd still be spending $1T a year to make it. People are willing to use a service that lets them generate funny pictures when it's free. When it costs $50 a month or more, why would most people bother?

            It's like that old joke: "Yeah, we lose money on every sale, but we make up for it on volume!"

            11 votes
            1. skybrian
              Link Parent
              They’re spending plenty of money on R&D, but that doesn’t necessarily mean they lose money on operations with their current prices. So, more sales might not make things worse. Since they’re a...

              They’re spending plenty of money on R&D, but that doesn’t necessarily mean they lose money on operations with their current prices. So, more sales might not make things worse.

              Since they’re a private company, we don’t get to see their income statements.

              2 votes
          2. tauon
            Link Parent
            Very interesting! I wonder how that (and their liquidity) will change when/if the current “AI” bubble pops. Like, at the end of the day, all that revenue growth seems to be doing for them is...

            Very interesting! I wonder how that (and their liquidity) will change when/if the current “AI” bubble pops.

            Like, at the end of the day, all that revenue growth seems to be doing for them is making them lose more money.

            1 vote
        2. Raspcoffee
          Link Parent
          Oh I'm well aware. Though, I do suspect that the financial bubble of it will be kept up a bit because people are invested in it being the 'next big thing'. Sometimes purely due to ideology....

          Oh I'm well aware. Though, I do suspect that the financial bubble of it will be kept up a bit because people are invested in it being the 'next big thing'. Sometimes purely due to ideology. Personally, I'm pretty curious what'll happen with all the infrastructure afterwards after its profitability(well, on paper) gets heavily reduced or even ceased.

          2 votes
      2. [5]
        papasquat
        Link Parent
        I wish we lived in a system that paid people what they actually deserve rather than what the market would bear. Those two things are basically never correlated.

        I wish we lived in a system that paid people what they actually deserve rather than what the market would bear. Those two things are basically never correlated.

        5 votes
        1. [4]
          stu2b50
          Link Parent
          I think they’re correlated more often than not. I think people overvalue the difficulty of work, and not the result of the work, when determining how much someone “deserves”.

          I think they’re correlated more often than not. I think people overvalue the difficulty of work, and not the result of the work, when determining how much someone “deserves”.

          4 votes
          1. [3]
            papasquat
            Link Parent
            The result of the work is compensated in terms of how impactful it is, but not how good it is, which is what people usually mean when they say someone deserves something. A teacher would deserve...

            The result of the work is compensated in terms of how impactful it is, but not how good it is, which is what people usually mean when they say someone deserves something.

            A teacher would deserve more than a mob boss, and a humanitarian aid worker would deserve more than a private equity broker, for instance.

            People don't get paid for how much good they do in the world, they get paid for how much influence they wield, and unfortunately most very influential people didn't get that way by doing morally good things.

            4 votes
            1. [2]
              stu2b50
              Link Parent
              Good is subjective. I don’t think it’s practical to make “goodness” part of the determination when it’s inherently variable.

              Good is subjective. I don’t think it’s practical to make “goodness” part of the determination when it’s inherently variable.

              6 votes
              1. papasquat
                Link Parent
                Oh it's definitely not practical. It's just a wish, not a policy proposal.

                Oh it's definitely not practical. It's just a wish, not a policy proposal.

                4 votes
    2. [13]
      EgoEimi
      Link Parent
      How is it overvalued? The market is pricing for the very long term value creation of AI. If anything, it's undervalued. Human economic output in the past 100 years has hockey sticked straight up...

      How is it overvalued? The market is pricing for the very long term value creation of AI. If anything, it's undervalued.

      Human economic output in the past 100 years has hockey sticked straight up thanks to several technological revolutions: the industrial revolution, the green revolution, and the information revolution.

      AI is clearly the next revolution to keep that hockey stick going. The market wants an AI that won't hallucinate; can reason conceptually, visually, spatially, chronologically, etc.; and can interact with the world and do human work, to supplant human workers. Given how things worked out in the web tech boom, many people suspect that it'll be a few-winners-takes-all-or-most game. The winners are going to help enable quadrillions of dollars of economic growth over the next century.

      Is 100 million dollars too much? No one is totally sure. Maybe too much, maybe too little. Intellectual labor in the AI space is not easily substitutable. The right mind is worth billions, perhaps trillions if they make the big breakthrough. The wrong one is worth nothing.

      7 votes
      1. [12]
        OBLIVIATER
        (edited )
        Link Parent
        100 million dollars is more money than everyone in the history of my family in America has ever made combined. To give that much wealth to one person for joining your company... Is that not the...

        100 million dollars is more money than everyone in the history of my family in America has ever made combined. To give that much wealth to one person for joining your company... Is that not the definition of "Over valued"?

        I can't imagine any one person is so incredibly pivotal to any effort that they deserve that much wealth dropped in their lap.

        Human economic output in the past 100 years has hockey sticked straight up thanks to several technological revolutions: the industrial revolution, the green revolution, and the information revolution.

        Damn that's crazy, is that why I can't afford to buy a house, and most Americans can't afford to buy their life saving medicine? I wonder where all that economic output ended up going... oh yeah to people like Zuckerberg who is using it to pay AI engineers 100 million dollars. Sure do love how much economic growth has helped the common man! Let's keep making that number go up forever!!!

        19 votes
        1. [7]
          Minori
          Link Parent
          The average person is massively wealthier than 100 years ago. Human welfare has dramatically improved, and the global poverty rate continues to drop. Inequality is bad, but we should be realistic...

          The average person is massively wealthier than 100 years ago. Human welfare has dramatically improved, and the global poverty rate continues to drop.

          Inequality is bad, but we should be realistic about the state of things. Social media encourages us to catastrophisize everything, and it's not healthy or productive.

          10 votes
          1. [6]
            boxer_dogs_dance
            Link Parent
            At the same time we should be aware of trends going forward. There is no guarantee that the prosperity we have now will continue. When I took a medieval history I was taught that economic growth...

            At the same time we should be aware of trends going forward. There is no guarantee that the prosperity we have now will continue.

            When I took a medieval history I was taught that economic growth was hindered by the lords of small territories, each of whom would extract fees for the use of rivers and roads and tariffs on goods, making trade more costly.

            Analogous choices can be made today by corporations with monopolies or control of technological choke points.

            14 votes
            1. [5]
              OBLIVIATER
              (edited )
              Link Parent
              Techno-feudalism seems more and more likely each day going forward. I would definitely argue average quality of life has gone way down in the last 30 years as well (not including things like less...

              Techno-feudalism seems more and more likely each day going forward. I would definitely argue average quality of life has gone way down in the last 30 years as well (not including things like less pollution, smoking/drinking reduction, etc as those issues were caused by rich people in the first place.)

              6 votes
              1. [4]
                Minori
                Link Parent
                How do you figure? While the wealthy have significantly higher emissions, how are smoking or drinking caused by rich people? Are you referring to corporate advertising? Drinking problems and the...

                (not including things like less pollution, smoking/drinking reduction, etc as those issues were caused by rich people in the first place.)

                How do you figure? While the wealthy have significantly higher emissions, how are smoking or drinking caused by rich people? Are you referring to corporate advertising?

                Drinking problems and the tragedy of the commons are ancient and predate anything resembling a modern economy.

                1. OBLIVIATER
                  Link Parent
                  It's a well documented fact that the smoking industry spent billions of dollars on propaganda and even bribed doctors to tell people smoking wasn't harmful (and was even good for you). They also...

                  It's a well documented fact that the smoking industry spent billions of dollars on propaganda and even bribed doctors to tell people smoking wasn't harmful (and was even good for you). They also shoved as many addictive chemicals as possible into them which they KNEW caused cancer in order to get people dependant on them as quickly as possible.

                  As for drinking, the alcohol industry wouldn't spend literally billions of dollar a YEAR on marketing if it didn't result in countless more billions in revenue. Alcohol brands spend twice as much on television advertising as the average brand and nearly four times as much on out-of-home advertising.

                  6 votes
                2. [2]
                  MimicSquid
                  Link Parent
                  Drinking problems, yes, but the Tragedy of the Commons isn't settled fact. There are extensive criticisms of the claims laid out by the author.

                  Drinking problems, yes, but the Tragedy of the Commons isn't settled fact. There are extensive criticisms of the claims laid out by the author.

                  4 votes
                  1. Minori
                    Link Parent
                    The tragedy of the commons doesn't apply in all circumstances, but there are tens of thousands of examples throughout history. I think the Marxists and collectivists that argue it's not real are...

                    The tragedy of the commons doesn't apply in all circumstances, but there are tens of thousands of examples throughout history. I think the Marxists and collectivists that argue it's not real are practically arguing against a strawman. It's not a universal rule of human nature, just a risk that has to be accounted for.

        2. [4]
          papasquat
          Link Parent
          The people running these AI companies have completely drank the kool-aid. They're all in on the promised dream of AI; that is, that it will in very short time result in a self improving super...

          The people running these AI companies have completely drank the kool-aid. They're all in on the promised dream of AI; that is, that it will in very short time result in a self improving super intelligence, that with the right finesse of alignment involved, solve all of the pressing issues humanity has while rewarding the people who created it beyond all imagination. Even if I personally believe that idea is far fetched, I'm hesitant to trust my own judgement over those of the people running these companies.

          If you're operating under that set of assumptions, it makes sense to pay the few people in the world that have the experience and skill set to truly move the needle of AI research forward whatever you can afford to keep them on your team. In their view, not getting these people doesn't just mean failure as a company, it means the potential extinction of the human race.

          In their mind, they're fighting the opening salvos of a sci-fi war that hasn't happened yet.

          Whether they're forward thinking visionaries or just crazy people sitting on piles of way too much money entirely depends on if they're right or not.

          7 votes
          1. [3]
            Requirement
            Link Parent
            Maybe my problem with it is that the rewards are happening now, as if the ends have already occurred. The people running the companies are gaining tremendous wealth while producing... functionally...

            while rewarding the people who created it beyond all imagination.

            Maybe my problem with it is that the rewards are happening now, as if the ends have already occurred. The people running the companies are gaining tremendous wealth while producing... functionally nothing yet. It feels like when I reward myself with pizza because I thought about going for a run only I don't meaningfully push forward climate change when I do that.
            I think the way you describe it is pretty accurate though: those who are that all in on AI are in a death cult.

            2 votes
            1. [2]
              papasquat
              Link Parent
              The reward I'm talking about isn't the money they're currently making. They're making a lot of money right now, but still, Elon Musk, Bill Gates, Warren Buffett and so on are making more, right...

              The reward I'm talking about isn't the money they're currently making. They're making a lot of money right now, but still, Elon Musk, Bill Gates, Warren Buffett and so on are making more, right now.

              The ambition is that they become the most powerful humans that have ever lived. As in, super intelligent, functionally immortal, guides of the human race. Alexander the Great, Ghengis Khan, Napoleon, all wrapped up together and dialed up to infinity. If they're the ones to herald, develop, and somehow control AI super intelligences, they believe that they'll become untouchable superhumans leading humanity to a golden age (the second part is at least what they say).

              5 votes
              1. boxer_dogs_dance
                Link Parent
                Guides? Overlords. H G Wells got it right.

                Guides? Overlords. H G Wells got it right.

                1 vote
    3. slade
      Link Parent
      Wealth hoarders have it to waste. Why wouldn't they casually throw around amounts like this? Ethics?

      Wealth hoarders have it to waste. Why wouldn't they casually throw around amounts like this? Ethics?

      1 vote
  2. [11]
    teaearlgraycold
    Link
    This smells of desperation. I have to assume this money is as much for having the employees not work for OpenAI as it is to have them at Meta. These people could probably get millions from Meta if...

    “I think that there’s a lot of people, and Meta will be a new one, that are saying ‘we’re just going to try to copy OpenAI,’” [Sam Altman] added. “That basically never works. You’re always going to where your competitor was, and you don’t build up a culture of learning what it’s like to innovate.”

    This smells of desperation.

    I have to assume this money is as much for having the employees not work for OpenAI as it is to have them at Meta. These people could probably get millions from Meta if they agreed to retire.

    22 votes
    1. [6]
      Greg
      Link Parent
      I did see someone say this’d be a great way for Altman to cause some chaos in the ranks at Meta, whether or not Meta had actually made any offers, because suddenly all the high-level hires are...

      I did see someone say this’d be a great way for Altman to cause some chaos in the ranks at Meta, whether or not Meta had actually made any offers, because suddenly all the high-level hires are side eyeing their compensation packages and wondering if their colleagues are getting tens of millions more than they are.

      But then again I did watch Meta invent diffusion transformers a couple of years ago, go suspiciously quiet on the subject, and then get blindsided along with everyone else when the same guy was poached by OpenAI and created Sora, so it does look like OpenAI are offering something compelling to the right people, and I can’t imagine that’s a great spot for Meta to be in…

      24 votes
      1. [5]
        SloMoMonday
        Link Parent
        This sort of strikes me as a very juvenile move from Altman. Because if I were an investor, my questions are "Which employees are worth 100mil?", "What is the strategic risks or opportunity that...

        This sort of strikes me as a very juvenile move from Altman. Because if I were an investor, my questions are "Which employees are worth 100mil?", "What is the strategic risks or opportunity that makes that retention cost worth it?", "Whats the plan if these employees are hit by a bus tomorrow?", "Whats the premiums on insurance for an employee of this value?", "Are there legitimate security risks for this person or their family that needs to be accounted for?", "What else could that money be used on?", "What can Meta do with this person that we haven't already done?", "Have we fulfilled all the tax obligations before publicizing payments in that amount?", "Where is that liquidity coming from?", "What about employees that are not getting those bonuses, how is morale and can we afford to cut them loose or loose them to competition?", "What contracts are in place to stop people taking the money and running?", "Do we have a skills transfer and documentation process in place to reduce this level of dependency?"

        And of course there's the bigger question of that money being real. Or is Altman playing semantic games with stock options, differed payments and empty promises. If anyone is poaching staff, its just a matter of picking up the phone and asking. OpenAI is the risky bet in this fight. If AI flops tomorrow, Facebook, Google, MS and Amazon still has everything else it started with. Altman has plenty of retail users and some big API deals but in the grander scheme, that's kid numbers next to the competition. And not good-enough numbers based on what they are valued at.

        I know big tech throws around stupid numbers, but $100mil means something. It's probably more wealth than most families could ever make over several generations. You don't just instant EFT that to someone based on a whim after a handshake. Throwing around a number like that is boasting, posturing and/or ignorance. And people like Altman have really lost the benefit of the doubt and I think should be treated with a healthy level of suspicion. They claim that their tech will fix everything and abandoning our IP rights, the environment, privacy, security and countless working people will be worth the benefits. But these systems are forced into everyone's faces and shoehorned into every possible thing. How long before all our problems are solved? And more importantly, how does openAI intend to make money in the coming years? Because, every other company was able to whip up an LLM division in months. Those kids in china could made competitive models in a cave with a box of scraps. Retail and commercial market is so saturated that I'm sure we have AI powered Jello at this point. So how is openAI going to squeeze more blood from this stone?

        21 votes
        1. [4]
          skybrian
          Link Parent
          Yes, it all seems ludicrous, but there are even bigger deals than that. Facebook-parent Meta hires 28-year-old Scale AI founder Alexandr Wang as Superintelligence Chief …

          Yes, it all seems ludicrous, but there are even bigger deals than that.

          Facebook-parent Meta hires 28-year-old Scale AI founder Alexandr Wang as Superintelligence Chief

          Facebook-parent Meta is investing $14.3 billion in data-labeling startup Scale AI, taking a 49% stake in a deal. As part of the agreement, Scale’s 28-year-old co-founder and CEO, Alexandr Wang, will join Meta to lead its new superintelligence unit, a major shift in the tech giant’s AI strategy. Meta confirmed the partnership on Thursday, saying: “We will deepen the work we do together producing data for AI models, and Alexandr Wang will join Meta to work on our superintelligence efforts.” As part of the deal, ScaleAI will be valued at $29 billion. Citing unnamed sources, Reuters reports that the main reason for Meta’s sizable investment was to secure Wang’s leadership. Unlike leaders at other AI labs who come from research backgrounds, Wang is seen as a business-oriented founder, similar to OpenAI’s Sam Altman.

          With $14.3 billion investment, it is Meta’s second-largest deal after its $19 billion acquisition of WhatsApp. The company does not plan to take a board seat in Scale, the Reuters report said.

          11 votes
          1. [2]
            SloMoMonday
            (edited )
            Link Parent
            I understand that big deals are being made and Facebook has its own problems after the Metaverse gambit. And maybe they are going to talent with deals that could add up to 100mil in the best of...

            I understand that big deals are being made and Facebook has its own problems after the Metaverse gambit. And maybe they are going to talent with deals that could add up to 100mil in the best of outcomes.

            But Altman is talking with such confidence about actions taken by competitors and his employees, trying to boast a position of strength. "Look at us. We so advanced that Facebook is paying top dollar for our staff. And our guys won't take the money because they believe in our company too much. That is how cool we are so give us your money so we can hold on to this top talent. " It's all posturing. And so is what Facebook is doing with Scale.

            Scale AI has a few dozen small projects under their belt. It's more impressive than 90% of startups and speaks to very cleaver leadership, networking and unique service offerings. But all of their work put together can't be more than a hundred million in dev and consulting services. Unless I'm missing something big or they are cooking up a revolutionary system in the back, all I can see in their references is limited data entry/sorting improvements (which can be done through regular digital migration and a paperless initiatives), edge analytics in the cloud for autonomous machines (a backwards way to manage the large data sets generated by autonomous equipment and ignores the actual ways data modelling is used for proactive monitoring) , a few chat bots and other vague and generic AI features. They have a single hammer and everything is a nail. For 15 billion I'd expect companies capable of enterprise transformation projects and sporting a full suite of integrated industry solution services that has a long list of companies on reoccurring licenses. SMEs with decades of collective experience and experiences with every edge case.

            Is the company valued at that point because it's demonstrated the capacity to perform at that level, or because we feel like they should. The story of a college drop out, starting his company, delivering the hottest new tech to big name clients. And then they get that major cash injection that will take them to the next level. The vibes are good.

            But this is the Metaverse guy. The same person who sunk several GDPs and still failed to make something work. AI is also the companies that made Windows 8. And Google glass. And the Fire Phone. And Hyperloop. And a blockchain currency. And YouTube Red. And cybertrucks. And the Vision Pro. And Google Plus. And who put the charger under a wireless mouse. And who renamed companies that had global brand recognition. And Full Self Driving. And who's mismanagement created the misinformation hellscape we live in. They all enjoyed years of consequence free stupidity because they got by on vibes, monopolies, PR and untaxed fortunes.

            14 votes
            1. Raspcoffee
              Link Parent
              You've really nailed(heh) it here. If I think about potential R&D investments which could pay off well, semiconductors could probably still give us so much more as well as other material research...

              They have a single hammer and everything is a nail. For 15 billion I'd expect companies capable of enterprise transformation projects and sporting a full suite of integrated industry solution services that has a long list of companies on reoccurring licenses. SMEs with decades of collective experience and experiences with every edge case.

              You've really nailed(heh) it here. If I think about potential R&D investments which could pay off well, semiconductors could probably still give us so much more as well as other material research that it's wild to consider these numbers being thrown at to make what is, essentially, a sophisticated text guesser.

              Like yeah I suppose some unexpected uses will also pop up but after a certain point no matter how good the new technology is you will oversaturate the investment capital.

              I know I write that down calmly but I'm actually also getting rather sick of all that time, effort and resources wasted.

              7 votes
          2. CptBluebear
            Link Parent
            That's simultaneously an acqui-hire and removing the competition from that data pool. In the grand scheme of AI it doesn't look too crazy. Which is crazy by itself.

            That's simultaneously an acqui-hire and removing the competition from that data pool.

            In the grand scheme of AI it doesn't look too crazy. Which is crazy by itself.

            3 votes
    2. stu2b50
      Link Parent
      Why? For one, you have to take everything Altman says with a grain of salt. He's talking about a direct competitor - it is in his interest to make the attempts at poaching his staff seem as...

      Why? For one, you have to take everything Altman says with a grain of salt. He's talking about a direct competitor - it is in his interest to make the attempts at poaching his staff seem as extraordinary as possible.

      Secondly, poaching from competitors is perfectly normal. It's also just called a competitive job market.

      17 votes
    3. [3]
      derekiscool
      Link Parent
      I read on Blind that this money was conditional on successfully developing AGI. If (a big if) true, it's pretty obvious why nobody took it.

      I read on Blind that this money was conditional on successfully developing AGI. If (a big if) true, it's pretty obvious why nobody took it.

      5 votes
      1. [2]
        teaearlgraycold
        Link Parent
        If so that's a nice admission that even the people at the very top of OpenAI don't think AGI is anywhere close on the horizon. If we're about to have AGI in a few years like SV would like us to...

        If so that's a nice admission that even the people at the very top of OpenAI don't think AGI is anywhere close on the horizon. If we're about to have AGI in a few years like SV would like us to believe then why not take the $100MM check? It should be easy money.

        3 votes
        1. skybrian
          Link Parent
          You not only have to predict when it happens, you have to pick the company that invents it, or you lose the bet.

          You not only have to predict when it happens, you have to pick the company that invents it, or you lose the bet.

          4 votes
  3. skybrian
    Link
    https://archive.is/3w5g8

    https://archive.is/3w5g8

    Bosworth said in an interview with CNBC's "Closing Bell: Overtime" on Friday that Altman "neglected to mention that he's countering those offers."

    13 votes
  4. [7]
    LetsBeChooms
    Link
    With this amount of money being thrown around, you know there is nothing ethical about these situations. The people, the resources, the results -- this is a high-powered rocket pointed at the sun.

    With this amount of money being thrown around, you know there is nothing ethical about these situations. The people, the resources, the results -- this is a high-powered rocket pointed at the sun.

    7 votes
    1. [6]
      Minori
      Link Parent
      Why do large amounts of money mean something is unethical? Is it weird to pay massive sums to the biggest movers and shakers? If you had the chance to hire the scientist that would solve obesity,...

      Why do large amounts of money mean something is unethical? Is it weird to pay massive sums to the biggest movers and shakers?

      If you had the chance to hire the scientist that would solve obesity, how much would you pay her? Some big problems and opportunities are worth a lot of money.

      6 votes
      1. [2]
        LetsBeChooms
        Link Parent
        Apologies for the delay -- the tiny little "1 comment" thing is waaaaaay off to the top right of my screen and I didn't notice the bugger. My original comment was shot from the hip, but I want to...

        Apologies for the delay -- the tiny little "1 comment" thing is waaaaaay off to the top right of my screen and I didn't notice the bugger.

        My original comment was shot from the hip, but I want to emphasize "this amount of money". 100 million dollars as a bonus is an obscene amount of money. Obscene. The fact that these people have that amount to begin with is obscene. People with huge amounts of money don't get there by making ethical choices -- people have to be exploited for money/power to accrue in such ridiculous quantities.

        Can you think of a single situation that involved disgusting amounts of money where people didn't get screwed over?

        And this isn't curing obesity. This is a tech race to max profit.

        They could use a huge chunk of money to do these things ethically, like not pirating data-sources, or not tricking users into giving their data to use for training purposes. Or not working alongside authoritarian governments to get ahead. No, instead they are playing fast and hard with no concern for those exploited or damaged along the way.

        5 votes
        1. Minori
          Link Parent
          As some other comments mentioned, these bonuses are supposedly contingent on creating true AI like we see in Sci-Fi. If someone managed to create that kind of true AI, the amount of money doesn't...

          As some other comments mentioned, these bonuses are supposedly contingent on creating true AI like we see in Sci-Fi. If someone managed to create that kind of true AI, the amount of money doesn't seem too crazy.

          In practice, it's a moon shot the company doesn't expect to pay out. It's an unrealistic golden carrot that nobody is likely to get, but if they do get that bag, it'll be due to an absolutely extraordinary scientific advance.

          2 votes
      2. [3]
        h6nry
        Link Parent
        Off topic Answering your rhethorical question: If I was a capitalist, I'd hire her instantly and make sure her knowledge will be hidden for decades. Healing obesity is just not as profitable as...
        Off topic

        Answering your rhethorical question: If I was a capitalist, I'd hire her instantly and make sure her knowledge will be hidden for decades. Healing obesity is just not as profitable as treating all the symptoms

        1 vote
        1. Tannhauser
          Link Parent
          Off topic But 100% of the healing market is surely more profitable than some fraction of the treatment market. A real life case is when Gilead pushed a Hep C cure to clinic and made great profits...
          Off topic But 100% of the healing market is surely more profitable than some fraction of the treatment market. A real life case is when Gilead pushed a Hep C cure to clinic and made great profits off of it prior to Abbvie and Merck getting their drugs approved.
          6 votes
        2. Minori
          (edited )
          Link Parent
          You're thinking as if there's some singular capitalist cabal that controls everything and shares all the profits. In reality, no one company owns and profits from everything. If you're running a...

          You're thinking as if there's some singular capitalist cabal that controls everything and shares all the profits. In reality, no one company owns and profits from everything.

          If you're running a healthcare startup, you make exactly €0 from obesity-related illnesses and deaths. Suddenly having a cure for a disease that afflicts two-fifths of the global population (3.2 billion people) is a massive opportunity for any organization.

          Most countries leap at any chance to treat obesity because it's such a massive public health burden. About 10% of deaths are from consequences of obesity, so there are major humanitarian motivations too.

          4 votes