28 votes

Stability AI reportedly ran out of cash to pay its bills for rented cloudy GPUs

13 comments

  1. [13]
    Eji1700
    Link
    Yeah. We're at the honeymoon/get em hooked stage of the new toy. I suspect basically all AI tools are going to go through the roof in pricing should they see even modest adopting in the next 2 years.

    Yeah. We're at the honeymoon/get em hooked stage of the new toy. I suspect basically all AI tools are going to go through the roof in pricing should they see even modest adopting in the next 2 years.

    12 votes
    1. [11]
      skybrian
      Link Parent
      There seem to be two price points: free (for so-so models) and $20/month for the best ones. Stability was weird because they gave their models away, which is why their fans liked them, but it's...

      There seem to be two price points: free (for so-so models) and $20/month for the best ones.

      Stability was weird because they gave their models away, which is why their fans liked them, but it's not much of a business model. I guess we can thank their investors for paying for them?

      It's starting to remind me of the dot-com era.

      15 votes
      1. [10]
        sparksbet
        Link Parent
        There's another price point if you're a business and need assurances that they won't train on your data. But that's not a price point available to the general public, ofc.

        There's another price point if you're a business and need assurances that they won't train on your data. But that's not a price point available to the general public, ofc.

        3 votes
        1. [9]
          unkz
          Link Parent
          That pricing is generally available, it's no secret. https://openai.com/pricing API clients' data isn't used for training. It's actually cheaper than the $20 ChatGPT option. $20 would buy you...

          That pricing is generally available, it's no secret.

          https://openai.com/pricing

          API clients' data isn't used for training. It's actually cheaper than the $20 ChatGPT option. $20 would buy you about 600k tokens, which would be actually disturbing if you used that much ChatGPT, which would be roughly equivalent to consuming all of War and Peace, but chatting with a computer instead.

          6 votes
          1. [5]
            sparksbet
            Link Parent
            I work as a data scientist for a company that has contracts with a couple major generative AI developers (including OpenAI). The type of deal where there's a button titled "contact sales" rather...

            I work as a data scientist for a company that has contracts with a couple major generative AI developers (including OpenAI). The type of deal where there's a button titled "contact sales" rather than a subscription you sign up for is what I was referring to (though this API offering does seem more or less like what we use, I'm not the one that handled the money or contract side of this for us so I can't weigh in on whether it's precisely the same in terms of either privacy or price).

            3 votes
            1. [4]
              unkz
              Link Parent
              I don’t really know why there is a link to contact sales, you can just use the link at the bottom to “get started” and make an account. You get instant API access and a free $5 credit, no...

              I don’t really know why there is a link to contact sales, you can just use the link at the bottom to “get started” and make an account. You get instant API access and a free $5 credit, no questions asked.

              1 vote
              1. [3]
                sparksbet
                Link Parent
                Generally speaking options to "contact sales" are made when you want to set up some contract for a whole department or company, rather than for an individual. Whether the terms differ or not, it's...

                Generally speaking options to "contact sales" are made when you want to set up some contract for a whole department or company, rather than for an individual. Whether the terms differ or not, it's obviously useful to have an actual contract. This is not particularly strange.

                The comment I replied to said there were two price points. I pointed out a third, and then you linked to a page that verified that a third price point does indeed exist through their API.

                4 votes
                1. [2]
                  unkz
                  Link Parent
                  My point is it is indeed available to the general public.

                  But that's not a price point available to the general public, ofc.

                  My point is it is indeed available to the general public.

                  3 votes
                  1. sparksbet
                    Link Parent
                    Yes, well, as I've already said I'm not sure whether the price negotiated by a company is the same as what's listed on their website since I'm not the one in charge of that (and if I were I...

                    Yes, well, as I've already said I'm not sure whether the price negotiated by a company is the same as what's listed on their website since I'm not the one in charge of that (and if I were I probably wouldn't be allowed to talk about it on social media lol). I do know we didn't just purchase a publicly-available API product, but that there were talks with the companies we ended up contracting with (OpenAI being only one of them) to ensure they met our requirements and negotiate details.

                    3 votes
          2. skybrian
            Link Parent
            Long conversations cost a lot more than short ones, since you’re re-sending the chat transcript in each API call. It’s O(n^2), until you reach the length of the context window, anyway. Last year I...

            Long conversations cost a lot more than short ones, since you’re re-sending the chat transcript in each API call. It’s O(n^2), until you reach the length of the context window, anyway. Last year I wrote a VS Code plugin [1] that lets you chat in a Jupyter notebook, and it started getting expensive. That was before GPT 4 Turbo was released, though.

            [1] Bot Typist

            2 votes
          3. [2]
            creesch
            Link Parent
            I am not sure your math is entirely right. With chatGPT the token usage increases incrementally the longer a single conversation is. To demonstrate, I took a look at a recent conversation I had...

            I am not sure your math is entirely right. With chatGPT the token usage increases incrementally the longer a single conversation is.

            To demonstrate, I took a look at a recent conversation I had with chatGPT. This conversation was about a snippet of code I had a short back and forth about. The rough breakdown is something like:

            1. My initial input consisted of 186 words. Roughly 350 tokens.
            2. The initial response of chatGPT was 321 words. Roughly 520 tokens.
            3. I then asked a short follow up question of 10 words. Roughly 15 tokens.
            4. ChatGPT's reply contained 373 words. Roughly 600 tokens.
            5. I had one final question, also 10 words. Roughly 15 tokens.
            6. The final chatGPT reply was 375 words long. Roughly 600 tokens.

            So ignoring the system prompt you'd say that it is a total of 2100 tokens. Which would mean that with 600k tokens (ignoring that response tokens are more expensive) you could have roughly 285 conversations like this.

            Except that doing the same through the API means that every time you want to follow up on a conversation, you need to send the entire conversation with it. So then the token usage becomes something like this:

            1. Initial input: 350
            2. Initial reply: 520
            3. Follow-up input: 350+520+15 = 885
            4. Second reply: 600
            5. Final follow up: 350+520+15+600+15=1500
            6. Final reply: 600

            Which now makes it a total token count of 4450. Which now gives you a budget of 135 conversations in one month.
            But, I also have had back-and-forths with chatGPT that were considerably longer. Assuming similar tokens one more question from me would have brought the token count to 7165 (83 monthly conversations) and so on.

            To summarize, if you only have short conversations with barely any follow-ups you are entirely right. If you on a regular basis have longer back-and-forths then chatGPT is actually a pretty decent deal. Though with really heavy use you do get rate limited when using GPT-4, but it then allows you to fall back to GPT-3.5. With the $20 API budget, you'd simply would have ran out of that budget.

            1. unkz
              Link Parent
              Well, it was a ballpark estimate. In my estimate, I was using the more expensive output token cost for all tokens. However, as you pointed out, input tokens are cheaper than output tokens, and in...

              Well, it was a ballpark estimate. In my estimate, I was using the more expensive output token cost for all tokens. However, as you pointed out, input tokens are cheaper than output tokens, and in long conversations a significant majority of those tokens are input tokens. $20 buys 2 million input tokens compared to around 600k output tokens, which is the number I was using as a basis. Also, as you point out, ChatGPT can fall back to 3.5-Turbo, where the API would provide 40 million input tokens or 13 million output tokens for $20.

    2. creesch
      Link Parent
      Yeah we are most we most certainly have been in the hype stage where everyone is throwing as much money at it as possible. It pretty much feels like a lot of technology bubbles of the past few...

      Yeah we are most we most certainly have been in the hype stage where everyone is throwing as much money at it as possible. It pretty much feels like a lot of technology bubbles of the past few decades. In a few years we will see what of all the thrown mud actually stuck and also ended up being sustainable as far as cost goes.

      5 votes