71 votes

I hope you don't use generative AI - an essay about my experience offering an open-source tool

95 comments

  1. [63]
    Gourd
    (edited )
    Link
    I understand the temptation to hate everything AI generated, but like it or not, this is the future of development. It's already happening at scale, and it's not going to stop. It's just another...

    I understand the temptation to hate everything AI generated, but like it or not, this is the future of development. It's already happening at scale, and it's not going to stop. It's just another layer of abstraction in the software engineering process. Just like how we aren't writing assembly anymore. Its totally reasonable to say that an experienced engineer can produce a virtually identical product with AI that they could have written manually, but much faster. This is not the case with art.

    Artists have an understandable bone to pick with genAI, and I think most people would agree that AI is not capable of producing art. AI slop is a major fucking annoyance and it's doing real harm to real people. But things are different in the software industry. Its either adapt or get left behind.

    And of course there are major negative implications. It's much easier for unskilled devs to ship broken or dangerous code. Or crash all of AWS for 12 hours like back in December*. Who knows how this will impact the job market or affect a junior dev's ability to land a job. Whether it's good or bad, it's here to stay.

    Editing to say that I love this site! I want to thank everyone for the civil and high-quality conversation. Especially involving such a controversial topic that we all clearly have strong opinions about.

    34 votes
    1. [23]
      sparksbet
      Link Parent
      There isn't really a coherent definition of art that excludes art made by a human using generative AI that doesn't exclude other things that are widely agreed to be art (like, for instance,...

      I think most people would agree that AI is not capable of producing art

      There isn't really a coherent definition of art that excludes art made by a human using generative AI that doesn't exclude other things that are widely agreed to be art (like, for instance, collage). A lot of AI art is bad art, but a lot of human art is also bad art. A definition of art that excludes AI art is arbitrary and doesn't even serve to help artists, as the negative effects of generative AI art on artists are almost entirely due to the economics and capitalistic side of making art for a living, not some abstract idea of whether an image "counts" as art.

      I think the economic problems with modern AI hurt developers less than they do artists, but in both cases I don't think there's anything inherently wrong with using generative AI to help you create, as long as you're thoughtful about what you put into the world and aren't sloppy or reckless about the quality/safety/accuracy of it. People can do the opposite of this with or without AI, but AI makes it easier to skip this step for many people.

      22 votes
      1. [19]
        papasquat
        Link Parent
        Sure there is. Art is an intentional expression of human experience or emotion. AI can't produce art not because it's not technically capable of producing similar imagery, but in the same way that...

        There isn't really a coherent definition of art that excludes art made by a human using generative AI that doesn't exclude other things that are widely agreed to be art (like, for instance, collage).

        Sure there is. Art is an intentional expression of human experience or emotion.

        AI can't produce art not because it's not technically capable of producing similar imagery, but in the same way that a malfunctioning printer spitting out pages of nonsense isn't art.

        It's not intentional, because AI doesn't have intentions, and it's not an expression of human experience or emotion, because AI isn't human, it doesn't have experiences, and it doesn't have emotion.

        The more human you put into the loop of AI generated imagery though, the more art it becomes.

        17 votes
        1. [4]
          Greg
          Link Parent
          The tricky part - and I think this ties heavily back to what the article is saying as well - is the way a thoughtful take ends up morphing first into a damagingly oversimplified statement, and...

          The tricky part - and I think this ties heavily back to what the article is saying as well - is the way a thoughtful take ends up morphing first into a damagingly oversimplified statement, and then from there into a rallying cry.

          It's totally fair to say that art as a form of communication and expression starts being lost if the tool takes away too many decisions from the creator. It's kinda sorta reasonable to distil that down to "AI is not capable of producing art" if you're using it as a shorthand for "a simplistic prompt alone is not enough intentionality to really express oneself".

          But as soon as you use that soundbite without the full context, people take it as a starting point and interpret it from the bottom up as "creative work is not art if it uses AI", as if any use of a given tool inherently taints or strips away the creativity in and of itself. And then from there you get the judgmental "I hope you don't use generative AI", as a shorthand for the shorthand to say "your work is inferior if you chose to use this tool, regardless of circumstance, nuance, or actual outcome".

          And the frustrating part is that I'm saying all this as someone who doesn't use coding LLMs (they aren't well suited to my workflow and the outcomes I need), and does get pissed off when Spotify tries to slip generative music into my feed so they can save on licensing fees. I get why there's a negative reaction to these things. I just abhor the lack of nuance and the inverted reasoning that ends up being shouted the loudest once these things break containment into the wider conversation.

          19 votes
          1. [3]
            papasquat
            Link Parent
            Yes, and I think a lot of people's blind hatred of AI has totally obscured the actual reasons why they've developed that hatred in the first place. Most of the people in the OP seem to be artists...

            Yes, and I think a lot of people's blind hatred of AI has totally obscured the actual reasons why they've developed that hatred in the first place.

            Most of the people in the OP seem to be artists who have the opinion that AI is bad because it's replacing real human art. I agree that that is bad.

            The issue is that the OP did not intend to make art, and is not making his tools available as pieces of art. He offering them because they're useful. They serve a purely a utilitarian function.

            It would be one thing if he posted the source code saying "I set out to make this code elegant and beautiful and to make a statement about x". Some people do, but most people do not produce code with that intention. They produce it to solve a problem; ie, they're not setting out to make art. They're setting out to make a tool.

            A tool has an entirely different value proposition than art. Art is intended to evoke an emotion, to communicate something, or to share a piece of the author. Tools are intended to do a job as efficiently and as well as possible.

            So then, tools shouldn't be judged on the basis of art, just as art shouldn't be judged on the basis of tools.

            If you're making a tool, I have no issue with using as much AI as possible as long as you're being responsible about reviewing that code, since AI still has a lot of practical problems regarding context and security.

            If you're making art, I will respect it less and be less interested in it the more of the technical aspects and decision making you've outsourced to AI. For instance, I have no desire to go to a museum filled with diffusion model generated art, no matter how carefully it's prompted and selected.

            I mean, hell, I respect the heck out of digital art, I know it's very time consuming and requires a great deal of skill to create. I still have no desire to even see that in an art museum.

            I, and I think most people if they really think about it, judge those two things by completely different standards.

            3 votes
            1. [2]
              Greg
              (edited )
              Link Parent
              Yeah, I agree that making tools and making art are very distinct goals, to be judged differently - but I do think there's a lot of overlap in the attitudes about both, partially because of the...

              Yeah, I agree that making tools and making art are very distinct goals, to be judged differently - but I do think there's a lot of overlap in the attitudes about both, partially because of the loss of nuance in a lot of the discussion around the tech in general.

              If you're making art, I will respect it less and be less interested in it the more of the technical aspects and decision making you've outsourced to AI

              I think that's reasonably fair, but I do also think it's important not to fall into the trap of assuming "AI" means "typed a simple prompt into an off the shelf, hosted commercial model". The problem here is that 99% of the output by volume is that, but again it means anything with even a hint of the term "AI" attached gets the backlash as if it were only that even if it's in the 1% where the ML models are a tool in the workflow like any other.

              Putting it another way, I'd hope you don't lose respect for photography as an art form just because it outsources many of the technical decisions and skills that painting requires. To run with the analogy, I'd bet that a lot of photos taken today aren't really intended or thought of as art by the people creating them either - they're selfies to show you're having fun, or snapshots to remember a moment, or whatever, and they absolutely have value in that sense, but they were never intended to be an artistic expression in and of themselves - but I don't think that detracts from the artistry in photography as a medium when it is used expressively.

              I probably wouldn't want to bother with a gallery exhibition of carefully prompted diffusion model art either, although I can think of a few ways it could be creatively interesting even then, but I'd absolutely be interested in an exhibition that explores diffusion generation itself as an art form - the nuances of how the models themselves evolve and change in response to input, in response to training, and in response to interacting with their parameters directly and mathematically. That's just an off the cuff idea, it might not have great artistic merit or be interesting to everyone, but what I'm trying to focus on is that when everyone says "AI" and means "text prompted output from pre-trained commercial models" it's like saying "food" and meaning "McDonald's".

              [Edit: A concrete example of the kind of art I'm talking about when I say the technology can be used in genuinely creative ways. And another piece by the same artist just because I really like it!]

              I mean, hell, I respect the heck out of digital art, I know it's very time consuming and requires a great deal of skill to create. I still have no desire to even see that in an art museum.

              That's a surprising generalisation to me - obviously you like what you like, but I've seen some phenomenal digital art over the years! I've also seen a lot of very analogue art that absolutely used (non-AI) digital technologies in the workflow, that I think we'd be much worse off for the lack of if there were an "anti digital" backlash to art that used a computer anywhere along the chain, in the same way that we see "anti AI" backlash to a game that had a few years-old generative textures left in.


              his tools

              Side note, OP uses they/them pronouns in their about page

              7 votes
              1. delphi
                Link Parent
                (I have nothing to add, you're spot on, just wanted to confirm that I do in fact use they/them pronouns exclusively)

                (I have nothing to add, you're spot on, just wanted to confirm that I do in fact use they/them pronouns exclusively)

                5 votes
        2. [8]
          rogue_cricket
          Link Parent
          I think I mostly agree with this, honestly, although some might say that a prompt is some expression of humanity. A lot of the the time my beef with someone saying "I used AI to produce this art"...

          I think I mostly agree with this, honestly, although some might say that a prompt is some expression of humanity.

          A lot of the the time my beef with someone saying "I used AI to produce this art" is actually the I, though.

          Like - let's say I find a webstite that allows me to create custom engraved wood signs. I write "LIVE LAUGH LOVE" in the request area, I pick a font, and then I send off my request to some workshop to get it laser cut into a board of wood. Many humans and tools alike are involved in the process of creating the product: there's me, who initiated the request. There's the creator of the font, of course. There's the maker of the application that converts the vector font into instructions for a CNC machine, there's the operator running the machine, and there's the CNC machine itself.

          Is my LIVE LAUGH LOVE sign art? I am leaning on the side of "no" (though I could be convinced), but what I'm more sure of is that in this process I would not be its artist even though I initiated the process of its production. Certainly the programmer that wrote the application that converted the font is not the artist of the result and certainly the CNC operator isn't either and the CNC machine isn't. The designer of the font is an artist, but their involvement in the process of the final product is removed from it by time, and all everyone else in this stack did was leverage their existing work.

          So, that's how I see generated AI content intended to be aesthetically pleasing or entertaining. It's laser-cut LIVE LAUGH LOVE sign. It fills visual space. Maybe there's some art somewhere in the process, but the artists are the people who created the original material. There is no artistic contribution from the tool itself, from the programmer of the tool, or from the person who pressed the "MAKE IMAGE" button.

          7 votes
          1. [7]
            papasquat
            Link Parent
            I think the prompt is an expression of humanity. When someone intentionally prompts something and picks a result to post, it's slightly art (the whole thing is a spectrum in my opinion). When an...

            I think the prompt is an expression of humanity. When someone intentionally prompts something and picks a result to post, it's slightly art (the whole thing is a spectrum in my opinion).

            When an AI agent just automatically prompts itself and posts slop based on what is trending, it's not even a little bit art.

            Similarly, I think your live laugh love sign is slightly art as well. There were lots of decisions involved, from deciding to do the sign in the first place, to that specific phrase, to the font choice, and so on. If a human wasn't in the loop though, and some automated workflow decided to make it because those signs were selling well, it's decidedly not art.

            Basically for me, the more choices made by a human = the more "art" it is.

            5 votes
            1. [6]
              avirse
              Link Parent
              The issue I have with that take is that if you replaced the LLM with a human artist and gave them the same prompts and revisions, the person giving the prompts would be said to be commissioning an...

              The issue I have with that take is that if you replaced the LLM with a human artist and gave them the same prompts and revisions, the person giving the prompts would be said to be commissioning an artist, not creating art themselves. I don't see how their relationship to the end product is meaningfully changed by the fact that it's not a human they are commissioning.

              3 votes
              1. [5]
                papasquat
                Link Parent
                I don't think there are many artists that would argue that their patron doesn't have a hand in creating their art. The creation of Adam wouldn't be what it is without the Sistine chapel and the...

                I don't think there are many artists that would argue that their patron doesn't have a hand in creating their art. The creation of Adam wouldn't be what it is without the Sistine chapel and the Catholic Church.

                We consider Michaelangelo to be the artist because he's the one who made the vast amount of decisions about the art. He decided the subject, the composition, the pigments, where each brush stroke should go, and so forth. Each of those decisions was informed by his life experience and emotions at the time.

                An LLM doesn't have emotions or experiences. The only human in the loop that does is the one prompting it. Thus, they're the "artist". (Albeit to a very small degree)

                1. [4]
                  avirse
                  Link Parent
                  The patron has a hand, sure, but if that patron claimed to be an artist or to have created the art no one would take them seriously. Why is that different with a LLM when they have the exact same...

                  The patron has a hand, sure, but if that patron claimed to be an artist or to have created the art no one would take them seriously. Why is that different with a LLM when they have the exact same level of involvement? Why does there need to be "an artist" involved at all?

                  An LLM doesn't have emotions, but it does have experiences. It has vast data banks of experiences that it uses to create its output.

                  3 votes
                  1. [2]
                    papasquat
                    Link Parent
                    That's because there was someone else who had a vastly more impact in the art; the actual artist. No different than someone contacted to design a single window in an office of the empire state...

                    The patron has a hand, sure, but if that patron claimed to be an artist or to have created the art

                    That's because there was someone else who had a vastly more impact in the art; the actual artist.

                    No different than someone contacted to design a single window in an office of the empire state building pointing to the building and going "I designed that that!".

                    This isn't really uncommon or new either. Large art installations have dozens or even hundreds of people that assist the artist in doing the grunt work of laying the tile, painting pieces, moving heavy equipment, installing stuff. The artist still gets recognized as the artist though.

                    The only difference here is that an LLM is tool, not a person, and thus is no more an artist as a paintbrush, or a camera, or a piano, or Adobe Photoshop is. The human using it is the person who made all the artistic decisions, so they're the artist, even if that role is extremely minimal.

                    And as @Drynyn noted, LLMs don't have experiences. They can't, because experience requires consciousness. It can, at best, parrot written accounts of experiences that people have had, but it has none of its own.

                    4 votes
                    1. avirse
                      Link Parent
                      That still presupposes both that there must be an artist involved and that that artist must be human, which is not self-evident.

                      That still presupposes both that there must be an artist involved and that that artist must be human, which is not self-evident.

                  2. Drynyn
                    Link Parent
                    LLMs don't have experiences, they have statistical correlations. Their output is normative and not representative of the breadth of human experience.

                    LLMs don't have experiences, they have statistical correlations. Their output is normative and not representative of the breadth of human experience.

                    1 vote
        3. [5]
          sparksbet
          (edited )
          Link Parent
          Even operating under the definition of art you give, when people discuss AI art, they are always discussing AI art that has a human in the loop somewhere, because generative AI is never...

          Even operating under the definition of art you give, when people discuss AI art, they are always discussing AI art that has a human in the loop somewhere, because generative AI is never autonomously generating art without a prompt. I don't think there's a coherent way to define a threshold of creativity involved by the human doing the prompting at which it magically becomes art, nor do I think there's a coherent definition of art that excludes the human expression involved in using generative AI in this way but includes other widely recognized forms of art. Any modicum of human creativity involved entails that it is art imo, even if that creativity is weak af and the result is generic, derivative slop -- otherwise our definition of art would exclude huge swaths of shitty, generic art made by human beings without AI. And I don't think there's a sensible way to define creativity or intentionality that doesn't include the use of prompts to get AI to generate an image as involving a non-zero amount thereof, even if in many cases it's a small amount of thought being put into it.

          If AI were capable of independently making art without being prompted, I think I'd consider your arguments more relevant to the definition. But if AI were capable of autonomously choosing to do so without a prompt, I think some of your other arguments, like a lack of intentionality, would cease to be true of AI art anymore. But that's a hypothetical anyway.

          6 votes
          1. [4]
            papasquat
            Link Parent
            Well that's not really true, is it? Right now, social media is absolutely infested by slop that is likely automatically churned out by agentic workflows. A web scraper looks for whatever is...

            they are always discussing AI art that has a human in the loop somewhere, because generative AI is never autonomously generating art without a prompt.

            Well that's not really true, is it? Right now, social media is absolutely infested by slop that is likely automatically churned out by agentic workflows. A web scraper looks for whatever is trending, sends that info to an LLM, which generates a prompt to send to a diffusion model and then gets posted. As long as the engagement nets you more money than the token costs, you make money. There are literally hundreds of thousands of accounts like this. I imagine the account creation is automated in most cases as well.

            In my view, content like that is not art by any means. It's not even slightly art.

            Something that someone intentionally prompted and selected at least has some human decision making behind it, and I would say it's at least a little bit art, but in the way that my drinking water has a little bit of gasoline in it.

            7 votes
            1. [3]
              sparksbet
              Link Parent
              I think there are arguments for it being able to be art even with agentic workflows under certain circumstances, since these could at least in theory be set up with intentionality by a human to...

              I think there are arguments for it being able to be art even with agentic workflows under certain circumstances, since these could at least in theory be set up with intentionality by a human to express more interesting things than "whatever is popular". I could also argue that there's plenty of genrric art made to just imitate what's popular online coming from human beings without AI and always has been. But honestly, getting further into the weeds on that isn't really relevant to the point I'm trying to make.

              I'm not interested in wishy-washy "more or less" definitions of art here -- I am opposing the assertion that AI generated images cannot be art, which is absolutely what at least some people are saying when they say "AI art isn't art." My point is that there is no way to draw a clean line between AI and human work in a way that classifies all AI art as "not art" and all human art made without AI as "art" without a wildly incoherent and/or inconsistent definition of "art" to begin with, and any amount of "art" in AI art is enough for that assertion.

              4 votes
              1. [2]
                papasquat
                Link Parent
                Well, I think the main issue is that "ai generated" doesn't really mean anything. That term is also a sliding spectrum with no real definition. You have things that everyone would agree with as...

                Well, I think the main issue is that "ai generated" doesn't really mean anything. That term is also a sliding spectrum with no real definition. You have things that everyone would agree with as being "AI generated", like the agentic workflow previously mentioned, but further along the spectrum you have things like just posting whatever the first image that comes back from your prompt, carefully selecting a prompted image, taking a prompted image and then editing it in Photoshop. The extreme end of that spectrum has things like the magic eraser tool, fuzzy selection, automated smoothing in 3d modeling software, and so on.

                If someone says "AI art isn't art", it's just not at all clear what they really mean.

                If they're talking about images generated with no human in the loop, I'd wholeheartedly agree with them.

                Anything less than that though... yeah I'd agree with you. You can't come up with a consistent definition of art that makes sense that includes human art but excludes "ai assisted" art.

                3 votes
                1. sparksbet
                  Link Parent
                  Ultimately I think most people who say things like "AI art isn't art" are reacting to their material conditions and the effects generative AI has on the labor market, which they're right to worry...

                  Ultimately I think most people who say things like "AI art isn't art" are reacting to their material conditions and the effects generative AI has on the labor market, which they're right to worry about. As a result, though, any philosophical arguments tend to be post-hoc justifications for what is ultimately not a position they came to because of anything they believe about art in general in the first place. There are exceptions to this who do think more deeply about it, but I can say from experience that there are a lot of people who arrive at "AI art isn't art" this way and they are very annoying on Tumblr.

                  5 votes
        4. Lia
          Link Parent
          While this is a mandatory component in the definition of art, it's not enough in itself. Not every human being is an artist, just like not everyone is a philosopher or a scientist. Every human is...

          Art is an intentional expression of human experience or emotion.

          While this is a mandatory component in the definition of art, it's not enough in itself. Not every human being is an artist, just like not everyone is a philosopher or a scientist. Every human is inherently expressive though, and self-expression can be beautiful, valuable and interesting without being art.

          I'm an artist and I often feel like I'm in a very small minority here on Tildes (if not completely solitary) with my opinions about art. I'm considering writing a standalone post about the definition but I'm feeling like it'll be a long essay and I don't have time for long essays right now. One day!

          4 votes
      2. [3]
        Gourd
        Link Parent
        It's definitely not black and white. As far as I know, there's no widely agreed upon definition of art, period.

        It's definitely not black and white. As far as I know, there's no widely agreed upon definition of art, period.

        3 votes
        1. [2]
          sparksbet
          Link Parent
          It's certainly something that has had a lot of words spilled over it over the years! I just don't think it's really true that most people would agree AI art isn't art. Hell, the fact that people...

          It's certainly something that has had a lot of words spilled over it over the years! I just don't think it's really true that most people would agree AI art isn't art. Hell, the fact that people are constantly calling it AI art even sometimes when arguing against its being art is telling.

          4 votes
          1. Nsutdwa
            Link Parent
            The line in the OP blog post that most grated, I think, was "by definition machines can't make art". It's a strong argument, but it's certainly not a given. I think the blogger should have said...

            The line in the OP blog post that most grated, I think, was "by definition machines can't make art". It's a strong argument, but it's certainly not a given. I think the blogger should have said "by my definition...", but that lacks rhetorical punch, of course.

            1 vote
    2. [20]
      PendingKetchup
      Link Parent
      I think there's a future for generative models in computer programming, but I don't understand why people think in this teleological way. The technology used to do work is a political decision,...

      like it or not, this is the future of development. It's already happening at scale, and it's not going to stop.

      I think there's a future for generative models in computer programming, but I don't understand why people think in this teleological way. The technology used to do work is a political decision, and many times in the past technologies that were "the" future turned out not to be, because people stopped liking them.

      Once upon a time, The Future of Development was SOAP web services. Or managed code. Or whatever the hell a "Java Bean" is. Or Web 2.0 "mashups". Just yesterday it was "blockchain".

      These technologies are also "here to stay" in that they have not ceased to exist, but one can lead a full and exciting life as a developer in the modern age without ever actually having to work with a SOAP web service.

      17 votes
      1. [19]
        Gourd
        Link Parent
        Oh god, I still have to regularly work with SOAP and it's a nightmare lol. I've been a bit glib here, but just look at how quickly Stackoverflow is losing traffic. It seems clear that a huge...

        Oh god, I still have to regularly work with SOAP and it's a nightmare lol. I've been a bit glib here, but just look at how quickly Stackoverflow is losing traffic. It seems clear that a huge percentage of developers are now regularly using AI in their workflow, and only increasingly so. I get the comparison to blockchain (especially with regards to the hype around it), but LLMs are a much more generally applicable technology.

        You are right though, the future is very uncertain. Model collapse, for instance is just one of many very real problems that could change everything.

        5 votes
        1. [14]
          OBLIVIATER
          Link Parent
          I think the difference from something like the blockchain fad is that genAI has already proven itself to be a useful tool in programming. Obviously the extent of its usefulness is most likely...

          I think the difference from something like the blockchain fad is that genAI has already proven itself to be a useful tool in programming. Obviously the extent of its usefulness is most likely being over exaggerated by some people, but it's undeniable that it can be a useful tool for a competent programmer.

          I don't really believe it'll ever be perfect or anything, but it does a good job when used responsibly by someone who knows what they're doing.

          8 votes
          1. [12]
            sandaltree
            (edited )
            Link Parent
            Proven how? Can you point me to those peer-reviewed studies? Here’s one where they found the opposite: https://metr.org/blog/2025-07-10-early-2025-ai-experienced-os-dev-study/ Devs were in their...

            genAI has already proven itself to be a useful tool in programming

            Proven how? Can you point me to those peer-reviewed studies?

            Here’s one where they found the opposite: https://metr.org/blog/2025-07-10-early-2025-ai-experienced-os-dev-study/

            Devs were in their mind being more efficient but it was actually the opposite in the long run.

            7 votes
            1. vord
              Link Parent
              Opus 4.6 is genuinely capable of spitting out a bunch of reasonably functioning code for non-critical applications. Best I can tell that was a generational leap that may have tipped the scales...

              Opus 4.6 is genuinely capable of spitting out a bunch of reasonably functioning code for non-critical applications. Best I can tell that was a generational leap that may have tipped the scales since that study came out. It definitely is better than falling back to the models that were available in July. Admittedly an obscenely expensive one...

              The problem putting half-baked tech demos out as full products never goes away, but now we can churn them out faster.......yay?

              4 votes
            2. OBLIVIATER
              Link Parent
              I mean I haven't extensively researched this, I was more going off of anecdotal evidence I've heard from people I know and people here on Tildes. I'm not trying to argue with you or anything, but...

              I mean I haven't extensively researched this, I was more going off of anecdotal evidence I've heard from people I know and people here on Tildes.

              I'm not trying to argue with you or anything, but I do think with the qualifiers I put in my comment would make things a little more clear

              "useful tool for a competent programmer" "when used responsibly by someone who knows what they're doing."

              I'm totally open to being wrong though, obviously anecdotal evidence doesn't trump studies, but I also think that 1 study doesn't totally prove anything.

              4 votes
            3. [9]
              post_below
              Link Parent
              I don't think we need a peer reviewed study. I haven't checked to see if any exist. Instead, just believe people. The sheer number of professionals saying "this is useful to me" is enough. Give...

              I don't think we need a peer reviewed study. I haven't checked to see if any exist.

              Instead, just believe people. The sheer number of professionals saying "this is useful to me" is enough. Give them the benefit of the doubt that they're in a position to objectively determine the utility of the tools they use.

              3 votes
              1. [3]
                kjw
                Link Parent
                Yes, and usually they are people financially tied to tech startups and corporations which create LLM products. Me and most of my professional friends complain on having to review bad llmgenerated...

                Yes, and usually they are people financially tied to tech startups and corporations which create LLM products.

                Me and most of my professional friends complain on having to review bad llmgenerated code sent by our workmates.
                Also, CEOs demand from us much faster work and more productivity because yOu CaN uSe Ai and don't understand that we want good quality code (well they have no competences regarding programming) and it takes time. This affects us psychologically very negatively.

                Also, this shit kills our planet. Period. Very energy demanding billionaires' dream.

                8 votes
                1. [2]
                  post_below
                  Link Parent
                  I sympathize with your situation. It sounds like AI tools are more bad than they are good for you. I don't see what that has to do with whether other people find the tools useful though? In 2025 I...

                  I sympathize with your situation. It sounds like AI tools are more bad than they are good for you.

                  I don't see what that has to do with whether other people find the tools useful though?

                  In 2025 I got it, everyone was adjusting to rapid change. But in 2026 the idea that everyone who says agents are useful is either lying or delusional is getting old.

                  I don't disagree about the downsides, btw, it's just that's not relevant to the question of utility.

                  3 votes
                  1. kjw
                    Link Parent
                    I can't accept this argument. People find lots of harmful stuff useful for them, I think that people generally can't think and plan longterm, we don't have means, knowledge and nowadays patience...

                    I don't see what that has to do with whether other people find the tools useful though?

                    I can't accept this argument. People find lots of harmful stuff useful for them, I think that people generally can't think and plan longterm, we don't have means, knowledge and nowadays patience to investigate if say that it really is useful in the long run.

              2. [5]
                Lia
                Link Parent
                The mentioned study showed that when people say "this is useful to me", while that may be their experience, the opposite may actually be more true when their efficiency is measured. As well, what...

                The mentioned study showed that when people say "this is useful to me", while that may be their experience, the opposite may actually be more true when their efficiency is measured.

                As well, what "sheer number of professionals" are you referring to? What's the number, specifically? Is it a certain subset of professionals you know personally, out of everyone you know, or a bunch of voices echoing online, or what?

                2 votes
                1. [4]
                  post_below
                  Link Parent
                  I feel like I need to make a post about the METR study. No one seems to have even glanced at it! Doesn't stop people from considering it a smoking gun though. The oft cited "study" is from early...

                  I feel like I need to make a post about the METR study. No one seems to have even glanced at it! Doesn't stop people from considering it a smoking gun though.

                  The oft cited "study" is from early 2025, they used Claude Sonnet 3.5 or 3.7. It is in no way comparable to current gen models. The commonly cited inflection point didn't happen until the end of 2025 with, depending on who you ask, Sonnet 4.5 or Opus 4.5

                  The study was comprised of 16 people! If those 16 were even vaguely representative of the developer population at the time most of them wouldn't have had significant experience with LLMs for coding.

                  These are not tools that just work out of the box, especially back then. It takes time and experimentation, or instruction, to use them well.

                  It was cool that they did the study, trying to understand LLMs was a good idea. But it's not what anyone would consider a representative study. 16 people!

                  But wait! They did a follow up study later in 2025.

                  This time with about 60 people and newer models and tools. In that study they found the opposite effect, AI tools sped developers up (which is a shock to no one who has used these tools long enough to get a feel for them). In addition they had some, kind of entertaining, issues:

                  Due to the severity of these selection effects, we are working on changes to the design of our study.

                  Back to the drawing board, because:

                  Recruitment and retention of developers has become more difficult. An increased share of developers say they would not want to do 50% of their work without AI, even though our study pays them $50/hour to work on tasks of their own choosing. Our study is thus systematically missing developers who have the most optimistic expectations about AI’s value.

                  And...

                  Developers have become more selective in which tasks they submit. When surveyed, 30% to 50% of developers told us that they were choosing not to submit some tasks because they did not want to do them without AI. This implies we are systematically missing tasks which have high expected uplift from AI.

                  And so...

                  Together, these effects make it likely that our estimate reported above is a lower-bound on the true productivity effects of AI on these developers.

                  [...]

                  Some developers were less likely to complete tasks that they submitted if they were assigned to the AI-disallowed condition. One developer did not complete any of the tasks that were assigned to the AI-disallowed condition.

                  [...]

                  Altogether, these issues make it challenging to interpret our central estimate, and we believe it is likely a bad proxy for the real productivity impact of AI tools on these developers.

                  So to summarize, the new study showed a productivity increase and they estimate it's larger than the ~20% increase the study found. Cheers to them for being honest about the issues they encountered.

                  But again, we don't need a study for this, any experienced engineer can readily see it for themselves and you can find them talking about it pretty much everywhere. It would be interesting, though, to see a well designed study that attempted to quantify how big the average productivity increase actually is.

                  For that the participants using AI would need to be experienced with it and allowed to use their existing setups.

                  7 votes
                  1. [3]
                    Lia
                    Link Parent
                    I didn't know about the follow-up, thanks for mentioning it! This seems to be in line with a study that found AI assisted work makes people worse in lateral thinking both during the assistance and...

                    I didn't know about the follow-up, thanks for mentioning it!

                    This seems to be in line with a study that found AI assisted work makes people worse in lateral thinking both during the assistance and after they no longer receive it - and more competent in deductive tasks while assisted, but less competent afterwards (less than how they started out). I'm using off-the-cuff terms and can't link to the study, and haven't checked for its credibility myself, but wanted to mention it in case someone can pitch in with more info.

                    It would be interesting, though, to see a well designed study that attempted to quantify how big the average productivity increase actually is.

                    Agreed! I'm not a programmer so I'm not sure how much of an impact the above mentioned deterioration of creative thinking skills can impact productivity in that type of work. If not a lot, then perhaps it is a net positive, as long as you/we accept that the productivity hike is tied to the tool and can't exist without it (which is of course normal for tools in general, but thinking about AI companies gatekeeping such tools makes me feel.. not that great).

                    1 vote
                    1. [2]
                      post_below
                      Link Parent
                      Yeah that's a real concern, my hope is that open models will continue to keep pace and eventually be good enough to give people a fallback.

                      AI companies gatekeeping

                      Yeah that's a real concern, my hope is that open models will continue to keep pace and eventually be good enough to give people a fallback.

                      1 vote
                      1. kjw
                        Link Parent
                        I see an issue here - even if open models will eventually become good enough, I'm afraid that hardware will be expensive enough not to be affordable. Don't have data on this, but afair there's not...

                        I see an issue here - even if open models will eventually become good enough, I'm afraid that hardware will be expensive enough not to be affordable. Don't have data on this, but afair there's not enough hardware for current demand (made by ai companies), not speaking about replacing and adding more hardware in the future. On one hand, throughout times hardware has been becoming more affordable, but we've never seen such spike of demand for it from corporations as nowadays.

          2. papasquat
            Link Parent
            I mean... Blockchains have proven themselves useful too. Every day I get to fight off the ransomware attempts that blockchains have provided the financial motivation for. I mean, it's not useful...

            I mean... Blockchains have proven themselves useful too. Every day I get to fight off the ransomware attempts that blockchains have provided the financial motivation for.

            I mean, it's not useful to me, in fact, quite the opposite. It's extremely useful to criminals at least though!

            4 votes
        2. [4]
          vord
          Link Parent
          I think StackOverflow losing traffic is a sign that AI coding will eventually end up eating itself, as the drop in traffic will likely correlate to a drop in updated correct information.

          I think StackOverflow losing traffic is a sign that AI coding will eventually end up eating itself, as the drop in traffic will likely correlate to a drop in updated correct information.

          5 votes
          1. [3]
            Zorind
            Link Parent
            I’m a little worried that it will likely mean the end of “new” programming languages and usage of new programming language features, as those won’t be as heavily reflected (or reflected at all) in...

            I’m a little worried that it will likely mean the end of “new” programming languages and usage of new programming language features, as those won’t be as heavily reflected (or reflected at all) in the training data the LLMs have and recommend/use in generated code.

            8 votes
            1. Gourd
              Link Parent
              This is an interesting point. The ripple effects outside of just model collapse, even if the problem of model collapse is somehow solved.

              This is an interesting point. The ripple effects outside of just model collapse, even if the problem of model collapse is somehow solved.

              5 votes
            2. PendingKetchup
              Link Parent
              I feel like new programming languages and features are also the only possible answer to this sort of LLM code generation. Maybe some extremely-high-level ones for rapid application development....

              I feel like new programming languages and features are also the only possible answer to this sort of LLM code generation. Maybe some extremely-high-level ones for rapid application development.

              People clearly want to be able to say "I want a windows to pop up with two buttons and a text box and a picture of a cat" and have that happen, without needing to worry about the details. But articulating that in English can't possibly be the right way to do it, because as soon as you start to add a few details, it quickly extends to paragraphs of dialog about what the buttons should say and do and what breed the cat needs to be, in a weirdly precise dialect that's hard to write.

              Language models might be able to make programming as simple as writing a detailed spec, but a well-designed programming language should be able to make it simpler. Think of the difference between Ruby and Ruby on Rails, and then put the whole thing on another set of rails. The models prove that kind of development is physically possible, but does it have to involve several gigawatts and a slot machine?

              3 votes
    3. [13]
      babypuncher
      (edited )
      Link Parent
      I don't hate generative AI for development because it's another abstraction, I hate it because I'm expected to be more productive for less money. All the benefit of the tech is going to the people...

      I don't hate generative AI for development because it's another abstraction, I hate it because I'm expected to be more productive for less money. All the benefit of the tech is going to the people at the top. All of us should hate this. All of us should passionately hate the billionaire class shoving this down our throats and calling us luddites when we point out obvious problems (like "where are future senior devs going to come from if we stop hiring and training juniors?")

      I will also point out that generative AI, unlike every other abstraction layer in software development, is fundamentally non-deterministic. I can prove that the same C code will produce identical assembly every time. I can feed Claude the same promp 10 times and get 10 different results.

      I keep seeing my peers submit awful AI-generated code and we have to spend so much more time reviewing it, and people just want to pass it off "because it works" even though it will be a maintenance nightmare. I hate all of this, and it's disheartening seeing people just accept it blindly.

      15 votes
      1. [2]
        rich_27
        Link Parent
        I really don't understand why the general reaction in situations where people get exploited via some new technology being leveraged results in getting mad at the new technology rather than getting...

        I really don't understand why the general reaction in situations where people get exploited via some new technology being leveraged results in getting mad at the new technology rather than getting mad at the exploitative arseholes misusing the new technology for their own gain

        6 votes
        1. babypuncher
          Link Parent
          I don't think that's a meaningful distinction. I see exploitative assholes misusing this technology, and I see sycophantic assholes all over social media shilling its use while offering no...

          I don't think that's a meaningful distinction. I see exploitative assholes misusing this technology, and I see sycophantic assholes all over social media shilling its use while offering no solutions to the problem at best, and claiming it's not a real problem at all at worst.

          The people defending the billionaire class have positioned themselves on the wrong side of the class war.

          1 vote
      2. [7]
        skybrian
        (edited )
        Link Parent
        Skepticism about who profits is understandable, but I find it strange to hear people arguing against higher productivity, as such. Getting more work done with less labor is basically the...

        Skepticism about who profits is understandable, but I find it strange to hear people arguing against higher productivity, as such. Getting more work done with less labor is basically the foundation of modern prosperity. Without labor-saving improvements we’d all be peasant farmers.

        This is kind of obvious when using a washing machine rather than washing clothes by hand, but it’s also supposed to work that way for industry, too. The result of productivity improvements, combined with competition, is that the main benefits of higher productivity go to neither capitalists nor labor - they go to consumers as lower prices. And there is certainly a lot of cheap stuff available, and cheap software as well.

        That’s the theory anyway. Yes, also, there are huge fortunes to be made as a side effect. But along with Bezo’s wealth there are an awful lot of Amazon deliveries.

        The reality is often messy and it’s going to take years for the software industry to come to grips with this. Yes, AI is nondeterministic but so are programmers. With rigorous automatic testing and static analysis, hopefully it will become less so. Code health is a choice and maybe we could use these new tools to improve quality?

        3 votes
        1. [2]
          babypuncher
          (edited )
          Link Parent
          It was the foundation of modern prosperity. The last 40 years or so of productivity improvements have lead to a steadily declining middle class. We faced this problem during the 2nd Industrial...

          Skepticism about who profits is understandable, but I find it strange to hear people arguing against higher productivity, as such. Getting more work done with less labor is basically the foundation of modern prosperity.

          It was the foundation of modern prosperity. The last 40 years or so of productivity improvements have lead to a steadily declining middle class.

          We faced this problem during the 2nd Industrial Revolution (known as the Gilded Age in American history). Productivity skyrocketed, as did GDP. But so did poverty rates in the US, and we even had an extensive period of high unemployment called the Long Depression. All of this driven by the rapid rise of automation. This trend did not reverse until we artificially constrained the supply of productivity by banning child labor, standardizing 40 hour work weeks, and empowering labor unions.

          We're in another gilded age today, and I reserve my right to be outright disdainful of anything that I see benefits the 1% more than the rest of us. Because even though I'm a top 20% earner, I'm smart enough to know I have more in common with someone living paycheck to paycheck than I do with Jeff Bezos or Elon Musk.

          I will say, I don't think the solution is "no AI", I think the solution is "AI makes us productive enough that we can move to 30 hour work weeks". However, the class of billionaire pedophiles that run this country aren't going to give that to us without a fight. They aren't sinking trillions of dollars into AI just so we can live more comfortable lives. Their only goal is reducing headcount.

          14 votes
          1. skybrian
            Link Parent
            I’m no expert, but I think there was a lot more going on after the Panic of 1873. For one thing, modern economics hadn’t been invented yet and a result monetary policy was way too tight; in the...

            I’m no expert, but I think there was a lot more going on after the Panic of 1873. For one thing, modern economics hadn’t been invented yet and a result monetary policy was way too tight; in the US, there was deflation for much of that time period. But it’s not a clean experiment with a simple explanation. The Wikipedia article you linked to goes into various theories.

            History is terrible and the time period is no exception, but during that time period life expectancy in the US and UK started to rise, from about 40 to 45 years by 1900, eyeballing the charts on Our World In Data. And that seems to show that what came before the Guilded Age was worse?

            I didn’t find good data on poverty rates. Do you remember where you saw that?

        2. [2]
          papasquat
          Link Parent
          Has software gotten cheaper though? Every major software producer that sells software to end users has done away with perpetual licenses and instead opted to charge expensive monthly...

          Has software gotten cheaper though?

          Every major software producer that sells software to end users has done away with perpetual licenses and instead opted to charge expensive monthly subscriptions. Microsoft Office 2007 was $400 when it was released. You could pay $400 and use it forever.

          The equivalent today, Microsoft 365, costs about $200 PER YEAR now. That means that after four years, it's more expensive than Office 2007 was, even accounting for inflation.

          Microsoft has been heavily pushing the idea of AI making everyone's workforces more efficient, and pushing very hard on its developers to use the technology to make them more efficient. In theory, that should result in cheaper software, but it really hasn't.

          I suspect the premise is faulty in at least one place.

          6 votes
          1. skybrian
            (edited )
            Link Parent
            Meanwhile, Google’s office suite is free for consumers. (Sure, it’s ad supported. But still.)

            Meanwhile, Google’s office suite is free for consumers. (Sure, it’s ad supported. But still.)

            1 vote
        3. Zorind
          Link Parent
          I hope so, but I’m skeptical. Programming/Software Engineering has had a large problem with “move fast and break things” for a while, that has always worried me in regards to security, safety, and...

          maybe we could use these new tools to improve quality

          I hope so, but I’m skeptical.

          Programming/Software Engineering has had a large problem with “move fast and break things” for a while, that has always worried me in regards to security, safety, and data privacy.

          Non-programmers being able to create useful tools with LLMs is cool IMO, but it becomes a bigger problem when those tools or products get released with lax security. And we’ve had that problem before LLMs but I think LLMs exacerbate it with the speed at which “slop” can get produced.

          There’s just not really a sense of responsibility when it comes to LLMs, but that problem has been around before LLMs too.

          I don’t really have a problem with people using LLMs for code or tools for personal use, or tools being released for free (ideally open-source so they can be reviewed more easily).

          I have a slightly bigger problem with LLMs being used for-profit, because IMO they’re profiting heavily off of scraped open-source code, and not respecting the licensing of (some of) those open-source projects. I think they should’ve gotten permission first, and put credit where credit is due, but that cat is out of the bag now.

          3 votes
        4. Greg
          Link Parent
          I'm hoping we get some more robust approaches to testing for race conditions, deadlocks, and memory leaks sooner rather than later! I feel like those issues have always been sidelined by a focus...

          I'm hoping we get some more robust approaches to testing for race conditions, deadlocks, and memory leaks sooner rather than later! I feel like those issues have always been sidelined by a focus on unit testing frameworks that don't really have good ways of catching them, and that's going to be an even bigger problem in a "code's a black box, just trust the tests" world than it is now.

          2 votes
      3. [2]
        Greg
        Link Parent
        Even more annoyingly, we absolutely could be using code generating LLMs in a deterministic way, and I think there are likely to be some really technically interesting possibilities down that path...

        Even more annoyingly, we absolutely could be using code generating LLMs in a deterministic way, and I think there are likely to be some really technically interesting possibilities down that path - it's just unlikely to be commercially viable.

        At a guess, fully deterministic models are going to end up in the same conceptual niche as Haskell: interesting to the small subset of us who really care about the mathematical basis of how things work, but probably a bit too constrained to ever get serious traction.

        But I'd also guess that the absolute lack of consideration for determinism or even basic predictability when generating code is going to end up being the same kind of ticking time bomb as the lack of memory safety in C (not to say the two things are similar per se, just that I expect them to create the same kind of widespread, slow-burn security and functionality problems). We're going to end up with whole swathes of code that not only has a high risk of difficult-to-spot bugs, but that has no standardised approach to finding and fixing those bugs even if we later discover a way to prevent them.

        3 votes
        1. Gourd
          Link Parent
          There is some indication that these massive foundation models may be rapidly approaching a plateau in usefulness as coding agents. I think the true future of serious development could be a mix,...

          There is some indication that these massive foundation models may be rapidly approaching a plateau in usefulness as coding agents. I think the true future of serious development could be a mix, using more traditional workflows but with several smaller, highly-specialized models, possibly small enough to comfortably run locally. Like with current AI-assisted development, a good developer is still needed to produce a good product, but these tools could give more control and still provide the increase in efficiency.

          With regards to commercial viability, I heard someone someone recently make the comparison to the collapse of the music industry in the late 90s, with commercial recording studios (data centers in the case of AI) basically becoming obsolete overnight as home recording studios became easily attainable.

          But I'm speculating way outside of my area of expertise. I think you're spot-on with the C memory safety analogy.

          4 votes
      4. Gourd
        Link Parent
        I agree wholeheartedly. I review terrible AI-generated code all the time :{

        I agree wholeheartedly. I review terrible AI-generated code all the time :{

        2 votes
    4. [2]
      googs
      Link Parent
      This is sort of off topic and probably a little too pedantic, but this isn't really what happened. I'm not surprised this is what people think happened based on the media coverage at the time. The...

      Or crash all of AWS for 12 hours like back in December

      This is sort of off topic and probably a little too pedantic, but this isn't really what happened. I'm not surprised this is what people think happened based on the media coverage at the time. The truth is, engineers at Amazon, using an AI assistant called Kiro, managed to knock AWS Cost Explorer offline for 13 hrs in one region in China. Obviously an AI offlining your production system is bad, but there's a big difference between "all of AWS was down for 13hrs" vs "one AWS system, used for monitoring account spending, was down in one region in China for 13hrs". But obviously that's a less interesting story; nobody wants to hear about some limited outage to a single tool like Cost Explorer, so to drive engagement, the story turns into "AI causes all of AWS to crash for 13hrs" (at least that's the way I see it). This level of outage happens all of the time, but you only hear reporting about it when an AI tool was to blame. There was even a far wider spread outage in October 2025 due to some DNS misconfiguration, and it was widely reported on at the time, but now nobody is talking about how "DNS tools crashed all of AWS back in October".

      Some headlines for thought...
      Financial Times: "Amazon Service was taken down by AI coding bot"
      MSN: "13-hour AWS outage reportedly caused by Amazon's own AI tools"
      The Register: "Amazon's vibe-coding tool Kiro reportedly vibed too hard and brought down AWS"
      The Guardian: "Amazon’s cloud ‘hit by two outages caused by AI tools last year’"

      9 votes
      1. Gourd
        Link Parent
        Thanks for the insight! I admittedly was not aware of the details, and I thought it was another issue in us-east. I was out of office for most of December but I was absolutely impacted by the...

        Thanks for the insight! I admittedly was not aware of the details, and I thought it was another issue in us-east. I was out of office for most of December but I was absolutely impacted by the October outage lol

        4 votes
    5. [3]
      sandaltree
      Link Parent
      I view programming as something of an artform; more than just "getting the job done", so it feels a bit hypocritical saying LLM's can replace programming but not art. Couldn't you just as well...

      I view programming as something of an artform; more than just "getting the job done", so it feels a bit hypocritical saying LLM's can replace programming but not art. Couldn't you just as well look at AI-images and say that it got the job done? (not saying I do)

      Its totally reasonable to say that an experienced engineer can produce a virtually identical product with AI that they could have written manually, but much faster. This is not the case with art.

      I think were still quite far from this. The actual coding is still a relatively small part of the job, so in a big picture it's probably not that much time saved, when you think about the planning and maintaining etc..

      8 votes
      1. rich_27
        Link Parent
        To me, the art of programming is almost all in the high level architecting of a software solution. My understanding is even when using genAI assisted development, the human is generally still...

        To me, the art of programming is almost all in the high level architecting of a software solution. My understanding is even when using genAI assisted development, the human is generally still doing most of the architecting. I know in my experiences with Claude assisted development, I've had to remind Claude about approaches repeatedly and keep it on track, especially when doing niche or novel things.

        In fact, trying to do something novel with Claude resulted in me wasting a day or two of my time because no matter how much support or guidance I gave it it kept writing code that didn't implement the functionality I wanted because it was too dissimilar to (I assume) the body of work it was trained on. If I recall correctly, I ended up scrapping that whole thing and starting over, writing it myself far quicker than I had spent trying to coax Claude into getting it working.

        AI assisted development can massively speed up development, but I've yet to see it be able to effectively do the artful bit of software engineering. It seems to me that when it imitates that it is because it is replicating prior design that it has seen others implement before. To me it feels akin to graffiti artists using stencils: it massively speeds up how quickly they can create art, and they could use and combine prior stencils into a novel design (just like a human directing an AI agent can get it to build things that feel new if they're just merging existing approaches), but if they want to create something new they've got to make the stencil themselves first.

        6 votes
      2. Gourd
        Link Parent
        I agree with both points, but I still think there is a difference. I'm a career software engineer so I very much appreciate the creativity that goes into coding. But software engineering is all...

        I agree with both points, but I still think there is a difference. I'm a career software engineer so I very much appreciate the creativity that goes into coding. But software engineering is all about abstraction. If you think of an API as black box, purely in terms of inputs and outputs, an AI-coded product and a human-coded product might be indistinguishable. I can prompt an AI to generate a picture in the style of an oil painting, but the AI can't generate a literal oil painting.

        I know this is possibly arbitrary, and very subjective, but it is my opinion on the matter.

        The actual coding is still a relatively small part of the job

        This is an important point, and something I repeatedly tell my boss when he talks about efficiency. There's a ton of human involvement that is still necessary. As for the code quality AI outputs, though, we're a hell of a lot further along than we were even one year ago.

        6 votes
    6. neige
      Link Parent
      if that proves true, I might actually quit the field. I like thinking about how to build what I need to build and I enjoy writing code. If that goes away, I don’t think I’d care for software...

      but like it or not, this is the future of development. It's already happening at scale, and it's not going to stop.

      if that proves true, I might actually quit the field. I like thinking about how to build what I need to build and I enjoy writing code. If that goes away, I don’t think I’d care for software development anymore.

      Nothing to contribute to the discussion, but I needed to vent :/

      4 votes
  2. [2]
    a_s_k
    Link
    Thanks you for making this toolset! Every time I have to convert an image format on my work machine I feel like someone has put bamboo under my fingernails!

    Thanks you for making this toolset! Every time I have to convert an image format on my work machine I feel like someone has put bamboo under my fingernails!

    14 votes
    1. delphi
      Link Parent
      Glad you enjoy it!

      Glad you enjoy it!

      10 votes
  3. [10]
    Greg
    Link
    "What are you, a cop?" is probably my favourite take on AI ethics to date, and it's extremely reassuring to see that I'm not the only one who remembers that large copyright holders are also the...

    The copyright office is not your friend, and I think it's frankly a marketing feat of herculean proportions that we all accepted the framing of "copyright infringement equals theft". I really thought we figured out during the F.A.S.T. era that that was just corporate bootlicking. So, is it ethical? Um, I, uh, what are you, a cop? Get a warrant.

    "What are you, a cop?" is probably my favourite take on AI ethics to date, and it's extremely reassuring to see that I'm not the only one who remembers that large copyright holders are also the bad guys, and that we've already seen how they act when the balance of power is tipped more their way.

    I'm not even on the full copyright abolition side of the argument - extreme reform, absolutely, but I think there's enough of a kernel of protection there for the individual creator to be worth keeping - but it's been worrying how many people have jumped on the enemy-of-my-enemy bandwagon in favour of stricter IP law.

    Apropos of nothing, here's the Wikipedia page for a "music rights" group currently suing Valve because they say that licensing audio for distribution in a game soundtrack apparently doesn't cover... distributing that music as part of the game download: https://en.wikipedia.org/wiki/PRS_for_Music#Legal_cases

    10 votes
    1. [7]
      GoatOnPony
      Link Parent
      I didn't find that section of the essay particularly helpful since I dislike copyright law for the same reason I find AI unethical: they're tools larger and more powerful entities use to squash...

      I didn't find that section of the essay particularly helpful since I dislike copyright law for the same reason I find AI unethical: they're tools larger and more powerful entities use to squash the artistic endeavors of the less powerful. Regardless of the legality, AI took valuable labor without compensation and used it to enrich already fantastically wealthy companies and undermine the uniqueness of that labor, likely forever. People may be attempting to use copyright to push back on the unethical actions but that's just the tool to put teeth into the argument not the underlying ethical argument of extraction without credit.

      6 votes
      1. [6]
        Greg
        Link Parent
        I'm with you on the broad strokes there - anger at copyright and anger at AI are both rooted in anger at the corporate entities using those things to do harm, and frankly people are absolutely...

        I'm with you on the broad strokes there - anger at copyright and anger at AI are both rooted in anger at the corporate entities using those things to do harm, and frankly people are absolutely right to be pissed off about that. But I think the idea of tying the ethical concerns you're raising back to copyright law as currently implemented is wrong enough that it would actively make things much worse if people followed that through to its logical conclusion.

        Model training just doesn't fit within the current IP law framework at all - either legally or conceptually. You say they took the valuable labor without compensation, but the works they were taking are already published on the internet*, there to be distributed. The "taking" isn't the problem, that's what you're supposed to do with something that's been published. What they've done that pisses people off is devalued those works, and I'm yet to see any sensible suggestions for how IP law could have combated that without causing vastly larger problems, even with the benefit of hindsight.

        I'm not saying people are wrong to be angry, upset, worried, or anything else about that devaluation. But I absolutely am saying that using the lens of copyright, or even of conceptual ownership more broadly, just doesn't fit the problem. Focusing on whether or not the tech companies were justified, ether legally or morally, in running statistical analysis on publicly available text and images is a distraction at best. It's pretty much guaranteed to descend into a fuzzy argument about where exactly the line is between "acceptable things to do with a publicly available document your computer already has in RAM" and "unacceptable things to with that same document your computer already has in RAM".

        It's a question of economics, of power imbalance, of financial value, of social value, and of technical capability. If you start from the premise that it's extraction without credit - a framing that I do quite like - it's a question of what extraction with credit could or should look like.


        *Yes, it does appear that some of the companies involved torrented huge numbers of books. The hubris and hypocrisy there pisses me off greatly, particularly because they could absolutely have afforded to just buy the books legally - but nothing I'm saying would actually change in a meaningful way if they'd paid for them like they were supposed to and each author got their $3.27 per book in royalties.

        6 votes
        1. [4]
          GoatOnPony
          (edited )
          Link Parent
          Absolutely, I agree with almost all that you've written! My attempted point is not that I think copyright solves anything (agree it wouldn't), but that the essay attempts to address ethical...

          Absolutely, I agree with almost all that you've written! My attempted point is not that I think copyright solves anything (agree it wouldn't), but that the essay attempts to address ethical concerns by saying "I don't like copyright" which is IMO not really a response. Whether or not copyright exists shouldn't really matter to whether an individual considers AI usage ethical.

          I also don't think it's on us to determine a legal or technical framework that would work in all scenarios before we can critique AI companies or their actions. Precise lines of demarcation in the realms of morality, ethics, or law don't exist but we regulate and debate all sorts of things in that area. If you were to press me on a specific course of action I wouldn't look to copyright but to AI rules to require transparency about training datasets, monetary awards for contributors to those datasets, restrictions on requests to output styles that are not already broadly shared, and just compensation for workers who are displaced by the technology. That's assuming we operate in the confines of the current politics. My ideal answer would be that entities should be automatically nationalized and democratized in proportion to their size and influence. Then we as a society can direct the benefits of it in more specific, responsive ways.

          4 votes
          1. [3]
            sparksbet
            Link Parent
            We absolutely need to consider the implications of things like strengthening copyright laws or punishing AI companies for training on publicly available data in other contexts before we do those...

            I also don't think it's on us to determine a legal or technical framework that would work in all scenarios before we can critique AI companies or their actions.

            We absolutely need to consider the implications of things like strengthening copyright laws or punishing AI companies for training on publicly available data in other contexts before we do those things, though, because the results of those things would be more harmful to individuals and small creatives than the AI training was by an enormous margin. It absolutely is on us to not advocate for something that will hurt us just because it will also hurt the companies who train Gen AI models (and even then, it probably will hurt them significantly less than it hurts your own rights and freedoms!) I know you don't explicitly argue for those things here in your comment, but huge swaths of other people who criticize AI on copyright grounds are doing that, and it's wise to not stand with people who are even ignorantly arguing for something harmful like that.

            2 votes
            1. [2]
              GoatOnPony
              Link Parent
              I'm anti copyright law, I'm in no way arguing for its expansion. However, I don't think that for many of the people arguing for more powerful copyright law or for its enforcement against AI...

              I'm anti copyright law, I'm in no way arguing for its expansion. However, I don't think that for many of the people arguing for more powerful copyright law or for its enforcement against AI companies they are doing so irrationally or with malice. I empathize with people who see their work being displaced or otherwise undercut. I don't need to advocate for their position to stand with them.

              Separately, I think AI and AI training are actually likely to do as much harm to the free and open access of information and personal freedoms as copyright law. Websites are closing off access and instigating deeper technical countermeasures absent any change in the law because they see the threat too. I'd almost rather a legal threat unlikely to be used against me personally than technical hurdles I must interact with constantly.

              1. sparksbet
                Link Parent
                I don't think the people advocating for strengthening copyright law or punishing AI companies for things like web-scraping are irrational or malicious, at least not in aggregate. What they are is...

                I don't think the people advocating for strengthening copyright law or punishing AI companies for things like web-scraping are irrational or malicious, at least not in aggregate. What they are is wrong and advocating for things that would, if implemented as they insist, harm creatives far more than AI does. I think most of the people arguing this are genuinely ignorant of the implications of what they're arguing for, but that's why it's all the more important to spread the word by doing things like leaving comments pointing out what an awful idea it is.

                If you think that the types of barriers people have put up in response to AI which impede your own ability to do things like web-scraping are having bigger negative effects than the copyright law changes that those people are advocating for would, you do not understand how harmful those copyright law changes would be. Things like allowing copyright on style or on ideas, or considering use of files containing copyrighted material for operations like those used to train machine learning would absolutely obliterate your freedoms in terms of not only what you can create without being sued by a media giant but also what you can do with files on your own computer, much less things that have already had opposition from copyright holders in the past like web-scraping. And the examples I chose are things I frequently see people demanding when it comes to enforcing copyright law against AI companies, not made-up examples.

                1 vote
        2. Nsutdwa
          Link Parent
          The torrenting grates, hard. And especially when it turns out they were filthy leechers: "Bashlykov modified the config setting so that the smallest amount of seeding possible could occur." I also...

          The torrenting grates, hard. And especially when it turns out they were filthy leechers: "Bashlykov modified the config setting so that the smallest amount of seeding possible could occur."

          I also love that direct downloads being "too slow" is offered up as a justification for torrenting. Heaven forbid you have to pirate file-by-file at kb/s like an unlawyered pleb, you know?

          2 votes
    2. [2]
      skybrian
      Link Parent
      This “what are you, a cop” line only works for laws you don’t care about. For something more serious like, say, violent crime, people are going to want law enforcement.

      This “what are you, a cop” line only works for laws you don’t care about. For something more serious like, say, violent crime, people are going to want law enforcement.

      4 votes
      1. Greg
        Link Parent
        Agreed, but I think that's kinda the point - it's a much snappier way of saying "the premise you're using to define ethical behaviour is rooted in an appeal to authority, in the form of existing...

        Agreed, but I think that's kinda the point - it's a much snappier way of saying "the premise you're using to define ethical behaviour is rooted in an appeal to authority, in the form of existing IP law, rather than on a broader view that accounts for how unethical the application of those laws has been".

        12 votes
  4. post_below
    Link
    I loved the writing! I hope you're having fun with the success of your project, that's a lot of people whose lives you've made (at least a little) better. Don't let the haters distract from that...

    I loved the writing!

    I hope you're having fun with the success of your project, that's a lot of people whose lives you've made (at least a little) better. Don't let the haters distract from that too much.

    Everyone's still trying to figure out what to do with AI agents, both literally and conceptually. I think the distinction between art and utility is important, thanks for sharing!

    9 votes
  5. [7]
    kacey
    Link
    Thank you for making this! Truth be told, I was also doing something like it, but failed miserably to do so -- I'm really happy that it got made, in the end, and that it has been helping people....

    Thank you for making this! Truth be told, I was also doing something like it, but failed miserably to do so -- I'm really happy that it got made, in the end, and that it has been helping people.

    ... this isn't your fault, but getting sniped on a project due to my own utter incompetence, and having something I care about (writing software) continue to be commoditized really sets the bulldozers to pasture in the pit of my stomach, however.

    Personal anecdote: I had a similar feeling when a close personal friend stated that (lightly paraphrased) they'd die of boredom if they ever tried to attend a concert of mine. "What's the point; I could just listen to an mp3". I suppose I'd already internalized that, effectively, no one cares about live music anymore, but hearing someone so close dismiss something which felt so dear to my heart was ... unpleasant. A stick of dynamite lobbed into my entrails.

    Anyways. I think the rough thesis of your blog post was to address critique from -- I'm assuming -- TikTok teens that're troubling your DMs. Totally fair if so, and I think your points were made very articulately! I'm curious why you're against using AI in art, however, if it's acceptable in your mind to be used as part of a process? For example: if a film can be art, is the scriptwriter's role in it diminished by the interpretation of their words by actors? Are composers creating art when their work is only appreciated after being performed by musicians? If a person types "catgirl miku, artstation, hq, anime, masterpiece" into a diffusion model's prompt ...

    ... that's probably going a bit far, but where's the line? We're probably going to see the end of inbetweening first, but what if the key frames are made with Krita AI Diffusion? At what point does the human spark of creativity die out, and it becomes unpalatable AI slop ...

    6 votes
    1. [2]
      RoyalHenOil
      (edited )
      Link Parent
      I'm not the OP, but I am a multimedia developer who doesn't like using AI (specifically genAI) in my art. Personally, I'm not opposed to it on a philosophical level; it's just another tool in the...

      I'm curious why you're against using AI in art, however, if it's acceptable in your mind to be used as part of a process?

      I'm not the OP, but I am a multimedia developer who doesn't like using AI (specifically genAI) in my art.

      Personally, I'm not opposed to it on a philosophical level; it's just another tool in the toolset, as far as I'm concerned. I've been using some of Photoshop's (non-genAI) AI tools for years, for example. However, I have so far not found current genAI tools very useful for my purposes. I suspect this is because genAI is largely developed by... well, developers. Not artists. So coding tools have gotten a huge amount of polish and testing, but art tools far less so.

      I can think of a dozen ways I'd love to be using genAI in art, at least for work, but the workflows are kind of wonky and the results are usually pretty bad — not that they look bad necessarily, but that they don't fit into what you're trying to create, and (depending on the type of asset) they often have fundamental flaws that make them too hard to edit and reuse. Particularly for things like photo editing, vector editing, and video editing, I find it's almost always faster — not to mention cheaper and better quality — to just do things manually.

      Mind you, in my field (multimedia development in tech education), technical accuracy is extremely important. If AI alterations are not 100% true to life, I can't use the asset at all. If you're just generating images without a specific goal (e.g., brainstorming for a mood board or making temporary assets for testing), then genAI is decent, but this is such a small part of most artists' workflow. It's not a part of my workflow at all.

      The big exception, in my experience, is using genAI to develop scripts and add-ons. LLMs are genuinely super helpful with that, and the tools they've helped my team build have saved us all a lot of time and frustration.

      As for making art in my personal life, I just have no interest in using AI at all (not even the aforementioned oldschool Photoshop features). That kind of defeats the purpose of a hobby, you know? I want total creative control.

      10 votes
      1. post_below
        Link Parent
        There's probably some truth to that, but there are three main reasons coding tools have gotten so much attention: First, that's where the money is. Developers, and software companies, will pay the...

        I suspect this is because genAI is largely developed by... well, developers. Not artists. So coding tools have gotten a huge amount of polish and testing, but art tools far less so.

        There's probably some truth to that, but there are three main reasons coding tools have gotten so much attention: First, that's where the money is. Developers, and software companies, will pay the $100 or $200/month per head more readily than most professions because, second, code is a particularly good use case for LLMs. There's a huge amount of decent quality code available to train on and, in coding, there are good methods to test the results in quantifiable ways. That makes fine tuning much easier than in something like visual design where the results are more subjective. And third, with every big tech leap in the digital revolution so far, if you want a technology to get adopted, convince the developers first, they'll bring everyone else along. Or alternatively you can convince the pornographers, but that's not really an option for American companies, the US is puritanical that way.

        Anthropic figured all of this out early on, Open AI has been sprinting to keep up ever since. Google doesn't seem to be trying as hard, just keeping close enough to stay in the top 3.

        My experience has been the same as yours, the tools just aren't there yet. Possibly some of the third party wrappers aimed at design offer better results but I haven't tried them. In graphic design especially, what's up with the artifacts? It's worse than mild JPEG compression. I guess it's a downside of diffusion that they haven't solved yet.

        No doubt they'll get there eventually, they've been improving fast.

        3 votes
    2. [2]
      tanglisha
      Link Parent
      I care about live music. That's someone ai will have trouble taking over.

      I care about live music. That's someone ai will have trouble taking over.

      7 votes
      1. kacey
        Link Parent
        I'm glad. Thank you for doing so, too!

        I'm glad. Thank you for doing so, too!

        4 votes
    3. [2]
      delphi
      Link Parent
      Frankly, I don't know. And I'm far from alone in this. I do believe what I said - I'd personally prefer if AI was tangentially to not at all involved in the creation of art, but like you and many...

      Frankly, I don't know. And I'm far from alone in this. I do believe what I said - I'd personally prefer if AI was tangentially to not at all involved in the creation of art, but like you and many other scholars put it, what the hell even is an "Art"? Books like 1 the Road by Ross Goodwin and his Markov Chains are unequivocally by volume mostly written by AI, by language models, and I still think it's art. But then as well, if I go to ChatGPT and ask it to "write me a story about a wizard and a talking cat", I don't really think what comes out is art in the same way.

      I guess this is about if it passes the "sniff test"? If I can see authorial intent (and my radar isn't exactly perfect here, I've been wrong about art many times), I can probably rationalise the piece, including AI's role in it. I'm not gonna name names, but a friend of mine is working on a novel that he's co-writing with an AI, and in contrast to 1 the Road, that's not the explicit point of the text. Still, while I have read passages and it's not very good in a quality sense, I can't deny that he himself put love and thought into it, and I can't really deny that it is - in a way - a form of artistic expression.

      Like I said, I don't have the answers. I know that I believe that the artist generally has a say here, and I say that delphitools isn't art, so I don't think it should be treated as such. But of course, there are degrees to this. If someone looks at that book (or 1 the Road for that matter) and calls it soulless AI slop that would be better if a human, a whole human, and nothing but a human worked on it, I'm not gonna get on my soap box and argue. Their argument is as valid as mine.

      5 votes
      1. kacey
        Link Parent
        Hahaha, I'm no scholar, but I appreciate the kind remark! Oh, fascinating. I'd never read about that before! I think that, overall, I agree with the notion that the sort of shame campaigns we're...

        Hahaha, I'm no scholar, but I appreciate the kind remark!

        [1 the Road by Ross Goodwin]

        Oh, fascinating. I'd never read about that before!

        I think that, overall, I agree with the notion that the sort of shame campaigns we're seeing are being targeted unfairly. Also, that we could sit here and discuss the nature of art all day 😅

        1 vote
  6. [2]
    Carrow
    Link
    Well put, I was struggling to phrase what you've communicated with standing in moving water. Which extends to the whole of the essay. I've been trying to grasp how I feel about AI for resolution...

    "Twitter is busy worshipping the machine god and Bluesky/Tiktok are refusing to see that they're standing in moving water."

    Well put, I was struggling to phrase what you've communicated with standing in moving water. Which extends to the whole of the essay. I've been trying to grasp how I feel about AI for resolution scaling and frame generation in video game graphics. I think the arguments you've made for AI assisted coding apply there too.

    6 votes
    1. delphi
      Link Parent
      It hadn't even occurred to me to think about the frame generation argument, but I suppose, sure. I don't think I have any foundational problems with it, but I do want to point out: If I make a...

      It hadn't even occurred to me to think about the frame generation argument, but I suppose, sure. I don't think I have any foundational problems with it, but I do want to point out: If I make a game, and you run it with frame generation enabled, I don't think that changes my stance on the matter or the piece itself. I've already left the room by the time you're playing it, and with something as inconsequential as inserting statistically probable frames between "real" ones, I can hardly say that my creativity as an original creator is being abstracted away.

      6 votes
  7. Lia
    Link
    Thank you so much for the tools! I was able to get rid of my old bookmarked QR generator which was annoying AF. Getting to replace it with a bookmark that takes me to ALL of these tools at once...

    Thank you so much for the tools! I was able to get rid of my old bookmarked QR generator which was annoying AF. Getting to replace it with a bookmark that takes me to ALL of these tools at once and isn't annoying? You made my day. <3

    Also enjoyed and agree with the article, nothing much to add.

    5 votes
  8. [3]
    vord
    Link
    In defense of those terrible online tools, they do fill an important niche: Providing functionality when using a device that I have 0 control over....like a library computer.

    In defense of those terrible online tools, they do fill an important niche: Providing functionality when using a device that I have 0 control over....like a library computer.

    3 votes
    1. [2]
      delphi
      Link Parent
      In offence of these terrible tools: So does mine, and it's better at it. There has been good free software and the sun hasn't collapsed

      In offence of these terrible tools: So does mine, and it's better at it. There has been good free software and the sun hasn't collapsed

      13 votes
      1. vord
        Link Parent
        Yes. I have yours bookmarked and I thank you for filling the niche with proper asphalt instead of a spoonful of dirt.

        Yes. I have yours bookmarked and I thank you for filling the niche with proper asphalt instead of a spoonful of dirt.

        4 votes
  9. jmpavlec
    Link
    Great article, really enjoyed the writing style. I will say after having just read the title and reading through the comments here... I was thinking "oh it's another one of these AI is bad...

    Great article, really enjoyed the writing style. I will say after having just read the title and reading through the comments here... I was thinking "oh it's another one of these AI is bad articles". I was pleasantly surprised to see that not only are you not in tech, but that you found great value in these tools and made yourself productive with it. (And did it successfully by going viral!)

    I'm a solo developer myself and have really been enjoying working with LLMs while building prototypes quickly. The feedback loop is SO much faster. I'm typing less and thinking more. I get to focus on the fun architectural decisions.

    In the end, it's just another tool in the box. One that even if it never gets significantly better than it is today, will continue to be useful.

    3 votes
  10. [5]
    skybrian
    Link
    I found the start of this essay a bit confusing because first you say that the website with these tools exists, only no it's not real, it's a concept you made up, no actually it does exist....

    I found the start of this essay a bit confusing because first you say that the website with these tools exists, only no it's not real, it's a concept you made up, no actually it does exist. Apparently there's a distinction I'm missing?

    2 votes
    1. rich_27
      Link Parent
      What the author is trying to say, as I understood it, was "This silly, hyperbolised name is a placeholder interchangeable with any ad ridden conversion website. We've all been to them, we all...

      What the author is trying to say, as I understood it, was "This silly, hyperbolised name is a placeholder interchangeable with any ad ridden conversion website. We've all been to them, we all recognise the concept, and this is what I'm talking about. It doesn't matter which one you have used, they're all the same and they're all shit"

      Basically, similar to:
      Eating is ubiquitous, we all do it, multiple times a day. Every day you get up and you eat your Human Food, you go about some activities and stop for another bite of Human Food. You start to watch the clock as the hours tick by, thinking about what flavour of Human Food you might pick out tonight.

      15 votes
    2. Liru
      Link Parent
      You've never looked up how to convert one thing into another, or how to do something that should be simple from a tech POV? If you do, you're definitely going to run into one of those websites....

      You've never looked up how to convert one thing into another, or how to do something that should be simple from a tech POV? If you do, you're definitely going to run into one of those websites. The name is an amalgamation of that type of website.

      11 votes
    3. [2]
      unkz
      Link Parent
      Right? Should just be

      Right?

      Free Convert Image PNG JPEG Conversion Online Free Tools Online Image is that website. It's not real, not strictly speaking, it's just a concept I made up for this essay, but it does spiritually exist.

      Should just be

      Delphitools is that website.

      3 votes
      1. Zorind
        Link Parent
        Well, delphitools is that site, but without all of the ads, tracking, and unnecessary logins that those sites are notorious for

        Well, delphitools is that site, but without all of the ads, tracking, and unnecessary logins that those sites are notorious for

        16 votes