44 votes

Making or using generative ‘AI’ is, all else being equal, a dick move

48 comments

  1. [25]
    Jordan117
    Link
    I agree as far as businesses using it to replace employees they can clearly afford to pay, or individuals using it for creepy or abusive stuff. But I think there's plenty of room for it to be a...
    • Exemplary

    I agree as far as businesses using it to replace employees they can clearly afford to pay, or individuals using it for creepy or abusive stuff. But I think there's plenty of room for it to be a fun, creative tool for non-commercial and indie use. The technology here is so revolutionary, borderline magical, that it feels wrong to try painting all use of or interest in it as shameful, and just another front in an endless culture war. Like trying to ostracize photography or Photoshop.

    Ideally, raw AI output will be classified as public domain, requiring significant and proven human involvement to be copyrighted. That would make it impractical for large companies to replace artists wholesale while preserving its creative applications for the public good -- appropriate since these models are essentially the reification of common visual concepts.

    102 votes
    1. [6]
      feanne
      (edited )
      Link Parent
      I agree with you! Another type of regulation I'd like to see is, to require corporate-owned generative AI tools to be open source. Following the idea that these generative AI tools are taking from...

      I agree with you! Another type of regulation I'd like to see is, to require corporate-owned generative AI tools to be open source. Following the idea that these generative AI tools are taking from the commons at a massive and unprecedented scale, and must therefore give back to the commons proportionately. And that corporations in general should not be allowed to excessively hoard/consolidate power.

      I think the linked article could have been more nuanced, for ex. "Making or using generative ‘AI’ that relies on exploitation is, all else being equal, a dick move". At the moment, a lot of generative AI is exploitative because of issues such as not respecting creators' consent, and underpaying laborers who clean up data used for training. But I can imagine future generative AI tools that are more ethical, such as those trained on public domain / properly licensed material.

      As for the question of generative AI and fair use, I recommend this comment and paper by info law scholar Ben Sobel explaining how generative AI is unlikely to be fair use:

      "Consider an AI model that ingests copyrighted musical compositions in order to generate novel musical compositions. To the extent this AI succeeds at its purpose, it has learned to identify and replicate expressive qualities of the copyrighted materials in its training corpus. Thus, to the extent that this AI derives value from its input data, it engages not with mere facts about copyrighted materials, but instead with the protected, expressive aspects of those materials. Such an expressive purpose does not make a use per se non-transformative. But it does make the rationale of non-expressive fair use unavailable."

      22 votes
      1. [5]
        nukacolaholic
        Link Parent
        As much as I agree that AI should not be used to replace human created work, I really struggle with the copyright arguments against AI. It's very hard for me to distinguish between the arguments...

        As much as I agree that AI should not be used to replace human created work, I really struggle with the copyright arguments against AI. It's very hard for me to distinguish between the arguments against AI training and the process of creation by a human. Humans creating artistic works are going to be inspired and influenced by the works that they have seen/read/heard in their lifetimes. It seems nonsensical to claim that a human should be expected to obtain a special training license for every book they have read in their life in order to write. The difference with AI is scale and speed, but the underlying process doesn't seem different and that makes it problematic to me to try to use the same legal framework.

        I do hope that copyright law ends up settling on the side of AI works being non-copyrightable, because I think that will help ensure that humans don't end up entirely replaced, even if the tech continues to improve.

        17 votes
        1. [4]
          feanne
          Link Parent
          I'm not sure if you read the paper by Ben Sobel which I've linked above, but it's a really thorough discussion on copyright and AI, including an explanation on how copyright law alone is...

          I'm not sure if you read the paper by Ben Sobel which I've linked above, but it's a really thorough discussion on copyright and AI, including an explanation on how copyright law alone is inadequate for tackling AI issues. It's a bit long but if the copyright aspect interests you then you might find it helpful!

          As for the speed and scale difference, another comment on this thread has explained why that matters-
          https://tildes.net/~tech/1b0j/making_or_using_generative_ai_is_all_else_being_equal_a_dick_move#comment-at1z

          But yeah overall I agree that I just want to see this technology being shaped in a way that benefits people, and not just the elite few. As Cory Doctorow has said, what a technology does is not as important as who it does it for and whom it does it to.

          6 votes
          1. [3]
            nukacolaholic
            Link Parent
            I think Sobel is ultimately unsuccessful at disambiguating AI training from humans interacting with copyrighted material in cases where there is not intermediate copying. Particularly in the...

            I think Sobel is ultimately unsuccessful at disambiguating AI training from humans interacting with copyrighted material in cases where there is not intermediate copying. Particularly in the section you quoted, how is that distinct from a human artist doing the same thing? I don't think a human artist doing a work in the style of another artist after hearing their music would be considered infringement.

            On speed and scale, I think my problem lies in that if the fundamental premise (i.e. work trained on the work of other artists is necessarily infringing) is incorrect, the speed and scale should be irrelevant in a legal framework. Once you say that is okay to draw the line somewhere, it creates a legal precedent that the fundamental premise has some validity, and people are going to start pushing at where that line should be drawn. If a human artist is inspired by the work of another artist and then starts putting out work in that same style but vastly outproducing the inspiring artist, does the inspiring artist have a claim against the one they inspired? I can definitely see big companies trying to push that boundary at the expense of smaller creators.

            In general, I don't know that we are super far off from each other. There probably needs to be some legal framework for the use of AI, but copyright law just seems like a really bad approach to me.

            Edited to add: At least, copyright law at the training level. At the output level, if it's putting out a note for note replication of a song, that's problematic, and preventing AI output from being copyrightable does seem like a beneficial approach.

            4 votes
            1. [2]
              feanne
              Link Parent
              Sobel isn't differentiating between humans and AI, he's differentiating between expressive and non-expressive use within the context of the fair use doctrine in copyright law. He's not addressing...

              Sobel isn't differentiating between humans and AI, he's differentiating between expressive and non-expressive use within the context of the fair use doctrine in copyright law. He's not addressing the question "how is this different from what humans do", he's addressing "should generative AI be fair use".

              There's no reason why speed and scale can't be relevant in determining harm. For ex., one person taking home a bottle of sand from the beach vs. a million people all taking home sand from the beach over a year vs. a truckload of sand being taken from the beach overnight. But yes, with regards to generative AI, this has less to do with copyright and more to do with corporate greed in general.

              If a human artist is inspired by the work of another artist and then starts putting out work in that same style but vastly outproducing the inspiring artist, does the inspiring artist have a claim against the one they inspired?

              It depends, how close is the style? Are the works counterfeits (fraudulently presented as being by the original artist)? Are the works close enough in style, and numerous enough, that it's causing reputation problems for the original artist?

              And re. style. It's often said that "style is not copyrightable", but it's not as simple as that because "style" itself is such a vague word.

              See for ex. the 1987 case of Steinberg vs Columbia, in which a movie poster was done in the same style as an editorial artwork. The court ruled infringement while citing elements of style such as perspective, linework, typography, and overall impression.

              Or this 2023 ruling where a Belgian artist successfully sued a Chinese artist for infringement on the basis of having a similar style. Unfortunately I couldn't find more examples of images from this case, I think it's really interesting. The article I linked shows two side-by-side examples, the first being a close copy and the second showing something that looks more "inspired" / "remixed". I am not a lawyer, but I am a full-time artist making a living off of licensing and I've successfully pursued infringement cases involving my work (including cases abroad). My opinion is that the second example would have made for a weak case on its own, were it not part of a broader pattern of copying involving multiple works.

              4 votes
              1. nukacolaholic
                Link Parent
                I think what you are saying is persuasive at the output level. If an AI generated work goes through the same tests a human generated work would go through and fails, I think it should be...

                I think what you are saying is persuasive at the output level. If an AI generated work goes through the same tests a human generated work would go through and fails, I think it should be considered infringement. However, this led me to take a closer look at Sobel and I am really struggling to square his arguments about infringement during the training of AI with the Grimmelmann Copyright for Literate Robots article he cited, which seems to argue that machine reading and copies only for machine consumption cannot be infringement. Grimmelmann seems to argue that should not necessarily be the case, but that it is the state of the law.

                And ultimately, I tend to be skeptical of copyright law because it in particular seems to be captured by the greed of large corporations against the interest of the public (see Disney and the ever increasing length of time it takes a work to enter public domain). Fundamentally, I am suspicious that any attempt to modify copyright law specifically for the issue of AI would likely be weaponized against small creators and the public good.

                3 votes
    2. [2]
      arch
      (edited )
      Link Parent
      I agree with you entirely. I personally look to use AI image generation to help me rough draft some ideas I have for a tattoo to see if they will even work at all. I would never dream of using AI...

      I agree with you entirely. I personally look to use AI image generation to help me rough draft some ideas I have for a tattoo to see if they will even work at all. I would never dream of using AI for the actual tattoo, I will 100% pay a legitimate tattoo artist for the final draft. But paying hundreds of dollars for someone to draw a concept to see if it can even look remotely passable would prevent me from ever going farther.

      20 votes
      1. smoontjes
        Link Parent
        I might actually want to do that because my drawing skills are less than great, that sounds like a pretty good use of AI! Do you then plan on using these drafts/sketches to explain to the artist...

        I might actually want to do that because my drawing skills are less than great, that sounds like a pretty good use of AI!

        Do you then plan on using these drafts/sketches to explain to the artist what you want them to make?

        5 votes
    3. blindmikey
      (edited )
      Link Parent
      On the other hand, fighting this particular battle is analogous to fighting to keep our economic shackles. If whole sectors of jobs can be replaced by AI, requiring a human do it just so that...

      On the other hand, fighting this particular battle is analogous to fighting to keep our economic shackles. If whole sectors of jobs can be replaced by AI, requiring a human do it just so that person can have an income is just robbing that human's precious time - it isn't solving the real issue and prevents us from reaching the goal of using AI to obtain economic freedom en mass.

      We've done this time and time before as well. The rich seek out any resource that allows them to stop paying as much for their workforce, and historically we keep getting mad at the resource (eg. immigrant labor). The rich win this game every time because they've got us fighting against our own interests from the get go.

      We need to get mad at the rich, not AI.

      18 votes
    4. PuddleOfKittens
      Link Parent
      AKA "X under capitalism is bad, but that doesn't mean there's an inherent problem with X".

      I agree as far as businesses using it to replace employees they can clearly afford to pay, or individuals using it for creepy or abusive stuff. But I think there's plenty of room for it to be a fun, creative

      AKA "X under capitalism is bad, but that doesn't mean there's an inherent problem with X".

      8 votes
    5. [13]
      teaearlgraycold
      Link Parent
      Yes. We need legal precedence that declares a human needs to exist that created a work in order to wrap the output in IP armor. That generally follows the existing laws. Natural chemicals can’t be...

      Yes. We need legal precedence that declares a human needs to exist that created a work in order to wrap the output in IP armor. That generally follows the existing laws. Natural chemicals can’t be patented. You need an author or company that owns something to copyright it.

      4 votes
      1. [8]
        Greg
        Link Parent
        My struggle on this is that hitting the button on a camera is considered sufficient human involvement for a work to be copyrighted - the camera is the tool, the person who hit the button is the...
        • Exemplary

        My struggle on this is that hitting the button on a camera is considered sufficient human involvement for a work to be copyrighted - the camera is the tool, the person who hit the button is the creator (further reinforced by the I-promise-this-is-real-legal-precedent monkey selfie copyright case). Not to disparage those who put great effort into their photography, at all, just to really underline that the effort isn’t the bit that’s correlated with the copyright validity - the human involvement of any minimal kind is.

        I’m not able to reconcile that with the idea of the computer somehow being the “creator” in AI cases. It’s no more the creator than the camera is, and the human has put in at least as much effort as hitting a button.

        Of course, I also think the copyright system is fundamentally broken in a ton of ways, so if this prompts a major overhaul I’m more on board - but for now it seems like it’d be a major contradiction that I haven’t been able to justify within the existing system.

        23 votes
        1. teaearlgraycold
          Link Parent
          Hmm you make a good point. And I agree. We should rethink much of IP law.

          Hmm you make a good point. And I agree. We should rethink much of IP law.

          7 votes
        2. [3]
          CosmicDefect
          Link Parent
          This is a really good point even though my gut feeling says there is a difference, I can't quite logically articulate how or why. Perhaps, in the case of AI, it is just better for society as a...

          This is a really good point even though my gut feeling says there is a difference, I can't quite logically articulate how or why. Perhaps, in the case of AI, it is just better for society as a whole if we pretend there is a distinction anyway.

          Based on how AI was trained on copyrighted materials on an unprecedented, massive scale, it would be catastrophic if you could copyright its output (the situation of the guy who did all the fantasy dinosaur artwork being easily reproduced says it all imo). I don't exploit the hard work of the entire world's artistic output when I use a camera generally speaking.

          5 votes
          1. [2]
            Greg
            Link Parent
            Yeah, I totally understand the gut feeling - I think a big part of it is that copyright law has been stretched so far beyond breaking point for so long that you've got to simultaneously entertain...

            Yeah, I totally understand the gut feeling - I think a big part of it is that copyright law has been stretched so far beyond breaking point for so long that you've got to simultaneously entertain "oh shit, this new technology had serious and nuanced implications to manage" and "what do you mean the existing system works like that?! It makes no sense" and then try to ball those two very dissonant things back up to get a good outcome. It's why my take here is that serious copyright reform is the only real answer, but I don't see that happening given the amount of money wrapped up in the status quo.

            Trying to drag it back to what we do have, I worry about using the input data ethical argument to prop up any blanket take on the system outputs - even if you think that gets to the right outcome here and now, it's setting a precedent for getting to that outcome in very much the wrong way. If the copyright office says anything about "AI systems", influenced by how (say) Midjourney have done things, that has far reaching implications for technically similar systems with very different input data.

            To me, the answer there is to figure out a clear IP framework for the input data, where it comes from, how it's used, and what that means for the organisation building the system. Separate that question from the philosophical point about human involvement, rather than legislating around the concept as a whole based on the choices just a few companies have made in their approach to data gathering.

            4 votes
            1. CosmicDefect
              Link Parent
              This is another good point. The tech is not the method and the method is the problem -- or at least a good chunk of it. I've seen the term "ethically sourced AI" here bouncing around and to those,...

              To me, the answer there is to figure out a clear IP framework for the input data, where it comes from, how it's used, and what that means for the organisation building the system. Separate that question from the philosophical point about human involvement, rather than legislating around the concept as a whole based on the choices just a few companies have made in their approach to data gathering.

              This is another good point. The tech is not the method and the method is the problem -- or at least a good chunk of it. I've seen the term "ethically sourced AI" here bouncing around and to those, I see them as much less problematic. If you're training your AI entirely on public domain or licensed/purchased works then you're taking the "hard" but morally good road to using the technology in my opinion.

              And I absolutely don't just want to be "AI==bad." I think the technology is amazing and I'm both scared and excited what people will do with it. Might honestly be another turning point in society like smartphones or the web.

              1 vote
        3. [3]
          merry-cherry
          Link Parent
          With cameras, the real human involvement is being there. Yes the camera did all the work of capturing the image, but the human did the effort of shlepping it all there. Even the monkey selfie had...

          With cameras, the real human involvement is being there. Yes the camera did all the work of capturing the image, but the human did the effort of shlepping it all there. Even the monkey selfie had a lot of human involvement to bring the camera and then recover it later. Humans also do a lot of work culling, curating, and editing images after though that can be short circuited by luck.

          Proper non-human photography would be machines that just roam and take pictures on their own. Doing it with a couple machines would be an interesting project and unlikely to stir the legal waters. Doing it on the scale that AI operates while by like sending out millions of autonomous machines that run around everywhere taking whatever random pictures. A company trying to monetize those is more likely to draw ire.

          3 votes
          1. Greg
            Link Parent
            I agree with pretty much everything you said about photography, it makes logical sense - and that’s kind of the problem, the law disagrees! Take a shitty snapshot from my desk? Automatic...

            I agree with pretty much everything you said about photography, it makes logical sense - and that’s kind of the problem, the law disagrees! Take a shitty snapshot from my desk? Automatic copyright. Take all the effort for the monkey selfie to happen, but the monkey presses the button? Cannot be copyrighted.

            I think it’s absurd that the button press is what matters in the eyes of the law rather than all the surrounding effort, but that’s the precedent here.

            When it comes to the robot analogy, I think that’d be quite fair if we’re talking about a fully automated system that scrapes data and makes new content - interesting implications for the copyright status of AI weights there - but most of the time the outputs we’re thinking of were prompted, curated, and often touched up by the human using the software. The human involvement to drive the AI is about the same as that of taking a photo, with similar variability from the most basic to most complex cases.

            6 votes
          2. ewintr
            Link Parent
            But that is not at all how I perceive generative AI. It does not work that way. Because suppose you build these autonomous machines and now you have a million of random pictures. How is a company...

            like sending out millions of autonomous machines that run around everywhere taking whatever random pictures. A company trying to monetize those is more likely to draw ire.

            But that is not at all how I perceive generative AI. It does not work that way. Because suppose you build these autonomous machines and now you have a million of random pictures. How is a company going to sell that? They should at the least invest in some kind of curation too?

            I see a very different analogy. Yes, the AI can spout all kinds of "stuff" but it is still the human that decides how and what part of it ends up in the final work. In the end, the human is in control.

            It is like buying a smoke generator and take (with one button press) some pretty pictures of the smoke. Not every random picture will be to your liking. The generator will create endless patterns of smoke, but most of them won't be worth anything by itself. It is the photographer that makes something out of it.

            1 vote
      2. [4]
        zipf_slaw
        Link Parent
        what if a human (or group of humans) created a body of art on which an AI was trained, and then provided a structured prompt to that AI to create additional art of the same type. could the output...

        declares a human needs to exist that created a work in order to wrap the output in IP armor.

        what if a human (or group of humans) created a body of art on which an AI was trained, and then provided a structured prompt to that AI to create additional art of the same type. could the output be legally attributed to those humans who ushered and guided the AI towards the art?

        4 votes
        1. [3]
          lou
          Link Parent
          Current copyright law is largely unsuitable for this new development. I'd expect many weird or unfair rulings to take place until lawmakers around the world manage to address AI. That said, IANAL,...

          Current copyright law is largely unsuitable for this new development. I'd expect many weird or unfair rulings to take place until lawmakers around the world manage to address AI.

          That said, IANAL, but @boxer_dogs_dance is! Maybe our brand new resident lawyer can bring some clarification? ;)

          8 votes
          1. [2]
            boxer_dogs_dance
            Link Parent
            I wish I could. With brand new developments like this, the courts will do their best to apply laws that weren't written with this situation in mind, and lawmakers will then react. None of it is...

            I wish I could. With brand new developments like this, the courts will do their best to apply laws that weren't written with this situation in mind, and lawmakers will then react. None of it is predictable.

            12 votes
            1. lou
              Link Parent
              That is informative in itself ;)

              That is informative in itself ;)

              4 votes
    6. CrazyProfessor02
      Link Parent
      I agree with you that AI should not used in a commercial way, like out right replacing people. AI should be used as a tool to make jobs easier, like the Cornell Bird lab is doing with their AI,...

      I agree with you that AI should not used in a commercial way, like out right replacing people. AI should be used as a tool to make jobs easier, like the Cornell Bird lab is doing with their AI, they fed it data that they collected over decades of migratory data of birds. And humans used to take hours to comb through that data, and now it takes minutes.

  2. [3]
    Wes
    Link
    I'm ready to move past this current outrage and onto the inevitable satirical response articles. "Making or using printing presses is killing the hand-scribed manuscript industry". "Clothes...

    I'm ready to move past this current outrage and onto the inevitable satirical response articles. "Making or using printing presses is killing the hand-scribed manuscript industry". "Clothes produced in a loom are taking jobs, and you should feel bad about it".

    I don't like leaving a non-substantial comment, but this was a non-substantial blog post. There's good discussion to be had about the ramifications of introducing new technologies into a society, but this ain't it.

    32 votes
    1. [2]
      PuddleOfKittens
      Link Parent
      The loom was a common household device in pre-industrial households, during the period of the first industrial revolution. I'm not aware of luddites smashing any production machines that they...

      "Clothes produced in a loom are taking jobs, and you should feel bad about it".

      The loom was a common household device in pre-industrial households, during the period of the first industrial revolution. I'm not aware of luddites smashing any production machines that they owned, only the ones owned by factory owners.

      7 votes
      1. public
        Link Parent
        …and AI is in a weird place because the artists whose art was pilfered are not the ones who own the AIs, yet some of the AIs (StableDiffusion and its descendants) are the equivalent of home-owned...

        …and AI is in a weird place because the artists whose art was pilfered are not the ones who own the AIs, yet some of the AIs (StableDiffusion and its descendants) are the equivalent of home-owned looms. It's why I take such antipathy to anti-artists: when they start getting upset at AI-generated ponies, they're shouting "stop having fun!", not protecting class interests.

        4 votes
  3. [4]
    Dr_Amazing
    Link
    I get why artists are upset about it, but I'm not going to stop using it. To me it's just another tool and it's too late to pretend it doesn't exist.

    I get why artists are upset about it, but I'm not going to stop using it. To me it's just another tool and it's too late to pretend it doesn't exist.

    26 votes
    1. [3]
      Crossroads
      Link Parent
      If I were a savvy artist, I'd offer services to help touch up people's AI "creations". At a reasonable, market adjusted fee to compete with whatever flood of AI generated artwork is probably out...

      If I were a savvy artist, I'd offer services to help touch up people's AI "creations".

      At a reasonable, market adjusted fee to compete with whatever flood of AI generated artwork is probably out there of course.

      It's the human touch that would set someone else's AI art out from the crowd of other AI generated art. It's another set of tools, too. I hope smart artists start doing this sort of thing, honestly.

      Luckily, I'm a musician, not a drawing or painting artist. For what that is worth, fully AI generated music sounds extremely generic and boring for now. I'm sure that'll change soon enough, too.

      I guess for me it comes down to consuming for the sake of consuming because I think eventually that's going to be more comfortable for people than engaging with actual works made by artists.

      People can generate all the art and music they want, but if it doesn't have any meaning or artistic qualia then at what point are you just shoveling things into your eye and ear holes just to drown out the fact that you're not consuming something with qualitative artistic substance in the first place?

      Eventually, no one will care. Probably, people don't care now about this sort of thing. I'm probably yelling at clouds.

      12 votes
      1. [2]
        Oslypsis
        Link Parent
        You are a genius. I love editing and retouching photos. I'm a graphic designer. I love Photoshop. I'm unemployed, and generally unmotivated by the job indistry right now. Maybe once I give this...

        You are a genius. I love editing and retouching photos. I'm a graphic designer. I love Photoshop. I'm unemployed, and generally unmotivated by the job indistry right now. Maybe once I give this idea a proper try, maybe I'll end up with a job after all!

        Of course, I'll have to edit some AI images to show what I can do first, so would anyone who reads this please send me some photos, drawings, or images in general from AI generated stuff, and try to make it as HD as you can. The only "payment" I ask is letting me use it as a portfolio piece to get a job. I'll undoubtedly make some from my own prompts, but diverse ideas and concepts help a lot in the real world. Just @ me with an imgur link or something, and I'll @ you back when I'm done. Then, maybe everyone can offer any advice they have on where I can improve. :)

        9 votes
        1. Crossroads
          Link Parent
          I'm glad you found some inspiration in my angst ridden rambling about AI. Good luck in your venture and I hope you get some traction with it soon!

          I'm glad you found some inspiration in my angst ridden rambling about AI.

          Good luck in your venture and I hope you get some traction with it soon!

          1 vote
  4. [5]
    Stranger
    Link
    As a labor-movement issue, I completely sympathize with artists. The term Luddite is used as a pejorative for people "afraid" of technology, but as a labor movement the Luddites weren't only...

    As a labor-movement issue, I completely sympathize with artists. The term Luddite is used as a pejorative for people "afraid" of technology, but as a labor movement the Luddites weren't only justified, but their worst fears played out exactly as they anticipated. Industrialization did wipe out their livelihoods, increasing income inequality and economic hardships. AI will do the same without intervention.

    As a labor issue, I sympathize, but as a moral issue I find the anti-AI backlash to be ludicrous. Nearly every single argument I've heard against AI is something you can point to that already exists in the art world. Artists train on each other's work without permission. Artists copy each other's style. Artists literally copy exact elements out of existing works and turn them into other things. Artists use machines and algorithms. Photography recreates the world exactly as it is. Photoshop automates so much of the human element, including using AI.

    People talk about "ethically sourced" AI training and accuse AI companies of violating copyright for scanning images, as if every artist whose ever practiced by tracing an image or drawing their favorite character haven't also violated copyright in the exact same way. The only difference is monetization, which is an issue with how the tool is used, not the tool itself, and circles back to being a labor issue.

    26 votes
    1. Wes
      Link Parent
      And even then, lots of these tools are not monetized. Stable Diffusion is trained on copyrighted images, but is a free and open-source tool. Lots of people are hacking on it to create extensions,...

      The only difference is monetization...

      And even then, lots of these tools are not monetized. Stable Diffusion is trained on copyrighted images, but is a free and open-source tool. Lots of people are hacking on it to create extensions, fine-tunes, LoRAs, and more. Most users of generative AI seem to be hobbyists.

      "Ethical sourcing" is a bad label in the same way that "pro-life" is a bad label. It poisons the discourse before any conversation can even be had. Should copyright holders be able to control if their work is used for training? I'm not sure yet! It's a brand new question. It seems like a reasonable argument, since they are the original creator and copyright grants them control over how their work can be used. But it's also true that the qualities being understood by AI are often those that cannot be copyrighted, such as a general "style". And as mentioned above, this is also the process by which humans learn; something that has long precedent of being acceptable.

      I feel we need to balance the interests of the content creator with those of society as a whole. It's bad when people lose jobs, but it's also good when technologies come out that empower people or make processes cheaper. If an animation studio can get by with two animators instead of four, the whole production just became cheaper. There is an immediate negative (two people losing a job), but in the longer term it also means that the barrier to entry to becoming an animation studio just went down. You can do it with fewer people, and at cheaper costs.

      My point is, none of it is simple. Picking a side and then claiming moral superiority isn't contributing anything to the conversation, but it's what this blog author is doing, and it's what terms like "ethically sourced" imply. Eventually the legal process will take its course and I expect attitudes will slowly align with that precedent, but until it seems most of the conversation will be dominated by mud slinging, with very little consideration of the issue itself.

      15 votes
    2. DavesWorld
      Link Parent
      Personally, I'm getting very tired of people who seem to think "but (creator) didn't consent to (their work) being (examined by the AI)" is some sort of valid legal argument. It's not even a moral...

      Personally, I'm getting very tired of people who seem to think "but (creator) didn't consent to (their work) being (examined by the AI)" is some sort of valid legal argument. It's not even a moral one. And they want to claim it's both.

      I'm a writer. I make money off words. Just to indicate I have skin in the game here. I'm also against AI (basically, a computer) being allowed to generate a copyright. If sentient AI shows up, we might have to revisit this, but for the moment when people say AI they just mean a highly detailed generative analysis program, and not Skynet or Lt Commander Data.

      Let's take that part first. The copyright, and who can own it. Until now basically, there was no question about copyright. Of course a human owns it, because only a human could create something copyrightable. So copyright law dealt with how to transfer and license copyrights. You can sell them, you can hoard them, you can abandon them, and so on. Simple.

      The US copyright office, so far, has ruled a computer can't own a copyright. And, following that chain of legal logic, any "creation" of an AI can't be protected by copyright since the creator (the AI, a non-human) never owned it in the first place to sell it to you. Basically, making AI generated "creations" public domain from the moment of creation.

      That's the important part, that needs to stay. Because we all know what'll happen if corporations, the wealthy, or even a decently organized fan group with some pooled resources gets together and starts churning out copyright creations. Right now, no one really can or will because ... there's no money in it.

      They can't own the stuff the computer generates for them, so no one's scrambling to do that. If the movie scripts, books, art pieces, music, whatever, could be immediately owned by whoever owned the computer ... we'd be buried under "creations" created at the speed of a computer. As James Cameron wrote about computers/robots, "... and it absolutely will not stop ... EVER."

      Before, when only humans could create, figuring out why that's important was unimportant. Now, with computers, we know why; because computers don't stop. They'll keep running the program, cranking out "creations" endlessly. And that would be bad for art in general, for people's consumption of it, and certainly for the economy of the creative industries. There's basically no sane world where it should be allowable for someone to set up their AI program and cash endless checks off its creations.

      As for the "didn't consent to the AI being 'allowed' to scan work" argument ... it demonstrates a lack of understanding in how information is transmitted. Art is information to humans. We look at a painting, and the colors (or clay, or wire, or whatever) has been arranged and we view it as information. Reacting to it. Same for music, for books, for it all. At its core, art boils down to creative information. And information that has been observed has been transmitted.

      Copyright concerns itself with the use of that transmitted information. It's not a violation of copyright to view the painting, to read the book, to receive the transmission created when a work of art (again be it movie, music, book, script, painting, digital piece, whatever) is released into the wild. What copyright coordinates is the ability to profit off the work.

      Every artist is shaped by the art they're exposed to. Every artist usually shapes themselves by seeking out art, narrowing in on the types and formats that seem to call out to them; and these will be what they study most. A kid with a dream of being a singer will consume a lot of music, a budding author reads a lot, and so on. Most artists I know, whatever their medium, often have what can sometimes build to full on explosions of ideas when they interact with someone else's art. Which is to say nothing of when they casually engage with art, or when they formally study it, or anything else of the sort.

      The argument that AI violates copyright by studying art ignores how humans do the same thing. You can't ban it for computers without doing the same for humans. Which would mean humans couldn't actually consume the art. How can you possibly, possibly, craft a law that says something to the effect of "licensor agrees to not be influenced by The Work after consumption of it; and agrees that doing so is a violation of Creator's copyright" and have that hold up in court? Have that not result in chaos?

      If I read your book, I've been influenced by it. It's in my head now. Thousands of books from hundreds upon hundreds of authors are in my head. Each one having impacted and shaped the author I've become. And I'm no different from every other artist, from wannabe movie directors to musicians to painters, heck even actors study ... wait for it ... other actors' performances. You will see novice actors in acting class try to directly emulate more famous ones, as part of how they learn their own acting voices.

      So not wanting corporations to replace and remove human creatives is a fine goal, and one I support. Not wanting (anyone) to set up a computer and crank out an endless stream of "creations" to dump on the market is something human society needs to work to avoid, because it'll be disastrous on so many levels.

      But whining "the AI stole stuff" is meme that demonstrates a fundamental lack of understanding of copyright and art in general.

      I grew up watching Bob Ross paint wet-on-wet landscapes, and about ten years ago discovered there are digital artists duplicating that same technique using graphical arts programs like GIMP and Photoshop. Same kinds of art as his comes out on their screen, just some of these newer artists didn't need to beat the devil out of their paintbrush in a garbage can when changing colors.

      They're not violating Ross's copyright. They're just not. Neither is a generative program that analyzes the library, the museum, the Internet.

      AI is here. Not coming; here. I look forward to what the next creative era brings as artists of all media incorporate this new tool in their work. Because ultimately, that's what happens with all tools. It takes a skilled operator to get good output from it. The same people who can't write still won't be able to write. And they'll find out, same as now, that "idea" isn't enough to make for a good story. Sure the computer might fill in the grammar and spellcheck it for them, but bad story is still bad story.

      Even with a computer helping, what'll make it art will still be the human element. And for now, unless Congress or the Copyright office lose their minds, a human still has to be in the loop to be able to profit from it. Because that's what copyright is in human society; the right to profit from a creation.

      9 votes
    3. [2]
      CosmicDefect
      (edited )
      Link Parent
      I think the peak difference is magnitude. A little rain doesn't hurt, and is beneficial, but a lot causes floods, destroying and killing. Humans doing all these things generally lead to more and...

      Nearly every single argument I've heard against AI is something you can point to that already exists in the art world. Artists train on each other's work without permission. Artists copy each other's style. Artists literally copy exact elements out of existing works and turn them into other things. Artists use machines and algorithms. Photography recreates the world exactly as it is. Photoshop automates so much of the human element, including using AI.

      I think the peak difference is magnitude. A little rain doesn't hurt, and is beneficial, but a lot causes floods, destroying and killing. Humans doing all these things generally lead to more and better artists which is good for humanity. It is entirely unclear these AI tools would do the same unless society makes some rules around their usage or training.

      If I spend years becoming talented enough to rival Stephen King in American horror novels, that's fair game. It's a good thing the artistic space has grown to have a robust space of creatives jostling and competing. Likely my own style will contain a unique voice all my own as well that wouldn't have existed otherwise.

      If I instead I use a fraction of the time training AI on all of King's works and crowd him out of the market, we've lost something precious. Just look at the absolute wasteland that has become the self published section on Amazon. The creative space there is being entirely drowned and destroyed and the AI replacements aren't even "good." They're awful facsimiles of books which exploit the market by sheer volume and speed of production rather than actual merit.

      8 votes
      1. Stranger
        Link Parent
        In regards to your final paragraph, there's a few things to keep in mind: In spite of the Industrial Revolution effectively eliminating a number of different crafts as a viable middle-class...

        In regards to your final paragraph, there's a few things to keep in mind:

        1. In spite of the Industrial Revolution effectively eliminating a number of different crafts as a viable middle-class career, bespoke craftsmen still exist to this day. While AI may very well eliminate a lot of creative jobs, it's very unlikely to completely eliminate ALL economic value from human-generated artistic works. It also won't stop people from making art for it's own sake, just as most art is made now.

        2. AI training itself on AI generated works creates some interesting problems in the dataset. At some point the models will either reach a point that's "good enough" and get cut off from further training to avoid corruption, or else it will need to be actively maintained to ensure it's receiving continuous original content. In the latter case, some kind of balance would have to be reached between AI and human generated content.

        3. And honestly, even before AI there has been a ton of poor quality self-published art. A lot of art has a very low skill floor to enter and a very high one to get noticed. The ease of flooding the marketplace might be exponentially higher, but you can moderate that with something like a "AI" tag and banning spammers.

        2 votes
  5. [3]
    Handshape
    Link
    Generative models can be ethically sourced. Take a look at what Getty images and Nvidia are working on for an example. The issue way down at the bottom of this isn't about the AI, it's about...

    Generative models can be ethically sourced. Take a look at what Getty images and Nvidia are working on for an example.

    The issue way down at the bottom of this isn't about the AI, it's about training data used without consent. For the folks in the copyright space, it's about content copied without authorization for the express purpose of creating works that compete with the copyright holders in the marketplace.

    22 votes
    1. smoontjes
      Link Parent
      Hmm I'm wondering if th-- never mind they already addressed it: Like you said, that does sound ethical! I agree that it is basically the only acceptable way of doing this. AI companies basically...

      Partnering with trusted industry leader, NVIDIA, our AI Generator pairs Getty Images' vast content and data with the latest AI technology to unlock endless possibilities for ideation and content creation.

      Hmm I'm wondering if th-- never mind they already addressed it:

      Compensates creators
      We’ve created a model that compensates our world‑class content creators for the use of their work in our AI model, allowing them to continue to create more of the high‑quality pre‑shot imagery you depend on.

      Like you said, that does sound ethical! I agree that it is basically the only acceptable way of doing this. AI companies basically freeloads their way to wealth through ripping the entire internet of images without credit -- monetary wealth through moral bankruptcy is not a new concept, but I am glad it is being addressed so relatively quickly though

      10 votes
    2. [2]
      Comment deleted by author
      Link Parent
      1. Handshape
        Link Parent
        Reiterating: the knowledge and rights are not in the trained weights, they're in the training dataset. The mincing into weights is done after the fact. As for the artist's consent, Getty hosts...

        Reiterating: the knowledge and rights are not in the trained weights, they're in the training dataset. The mincing into weights is done after the fact.

        As for the artist's consent, Getty hosts content for artists and photographers who chose to broker their work through Getty for pay. No such compensation happens for Twitter or Facebook.

        2 votes
  6. lou
    (edited )
    Link
    The issue occurs when the AI is trained in materials the developer does not own or license. An ethically sourced AI would not have this problem. Some people in /r/mud reportedly used AI to help...

    The issue occurs when the AI is trained in materials the developer does not own or license. An ethically sourced AI would not have this problem.

    Some people in /r/mud reportedly used AI to help them flesh out roleplay profiles and generate room descriptions for free MUD games. I assume the author wouldn't be as judgemental about non-commercial uses such as these.

    14 votes
  7. [4]
    UP8
    Link
    I think that's a silly position. I use Generative Fill in Photoshop all the time now, usually without a prompt, to do simple and not so simple retouching tasks. For instance I used it to make a...

    I think that's a silly position.

    I use Generative Fill in Photoshop all the time now, usually without a prompt, to do simple and not so simple retouching tasks.

    For instance I used it to make a spot on a brick wall go away

    https://mastodon.social/@UP8/110607460518784045

    and it turned out the image wasn't properly aligned on the wall so I had it draw another row of bricks on the bottom of the image. Adobe trained the model from images that were part of their stock photo exchange and had a contract so the copyright issues are resolved too.

    14 votes
    1. [2]
      0xSim
      Link Parent
      I think there's a huge range between using AI to remove a spot on a wall or fill a "repeating but not really" pattern, and using it to generate a whole new image explicitly in the style of another...

      I think there's a huge range between using AI to remove a spot on a wall or fill a "repeating but not really" pattern, and using it to generate a whole new image explicitly in the style of another artist.

      6 votes
      1. UP8
        (edited )
        Link Parent
        Yeah and no. Accusations of plagiarism and similar crimes are widespread in popular culture. For instance a fan artist who uses the IP of Nintendo, Type Moon or Idea Factory will get mad because...

        Yeah and no.

        Accusations of plagiarism and similar crimes are widespread in popular culture.

        For instance a fan artist who uses the IP of Nintendo, Type Moon or Idea Factory will get mad because their images are up on a web site like danbooru.

        Every semester art professors face down students who imitate the red/black/white Futura Bold of feminist artist

        https://en.wikipedia.org/wiki/Barbara_Kruger

        who will earnestly say that they thought it up themselves when really they’ve been seeped in it seeing it on bus shelter ads, magazine covers, etc.

        For that matter so many pop songs sound a lot like some other pop song like The Dangerous Type by the Cars has the same intro as T-Rex’s Bang a Gong. There are only so many chords and music fundamentally walks a line between being familiar enough to be comfortable and different enough to be interesting. There are lawsuits over pop music all the time and that’s one reason why good music generating models are being held back, even if one was 100% as “creative” as a good songwriter, it would still get accused of ripping people off.

        I even suspect the guy who made that painting isn’t 100% happy that I made a print of it.

        So I think it is not A.I. that is ethical or unethical, it is how you use it.

        7 votes
    2. smoontjes
      Link Parent
      Your office wall looks great! Keeping it neat yet colorful which is lovely

      Your office wall looks great! Keeping it neat yet colorful which is lovely

      2 votes
  8. arqalite
    Link
    Unpopular opinion - I don't believe copyright is a good solution for the digital age, and trying to enforce copyright on LLMs and other AI models is a fruitless task. I believe all human creation...

    Unpopular opinion - I don't believe copyright is a good solution for the digital age, and trying to enforce copyright on LLMs and other AI models is a fruitless task.

    I believe all human creation and knowledge should be freely created, shared, modified and discussed. As such, I find LLMs an immensely helpful resource nowadays.

    You get to find and derive knowledge so much faster, and get things done at blazing fast speeds.

    Yes, they have issues, plenty even. Training data biases and hallucinations (that, anecdotally, seem to occur much more often nowadays instead of less often - ChatGPT taught me how to use a Python standard module that doesn't even exist) are my top two problems with them.

    At the same time, I do believe that people should be compensated for their creations, but I also believe once you put it into the world, it's no longer yours, it's for all of humanity to enjoy.

    Any amateur artist will know that building off other people's work is the quickest way to learn and figure out what you like and form your own style. And even once you form your own style, you'll still unconsciously steal bits and pieces from your favorite work. It's part of the creative process and how our brains do all of this.

    Copyright laws, in my opinion, block all that. I've probably lost count of how many bootlegs and unofficial derivatives and remixes of songs I've enjoyed, only to lose them to copyright strikes, or have them on obscure platforms.

    It's a tough problem to solve, and I won't even dare to come up with one, but our current copyright laws just ain't it.

    EDIT: This is so scattered, I thought I was having a coherent train of thought but nope.

    10 votes
  9. hhh
    Link
    Sweeping statements like this expose a lack of creativity and completely ignore potentially transformative use-cases such as in drug design.

    Sweeping statements like this expose a lack of creativity and completely ignore potentially transformative use-cases such as in drug design.

    10 votes
  10. SpruceWillis
    Link
    I generally don't use AI for anything that's going to bother artists or creators. I play a lot of D&D as a Dungeon Master, currently playing weekly with one group and fortnightly with another....

    I generally don't use AI for anything that's going to bother artists or creators.

    I play a lot of D&D as a Dungeon Master, currently playing weekly with one group and fortnightly with another. Being able to use AI to describe a room, building or NPC and have the AI build on the description and really flesh it out is great.

    I also use Foundry Virtual Table Top to play D&D and use AI to help me create and understand javascript macros to create some really immersive effects for use by me and my players.

    9 votes