38 votes

Topic deleted by author

18 comments

  1. [2]
    Comment deleted by author
    Link
    1. GalileoPotato
      Link Parent
      Well, the developer is suggesting that it's because of AI generated content, but they're not showing what game it is, the assets, a trailer, nothing. We don't know if they're being 100% truthful,...

      Well, the developer is suggesting that it's because of AI generated content, but they're not showing what game it is, the assets, a trailer, nothing. We don't know if they're being 100% truthful, as you've said. I can't have an opinion because the facts aren't completely laid out. What I am seeing is a supposed game developer trying to make people on reddit choose sides against a large publisher without actual evidence.

      39 votes
  2. [6]
    Minithra
    Link
    Makes sense for a storefront that makes money off of hosting people's product to want to be sure that they won't end up involved in a lawsuit or something if the product is... "stolen" ? Not sure...

    Makes sense for a storefront that makes money off of hosting people's product to want to be sure that they won't end up involved in a lawsuit or something if the product is... "stolen" ? Not sure what the word would be there.

    Hopefully stuff like this will lead to some clearish regulations with AI-produced content.

    14 votes
    1. [3]
      Eji1700
      Link Parent
      The major news here is that this will affect AAA studios the most. Some indie developer sneaking in some AI art or text is going to be impossible to catch, but large studios can wind up liable...

      The major news here is that this will affect AAA studios the most.

      Some indie developer sneaking in some AI art or text is going to be impossible to catch, but large studios can wind up liable enough they can't screw around with that. There's 0 way that companies like EA or Blizzard aren't planning on trying to roll out AI everywhere they can, but if it means not launching on steam that's a big issue.

      I see this leading to either "clearer" regulations on AI (which will more likely come down when Disney's legal team lets the lawmakers know what the rule is, and probably somehow exempt mega corps but still make sure you can't AI a mickey mouse), or possibly competitors like Epic saying "well we're ok with it".

      This is a big chance for epic to have a "reason" to exist and get natural exclusivity. Rather than just throwing money at products, if they allow AI assets and they are in the latest FIFA/CoD/Diablo/Whatever, then that could be a game changer.

      11 votes
      1. MimicSquid
        Link Parent
        I think it'll depend on the provenance of the art, yeah? A small developer may depend on a publicly available image generator, but a AAA studio can make their own software (or license one) that...

        I think it'll depend on the provenance of the art, yeah? A small developer may depend on a publicly available image generator, but a AAA studio can make their own software (or license one) that has cleaner provenance and are legally art tools or something similar.

        Or they can just lie, really. With some degree of reworking its easier to hide the origin.

        8 votes
      2. [2]
        Comment deleted by author
        Link Parent
        1. Eji1700
          Link Parent
          Sure, but this just went up. I suspect that by this time next year people will already be able to get around it. It's going to be frustrating as hell though when a bunch of indie developers get...

          Sure, but this just went up. I suspect that by this time next year people will already be able to get around it. It's going to be frustrating as hell though when a bunch of indie developers get slammed with auto rejects, even for non AI art, and get no help, while the major companies get a direct line to get things hand waved (basically the problem with youtube and every other automated content system ever).

          3 votes
    2. DoomedCivilian
      Link Parent
      The DMCA protects Valve here. But there is still significant risk to Valve, as they'd be forced into a position to refund customers should products like this be pulled. Proactively blocking them...

      The DMCA protects Valve here. But there is still significant risk to Valve, as they'd be forced into a position to refund customers should products like this be pulled. Proactively blocking them is likely the right call, at least until the legalities get worked out.

      Although given the assets were obviously infringing to the point where they had to be 'improved' by hand (after an initial denial by Valve) likely makes this a case we don't want to base future policy predictions off of. It sounds like a very rough asset flip attempt, only this time they used recognizable assets they didn't own passed through an AI model that changed them very little.

      8 votes
    3. [2]
      Comment deleted by author
      Link Parent
      1. sparksbet
        Link Parent
        IANAL but is this true of Steam? I know it's true of social media networks and the like but I know there is a boundary on which types of online services the safe habor provision applies to and I'm...

        What's interesting is that they're already not responsible as long as they comply with the DMCA takedown requests when received

        IANAL but is this true of Steam? I know it's true of social media networks and the like but I know there is a boundary on which types of online services the safe habor provision applies to and I'm not 100% sure if Steam qualifies (since it curates which games it offers for sale and directly makes money from game sales). Do you know if there's something out there verifying that Steam qualifies for that kind of safe harbor protection? Alternatively, if someone on here knows more about the law than I do and can elaborate on it based on that, I'd appreciate that.

        3 votes
  3. [5]
    Comment deleted by author
    Link
    1. Minithra
      Link Parent
      Firmament also has a lot of AI-assisted content, but it was all to basically give a starting point for artists to further refine, rather than minimally edit or use-as-is. It's also original stuff,...

      Firmament also has a lot of AI-assisted content, but it was all to basically give a starting point for artists to further refine, rather than minimally edit or use-as-is. It's also original stuff, not "give me mickeymouse... but make it sliiiightly different"

      8 votes
    2. [4]
      Comment deleted by author
      Link Parent
      1. [3]
        raze2012
        Link Parent
        Sounds about right. Stomp down on attempts to make open source art accessible, while letting Adobe more or less steal rights to art to be used by others. History truly repeats. Wonder if coporate...

        From what I can gather, this is what Valve is siding with. The OpenSource models are what will likely be flagged and disallowed, the closed source ones, that have a license to the model owners, will be allowed in.

        Sounds about right. Stomp down on attempts to make open source art accessible, while letting Adobe more or less steal rights to art to be used by others. History truly repeats.

        Wonder if coporate licenses are waived around this. I imagine that they would lest we'd see a huge industry movement to some alternative.

        4 votes
        1. [3]
          Comment deleted by author
          Link Parent
          1. sparksbet
            Link Parent
            This is the current consensus in the copyright law world atm, fwiw. There's quite a bit of clear precedent on that front. I highly recommend looking for AI-related coverage on Leonard French's...

            This would be a strong reading of the current copyright office's interpretation of human authorship, which render the argument of ownership moot.

            This is the current consensus in the copyright law world atm, fwiw. There's quite a bit of clear precedent on that front. I highly recommend looking for AI-related coverage on Leonard French's Youtube Channel -- he's an actual copyright attorney and is very good at explaining the current state of laws like this imo

            6 votes
          2. raze2012
            Link Parent
            Sure, and the alternative was... not having access to a nigh required tool in order to work in industry. Adobe already has modern users stuck on subscriptions, and as such hey can update their...

            I mean, it's not really that simple. Like with adobe, artists signed a contract when they agreed to EULA of the software. They gave up their rights willingly to these large corporations

            Sure, and the alternative was... not having access to a nigh required tool in order to work in industry. Adobe already has modern users stuck on subscriptions, and as such hey can update their EULA to whatever they want, whenever they want. If you signed up a year ago and then suddenly had to sign this EULA in order to continue the subscription you already paid for: well, tough luck.

            It may be legally fine, but the devil is in the details, and it's not like this is a new tactic for Adobe.


            as for your more general analysis: I do indeed fear the 3rd outcome is the most likely. The legal requirements to show that all art in your prompt is okay will become similar to the old school game industry hurdles Nintendo/Sony required in order to publish a game. It essentially means the indies and maybe even AA studios can't compete, making that part of industry a AAA-exclusive club of scale that widens the gap further (Or as you said: the less regulated parts of the world that already violate copyright on the daily).

            I'm still ambivalent on how exactly I want the dust here to settle, but a result that essentially makes the rich richer and further discourages smaller creators to thrive is my worst nightmare.

            2 votes
  4. vektor
    Link
    Another legal question that I think makes AI copyright legislation inherently problematic is that of "copyright whitewashing". There's three things that go into a generative model: The...

    Another legal question that I think makes AI copyright legislation inherently problematic is that of "copyright whitewashing".

    There's three things that go into a generative model: The "algorithm", the training data, and the prompt. I can, no problem, set up a generative model that is so overfitted to a training set that I might as well be copying from it.

    • If we ever decide that AI outputs are independent of their training data, in terms of copyright, this is an obvious way to abuse that.
    • If we decide against that, then we force the AI community to gather CC or similarly copyleft datasets, or keep it all proprietary. Proprietary training data is bad, of course, for everyone who isn't a megacorp, while CC datasets basically eliminates almost all ways in which we previously gathered data. This will either spur on a drive towards more data-efficient models (good) or cripple current AI models (bad).
    • If we decide to go for a "somewhere in the middle" approach, you end up in a giant fucking mess of legal unclarity about where the line is. Is it whitewashing if I just train using a single sample? What if I train using a thousand copyrighted samples, and get something out that is clearly copyrighted then, but we don't know by which of the thousand original owners? What if I manage to pull a sample that is clearly from the training data out of a very large model trained using billions of data points? There's no obvious way to draw the line here, and I'd argue there's no way to draw the line to make both sides happy; more likely you'd end up with a line that stifles development while not protecting rights holders. Moreover, drawing a line that makes rights holders happy in the US would just mean that China overtakes the west in AI. They'll just violate US copyright doing it, hurting the rights holders, creating a double whammy.

    As for the algorithms, while they might contribute a lot, they're generally speaking far too liberally licensed to matter. They're also not copyright-protected to the degree you couldn't just reimplement them if needed. No problems here, for the most part.

    As for the prompt, this is similarly tricky if we also consider prompts of "higher density" media like images, video, or sound that are copyrighted. Can I ask stable diffusion to repaint someone else's painting and then claim that as my own work? Surely not! Though admittedly this point is much less interesting as the one about training data, as we can trace lineage more clearly here and there's also no good reason to take a hardline stance here of considering the output to be a derivative work of the input, meaning you must secure a license for the input in order to use the output.

    Frankly, I don't see a good way out of the training data issue that doesn't involve a massive reform of copyright more generally.

    5 votes
  5. [6]
    kallisti
    Link
    Legal quandaries waiting to be resolved aside, I really truly hope this ends up being the final decision and the new cultural norm we come to rest on. It's already shocking how much obvious...

    Legal quandaries waiting to be resolved aside, I really truly hope this ends up being the final decision and the new cultural norm we come to rest on. It's already shocking how much obvious copypasted ChatGPT content I see floating around the internet purporting to be the work of real people - I see it in comments threads and marketing copy, and I feel like if it proliferates much more it will just make huge swathes of language utterly meaningless, especially as they feed training sets full of GPT-output back into the next version of the models.

    The same thing goes for AI art. I don't want this stuff becoming so prevalent that the only images I see are things that were spat out of midjourney. This particular case is pretty good evidence that if you tell someone "no, get rid of that AI generated content" they'll basically just sharpie a smiley face on top of it and call it their own, so good on Valve for telling them to get bent, imo.

    3 votes
    1. [5]
      FeminalPanda
      Link Parent
      so should procedural generated maps and content be made illegal as well? Why does an AI that learns from others be damned but a human that learns from other get praised?

      so should procedural generated maps and content be made illegal as well? Why does an AI that learns from others be damned but a human that learns from other get praised?

      5 votes
      1. [2]
        Comment deleted by author
        Link Parent
        1. raze2012
          Link Parent
          yes, data which may or may not be assets. the assets may or may not be properly licensed and if you use enough it can become as much a black box as any AI model. I feel this is intent vs. tech. I...

          that is typically a human crafted algorithm that's generating data, and you can explain exactly how it works.

          yes, data which may or may not be assets. the assets may or may not be properly licensed and if you use enough it can become as much a black box as any AI model.

          I feel this is intent vs. tech. I wouldn't be susprised if you could levereage the core tech in these AI tools, ignore assets, and essentially use them to procedurally generate stuff people care less about copyrighting. Like environment props or terrain. Likewise, I'm sure someone smart enough could levereage existing creation tools and basically rig up their own AI tool that samples from existing artwork.

          imo, they are used to trivialize very real human endeavors that enrich our lives. don't want art to become a thing where you just type a few words in a box and then receive something you can pass off as real effort

          Compared to asset flip games or some games which certain fans may describe as "creatively bankrupt"? I'm honestly not sure.

          And I don't see why not. There is and wll always be lazy or outright stolen art, even if we somehow univerally ban use of such tools. The best uses of such a tool won't be ones that type a few words and throw the result into a game. They will create their own basis, tweak animation sets until it feels alive, be used to fix seams and other oddities in rigs/skins, and overall help add polish which as of now can take longer than the base model itself.

          And this isn't a fantasy world. Machine Learning has been doing this stuff for a few years before this AI crazy kicked up. I can't imagine the tech is radically different: https://developer.nvidia.com/blog/virtual-character-animation-system-uses-ai-to-generate-more-human-like-movements/

          1 vote
      2. [4]
        Comment deleted by author
        Link Parent
        1. jaxoff
          Link Parent
          Just to clarify, your comment has a misunderstanding of how AI image generators work. Yes, they need a dataset of images to train off of, but they are not "reassembling" those images into a...

          Just to clarify, your comment has a misunderstanding of how AI image generators work. Yes, they need a dataset of images to train off of, but they are not "reassembling" those images into a collage. The exact tech is not something I can properly explain but it's more like the AI is using the dataset as parameters to inform its outputs.

          They are "original works" technically speaking. But I wouldn't say the AI has true originality like a human artist does. AI can only synthesize art from previous human works. It's up for debate in the courts whether or not this process is copyright infringement or if it's fair use.

          3 votes
        2. [2]
          raze2012
          Link Parent
          To be fair you can argue the same of procedural generation. It's only as effective as the assets you feed into it. Would the concept of AI art be fine if I scoured the web for thousands of CC0...

          To be fair you can argue the same of procedural generation. It's only as effective as the assets you feed into it. Would the concept of AI art be fine if I scoured the web for thousands of CC0 (IIRC, could be a different license) art pieces and only use those for my models? Add in some more thousand proprietary photos I take myself and there's almost zero ethical quandary.

          Or at least, no more ethical quandary than using a Google maps API to generate terrain, since I'll inevitably have pictures of real life locations in there.

          1 vote
          1. [2]
            Comment deleted by author
            Link Parent
            1. raze2012
              Link Parent
              It's sad to wonder if that's genuinely worth the work compared to spending years either honing your own craft or finding a like minded artist to work with you. But with the demands of moder game...

              It's a very difficult path to take due to the documentation overhead, but it heavily diminishes the risk of legal issues as a result.

              It's sad to wonder if that's genuinely worth the work compared to spending years either honing your own craft or finding a like minded artist to work with you. But with the demands of moder game audiences, it's not something to immediately dismiss.

              I'll still stick to the latter path for several reasons, but it's a question that's been looming in my head for a while now for my personal missions. I want to help smaller or individual teams utilize higher fidelity assets and larger scale worlds, and to some extent I will need to lean on software to help realize that. But finding the lines on what is and isn't a creator's "art" while relying on such software can be daunting.