I have to admit I didn't much care for this video. The premise was a nuanced look at a complex issue from a legal perspective, which sounded awesome. But I feel it really didn't deliver on that...
Exemplary
I have to admit I didn't much care for this video. The premise was a nuanced look at a complex issue from a legal perspective, which sounded awesome. But I feel it really didn't deliver on that premise. Instead I found it heavy on moralizing, and light on legal interpretation.
Probably the biggest problem I had is that this is somebody that clearly does not like the AI art community. He speaks with disdain towards them the entire time. He makes the occasional comment that "Sure, this isn't everybody", but then goes right back using phrases like "These people believe...".
He quotes random reddit users, who are frankly often young teenagers these days, and uses that as an example of how awful these users are. He then shows that a large number of AI generations are for anime pornography, which served little purpose other than to denigrate them. Is it really surprising that people like pornography? Do I need to point out how many nude ladies there are in classic paintings?
Beyond just the users, he attacks AI companies as being overly corporate, which - sure - I get. But he's again painting with broad strokes, and there's no distinction made for open models or open-source in general.
He uses the word "stealing" throughout the video as if it's a foregone conclusion. I'm not convinced that scanning an image to make minor mathematical adjustments to matrix in a model is actually stealing. I can absolutely see that argument being made, but he doesn't try to make it.
He implies that those using these tools are being unempathic to artists. That users are treating these tools as a means to an end, and not as true expressions of art. This is not necessarily wrong, but it also seems irrelevant to me. Art in all mediums can be produced as an expression of creativity, or as a mass-produced, audience-tested, soulless affair. And that's fine. If people are using AI generation to fill a role rather than to express their own creativity, I don't have a problem with that. I have no interest in Marvel's Superhero #73, but clearly a lot of other people do, so it's still creating value.
This argument also seems rather dismissive of those who cannot create through traditional tools. I don't think it really matters if that's due to physical disability, or simply that they haven't invested the time to learn the skills. I'm sure some will disagree with that comment, but I really dislike the idea that there's a requirement of time or effort for something to "count" as art. Art is expression, whether that's spending years working on your masterpiece, or taping a banana to the wall. Time and effort will likely factor into value provided, but not always.
I'm not saying that punching a few words into a generator and hitting submit makes you an artist, but I also don't discount this tool as a means of expression. For some people, it may be their only means of expression, and I think it's really unfair to dismiss that out of hand.
He touched on this some, but tools like inpainting, ControlNet, and DreamBooth, and LoRAs allow for a significant increase in control over these art generators. He argued that only a small number of users are actually making use of these tools right now, and I agree that's likely true. But this technology is still brand new. Everything is rapidly evolving. Those who spent six months learning how to correctly prompt Stable Diffusion 1.5 effectively had to start over when 2.0 released. The plugins and techniques required are evolving along with the tools themselves, and it's unreasonable to expect a majority of users to understand them. Eventually they will become more accessible, and their usage will likely increase as a result.
Part 5 was better, and covered some good examples of art with varying levels of control by the artists. His arguments seemed consistent with statements published by the US Copyright Office, which is that it comes down to the level of control by the creator. As mentioned above, that control is likely to increase as the technology evolves, and that will also shape the question of copyright.
I certainly see the pitfalls of AI tools. Not just imagery, but text generation as well. It will be invaluable to spammers and other bad actors who will gladly make life a living hell for everybody else if they think they can earn a dollar doing it. And what can I say? Those people suck. But I really try not to blame the tool when that tool has other, positive applications. And I really believe that AI does. Whether that's being able to create your D&D character, or generating header images for your cat's blog, or creating a custom storybook for your kids. I think it's all very cool, and a lot of fun, and a viable form of creation for those who otherwise may not have had the means to express themselves.
I don't want to criticize this video too harshly because clearly the author put a lot of work into it. But despite the length, I really don't feel it offered much new to the conversation, and in many ways dragged it down further by attacking portions of the community. That seems like a good way to turn away those you're trying to reach, and his closing comments seem to acknowledge that many will have done just that.
I think the disdain is warranted, because the complete lack of empathy towards traditional artists has been astonishing. This is a disruptive technology, so it's understandable that those who have...
Probably the biggest problem I had is that this is somebody that clearly does not like the AI art community. He speaks with disdain towards them the entire time.
I think the disdain is warranted, because the complete lack of empathy towards traditional artists has been astonishing. This is a disruptive technology, so it's understandable that those who have been disrupted are lashing out, and the reaction to that from the AI community has been surprise at best, and mockery at worst.
Sadly because the technology is out there, fully functional and impossible to stop, people are free to be as inconsiderate as they want. It's not like anybody can take their toys away from them if they're behaving badly. The victims have no recourse. So it seems fair to me that tech bros should get all the disdain they deserve, because social shaming is the only tool to keep them in check.
Consider that the disdain towards artists rallying against AI may also be very justified. From my perspective a lot of their arguments are deeply selfish and basically amount to "you shouldn't use...
Consider that the disdain towards artists rallying against AI may also be very justified. From my perspective a lot of their arguments are deeply selfish and basically amount to "you shouldn't use this tool because I think it's bad".
And or "I deserve such a strong claim to copyright that I own the right to control who can learn what from my art", basically summoning a new and self beneficial rule from thin air to prevent this progress.
They're literally trying to hold back what I am allowed to do so they can make a profit. I'm not going to have much sympathy for anyone taking such a stance. Especially when it's argued in such an angry and dismissive way.
For those arguing that we should express sympathy for artists as a whole, I understand. I can see why you'd be scared and I wish you all the best.
But I think the forests are being missed from the trees here. 90 percent plus of AI art is going to cases where humans really couldn't do the job. For example, I'm seeing lots of people take static pictures of characters and using AI art to make their own personal collection of emotions to use in some visual novel like experience.
Commissions and real creative works? It's going to require the touch of a human, and while I'm sure all artists will be forced to use AI tools in their craft, I think we will see demand scale quite a lot to account for the extra productivity, and we will see artists pay and livelihood increase from here on.
Assuming AI art doesn't go superhuman that is, but we can worry about that when we get to it.
AI art is also already actively being used by record labels with money available to commission actual artists instead for music and it's gross. They don't even often edit it much- there are albums...
AI art is also already actively being used by record labels with money available to commission actual artists instead for music and it's gross. They don't even often edit it much- there are albums coming out with obvious tell-tale signs of AI-generated art just thrown together on a cover with a logo smacked on top.
So, as someone that encounters a lot of new music releases on a weekly basis, I'm already seeing instances where it's being used to replace real creative works. Companies can and will cut corners to save costs if they can generate stuff via AI that they consider good "enough". I do think the threat to artists is real and I think the worry about anger, dismissiveness, civility, etc are all mostly a distraction. Artists that earn a living through creative work (already a tough thing to do in our society) have a lot to lose- which to them is going to be far more important than whether someone gets to use a shiny new AI toy.
That said, I think it's mostly just a noise vs. signal problem. I won't buy albums that have obvious AI art, and most of the time the music is garbage anyway, because someone that cares for their creative output as a musician often will care enough to use something better than obvious AI art to represent it. So I think we'll mostly just see an acceleration/saturation- just far more volume of low-quality material out in the world that one will have to wade through to find something worthwhile. Whether that is albums, websites, artworks, writings, etc.
Also, I personally think the underlying problem to nearly all of the concerns I see with AI is capitalist structures. From artists worrying about their livelihood to jobs being replaced. The "enshittification" of AI is inevitable under this system. Advertising will infect it. Money will bias it. Ultimately, I think the bad uses of it will outweigh the good until the entire societal structure it is being built under is fundamentally overhauled.
Yet again, like most technologies - it just ends up being a mirror into who we are - and I don't like what I see
I agree, but I think it's a distraction mostly in favor of the artists. On a general scale the average person stands to benefit from, and will support, access to all forms of AI generated content....
I do think the threat to artists is real and I think the worry about anger, dismissiveness, civility, etc are all mostly a distraction.
I agree, but I think it's a distraction mostly in favor of the artists. On a general scale the average person stands to benefit from, and will support, access to all forms of AI generated content.
And I understand concerns about spam, but for the most part every social system already deals with this problem. AI art will probably increase the noise, but these systems are pretty well prepared for spam with 1 in 100 works of art generally being good and worth displaying to users.
So the spam will get hit by - and increase load on - the systems built to filter it while the AI works that get through will probably do so on their merit as interesting works.
And while AI will certainly add to the spam, it's also greatly changing the world of how content is consumed. For now, that's in the form of things like search and recommendation algorithms. Chat GPT is already killing Google when you need to know something obscure or hard to search.
I find this an incredibly bold claim. I hold that AI will be a net negative for humanity unless many aspects of society change. The average person may be gullible enough to be duped into believing...
On a general scale the average person stands to benefit from, and will support, access to all forms of AI generated content
I find this an incredibly bold claim. I hold that AI will be a net negative for humanity unless many aspects of society change. The average person may be gullible enough to be duped into believing it's good for them, though. Maybe my claim is just as bold, and I'll grant that.
As far as spam, we are not prepared for AI-enhanced level spam. AI detection systems are already having trouble actually determining whether something is AI-generated, and it will only get more difficult if AI progresses. Things can be low quality but much harder to distinguish as spam algorithmically. I don't know that we'll have the tools to properly fight it.
My issue with AI proponents is the assumption of its positive and useful nature, as if that's not piss in the wind before it's corrupted the same way everything else is. I am a tech person, I understand excitement when it comes to technology. I also understand that technology enables bad things at scale along with the good, and I find "generative" technology to be uniquely repugnant. I don't agree with the apocalyptic AI doomers (the sci-fi humanity-ending stuff), but I do agree with people that see how real issues in healthcare, jobs, inequality, and society in general are just waiting to be disrupted in negative ways by AI and real harm caused, that is honestly scarier than any mass extinction event
Can AI be used for good things too? Sure, but our society (at least in the US) is not incentivized at all to have the good outweigh the bad. Maybe I'll be proven wrong, but I doubt it.
I honestly just refuse to be hopeful about every new technology. Call me stuck in an era or whatever else but sometimes I think certain tech would be better off not existing at all in our current social structure, and increasingly I feel like AI is one of those.
That said, I don't think we'll see eye to eye on this, so I'm not sure how fruitful continuing discussion will prove to be. The direction technology takes these days is extremely divergent from the directions I would rather it go.
That's a pretty big claim. Have you got any examples? I'm not saying you're definitely wrong, but aside from pieces with obvious tells like extra hands, most of the art I see people citing as...
They don't even often edit it much- there are albums coming out with obvious tell-tale signs of AI-generated art just thrown together on a cover with a logo smacked on top.
That's a pretty big claim. Have you got any examples? I'm not saying you're definitely wrong, but aside from pieces with obvious tells like extra hands, most of the art I see people citing as obvious AI art just looks like modern Chinese or Korean art to me. Which mostly just tells me there was a lot of art from China and Korea in the datasets used to train these things. It's suggestive, but not really damning the way it's often painted to be.
I'll see if I can post some later when I look them up, but most of them have tell-tale signs of being from Midjourney. If you've used AI art generators for any amount of time (I spent weeks doing...
I'll see if I can post some later when I look them up, but most of them have tell-tale signs of being from Midjourney. If you've used AI art generators for any amount of time (I spent weeks doing generations and using it prior to becoming more anti-AI and I think I might be okay in limited ways with it, like with using it in non-commercial personal use ways), the majority of the output is incredibly easy to spot. Especially lazy output that hasn't been altered or refined through hundreds of prompts/etc.
Midjourney output usually has many details that a human wouldn't put in. If you look at windows, architecture, etc you see really strange offputting lines that shouldn't be there. If it's a darker themed image, you often see "spaghetti nightmares". Much of that spaghetti-looking stuff is key marker of Midjourney output.
I've mostly seen it with metal albums, even albums coming from labels who should be paying artists instead.
I know Trespasser's latest album is AI art + a bit of editing. It basically looks like an old/ancient style painting but with a bunch of details messed up / broken in the unique way AI art breaks.
The reason I can't think of many off the top of my head is usually obvious AI art use on an album cover is an instant dealbreaker for me where I won't listen to the band. Trespasser's politics happen to vocally align with my own so I give them a pass. I watch a weekly stream where for a few hours we view all new releases coming out each week based on their release date on Metal Archives, and we encounter AI cover art nearly every week.
Examples:
Trespasser's latest
Red Cain - Naebliss (obvious Midjourney output with no obvious edits)
Seek's latest (more edited, but the windows are tell-tale Midjourney)
Looking at the Trespasser and Red Cain albums (I couldn't find an album by a band named Seek that was recent enough to be what you were talking about), I don't know. It could be AI weirdness, but...
Looking at the Trespasser and Red Cain albums (I couldn't find an album by a band named Seek that was recent enough to be what you were talking about), I don't know. It could be AI weirdness, but it could also be intentional weirdness. I've definitely seen art with that kind of dreamy vibe from before AI art was a thing.
Also, thanks for letting me know about a new Wheel of Time concept album. I've got to check that out.
The Red Cain is the worst offender of the ones I listed. It is the most obvious. I don't know if you've used Midjourney some or at all, but I'm pretty sure anyone that's familiar with Midjourney...
The Red Cain is the worst offender of the ones I listed. It is the most obvious. I don't know if you've used Midjourney some or at all, but I'm pretty sure anyone that's familiar with Midjourney can instantly see the IMHO undeniable Midjourneyisms in that album art.
ἈΠΟΚΆΛΥΨΙΣ by TRESPASSER has a dove and light beam edited over top of it, but the buildings, people, and flags, all have Midjourneyisms too- none of which I've seen a human create without AI art. There's a definite "look" and I think, unlike text, a lot of low-effort AI art is incredibly obvious.
I can't find the Seek one I found earlier (I was at work, looking in the bandcamp app)
There are actually quite a lot of rules against profiting from someone else's work without paying them. This technology could not exist without enormous amounts of training data, and as...
And or "I deserve such a strong claim to copyright that I own the right to control who can learn what from my art", basically summoning a new and self beneficial rule from thin air to prevent this progress.
There are actually quite a lot of rules against profiting from someone else's work without paying them. This technology could not exist without enormous amounts of training data, and as compensation for all their time and effort artists are looking to be the ones who lose out big.
They're literally trying to hold back what I am allowed to do so they can make a profit. I'm not going to have much sympathy for anyone taking such a stance. Especially when it's argued in such an angry and dismissive way.
Those greedy artists, they want to be paid for their work so that they can afford useless things like food and rent. There is a massive difference between corporations seeking ever growing profits and people just trying to pay their bills, and artists as a whole tend to overwhelmingly fall into the latter group.
Commissions and real creative works? It's going to require the touch of a human, and while I'm sure all artists will be forced to use AI tools in their craft, I think we will see demand scale quite a lot to account for the extra productivity, and we will see artists pay and livelihood increase from here on.
I'm extremely skeptical that for the first time in history, the threat of automation will be used to increase worker wages. I think companies will use the threat of automation, feasible or not, as a cudgel to extract more value from artists for less pay. Some artists will make enough of a name for themselves to still make a good living, but those positions will overwhelmingly go to the privileged, the people who can afford to take low or no pay work for years to build a reputation or who have connections to management.
There is real harm being done to real people by this technology, and I think it's a bad move to simply brush it off, even if you (not you specifically, bioemerl, I'm speaking with a general you) approach the problem from a purely selfish perspective. What's happening to artists now is a story that's going to happen to a lot of people in the future, whether from advances in AI or climate change chaos and forced degrowth. The sooner we figure this problem out, the fewer people are going to suffer, and the less likely it is that you'll be one of them.
And on an emotion driven level, I think this whole thing is tragic. Art isn't a career field that people get into to make the big bucks, they do it because they like to do it and find satisfaction from doing so. Even if they find new work, I think it's unlikely to be anywhere near as fulfilling, and the conversion of jobs from ones that bring people meaning to ones that don't is sad. Art is also a field that improves the lives of other people than the ones doing it. Most people have been deeply touched by a work of art, a brief moment of true connection to another human and a message that resonates on a core level. Even if AI gets to a level where it can replicate that, to me there's something so terribly lonely about that idea.
The rules protecting copyright are just that copy right. To learn from and create novel works was never protected, and contrary to what a lot of people are saying these AI models don't just copy...
The rules protecting copyright are just that copy right. To learn from and create novel works was never protected, and contrary to what a lot of people are saying these AI models don't just copy and reproduce artwork. They create genuinely new bits of art.
and as compensation for all their time and effort artists are looking to be the ones who lose out big.
They aren't really. Even if you assume artists capture the entire profit of AI art each individual artist would likely made pennies. The data going into these models is absolutely massive, and any attempt to even imagine paying for every data point, even if you assume the artists even deserve that pay, would see the breaurocratic costs render it a moot point.
they want to be paid for their work so that they can afford useless things like food and rent
They want to be required to do work that isn't necessary for them to do, and paid for that unnecessary work.
Artists are already paid for their work. They've been being paid for their work for ages. They'll likely continue to be paid for their work. The advent of AI art doesn't change that, it only changes what other people who aren't artists are able to do.
Artists want to restrict that, because they feel threatened that other people having access to AI art will render their current skills not valuable.
Even if it does, (I don't think it will in the medium term) they can do what every other displaced worker has done and find a new job. The artists making a living on art today are already a very rare occurrence and the vast majority of artists already have to have a side job regardless.
I'm extremely skeptical that for the first time in history, the threat of automation will be used to increase worker wages
Our entire modern lifestyle exists because automation enabled higher wages and standards of living. The historical precedent for this sort of automation generally improving people's lives is plentiful.
Fair. I doubt the legislators who wrote those laws were expecting anything like this to happen though. It is common to create new laws in response to new technology though. But overall I'm not...
The rules protecting copyright are just that copy right. To learn from and create novel works was never protected, and contrary to what a lot of people are saying these AI models don't just copy and reproduce artwork. They create genuinely new bits of art.
Fair. I doubt the legislators who wrote those laws were expecting anything like this to happen though. It is common to create new laws in response to new technology though. But overall I'm not familiar enough with the technology to really debate the point so I'll concede it. It just sits badly with me that artists were a vital part of making these technologies, but even though they never asked for or consented to it they're the ones who will lose their jobs to it.
They aren't really. Even if you assume artists capture the entire profit of AI art each individual artist would likely made pennies. The data going into these models is absolutely massive, and any attempt to even imagine paying for every data point, even if you assume the artists even deserve that pay, would see the breaurocratic costs render it a moot point.
Compensation was a poor word choice. The cost paid by artists is the loss of opportunity, the loss of actual paid work that will instead go to AI now.
They want to be required to do work that isn't necessary for them to do, and paid for that unnecessary work.
Artists are already paid for their work. They've been being paid for their work for ages. They'll likely continue to be paid for their work. The advent of AI art doesn't change that, it only changes what other people who aren't artists are able to do.
It depends on how you mean necessary. If on a strict bare minimum necessary for society to function, then no, but there are quite a lot of unnecessary jobs through that lens, and I think it's an entirely different discussion. There is demand for these jobs to be done, and if they're going to be done I'd much rather the benefits go to a human than a computer with all the benefits going to the people at the top.
I should clarify: by paid, I mean paid fairly, which I should've specified earlier. My understanding of the situation is that music artists generally make most of their money off of live shows, because so little of the value generated by their music on streaming services goes back to them. The Writer's Guild in the US is currently on strike because their wages have failed to keep up with the cost of living to the point that it's no longer a viable career. Pretty much every artist has gotten a "work for exposure" offer to the point where it's a meme.
And then comes AI. To use the writer's strike as an example, one idea that's been bandied about a lot is using AI to generate a script, then bringing in a writer to make punch ups. The main reason is that it's less money to hire a writer for punch ups than for script writing. But the writers I've heard talk about it say that it's harder to fix a bad script than to just write a good one in the first place. That's the trajectory I see AI following if we don't work to change things; money taken from working class folks and redirected to executives.
Even if it does, (I don't think it will in the medium term) they can do what every other displaced worker has done and find a new job. The artists making a living on art today are already a very rare occurrence and the vast majority of artists already have to have a side job regardless.
That's not an easy thing to do for many people, and can take a real toll. Even if it's a supplemental income, losing it is going to have real costs in terms of human misery that I don't think should be brushed off so lightly, especially since this is something that's going to keep happening in various fields.
Our entire modern lifestyle exists because automation enabled higher wages and standards of living. The historical precedent for this sort of automation generally improving people's lives is plentiful.
We're talking past each other a bit here I think. I wasn't talking long term, I was responding to you saying that you thought artist pay would increase, which I see as a short-medium term affair. For around a decade now, service workers and truckers have been threatened with automation any time they push for improved working conditions. If we had to pay you more, we'd just use a computer instead. For artists, I see it going much the same way, with some passion field exploitation thrown in too. There are a million other artists who would love the job, if you want more money I'll just give it to them, or a computer.
Long term, automation may benefit society, but that possibility doesn't excuse the preventable short term harm done. Trusting in things to naturally work out seems to me like a way to end up in a situation like farmers are in in the US. There have been strides made in automating farming, but every year more and more small time operations are pushed out by the big corporations. Equitable outcomes require effort to ensure that the biggest players don't leverage their position to end up with an outsized share.
Just note that you're guilty of your first complaint within a single sentence. You're just automatically assuming that AI is "progress" and anything that holds it back is automatically bad. Seems...
"you shouldn't use this tool because I think it's bad". And or "I deserve such a strong claim to copyright that I own the right to control who can learn what from my art", basically summoning a new and self beneficial rule from thin air to prevent this progress.
Just note that you're guilty of your first complaint within a single sentence. You're just automatically assuming that AI is "progress" and anything that holds it back is automatically bad. Seems like there's a lot of assumptions being made about the benefits of AI here without justification.
Your second complaint is pretty hyperbolic. Artists don't need to claim any stronger version of copyright than exists already. Copyright doesn't prevent what anyone learns from looking at art or from reading a book, etc. Exploiting a copy of an artist's work to make derivative works would be where copyright would already come into play. And that's pretty explicitly what training AI models is doing.
They're literally trying to hold back what I am allowed to do so they can make a profit. I'm not going to have much sympathy for anyone taking such a stance.
You're pretty vague here, but I'm not sure anyone should sympathize with a position that you should be able to do whatever you want without regard to the rights, lives, and livelihood of others.
What should you be allowed to do? Make a profit? Make a profit by exploiting the work of artists? Convince artists that to survive as artists they'll need to adopt your proprietary AI tools (which were trained using the artist's work in the first place) for a $19.99/mo subscription?
Artists don't even need to desire profits at all. Licenses like Creative Commons BY-NC exist for artists that want to give away their work, even allow it to be remixed, but not for commercial purposes without permission.
AI is doing the former. An AI model is trained to perform an action on data - in this case images collected from the internet. A derivative work implies a person's artwork is floating around in...
Copyright doesn't prevent what anyone learns from looking at art or from reading a book, etc. Exploiting a copy of an artist's work to make derivative works
AI is doing the former. An AI model is trained to perform an action on data - in this case images collected from the internet.
A derivative work implies a person's artwork is floating around in the model somewhere or the person's artwork is somehow part of the AI. Neither are true (barring a few cases where the AI developers screw up and it overfits on data, learning to copy instead of actually learning).
I'm not sure anyone should sympathize with a position that you should be able to do whatever you want without regard to the rights, lives, and livelihood of others.
Sure, but when what I'm doing doesn't actually get in the way of any of those things, this isn't my position.
I'm not interfering with artists ability to sell art. They still can do that. Nor with their ability to own the art they produce. They still have copyright just like they did before.
People using public data to train AI aren't stealing anything either. The tools literally just go download the works from the internet using public links.
And this proposition:
What should you be allowed to do? Make a profit? Make a profit by exploiting the work of artists?
Is where you establish that you're the one interested in limiting and interfering. Here's a brand new revolutionary tool and you're trying to shut it down so it can't be used.
If you can prove a work is derivative, do so, and get it shut down with copyright law. 99 percent of the time with AI art you can't prove that, because it's literally not derivative. It's a machine that knows how to make art, and the use of it doesn't exploit artists in any way but being competition to them.
they'll need to adopt your proprietary AI tools (which were trained using the artist's work in the first place) for a $19.99/mo subscription?
Now this I can get behind. These tools should be open source as much as possible and I think everyone paying a subscription for them is a fool.
Copyright law is inherently an imposition: it is the state giving an individual exclusive control over a specific piece of information. As it stands, not speaking for bioemerl, it's hard for me to...
Copyright law is inherently an imposition: it is the state giving an individual exclusive control over a specific piece of information. As it stands, not speaking for bioemerl, it's hard for me to stomach a "compassionate" argument for the privatization of culture that primarily benefits large firms with legal funds to enforce their copyright, just because it's convenient for artists to squeeze by on selling their copyright to those businesses.
I would challenge the idea that, in the case of generative visual art, it's clear that MLs are creating "derivative works" from their training set. It's an argument being had, and it won't be settled soon. In cases of LLMs like GPT, that argument feels more accurate, but separating derivative works of visual art from vaguely-influenced works seems incredibly difficult.
If your line of reasoning in the second half of this comment rests upon "You should not be guaranteed a profit based on the stolen art of others," that seems to be an argument against our current status quo in a way that AI art is in only a small way notably exceptional.
No, we will not. Artistic wages have declined since the 80's. We will see all the young artists in training being put out of business as the commissions they would cut their artistic teeth on will...
and we will see artists pay and livelihood increase from here on
No, we will not. Artistic wages have declined since the 80's. We will see all the young artists in training being put out of business as the commissions they would cut their artistic teeth on will no longer exist (I've already seen this.) No pay=no art, no art=no progress as an artist. It's a race to the bottom. The bottom being everyone using AI that is trained (unethically) on the styles of anyone successful. That artist will then have to compete with a machine cranking out the exact same style of their work, and they will be put out of business.
AI as a whole isn't bad. What is bad is the inability to opt of being included in data sets and controlling who can reproduce your work and in turn steal the style you've worked on developing.
The entire argument is about moral judgements around the use of the tool, not the viability of the tool itself. So far every company behind these tools has been morally bankrupt and refused to work with artistic communities to develop their tools in a way that is synergistic. Hence class action lawsuits are under way. Until there are common sense laws in place to protect human output, AI will always be detrimental and morally suspect.
This is just a really terrible statistic in general. Firstly, it's since the 80s. The relevance of a historical trend like that to a modern world-changing tool just doesn't exist. It's also a...
Artistic wages have declined since the 80's
This is just a really terrible statistic in general.
Firstly, it's since the 80s. The relevance of a historical trend like that to a modern world-changing tool just doesn't exist.
It's also a really bad statistic. How do they get that estimation? Is it adjusted for inflation? Does it account for the fact that very few people need to do things like having their photograph taken because they have digital cameras? Does it include the fact that there are many many more small-time artists running around because they have access to the internet where they can just post their art online for some cheap $20 commission?
And I don't think commissions are going to make that big of an impact. I've seen what you have to do to charge to get commissions for most cases. They're cheap as all hell. There's like 50,000 artists desperate to get any amount of money for their passionate work and does a result their engaged in this massive pit fight where they constantly undermine each other and it's basically impossible to make a living in the field.
Part of the reason I think a lot of this criticism is coming from people of incredible privilege. Any artist who is actually making money off of their heart is like one in 10,000. They are the relative ultra wealthy, they are living the dream. While they complain about having to lose their passionate job, hundreds of other would be artists are working at Starbucks.
Also I've seen what you have to do to get high quality work out of an AI. Unless you're looking for something crazy generic for the most part you're not going to be happy as some random unskilled user punching some problems into the tool and getting an output.
It takes it genuine amount of effort and devotion to learning the tools that exist today in order to actually get some workable outputs from these AI. You can't really get consistent characters out of them. You can't really count on hands and anatomy looking alright.
The artwork you see out there for AI generations that are like "wow look at this it's just like a person" is like one in 1000 generations and or a lot of tweaking and fine tuning of the generation with stuff like control net.
Barring some sort of revolutionary new generation of AI that can take more generic human prompting and produce an image with live feedback, maybe something like a chat GPT mixed with one of these diffusion models, commissions aren't going to be largely impacted. Especially considering the above where commissions right now are a total hellscape.
What is bad is the inability to opt of being included in data sets
And this has nothing to do with pay, this has nothing to do with commissions. This has nothing to do with artists being able to sell commissions. Everything you're complaining about has absolutely nothing to do with this copyright.
Adobe already has a model trained on open source and licensed artwork. It's already working, you can already use it for most of the same things you do with stable diffusion.
Think that's the problem, none of the problems you've raised so far are going to be solved by "ethical" datasets.
If you use an AI to reproduce someone's artwork in an exact form, that reproduction is just as protected under copyright as if someone had come along and traced a sketch. AI doesn't change copyright law.
You can use AI to reproduce a style if the AI has been trained on that style. But you could also get some random cheap $20 commission artist to do that as well. Style has never been copyright protected.
So far every company behind these tools has been morally bankrupt and refused to work with artistic communities
Well yeah, I certainly wouldn't. Artistic communities don't want this tool to exist because it's seen as a threat. What are you going to do, work with the people who want to shut you down?
And to be fair there's only one AI company that I have respect for, and that's stability AI, because they actually make their models open source for the average person to use and run themselves instead of trying to make a big money selling the modeling software as a service thing.
I still don't know how they plan to be sustainable in the long term, but they're the only company I would actually back on this whole deal, because tools like this must not be locked behind web APIs.
Everyone pushing hard against the use and adoption of AI today is going to be seen as a dated Luddite within 20 years. You're fighting a losing battle, and you're fighting a losing battle for the express purpose of holding humanity back so you can make more money doing high labor tasks.
You might even get some draconian copyright law passed, but eventually it will fall. The common good of automated tools that can just create artwork out of thin air with near zero human effort is just too large of a benefit to humanity to pass up in the long term.
It would be like trying to ban music so the artists can still make money doing live performances. It's just not going to happen.
OK, I'll bite. I am one of those "10,000" you claim are "living the dream." I have scrimped, scrapped and scrounged every art job in existence to get here. It was a proving ground, getting paid...
OK, I'll bite.
I am one of those "10,000" you claim are "living the dream." I have scrimped, scrapped and scrounged every art job in existence to get here. It was a proving ground, getting paid 20-50 bucks a commission until my art was good enough to take a step up to the next level. (and it isn't some amazing lifestyle, it's not even breaking middle class) Yeah, we don't have unions protecting us, helping us get better pay. Something I think you can identify with given your comment history.
As a professional surrounded by working professionals, I have intimate, first hand accounts refuting everything you've talked about. From the highest level Marvel comics artist, Sketch artists in the entertainment industry, right on down to the people in my community of young artists struggling to make even 100 bucks on something. I know, not think, that we have been getting screwed for years. As a small example, I began working in comics in 1994. I was paid MORE per page than a beginning artist today. It's the same in entertainment illustration, everyone from the top illustrators to the lowly sketch artists had a higher ceiling back in the 80s and 90s. So, apologies, but it isn't some vacuous "they say" piece of data.
all of the smaller artists I know (not a small number, upwards of 300) lost money to Ai the second it became big enough for people to plug in a selfie and do a shitty rip off of digital art. Again, first hand data of the impact it has had across the board because NO ONE asked if they could use their work.
The problems I listed CAN be solved by inclusion of artists in the data set and paying a royalty structure, just like royalties for musical artists that write, produce or participate in the making of a song. But visual artists don't have AASCAP...noticing a trend? Unions to protect us...yeah, they don't exist. So we're getting nickel and dimed out of existence. Include an opt in for a payment for hard won creativity and expertise. Weird how easy it could be.
Everyone pushing hard against the use and adoption of AI today is going to be seen as a dated Luddite within 20 years. You're fighting a losing battle, and you're fighting a losing battle for the express purpose of holding humanity back so you can make more money doing high labor tasks.
I think there is a nuanced approach where the tech will be helpful to humanity without destroying the positives that we have built over time and supporting the creativity of future generations. If that makes me a luddite, cool, I'll live with that.
I'm aware, which is what lead me to say the "relative" dream. Demand for any hand made art was already so low that if you could make a living from it you're probably someone with a lot of luck and...
and it isn't some amazing lifestyle, it's not even breaking middle class
I'm aware, which is what lead me to say the "relative" dream. Demand for any hand made art was already so low that if you could make a living from it you're probably someone with a lot of luck and a pretty large network bringing attention to you and away from others.
This is also why I don't buy the idea that a lack of commissions would cause artists to wither. Working on them doesn't provide some temporary lifestyle for people to work up through, is a hellscape of lower than minimum wage work with prices constantly undercut by other desperate artists.
I don't believe commissions will be replaced by AI art, but if they are at the end of the day it's probably going to cause less suffering when all these people who think they might make it get shoved into the cold earlier so they can adjust.
This is why I told my brother not to ever go into music. Your have to be absolutely amazing to have a chance, and even if you are without some luck or wealthy parents you're still out in the dry, you're still screwed, and even if you find marginal success you're damned to poverty your whole life.
Arts should remain a hobby. If they become a job by luck, awesome. Otherwise, do something else with your career so you can draw.
This was true regardless of AI.
From the highest level Marvel comics artist, Sketch artists in the entertainment industry, right on down to the people in my community of young artists struggling to make even 100 bucks on something
Well yeah, your point of comparison was the 80s when comics were actually popular and sketch artists were actually valuable.
Sketching and human drawn art went from a valuable commodity (animations and cad and comics and so on) to nearly totally worthless, basically only practically useful as a luxury expense that people who specifically want their art to be human (anyone who just wanted pretty art can just Google for it) or a very niche use case (for example, the furry and porn commission industry).
You're looking at the withered husk of a long dead industry, a few niche use cases for something that was once broadly required.
And that's why AI art isn't going to wipe it out. People who wanted pretty art will just go Google it still, and get better quality than AI can do.
People who want super personalized art can't use AI because the AI just generally sucks at it. It's more work to generate it that way than the costs of just getting a commission.
The only feasible thing I can see is that artists using AI will immediately squash and push out artists who don't from the commissions market. But that's life. Use the tools that make you more productive, or go obsolete.
The problems I listed CAN be solved by inclusion of artists in the data set and paying a royalty structure
No they can't, because AI art doesn't make any money. Which sounds weird, clearly they are making money, right?
Sure they are, but there are billions of images in these datasets. The AI companies are making such a hilariously miniscule amount of money that any given artist will be making literally pennies.
And remember that these tools are open source. The people using these models don't, won't, and never will be paying money to companies. Even if you annihilate the companies, the open source community will continue to happily scrape the web and train new AI. You'll get rid of them as soon as you get rid of piracy.
Eventually AI will be powerful that even without training data you could give it an example song and it'll spit out new similar examples. Even if the "MAFIAA" manages to use lawsuits to hold things back on the music end, they can't stop the tide.
I think it's important to remember the context here. AI artists and proponents are just starting down a new road, it's an exciting time and there is lots of new opportunity. I don't think anyone...
And or "I deserve such a strong claim to copyright that I own the right to control who can learn what from my art", basically summoning a new and self beneficial rule from thin air to prevent this progress.
They're literally trying to hold back what I am allowed to do so they can make a profit. I'm not going to have much sympathy for anyone taking such a stance. Especially when it's argued in such an angry and dismissive way.
I think it's important to remember the context here. AI artists and proponents are just starting down a new road, it's an exciting time and there is lots of new opportunity. I don't think anyone should begrudge them for being excited. Traditional artists have built a career and possibly families and other responsibilities on top of this expected source of income and they're not wrong to be scared and upset. Maybe their history as artists will give them an edge over other AI artists if they choose to use the tools, but maybe not. There is a lot of uncertainty in their future so I don't think we should hold the two groups to the same standard.
It seems deeply wrong to me to both excuse artists behavior while simultaneously using the reaction from AI enthusiasts to that behavior as a means to slander and "disdain" them. The road of...
It seems deeply wrong to me to both excuse artists behavior while simultaneously using the reaction from AI enthusiasts to that behavior as a means to slander and "disdain" them.
The road of understanding should go both ways.
You can't expect a group to be attacked and everyone in that group just run around being the better person. It never works that way.
I'm not sure what you mean here, is this directed at me? Yea, I understand that. I don't agree with the artist response, it's just predictable is all. I'm mostly just speaking as a third party...
It seems deeply wrong to me to both excuse artists behavior while simultaneously using the reaction from AI enthusiasts to that behavior as a means to slander and "disdain" them.
I'm not sure what you mean here, is this directed at me?
You can't expect a group to be attacked and everyone in that group just run around being the better person. It never works that way.
Yea, I understand that. I don't agree with the artist response, it's just predictable is all. I'm mostly just speaking as a third party observer of both groups. I guess my message to AI artist proponents would be that the artist response is no different than how any group would respond when facing an existential threat to their career, and to keep that in mind.
To a degree, but mostly in context of what I was originally responding to where someone was saying "disdain for the AI enthusiasts is justified". If you didn't have that original comment in mind...
is this directed at me?
To a degree, but mostly in context of what I was originally responding to where someone was saying "disdain for the AI enthusiasts is justified". If you didn't have that original comment in mind what I said there may not apply.
I guess my message to AI artist proponents would be that the artist response is no different than how any group would respond when facing an existential threat to their career, and to keep that in mind.
I agree with you entirely. I've told off a few on the pro AI side who run around acting like absolute children and calling everyone "luddies" as a meme-phrase-insult.
But I have similar feelings for a number of people in threads like this who are using terms like "AI bros" or acting like they're in a place where they have a right to "take away their toys for misbehaving". Those people are acting in a similarly very toxic way and from how I read your message I saw it as ignoring or even justifying their behavior.
I'm not really involved with any other discussions, but given your characterizations I agree, everyone is being childish. However, I'm still of the opinion that we should expect better from the...
Those people are acting in a similarly very toxic way and from how I read your message I saw it as ignoring or even justifying their behavior.
I'm not really involved with any other discussions, but given your characterizations I agree, everyone is being childish. However, I'm still of the opinion that we should expect better from the group that doesn't have their world crashing down around them. Perhaps that's a little optimistic though.
I fully understand why artists who spent years or decades learning skills and building careers aren't too excited about AI art that can do most of what they do with almost no effort. On the other...
I fully understand why artists who spent years or decades learning skills and building careers aren't too excited about AI art that can do most of what they do with almost no effort.
On the other hand, I'm not an artist. I've been playing around with it a fair bit and I'm excited that I can build a picture I'm seeing in my head. I don't feel bad about this, because I'm not going to sell a picture of my cat wearing power armor and I would never have paid someone else to draw it for me.
When digital art became possible, people sneered and said it wasn't real art, but that sounds ridiculous today. I'm pretty good at photoshop. There's skills I learned in the past that became completely useless as new upgrades automated those processes. I don't mind the fact that anyone can easily pull the background off a picture. I'm excited I don't have to do it the long, boring way.
In the long run I don't think AI art is going to replace artists anymore than cameras or photoshop did. It will just be another tool artists can use. I think we will benefit overall. There's probably kids today with a great idea for a comic book, or animations that would never get made if they had to draw it by hand, but they'll be able to share them with the world as making art becomes easier.
This is what I think will always keep me on the "pro-AI-art" side. There are people that want to make art, but can't. For a small example: I have shaky hands. It's not horrible by any means, but...
This argument also seems rather dismissive of those who cannot create through traditional tools. I don't think it really matters if that's due to physical disability, or simply that they haven't invested the time to learn the skills. I'm sure some will disagree with that comment, but I really dislike the idea that there's a requirement of time or effort for something to "count" as art.
This is what I think will always keep me on the "pro-AI-art" side. There are people that want to make art, but can't. For a small example: I have shaky hands. It's not horrible by any means, but my handwriting has always been terrible. I genuinely don't think drawing/painting/whatever is ever going to go well for me.
However, when it comes to artistic expression, I am a decent programmer and an okay writer (well, typer at least). I'm already good at those, if a little out of practice, and so if I want to make "art" then that's the path I'm going to go down. And that means I am never in my entire life going to see the images that I wish I could create.
Except now I can. And I don't think I should be looked down upon because I spent the first 30 years of my life focusing on other forms of art that I can actually do. I feel the same about things like ChatGPT, as well. If someone has a cool idea for a story but can never seem to get the things they see in their head down on the page, then I am incredibly happy that they have another tool that can help them out.
Digital art is uniquely suited to these early days of AI thanks to it being a simple bag of bytes where there is a very strong relation between those bytes that the AI can learn to depict in all...
Digital art is uniquely suited to these early days of AI thanks to it being a simple bag of bytes where there is a very strong relation between those bytes that the AI can learn to depict in all sorts of images in different styles.
Eventually, the abstraction layer will creep higher and AI will be able to understand less discrete relationships in more complex datasets and it will be able to do more complex tasks.
However, when it comes to artistic expression, I am a decent programmer and an okay writer (well, typer at least). I'm already good at those, if a little out of practice, and so if I want to make "art" then that's the path I'm going to go down. And that means I am never in my entire life going to see the images that I wish I could create.
Would you not feel a little bit protective of your skills as a programmer if anyone could hire an AI for pennies and, giving high-level goals to an AI agent, crank out software in a few minutes it would take you days to create? I know I'll probably look down on the 'programmers' of the future, not because they deserve it but because a) my livelihood may be threatened and b) I invested a bunch of time in the hard way before the easy way existed
To be clear, I think trying to avoid any of this is trying to close the lid on pandora's box, but I think people are justifiably upset.
Let me preface everything I say by making it clear I do not professionally code or write. So even though I try to empathize with people that are in this situation, I obviously am not directly...
Let me preface everything I say by making it clear I do not professionally code or write. So even though I try to empathize with people that are in this situation, I obviously am not directly threatened by any of this.
For me personally, I think I see the act of programming as a means to an end. There is certainly a skill to it, and in many ways it is also an artform, but the goal of programming is usually not to create good code. The goal (again, just how I see it) is to create either something that completes a task efficiently or something that is usable by a person. The important part is not the code itself; it's what the code does.
And, importantly, I think there will still be skill involved in getting an AI to create a program that does something novel, is usable by a person, and isn't filled with bugs. At least for the near future. Eventually we might be able to tell a computer "Make an app that does x, y, and z" and it spits out a perfect program with no faults, but for the foreseeable future it will probably just be another tool that allows more people to make more things. I think that's kind of always been the goal of humanity.
Reading the discussions in these thread, all that comes to my mind is "the only permanent thing in this world is change". I see people holding on to maintain the status quo and i see people riding...
Reading the discussions in these thread, all that comes to my mind is "the only permanent thing in this world is change". I see people holding on to maintain the status quo and i see people riding the flow of change.
I myself am a hobby artist but i am also pro AI. Because i think. With the proliferation of AI images, the human made art will become more valuable in the future , just like sculptures made by moulds from factories vs sculptures commissioned to celebrate something great. The moulfs are used for low quality unimportant works. But the statue is specifically commissioned because they want an Artist to make into reality their i terpretation of a certain subject, the artist's emoyion, philosophy and Outlook in life will reflect on that sculpture. The sculpture will have a soul. This will make it infinitely more valuable than a mould, no matters how perfect or exquisite it looks. Hell the commissioned sculpture may look like trash, but it will have soul. That's my take on it, ai art will make human art more valuable.
But, there is still a case of human livelihood at stake. This i cannot comment on because i myself do not earn from my art. I leave it to others more knowledgeable than me, to think of ways to compromise and meet in the middle. Not hamper progress as well as not to harm the way of living for people who are affected.
You can tell he is coming from the perspective of an author who sells his work for money. Every argument reflects this viewpoint. He repeatedly states that this is hurting people making money off...
You can tell he is coming from the perspective of an author who sells his work for money. Every argument reflects this viewpoint. He repeatedly states that this is hurting people making money off their art, while at the same time slamming people who try to sell what they create using AI art.
I see a few uses for AI art, one chief one being as a starting point for a commissioned piece by a human. "This, but good" sort of thing. The other being art for personal use that wasn't going to...
I see a few uses for AI art, one chief one being as a starting point for a commissioned piece by a human. "This, but good" sort of thing. The other being art for personal use that wasn't going to be bought anyway. Most people, if they want a particular piece of digital art to store and look at, will scour the internet rather than buy it (a picture of, say, Spider-Man fighting Green Goblin.)
To me, the main problem with AI art isn't the supplanting of humans, it's the using the art of humans to train without consent. Human artists are always going to be valued by some and not others (today's AI art is yesterday's "for exposure" request,)
The video goes fairly in depth on your concerns! He talked a lot about how ethically trained AI on datasets of public domain/donated works definitely have a place in the world
The video goes fairly in depth on your concerns! He talked a lot about how ethically trained AI on datasets of public domain/donated works definitely have a place in the world
The video presents a critical overview of the arguments for and against AI arts. It seems well informed in terms of both the artistic and technical aspects. I see the scope of this one to be...
The video presents a critical overview of the arguments for and against AI arts. It seems well informed in terms of both the artistic and technical aspects. I see the scope of this one to be similar to what the Line goes up video did for NFTs. A lot of what's brought up you probably have all heard before but the video maker addresses them substantially with his own counter arguments.
The channel's usual content is on creative writing but he also claimed to have a published book and background in law so I trust that he's invested in the issue. His other video about Japan's WWII warcrimes was also quite good.
I appreciate the nuance around this subject. It seems like so many people are so vitriolic about the topic of AI art, making often baseless and uninformed claims just because they're upset.
I appreciate the nuance around this subject. It seems like so many people are so vitriolic about the topic of AI art, making often baseless and uninformed claims just because they're upset.
This video is longer than I have the patience for, but I like the point he made at 53:20 about the personal-growth aspect of making art being the source of societal/moral value.
This video is longer than I have the patience for, but I like the point he made at 53:20 about the personal-growth aspect of making art being the source of societal/moral value.
I feel like he tried to cover every possible aspect of AI art into one video. 2 hours and 19 minutes is just ridiculously long. Honestly, I fell asleep about an hour and an half in, and I don't...
I feel like he tried to cover every possible aspect of AI art into one video. 2 hours and 19 minutes is just ridiculously long.
Honestly, I fell asleep about an hour and an half in, and I don't fell like I missed anything.
He should break this up into 6-7 videos that covers parts of the discussion.
I didn't watch the video but I wrote something based on the discussion here so I'm going to post it anyway: I'm not a lawyer but I think it's useful to think about this from a legal realism...
I didn't watch the video but I wrote something based on the discussion here so I'm going to post it anyway:
I'm not a lawyer but I think it's useful to think about this from a legal realism perspective. What's at stake?
Copyright law is at least somewhat self-enforcing in the sense that businesses often avoid doing anything that puts them at unnecessary legal risk. Releasing a library under GPL means a lot of businesses say "nope, not touching that, let's pick something else." Movie and record companies often make sure they have all the legal clearances they need and take out anything they can't get. Some businesses are avoiding AI-generated art due to legal uncertainty.
This means there's a market for artists who will charge reasonable prices for clear legal rights to use an image. The average user doesn't care since they're just going to copy something off the Internet, but there are businesses who do care.
Adobe claims that they really do have the rights to all the images that their AI image generator uses, which makes their product attractive to some businesses, even though it's not as impressive as MidJourney. This means that commercial artists are going to have to compete with AI-generated images regardless, because there is already AI-generated art available that's clearly legal. Also, there's little reason to believe Adobe can't improve their image generator and maybe catch up with MidJourney eventually.
Businesses don't always avoid legal risk. In other cases when enough is at stake, they will go ahead and maybe fight it out in court, or negotiate a compromise. Building a web search engine was once a legal grey area. YouTube was built on copyright infringement, and then Google negotiated a messy compromise with rights holders. For Google Books they tried and lost.
This means that court cases will affect us even though we're not directly involved. If the courts rule that what Stable Diffusion or MidJourney is doing really is infringing then it will matter to some customers and that will affect the market for artwork. MidJourney might need to shut down or start over.
I'm not sure it will be all that big a deal to the average Internet user who wants to make an AI-generated image, other than being disappointed that they have to use a somewhat worse AI image generator for a while. They probably weren't thinking about buying original art.
I have to admit I didn't much care for this video. The premise was a nuanced look at a complex issue from a legal perspective, which sounded awesome. But I feel it really didn't deliver on that premise. Instead I found it heavy on moralizing, and light on legal interpretation.
Probably the biggest problem I had is that this is somebody that clearly does not like the AI art community. He speaks with disdain towards them the entire time. He makes the occasional comment that "Sure, this isn't everybody", but then goes right back using phrases like "These people believe...".
He quotes random reddit users, who are frankly often young teenagers these days, and uses that as an example of how awful these users are. He then shows that a large number of AI generations are for anime pornography, which served little purpose other than to denigrate them. Is it really surprising that people like pornography? Do I need to point out how many nude ladies there are in classic paintings?
Beyond just the users, he attacks AI companies as being overly corporate, which - sure - I get. But he's again painting with broad strokes, and there's no distinction made for open models or open-source in general.
He uses the word "stealing" throughout the video as if it's a foregone conclusion. I'm not convinced that scanning an image to make minor mathematical adjustments to matrix in a model is actually stealing. I can absolutely see that argument being made, but he doesn't try to make it.
He implies that those using these tools are being unempathic to artists. That users are treating these tools as a means to an end, and not as true expressions of art. This is not necessarily wrong, but it also seems irrelevant to me. Art in all mediums can be produced as an expression of creativity, or as a mass-produced, audience-tested, soulless affair. And that's fine. If people are using AI generation to fill a role rather than to express their own creativity, I don't have a problem with that. I have no interest in Marvel's Superhero #73, but clearly a lot of other people do, so it's still creating value.
This argument also seems rather dismissive of those who cannot create through traditional tools. I don't think it really matters if that's due to physical disability, or simply that they haven't invested the time to learn the skills. I'm sure some will disagree with that comment, but I really dislike the idea that there's a requirement of time or effort for something to "count" as art. Art is expression, whether that's spending years working on your masterpiece, or taping a banana to the wall. Time and effort will likely factor into value provided, but not always.
I'm not saying that punching a few words into a generator and hitting submit makes you an artist, but I also don't discount this tool as a means of expression. For some people, it may be their only means of expression, and I think it's really unfair to dismiss that out of hand.
He touched on this some, but tools like inpainting, ControlNet, and DreamBooth, and LoRAs allow for a significant increase in control over these art generators. He argued that only a small number of users are actually making use of these tools right now, and I agree that's likely true. But this technology is still brand new. Everything is rapidly evolving. Those who spent six months learning how to correctly prompt Stable Diffusion 1.5 effectively had to start over when 2.0 released. The plugins and techniques required are evolving along with the tools themselves, and it's unreasonable to expect a majority of users to understand them. Eventually they will become more accessible, and their usage will likely increase as a result.
Part 5 was better, and covered some good examples of art with varying levels of control by the artists. His arguments seemed consistent with statements published by the US Copyright Office, which is that it comes down to the level of control by the creator. As mentioned above, that control is likely to increase as the technology evolves, and that will also shape the question of copyright.
I certainly see the pitfalls of AI tools. Not just imagery, but text generation as well. It will be invaluable to spammers and other bad actors who will gladly make life a living hell for everybody else if they think they can earn a dollar doing it. And what can I say? Those people suck. But I really try not to blame the tool when that tool has other, positive applications. And I really believe that AI does. Whether that's being able to create your D&D character, or generating header images for your cat's blog, or creating a custom storybook for your kids. I think it's all very cool, and a lot of fun, and a viable form of creation for those who otherwise may not have had the means to express themselves.
I don't want to criticize this video too harshly because clearly the author put a lot of work into it. But despite the length, I really don't feel it offered much new to the conversation, and in many ways dragged it down further by attacking portions of the community. That seems like a good way to turn away those you're trying to reach, and his closing comments seem to acknowledge that many will have done just that.
I think the disdain is warranted, because the complete lack of empathy towards traditional artists has been astonishing. This is a disruptive technology, so it's understandable that those who have been disrupted are lashing out, and the reaction to that from the AI community has been surprise at best, and mockery at worst.
Sadly because the technology is out there, fully functional and impossible to stop, people are free to be as inconsiderate as they want. It's not like anybody can take their toys away from them if they're behaving badly. The victims have no recourse. So it seems fair to me that tech bros should get all the disdain they deserve, because social shaming is the only tool to keep them in check.
Consider that the disdain towards artists rallying against AI may also be very justified. From my perspective a lot of their arguments are deeply selfish and basically amount to "you shouldn't use this tool because I think it's bad".
And or "I deserve such a strong claim to copyright that I own the right to control who can learn what from my art", basically summoning a new and self beneficial rule from thin air to prevent this progress.
They're literally trying to hold back what I am allowed to do so they can make a profit. I'm not going to have much sympathy for anyone taking such a stance. Especially when it's argued in such an angry and dismissive way.
For those arguing that we should express sympathy for artists as a whole, I understand. I can see why you'd be scared and I wish you all the best.
But I think the forests are being missed from the trees here. 90 percent plus of AI art is going to cases where humans really couldn't do the job. For example, I'm seeing lots of people take static pictures of characters and using AI art to make their own personal collection of emotions to use in some visual novel like experience.
Commissions and real creative works? It's going to require the touch of a human, and while I'm sure all artists will be forced to use AI tools in their craft, I think we will see demand scale quite a lot to account for the extra productivity, and we will see artists pay and livelihood increase from here on.
Assuming AI art doesn't go superhuman that is, but we can worry about that when we get to it.
AI art is also already actively being used by record labels with money available to commission actual artists instead for music and it's gross. They don't even often edit it much- there are albums coming out with obvious tell-tale signs of AI-generated art just thrown together on a cover with a logo smacked on top.
So, as someone that encounters a lot of new music releases on a weekly basis, I'm already seeing instances where it's being used to replace real creative works. Companies can and will cut corners to save costs if they can generate stuff via AI that they consider good "enough". I do think the threat to artists is real and I think the worry about anger, dismissiveness, civility, etc are all mostly a distraction. Artists that earn a living through creative work (already a tough thing to do in our society) have a lot to lose- which to them is going to be far more important than whether someone gets to use a shiny new AI toy.
That said, I think it's mostly just a noise vs. signal problem. I won't buy albums that have obvious AI art, and most of the time the music is garbage anyway, because someone that cares for their creative output as a musician often will care enough to use something better than obvious AI art to represent it. So I think we'll mostly just see an acceleration/saturation- just far more volume of low-quality material out in the world that one will have to wade through to find something worthwhile. Whether that is albums, websites, artworks, writings, etc.
Also, I personally think the underlying problem to nearly all of the concerns I see with AI is capitalist structures. From artists worrying about their livelihood to jobs being replaced. The "enshittification" of AI is inevitable under this system. Advertising will infect it. Money will bias it. Ultimately, I think the bad uses of it will outweigh the good until the entire societal structure it is being built under is fundamentally overhauled.
Yet again, like most technologies - it just ends up being a mirror into who we are - and I don't like what I see
I agree, but I think it's a distraction mostly in favor of the artists. On a general scale the average person stands to benefit from, and will support, access to all forms of AI generated content.
And I understand concerns about spam, but for the most part every social system already deals with this problem. AI art will probably increase the noise, but these systems are pretty well prepared for spam with 1 in 100 works of art generally being good and worth displaying to users.
So the spam will get hit by - and increase load on - the systems built to filter it while the AI works that get through will probably do so on their merit as interesting works.
And while AI will certainly add to the spam, it's also greatly changing the world of how content is consumed. For now, that's in the form of things like search and recommendation algorithms. Chat GPT is already killing Google when you need to know something obscure or hard to search.
I find this an incredibly bold claim. I hold that AI will be a net negative for humanity unless many aspects of society change. The average person may be gullible enough to be duped into believing it's good for them, though. Maybe my claim is just as bold, and I'll grant that.
As far as spam, we are not prepared for AI-enhanced level spam. AI detection systems are already having trouble actually determining whether something is AI-generated, and it will only get more difficult if AI progresses. Things can be low quality but much harder to distinguish as spam algorithmically. I don't know that we'll have the tools to properly fight it.
My issue with AI proponents is the assumption of its positive and useful nature, as if that's not piss in the wind before it's corrupted the same way everything else is. I am a tech person, I understand excitement when it comes to technology. I also understand that technology enables bad things at scale along with the good, and I find "generative" technology to be uniquely repugnant. I don't agree with the apocalyptic AI doomers (the sci-fi humanity-ending stuff), but I do agree with people that see how real issues in healthcare, jobs, inequality, and society in general are just waiting to be disrupted in negative ways by AI and real harm caused, that is honestly scarier than any mass extinction event
Can AI be used for good things too? Sure, but our society (at least in the US) is not incentivized at all to have the good outweigh the bad. Maybe I'll be proven wrong, but I doubt it.
I honestly just refuse to be hopeful about every new technology. Call me stuck in an era or whatever else but sometimes I think certain tech would be better off not existing at all in our current social structure, and increasingly I feel like AI is one of those.
That said, I don't think we'll see eye to eye on this, so I'm not sure how fruitful continuing discussion will prove to be. The direction technology takes these days is extremely divergent from the directions I would rather it go.
That's a pretty big claim. Have you got any examples? I'm not saying you're definitely wrong, but aside from pieces with obvious tells like extra hands, most of the art I see people citing as obvious AI art just looks like modern Chinese or Korean art to me. Which mostly just tells me there was a lot of art from China and Korea in the datasets used to train these things. It's suggestive, but not really damning the way it's often painted to be.
I'll see if I can post some later when I look them up, but most of them have tell-tale signs of being from Midjourney. If you've used AI art generators for any amount of time (I spent weeks doing generations and using it prior to becoming more anti-AI and I think I might be okay in limited ways with it, like with using it in non-commercial personal use ways), the majority of the output is incredibly easy to spot. Especially lazy output that hasn't been altered or refined through hundreds of prompts/etc.
Midjourney output usually has many details that a human wouldn't put in. If you look at windows, architecture, etc you see really strange offputting lines that shouldn't be there. If it's a darker themed image, you often see "spaghetti nightmares". Much of that spaghetti-looking stuff is key marker of Midjourney output.
I've mostly seen it with metal albums, even albums coming from labels who should be paying artists instead.
I know Trespasser's latest album is AI art + a bit of editing. It basically looks like an old/ancient style painting but with a bunch of details messed up / broken in the unique way AI art breaks.
The reason I can't think of many off the top of my head is usually obvious AI art use on an album cover is an instant dealbreaker for me where I won't listen to the band. Trespasser's politics happen to vocally align with my own so I give them a pass. I watch a weekly stream where for a few hours we view all new releases coming out each week based on their release date on Metal Archives, and we encounter AI cover art nearly every week.
Examples:
Trespasser's latest
Red Cain - Naebliss (obvious Midjourney output with no obvious edits)
Seek's latest (more edited, but the windows are tell-tale Midjourney)
Looking at the Trespasser and Red Cain albums (I couldn't find an album by a band named Seek that was recent enough to be what you were talking about), I don't know. It could be AI weirdness, but it could also be intentional weirdness. I've definitely seen art with that kind of dreamy vibe from before AI art was a thing.
Also, thanks for letting me know about a new Wheel of Time concept album. I've got to check that out.
The Red Cain is the worst offender of the ones I listed. It is the most obvious. I don't know if you've used Midjourney some or at all, but I'm pretty sure anyone that's familiar with Midjourney can instantly see the IMHO undeniable Midjourneyisms in that album art.
ἈΠΟΚΆΛΥΨΙΣ by TRESPASSER has a dove and light beam edited over top of it, but the buildings, people, and flags, all have Midjourneyisms too- none of which I've seen a human create without AI art. There's a definite "look" and I think, unlike text, a lot of low-effort AI art is incredibly obvious.
I can't find the Seek one I found earlier (I was at work, looking in the bandcamp app)
Some other examples:
https://www.metal-archives.com/albums/A_Rising_Chapter/Inanimate/1111621
https://www.metal-archives.com/albums/Sagen/Roots_of_Proctor/1104299 (another super obvious one)
https://www.metal-archives.com/albums/Born_Criminal/Until_the_World_Collapse/1132092
https://www.metal-archives.com/albums/Get_Out_of_Nashville/The_Uncaring_Inhuman_Maelstrom/1128753
https://humanityslastbreathofficial.bandcamp.com/album/ashen?from=hp (obvious, says done by an artist inspired by AI examples but... that's sad because the artist wasn't good enough to make it not look AI)
https://awicha.bandcamp.com/album/drag-them-down-from-the-sky?from=discover-new
https://bindingties.bandcamp.com/album/diary-of-a-dying-world?from=discover-new
There are actually quite a lot of rules against profiting from someone else's work without paying them. This technology could not exist without enormous amounts of training data, and as compensation for all their time and effort artists are looking to be the ones who lose out big.
Those greedy artists, they want to be paid for their work so that they can afford useless things like food and rent. There is a massive difference between corporations seeking ever growing profits and people just trying to pay their bills, and artists as a whole tend to overwhelmingly fall into the latter group.
I'm extremely skeptical that for the first time in history, the threat of automation will be used to increase worker wages. I think companies will use the threat of automation, feasible or not, as a cudgel to extract more value from artists for less pay. Some artists will make enough of a name for themselves to still make a good living, but those positions will overwhelmingly go to the privileged, the people who can afford to take low or no pay work for years to build a reputation or who have connections to management.
There is real harm being done to real people by this technology, and I think it's a bad move to simply brush it off, even if you (not you specifically, bioemerl, I'm speaking with a general you) approach the problem from a purely selfish perspective. What's happening to artists now is a story that's going to happen to a lot of people in the future, whether from advances in AI or climate change chaos and forced degrowth. The sooner we figure this problem out, the fewer people are going to suffer, and the less likely it is that you'll be one of them.
And on an emotion driven level, I think this whole thing is tragic. Art isn't a career field that people get into to make the big bucks, they do it because they like to do it and find satisfaction from doing so. Even if they find new work, I think it's unlikely to be anywhere near as fulfilling, and the conversion of jobs from ones that bring people meaning to ones that don't is sad. Art is also a field that improves the lives of other people than the ones doing it. Most people have been deeply touched by a work of art, a brief moment of true connection to another human and a message that resonates on a core level. Even if AI gets to a level where it can replicate that, to me there's something so terribly lonely about that idea.
The rules protecting copyright are just that copy right. To learn from and create novel works was never protected, and contrary to what a lot of people are saying these AI models don't just copy and reproduce artwork. They create genuinely new bits of art.
They aren't really. Even if you assume artists capture the entire profit of AI art each individual artist would likely made pennies. The data going into these models is absolutely massive, and any attempt to even imagine paying for every data point, even if you assume the artists even deserve that pay, would see the breaurocratic costs render it a moot point.
They want to be required to do work that isn't necessary for them to do, and paid for that unnecessary work.
Artists are already paid for their work. They've been being paid for their work for ages. They'll likely continue to be paid for their work. The advent of AI art doesn't change that, it only changes what other people who aren't artists are able to do.
Artists want to restrict that, because they feel threatened that other people having access to AI art will render their current skills not valuable.
Even if it does, (I don't think it will in the medium term) they can do what every other displaced worker has done and find a new job. The artists making a living on art today are already a very rare occurrence and the vast majority of artists already have to have a side job regardless.
Our entire modern lifestyle exists because automation enabled higher wages and standards of living. The historical precedent for this sort of automation generally improving people's lives is plentiful.
Fair. I doubt the legislators who wrote those laws were expecting anything like this to happen though. It is common to create new laws in response to new technology though. But overall I'm not familiar enough with the technology to really debate the point so I'll concede it. It just sits badly with me that artists were a vital part of making these technologies, but even though they never asked for or consented to it they're the ones who will lose their jobs to it.
Compensation was a poor word choice. The cost paid by artists is the loss of opportunity, the loss of actual paid work that will instead go to AI now.
It depends on how you mean necessary. If on a strict bare minimum necessary for society to function, then no, but there are quite a lot of unnecessary jobs through that lens, and I think it's an entirely different discussion. There is demand for these jobs to be done, and if they're going to be done I'd much rather the benefits go to a human than a computer with all the benefits going to the people at the top.
I should clarify: by paid, I mean paid fairly, which I should've specified earlier. My understanding of the situation is that music artists generally make most of their money off of live shows, because so little of the value generated by their music on streaming services goes back to them. The Writer's Guild in the US is currently on strike because their wages have failed to keep up with the cost of living to the point that it's no longer a viable career. Pretty much every artist has gotten a "work for exposure" offer to the point where it's a meme.
And then comes AI. To use the writer's strike as an example, one idea that's been bandied about a lot is using AI to generate a script, then bringing in a writer to make punch ups. The main reason is that it's less money to hire a writer for punch ups than for script writing. But the writers I've heard talk about it say that it's harder to fix a bad script than to just write a good one in the first place. That's the trajectory I see AI following if we don't work to change things; money taken from working class folks and redirected to executives.
That's not an easy thing to do for many people, and can take a real toll. Even if it's a supplemental income, losing it is going to have real costs in terms of human misery that I don't think should be brushed off so lightly, especially since this is something that's going to keep happening in various fields.
We're talking past each other a bit here I think. I wasn't talking long term, I was responding to you saying that you thought artist pay would increase, which I see as a short-medium term affair. For around a decade now, service workers and truckers have been threatened with automation any time they push for improved working conditions. If we had to pay you more, we'd just use a computer instead. For artists, I see it going much the same way, with some passion field exploitation thrown in too. There are a million other artists who would love the job, if you want more money I'll just give it to them, or a computer.
Long term, automation may benefit society, but that possibility doesn't excuse the preventable short term harm done. Trusting in things to naturally work out seems to me like a way to end up in a situation like farmers are in in the US. There have been strides made in automating farming, but every year more and more small time operations are pushed out by the big corporations. Equitable outcomes require effort to ensure that the biggest players don't leverage their position to end up with an outsized share.
Just note that you're guilty of your first complaint within a single sentence. You're just automatically assuming that AI is "progress" and anything that holds it back is automatically bad. Seems like there's a lot of assumptions being made about the benefits of AI here without justification.
Your second complaint is pretty hyperbolic. Artists don't need to claim any stronger version of copyright than exists already. Copyright doesn't prevent what anyone learns from looking at art or from reading a book, etc. Exploiting a copy of an artist's work to make derivative works would be where copyright would already come into play. And that's pretty explicitly what training AI models is doing.
You're pretty vague here, but I'm not sure anyone should sympathize with a position that you should be able to do whatever you want without regard to the rights, lives, and livelihood of others.
What should you be allowed to do? Make a profit? Make a profit by exploiting the work of artists? Convince artists that to survive as artists they'll need to adopt your proprietary AI tools (which were trained using the artist's work in the first place) for a $19.99/mo subscription?
Artists don't even need to desire profits at all. Licenses like Creative Commons BY-NC exist for artists that want to give away their work, even allow it to be remixed, but not for commercial purposes without permission.
AI is doing the former. An AI model is trained to perform an action on data - in this case images collected from the internet.
A derivative work implies a person's artwork is floating around in the model somewhere or the person's artwork is somehow part of the AI. Neither are true (barring a few cases where the AI developers screw up and it overfits on data, learning to copy instead of actually learning).
Sure, but when what I'm doing doesn't actually get in the way of any of those things, this isn't my position.
I'm not interfering with artists ability to sell art. They still can do that. Nor with their ability to own the art they produce. They still have copyright just like they did before.
People using public data to train AI aren't stealing anything either. The tools literally just go download the works from the internet using public links.
And this proposition:
Is where you establish that you're the one interested in limiting and interfering. Here's a brand new revolutionary tool and you're trying to shut it down so it can't be used.
If you can prove a work is derivative, do so, and get it shut down with copyright law. 99 percent of the time with AI art you can't prove that, because it's literally not derivative. It's a machine that knows how to make art, and the use of it doesn't exploit artists in any way but being competition to them.
Now this I can get behind. These tools should be open source as much as possible and I think everyone paying a subscription for them is a fool.
Copyright law is inherently an imposition: it is the state giving an individual exclusive control over a specific piece of information. As it stands, not speaking for bioemerl, it's hard for me to stomach a "compassionate" argument for the privatization of culture that primarily benefits large firms with legal funds to enforce their copyright, just because it's convenient for artists to squeeze by on selling their copyright to those businesses.
I would challenge the idea that, in the case of generative visual art, it's clear that MLs are creating "derivative works" from their training set. It's an argument being had, and it won't be settled soon. In cases of LLMs like GPT, that argument feels more accurate, but separating derivative works of visual art from vaguely-influenced works seems incredibly difficult.
If your line of reasoning in the second half of this comment rests upon "You should not be guaranteed a profit based on the stolen art of others," that seems to be an argument against our current status quo in a way that AI art is in only a small way notably exceptional.
No, we will not. Artistic wages have declined since the 80's. We will see all the young artists in training being put out of business as the commissions they would cut their artistic teeth on will no longer exist (I've already seen this.) No pay=no art, no art=no progress as an artist. It's a race to the bottom. The bottom being everyone using AI that is trained (unethically) on the styles of anyone successful. That artist will then have to compete with a machine cranking out the exact same style of their work, and they will be put out of business.
AI as a whole isn't bad. What is bad is the inability to opt of being included in data sets and controlling who can reproduce your work and in turn steal the style you've worked on developing.
The entire argument is about moral judgements around the use of the tool, not the viability of the tool itself. So far every company behind these tools has been morally bankrupt and refused to work with artistic communities to develop their tools in a way that is synergistic. Hence class action lawsuits are under way. Until there are common sense laws in place to protect human output, AI will always be detrimental and morally suspect.
This is just a really terrible statistic in general.
Firstly, it's since the 80s. The relevance of a historical trend like that to a modern world-changing tool just doesn't exist.
It's also a really bad statistic. How do they get that estimation? Is it adjusted for inflation? Does it account for the fact that very few people need to do things like having their photograph taken because they have digital cameras? Does it include the fact that there are many many more small-time artists running around because they have access to the internet where they can just post their art online for some cheap $20 commission?
And I don't think commissions are going to make that big of an impact. I've seen what you have to do to charge to get commissions for most cases. They're cheap as all hell. There's like 50,000 artists desperate to get any amount of money for their passionate work and does a result their engaged in this massive pit fight where they constantly undermine each other and it's basically impossible to make a living in the field.
Part of the reason I think a lot of this criticism is coming from people of incredible privilege. Any artist who is actually making money off of their heart is like one in 10,000. They are the relative ultra wealthy, they are living the dream. While they complain about having to lose their passionate job, hundreds of other would be artists are working at Starbucks.
Also I've seen what you have to do to get high quality work out of an AI. Unless you're looking for something crazy generic for the most part you're not going to be happy as some random unskilled user punching some problems into the tool and getting an output.
It takes it genuine amount of effort and devotion to learning the tools that exist today in order to actually get some workable outputs from these AI. You can't really get consistent characters out of them. You can't really count on hands and anatomy looking alright.
The artwork you see out there for AI generations that are like "wow look at this it's just like a person" is like one in 1000 generations and or a lot of tweaking and fine tuning of the generation with stuff like control net.
Barring some sort of revolutionary new generation of AI that can take more generic human prompting and produce an image with live feedback, maybe something like a chat GPT mixed with one of these diffusion models, commissions aren't going to be largely impacted. Especially considering the above where commissions right now are a total hellscape.
And this has nothing to do with pay, this has nothing to do with commissions. This has nothing to do with artists being able to sell commissions. Everything you're complaining about has absolutely nothing to do with this copyright.
Adobe already has a model trained on open source and licensed artwork. It's already working, you can already use it for most of the same things you do with stable diffusion.
Think that's the problem, none of the problems you've raised so far are going to be solved by "ethical" datasets.
If you use an AI to reproduce someone's artwork in an exact form, that reproduction is just as protected under copyright as if someone had come along and traced a sketch. AI doesn't change copyright law.
You can use AI to reproduce a style if the AI has been trained on that style. But you could also get some random cheap $20 commission artist to do that as well. Style has never been copyright protected.
Well yeah, I certainly wouldn't. Artistic communities don't want this tool to exist because it's seen as a threat. What are you going to do, work with the people who want to shut you down?
And to be fair there's only one AI company that I have respect for, and that's stability AI, because they actually make their models open source for the average person to use and run themselves instead of trying to make a big money selling the modeling software as a service thing.
I still don't know how they plan to be sustainable in the long term, but they're the only company I would actually back on this whole deal, because tools like this must not be locked behind web APIs.
Everyone pushing hard against the use and adoption of AI today is going to be seen as a dated Luddite within 20 years. You're fighting a losing battle, and you're fighting a losing battle for the express purpose of holding humanity back so you can make more money doing high labor tasks.
You might even get some draconian copyright law passed, but eventually it will fall. The common good of automated tools that can just create artwork out of thin air with near zero human effort is just too large of a benefit to humanity to pass up in the long term.
It would be like trying to ban music so the artists can still make money doing live performances. It's just not going to happen.
OK, I'll bite.
I am one of those "10,000" you claim are "living the dream." I have scrimped, scrapped and scrounged every art job in existence to get here. It was a proving ground, getting paid 20-50 bucks a commission until my art was good enough to take a step up to the next level. (and it isn't some amazing lifestyle, it's not even breaking middle class) Yeah, we don't have unions protecting us, helping us get better pay. Something I think you can identify with given your comment history.
As a professional surrounded by working professionals, I have intimate, first hand accounts refuting everything you've talked about. From the highest level Marvel comics artist, Sketch artists in the entertainment industry, right on down to the people in my community of young artists struggling to make even 100 bucks on something. I know, not think, that we have been getting screwed for years. As a small example, I began working in comics in 1994. I was paid MORE per page than a beginning artist today. It's the same in entertainment illustration, everyone from the top illustrators to the lowly sketch artists had a higher ceiling back in the 80s and 90s. So, apologies, but it isn't some vacuous "they say" piece of data.
all of the smaller artists I know (not a small number, upwards of 300) lost money to Ai the second it became big enough for people to plug in a selfie and do a shitty rip off of digital art. Again, first hand data of the impact it has had across the board because NO ONE asked if they could use their work.
The problems I listed CAN be solved by inclusion of artists in the data set and paying a royalty structure, just like royalties for musical artists that write, produce or participate in the making of a song. But visual artists don't have AASCAP...noticing a trend? Unions to protect us...yeah, they don't exist. So we're getting nickel and dimed out of existence. Include an opt in for a payment for hard won creativity and expertise. Weird how easy it could be.
I think there is a nuanced approach where the tech will be helpful to humanity without destroying the positives that we have built over time and supporting the creativity of future generations. If that makes me a luddite, cool, I'll live with that.
I'm aware, which is what lead me to say the "relative" dream. Demand for any hand made art was already so low that if you could make a living from it you're probably someone with a lot of luck and a pretty large network bringing attention to you and away from others.
This is also why I don't buy the idea that a lack of commissions would cause artists to wither. Working on them doesn't provide some temporary lifestyle for people to work up through, is a hellscape of lower than minimum wage work with prices constantly undercut by other desperate artists.
I don't believe commissions will be replaced by AI art, but if they are at the end of the day it's probably going to cause less suffering when all these people who think they might make it get shoved into the cold earlier so they can adjust.
This is why I told my brother not to ever go into music. Your have to be absolutely amazing to have a chance, and even if you are without some luck or wealthy parents you're still out in the dry, you're still screwed, and even if you find marginal success you're damned to poverty your whole life.
Arts should remain a hobby. If they become a job by luck, awesome. Otherwise, do something else with your career so you can draw.
This was true regardless of AI.
Well yeah, your point of comparison was the 80s when comics were actually popular and sketch artists were actually valuable.
Sketching and human drawn art went from a valuable commodity (animations and cad and comics and so on) to nearly totally worthless, basically only practically useful as a luxury expense that people who specifically want their art to be human (anyone who just wanted pretty art can just Google for it) or a very niche use case (for example, the furry and porn commission industry).
You're looking at the withered husk of a long dead industry, a few niche use cases for something that was once broadly required.
And that's why AI art isn't going to wipe it out. People who wanted pretty art will just go Google it still, and get better quality than AI can do.
People who want super personalized art can't use AI because the AI just generally sucks at it. It's more work to generate it that way than the costs of just getting a commission.
The only feasible thing I can see is that artists using AI will immediately squash and push out artists who don't from the commissions market. But that's life. Use the tools that make you more productive, or go obsolete.
No they can't, because AI art doesn't make any money. Which sounds weird, clearly they are making money, right?
Sure they are, but there are billions of images in these datasets. The AI companies are making such a hilariously miniscule amount of money that any given artist will be making literally pennies.
And remember that these tools are open source. The people using these models don't, won't, and never will be paying money to companies. Even if you annihilate the companies, the open source community will continue to happily scrape the web and train new AI. You'll get rid of them as soon as you get rid of piracy.
Eventually AI will be powerful that even without training data you could give it an example song and it'll spit out new similar examples. Even if the "MAFIAA" manages to use lawsuits to hold things back on the music end, they can't stop the tide.
I think it's important to remember the context here. AI artists and proponents are just starting down a new road, it's an exciting time and there is lots of new opportunity. I don't think anyone should begrudge them for being excited. Traditional artists have built a career and possibly families and other responsibilities on top of this expected source of income and they're not wrong to be scared and upset. Maybe their history as artists will give them an edge over other AI artists if they choose to use the tools, but maybe not. There is a lot of uncertainty in their future so I don't think we should hold the two groups to the same standard.
It seems deeply wrong to me to both excuse artists behavior while simultaneously using the reaction from AI enthusiasts to that behavior as a means to slander and "disdain" them.
The road of understanding should go both ways.
You can't expect a group to be attacked and everyone in that group just run around being the better person. It never works that way.
I'm not sure what you mean here, is this directed at me?
Yea, I understand that. I don't agree with the artist response, it's just predictable is all. I'm mostly just speaking as a third party observer of both groups. I guess my message to AI artist proponents would be that the artist response is no different than how any group would respond when facing an existential threat to their career, and to keep that in mind.
To a degree, but mostly in context of what I was originally responding to where someone was saying "disdain for the AI enthusiasts is justified". If you didn't have that original comment in mind what I said there may not apply.
I agree with you entirely. I've told off a few on the pro AI side who run around acting like absolute children and calling everyone "luddies" as a meme-phrase-insult.
But I have similar feelings for a number of people in threads like this who are using terms like "AI bros" or acting like they're in a place where they have a right to "take away their toys for misbehaving". Those people are acting in a similarly very toxic way and from how I read your message I saw it as ignoring or even justifying their behavior.
I'm not really involved with any other discussions, but given your characterizations I agree, everyone is being childish. However, I'm still of the opinion that we should expect better from the group that doesn't have their world crashing down around them. Perhaps that's a little optimistic though.
I fully understand why artists who spent years or decades learning skills and building careers aren't too excited about AI art that can do most of what they do with almost no effort.
On the other hand, I'm not an artist. I've been playing around with it a fair bit and I'm excited that I can build a picture I'm seeing in my head. I don't feel bad about this, because I'm not going to sell a picture of my cat wearing power armor and I would never have paid someone else to draw it for me.
When digital art became possible, people sneered and said it wasn't real art, but that sounds ridiculous today. I'm pretty good at photoshop. There's skills I learned in the past that became completely useless as new upgrades automated those processes. I don't mind the fact that anyone can easily pull the background off a picture. I'm excited I don't have to do it the long, boring way.
In the long run I don't think AI art is going to replace artists anymore than cameras or photoshop did. It will just be another tool artists can use. I think we will benefit overall. There's probably kids today with a great idea for a comic book, or animations that would never get made if they had to draw it by hand, but they'll be able to share them with the world as making art becomes easier.
This is what I think will always keep me on the "pro-AI-art" side. There are people that want to make art, but can't. For a small example: I have shaky hands. It's not horrible by any means, but my handwriting has always been terrible. I genuinely don't think drawing/painting/whatever is ever going to go well for me.
However, when it comes to artistic expression, I am a decent programmer and an okay writer (well, typer at least). I'm already good at those, if a little out of practice, and so if I want to make "art" then that's the path I'm going to go down. And that means I am never in my entire life going to see the images that I wish I could create.
Except now I can. And I don't think I should be looked down upon because I spent the first 30 years of my life focusing on other forms of art that I can actually do. I feel the same about things like ChatGPT, as well. If someone has a cool idea for a story but can never seem to get the things they see in their head down on the page, then I am incredibly happy that they have another tool that can help them out.
Digital art is uniquely suited to these early days of AI thanks to it being a simple bag of bytes where there is a very strong relation between those bytes that the AI can learn to depict in all sorts of images in different styles.
Eventually, the abstraction layer will creep higher and AI will be able to understand less discrete relationships in more complex datasets and it will be able to do more complex tasks.
Would you not feel a little bit protective of your skills as a programmer if anyone could hire an AI for pennies and, giving high-level goals to an AI agent, crank out software in a few minutes it would take you days to create? I know I'll probably look down on the 'programmers' of the future, not because they deserve it but because a) my livelihood may be threatened and b) I invested a bunch of time in the hard way before the easy way existed
To be clear, I think trying to avoid any of this is trying to close the lid on pandora's box, but I think people are justifiably upset.
Let me preface everything I say by making it clear I do not professionally code or write. So even though I try to empathize with people that are in this situation, I obviously am not directly threatened by any of this.
For me personally, I think I see the act of programming as a means to an end. There is certainly a skill to it, and in many ways it is also an artform, but the goal of programming is usually not to create good code. The goal (again, just how I see it) is to create either something that completes a task efficiently or something that is usable by a person. The important part is not the code itself; it's what the code does.
And, importantly, I think there will still be skill involved in getting an AI to create a program that does something novel, is usable by a person, and isn't filled with bugs. At least for the near future. Eventually we might be able to tell a computer "Make an app that does x, y, and z" and it spits out a perfect program with no faults, but for the foreseeable future it will probably just be another tool that allows more people to make more things. I think that's kind of always been the goal of humanity.
Reading the discussions in these thread, all that comes to my mind is "the only permanent thing in this world is change". I see people holding on to maintain the status quo and i see people riding the flow of change.
I myself am a hobby artist but i am also pro AI. Because i think. With the proliferation of AI images, the human made art will become more valuable in the future , just like sculptures made by moulds from factories vs sculptures commissioned to celebrate something great. The moulfs are used for low quality unimportant works. But the statue is specifically commissioned because they want an Artist to make into reality their i terpretation of a certain subject, the artist's emoyion, philosophy and Outlook in life will reflect on that sculpture. The sculpture will have a soul. This will make it infinitely more valuable than a mould, no matters how perfect or exquisite it looks. Hell the commissioned sculpture may look like trash, but it will have soul. That's my take on it, ai art will make human art more valuable.
But, there is still a case of human livelihood at stake. This i cannot comment on because i myself do not earn from my art. I leave it to others more knowledgeable than me, to think of ways to compromise and meet in the middle. Not hamper progress as well as not to harm the way of living for people who are affected.
You can tell he is coming from the perspective of an author who sells his work for money. Every argument reflects this viewpoint. He repeatedly states that this is hurting people making money off their art, while at the same time slamming people who try to sell what they create using AI art.
I see a few uses for AI art, one chief one being as a starting point for a commissioned piece by a human. "This, but good" sort of thing. The other being art for personal use that wasn't going to be bought anyway. Most people, if they want a particular piece of digital art to store and look at, will scour the internet rather than buy it (a picture of, say, Spider-Man fighting Green Goblin.)
To me, the main problem with AI art isn't the supplanting of humans, it's the using the art of humans to train without consent. Human artists are always going to be valued by some and not others (today's AI art is yesterday's "for exposure" request,)
The video goes fairly in depth on your concerns! He talked a lot about how ethically trained AI on datasets of public domain/donated works definitely have a place in the world
The video presents a critical overview of the arguments for and against AI arts. It seems well informed in terms of both the artistic and technical aspects. I see the scope of this one to be similar to what the Line goes up video did for NFTs. A lot of what's brought up you probably have all heard before but the video maker addresses them substantially with his own counter arguments.
The channel's usual content is on creative writing but he also claimed to have a published book and background in law so I trust that he's invested in the issue. His other video about Japan's WWII warcrimes was also quite good.
I appreciate the nuance around this subject. It seems like so many people are so vitriolic about the topic of AI art, making often baseless and uninformed claims just because they're upset.
This video is longer than I have the patience for, but I like the point he made at 53:20 about the personal-growth aspect of making art being the source of societal/moral value.
I feel like he tried to cover every possible aspect of AI art into one video. 2 hours and 19 minutes is just ridiculously long.
Honestly, I fell asleep about an hour and an half in, and I don't fell like I missed anything.
He should break this up into 6-7 videos that covers parts of the discussion.
I didn't watch the video but I wrote something based on the discussion here so I'm going to post it anyway:
I'm not a lawyer but I think it's useful to think about this from a legal realism perspective. What's at stake?
Copyright law is at least somewhat self-enforcing in the sense that businesses often avoid doing anything that puts them at unnecessary legal risk. Releasing a library under GPL means a lot of businesses say "nope, not touching that, let's pick something else." Movie and record companies often make sure they have all the legal clearances they need and take out anything they can't get. Some businesses are avoiding AI-generated art due to legal uncertainty.
This means there's a market for artists who will charge reasonable prices for clear legal rights to use an image. The average user doesn't care since they're just going to copy something off the Internet, but there are businesses who do care.
Adobe claims that they really do have the rights to all the images that their AI image generator uses, which makes their product attractive to some businesses, even though it's not as impressive as MidJourney. This means that commercial artists are going to have to compete with AI-generated images regardless, because there is already AI-generated art available that's clearly legal. Also, there's little reason to believe Adobe can't improve their image generator and maybe catch up with MidJourney eventually.
Businesses don't always avoid legal risk. In other cases when enough is at stake, they will go ahead and maybe fight it out in court, or negotiate a compromise. Building a web search engine was once a legal grey area. YouTube was built on copyright infringement, and then Google negotiated a messy compromise with rights holders. For Google Books they tried and lost.
This means that court cases will affect us even though we're not directly involved. If the courts rule that what Stable Diffusion or MidJourney is doing really is infringing then it will matter to some customers and that will affect the market for artwork. MidJourney might need to shut down or start over.
I'm not sure it will be all that big a deal to the average Internet user who wants to make an AI-generated image, other than being disappointed that they have to use a somewhat worse AI image generator for a while. They probably weren't thinking about buying original art.
It's a perfect moment to reassess what art even is, then.