Y'know what I hate way more than AI generated art? Being told how I should feel about it by others. How is that any different than most other top 40 music? New Country music especially has been...
Exemplary
Y'know what I hate way more than AI generated art? Being told how I should feel about it by others.
There’s no question that AI generated music lacks the heart and soul you find from artists who live and breath their music. There’s no depth to the lyrics, no intricacies to the music, and nothing particular interesting about the crap they’re putting out.
How is that any different than most other top 40 music? New Country music especially has been riddled with soulless, heavily manufactured artists and songs pandering to their core audience's idealized "traditional" values for decades now. Bo Burnham 100% nailed it all the way back in 2015 in his Make Happy special:
Where instead of people actually telling their stories, you got a bunch of millionaire metrosexuals who've never done a hard days work in their life, but they figured out the words and the phrases they can use to pander to their audience, and they list the same words and phrases off sort of mad libs style in every song, raking in millions of dollars from actual working class people! You know the words, you know the phrases, phrases like--
A dirt road, a cold beer
A blue jeans, a red pickup
A rural noun, simple adjective
No shoes, no shirt
No Jews, you didn't hear that
Sort of a mental typo
I walk and talk like a field hand
But the boots I'm wearing cost three grand
I write songs about riding tractors
From the comfort of a private jet
The only difference between then and now is that it was record execs and their marketing departments doing the manufacturing of the music back then, and now it's a computer algorithm.
Just to pile on, I can reality-check that assumption with this pure-AI track. Those actually are AI-written lyrics, too. From the actual human who created this (imo rather delightful) little...
There’s no question that AI generated music lacks the heart and soul you find from artists who live and breath their music. There’s no depth to the lyrics, no intricacies to the music, and nothing particular interesting about the crap they’re putting out.
Just to pile on, I can reality-check that assumption with this pure-AI track. Those actually are AI-written lyrics, too. From the actual human who created this (imo rather delightful) little track:
Music is mainly all generated using AI. Only 0% to 5% of the generated lyrics get manually adjusted afterwards by me, so there definitely will be some lyrics which will play with your imaginative interpretation skills. I'm mostly intrigued with the AI mathematical manipulation, instead of following the common musical compositions
I'm a software developer who also, as a side project, does coding of AI + RAG using Python. I love listening and appreciating all forms of musical sounds and patterns of all things living (which includes us humans) that internally move me or makes me wonder.
His little AI experiments are all right here. There are a lot of them. So, this track was created by a software developer, not a musician.
If you want to discover the track you've still got to play it live, give it a chance to breathe and echo at a few shows, play with it on tour. I want to hear this AI track covered by The Greyhounds. They would embarrass any AI with the final cut. ;)
I'd also like a tool that could track the influences the AI used creating this track back to the original sources. If it's just lifting and remixing, seems like it's on the scent of some artists I'd like to hear... no credit is given to whatever sources the AI cribbed this from. Gotta admire capitalism for impossibly reinventing theft yet again, in an entirely new way. I think all the artists suing over AI right now have got a point. No attribution, no compensation.
Another good example that, like the one mentioned in the article, also made the Billboard charts: Xania Monet - How Was I supposed to Know? And as far as contemporary R&B music goes, IMO it's...
Being told how you should feel about music is basically par for the course in western discourse. This goes across eras and genres. A lot of times the critiques are raising real issues, but it’s...
Being told how you should feel about music is basically par for the course in western discourse. This goes across eras and genres. A lot of times the critiques are raising real issues, but it’s really about people having their identity tied to yet another commercial enterprise. Sorry for the tangent, it’s just a pet peeve of mine.
Amusingly, ten years ago if you talked about AI generated music people would probably marvel at how niche and creative of an exercise it is. Something reserved for serious academics.
Record companies are probably salivating at the chance to cut artists out even further. But imagine the irony if they get their clock cleaned by some TikToker with an AI generated next gen Gorillaz or Hatsune Miku and a well tuned agent system.
Eh I wouldn’t think so or at least not if they’re smart IMO. You already see tech companies purchasing or jockeying to purchase movie studios. If they pursue this route, I can see it being a...
Eh I wouldn’t think so or at least not if they’re smart IMO. You already see tech companies purchasing or jockeying to purchase movie studios.
If they pursue this route, I can see it being a hybrid model of Uber and AWS. Uber(with Lyft) burning VC cash to dominate the market then extract all of their value once dependent. Everyone moved to “the cloud” and now half of the internet (hyperbole but not far off) goes down if AWS, Azure or GCP trips over a wire in the DC.
I expect once the AI dust starts to settle from this Wild West, any company or industry that put too much of their “portfolio” into whoever is left standing is going to be highly susceptible to essentially being held hostage. One of their own creation. Back to the cloud computing reference, trying to convince companies to move back to on-prem is a great example of this.
There I Ruined It did a take last year on the same concept. Worth mentioning that a lot of AI channels (like TIRI) are genuine musicians in their own right, and are simply having fun and being...
Worth mentioning that a lot of AI channels (like TIRI) are genuine musicians in their own right, and are simply having fun and being creative with the new style transfer tools that are available. He's still composing the words, playing the instruments and singing the melody. There's a lot more that goes on than just Suno-generated prompting.
I was drunk the day my mom got out of prison
And I went to pick her up in the rain
But before I could get to the station in my pickup truck
She got runned over by a damned old train
The "music", as it is AI-generated, is directly harmful to the people who make the music in the first place.
The "music", as it is AI-generated, is directly harmful to the people who make the music in the first place.
Another issue, as I’ve pointed out in the past, is that these AI-generated songs are taking attention – and money – away from actual songwriters and artists. Ella Langley is directly behind Breaking Rust at #2 on the Billboard Country Digital Song Sales chart with her latest single, “Choosin’ Texas,” meaning that she would have the top song on the chart if it weren’t for some AI-generated slop that, I suspect, is being boosted by fake streams and followers. (To be clear, I don’t have any evidence of that other than the massive numbers that these songs are doing for an unknown – and fake – artist).
If it is being boosted by bots or whatnot, that’s an issue, but if is just more popular because people like it more, then that seems fair game. That’s just competition.
If it is being boosted by bots or whatnot, that’s an issue, but if is just more popular because people like it more, then that seems fair game. That’s just competition.
It's "competition" in the same way that introducing an invasive species to an ecosystem is just "competition". That AI-generated slop was created by stealing, for lack of a better way to frame it,...
It's "competition" in the same way that introducing an invasive species to an ecosystem is just "competition".
That AI-generated slop was created by stealing, for lack of a better way to frame it, from essentially every country music artist that has music available for download. It would not, could not exist without human creativity and feeling. This AI garbage only exists as a consequence of taking from human creation, without actually giving back to the artists from whom it stole. It can generate far more music than a person could create in the same time period, and the vast majority of it will be garbage. Machine-generated music will be able to flood the proverbial zone with so much garbage that you will have a very hard time finding anything created by people, if this becomes the norm.
The fact that some of it comes out listenable, even popular, is incredible as a technical feat. But it comes at the expense of everyone who cares, even a little, about the artistry behind what they do.
Imagine the worst-case scenario, where commercially-viable music from people is no longer available. People are no longer recording music, because it gets drowned out in the wave of AI-generated, algorithm-pleasing slop. There is no more feeling, no more soul, no more meaning behind anything you listen to anymore - because it's generated by a machine, looking to get the most views/listens/etc instead of trying to create something meaningful.
I dunno, I don't see a world where there's an actual benefit - to people, to culture, to anything - from just hosing down the internet with AI-generated slop like this.
If everything it makes is garbage, then it sounds like that says more about the human artists who are losing to said garbage in the ratings. Like I said, if there is foul play in the ratings like...
If everything it makes is garbage, then it sounds like that says more about the human artists who are losing to said garbage in the ratings. Like I said, if there is foul play in the ratings like botting, that’s an issue, if it’s just people choosing to listen to it, skill issue?
I don’t see why Billboard should omit them if real people are listening, and I think people are free to listen to what they want. Is what it is.
That's a pretty callous thing to say about people potentially losing their livelihoods due to a technology that is actively stealing their work. I think financial recompense to the artists who...
If everything it makes is garbage, then it sounds like that says more about the human artists who are losing to said garbage in the ratings.
That's a pretty callous thing to say about people potentially losing their livelihoods due to a technology that is actively stealing their work.
What do you want to be done about it?
I think financial recompense to the artists who have had their work stolen and their copyrights infringed is a start. Again, none of this AI-generated music would exist if it weren't for the fact that human artists have had their works stolen and copied by these huge AI companies. These companies aren't actually competing "fairly" in the first place - all of these LLM "AI" chatbots are based on news articles, books, literature, and the like having been published by people, for people, and protected under copyright.
Or, at least, they should be protected under copyright, except that these companies just ... stole the content, shuffled the bits around, and didn't pay for fair use of the material in the first place. There's ongoing legal cases surrounding it, and I imagine similar legal issues will become more prevalent, and find their way into AI-generated music too.
And if paying fair financial recompense to human artists would result in these AI-generated music companies going out of business - isn't that just competition too?
It’s more of a response to everyone who keeps saying it’s “garbage slop” or whatever, which seems logically inconsistent. You can’t have your cake and eat it too - either these music models can...
That's a pretty callous thing to say about people potentially losing their livelihoods due to a technology that is actively stealing their work.
It’s more of a response to everyone who keeps saying it’s “garbage slop” or whatever, which seems logically inconsistent. You can’t have your cake and eat it too - either these music models can make music good enough that real humans like as much or more than music made by humans and are therefore deadly competition for artists, OR they just make dumb garbage no one likes and no humans need to be worried.
Personally, I’m fine with it being the former. The artists make good music, and so do the music models. That’s why this is even a relevant topic. If you call all AI music garbage, you’re implicitly calling the human artists who lose to them WORSE than garbage.
I think financial recompense to the artists who have had their work stolen and their copyrights infringed is a start.
I think whether or not training a model counts as a non-transformative work of copyright is something that is not yet determined. Those court cases are working on it, and what precedence they set will determine future policy. The machines of government are in progress for this already. We just have to wait for them to complete.
I don't call it garbage because it's "bad music". I call it garbage because it's bad for culture, it's bad for actual artists who do actual work, it's bad for the environment, it's bad for people...
I don't call it garbage because it's "bad music". I call it garbage because it's bad for culture, it's bad for actual artists who do actual work, it's bad for the environment, it's bad for people who want any financial benefit from their work ... it's just bad all around.
Garbage isn't just an assessment of quality or utility. It's an assessment of value. To me, a kids Dora the explorer quartz watch I found on the ground is garbage because it was mass produced in a...
Garbage isn't just an assessment of quality or utility. It's an assessment of value. To me, a kids Dora the explorer quartz watch I found on the ground is garbage because it was mass produced in a Chinese factory for pennies just like the millions of other identical models. It's worthless because it's not special. No care was put into its production, it doesn't actually say anything. The fact that it objectively tells time better than a handcrafted 1 million dollar mechanical movement Patek Philippe doesnt somehow make it not garbage.
What does “good enough” mean, though? I think it’s logically consistent to believe that many people have bad taste in music. Popularity shows a kind of skill at making stuff that many people like...
What does “good enough” mean, though? I think it’s logically consistent to believe that many people have bad taste in music. Popularity shows a kind of skill at making stuff that many people like and that’s important professionally, but you don’t have to like the result. Compare with junk food being popular, but known to be bad for you. There is more to life than commercial success.
It’s a lot harder, though, to show that music is objectively bad. Usually this is conditional: if you like X, this is a bad version of X.
Also, listening to music out of context, not knowing how it was made, is only one way to listen. While it can be interesting to do blind auditions and see what you can puzzle out going in cold, it’s also interesting to listen to music in context, knowing something of its history and influences. AI-generated music is going to have a different meaning than historically important music made by musicians.
Compare with sports. A sports movie is different from watching a game. You can admire the skill of the athletes performing in a real game or the skill of the actors and all the other people to fake an exciting game when making a sports movie. These are different kinds of admiration.
I think it’s possible to admire AI-generated music, but it will be for different reasons than admiring recordings by musicians. The AI researchers and programmers building these music generators are good at what they do, and they will get better. It would help me appreciate them if I knew more about how they did it.
(It’s also possible to admire the skill that went into making something and not be all that into it.)
I can't help but notice that you go from "actively stealing" to "being based on". It's just wild how liberally the word theft is used when it comes to conversations about AI. And I don't know...
I can't help but notice that you go from "actively stealing" to "being based on". It's just wild how liberally the word theft is used when it comes to conversations about AI. And I don't know about you, but it's not really obvious to me that authors are owed money for any algorithm that uses their works as input. Like if I run a script that takes articles published online, takes a random adjective, verb, noun from each and cleverly arranges them into a legible text - do I now owe financial recompense to every single author of an article that I used? It doesn't make any sense to me, the value of the finished work (if any) would be in the clever arrangement of the words, each one on their own would be worthless. Similarly, a raw dataset is just a pile of garbage, the value comes from filtering it, labeling it, writing an algorithm to train on it, from compute costs, from R&D behind it all. All of that work is not done by the original authors.
You could say, well what about the data that wasn't openly published on the internet, and yes indeed that would be piracy if it was downloaded illegally, something that no one really disagrees with including the courts. Yet it is NEVER talked about in AI criticisms, probably because in the eyes of most people piracy isn't really such a big deal, instead we have the accusations of "callous theft!" and such, which is in my opinion simply disingenuous.
I use different words because I was taught not to use a single word too frequently. Writing style and all that. Let me make my stance clear: for lack of a better definition for what...
I use different words because I was taught not to use a single word too frequently. Writing style and all that.
Let me make my stance clear: for lack of a better definition for what AI/LLMs/whatever are doing with music, written works, TV, films, etc - all of these AI companies are stealing.
Your argument is confusing, because you're drawing a line where it's okay to take piecemeal works published and available online, but not okay with something downloaded illegally or otherwise unavailable online. A lot of these LLMs training models are using works that were downloaded illegally. In many cases, such as ChatGPT, they are making direct quotes from the works they were trained on, and not simply cut-and-pasting random words.
I would argue that there is a similarity in the training of an AI to (ideally) regurgitate facts and otherwise be a point of reference ... to educating people. When we get educated, a teacher/professor/whatever often gives you a textbook, an assigned reading, that sort of thing. They are not asking you to repeat what you read verbatim, but rather to read it, consider it, and in the end be able to bring about original ideas and combinations of words that speak to truth, history, whatever.
When a teacher uses those textbooks in the classroom, they (or the school district, whatever) have to pay for the use of those books. They can't just steal them and insist that because they aren't repeating the words from the text verbatim, that it's somehow okay and solely original work. The vast, vast majority of the work used to train AIs was not paid for.
That's fine and good, but do you acknowledge that "being based on" and "stealing from" are fundamentally different concepts and the difference between them makes or breaks your argument? To also...
That's fine and good, but do you acknowledge that "being based on" and "stealing from" are fundamentally different concepts and the difference between them makes or breaks your argument?
To also be clear, I think there are a variety of better ways of describing whatever AI are doing with music et al than "stealing". I'm trying to say that it is not at all intuitive and you haven't demonstrated why exactly it is theft.
As I said in the previous post, piracy is not at all the same. I don't even think anyone would ever say "I stole a PDF of a textbook" because it's just not the same thing intuitively as lifting a book from a shop. So yes, in my opinion there's clear distinction there. Personally I also don't particularly care about piracy that much. In fact I think copyright laws are too draconian and way more works should be in public domain. I do agree that just as schools should pay for textbooks and companies should not pirate the training data. Thankfully there's no inherent requirement for training data to be pirated, it is not at all a problem with AI as a technology.
Now if you want to argue that using works publicly posted for free to view and download for AI training is stealing and every one of those authors is owed a compensation then you really should make a strong argument for it, because I don't see it. I mean, did Google researches know that AI was gonna become a golden goose when they posted their paper on Transformer architecture for all to use? No. Are they now retroactively owed compensation because their work is now being used to make billions? I don't think so. Neither do I think that everyone who posted their art/music/whatever online is now owed compensation because it happened to be used to train valuable models. That's just... progress.
Now if you want to say that you should be able to opt out of whatever you're posting to be used in AI training going forward through appropriate licenses, I can definitely see that argument having validity.
Individual piracy is not the same as corporate piracy. While copyright laws may or may not be draconian, and more works should be in the public domain, the fact of the matter is they're not - not...
Individual piracy is not the same as corporate piracy. While copyright laws may or may not be draconian, and more works should be in the public domain, the fact of the matter is they're not - not yet - and in many cases the authors of copyrighted works are still alive and protected by those laws.
Elements publicly posted for free viewing is not the same as being made for public use. I can view a lot of things on YouTube, but created content is protected by the law. And again, there is a difference between a single person ripping a free YouTube video, and a corporation ripping millions of videos.
Ultimately, I don't think we're going to see eye to eye on this, and that's fine.
Again, you're using the word "ripping", but I don't see how it is that. Remixes and edits exist on YouTube just fine. I'm aware, that's why I said that companies should pay for the data. But this...
Elements publicly posted for free viewing is not the same as being made for public use. I can view a lot of things on YouTube, but created content is protected by the law. And again, there is a difference between a single person ripping a free YouTube video, and a corporation ripping millions of videos.
Again, you're using the word "ripping", but I don't see how it is that. Remixes and edits exist on YouTube just fine.
Individual piracy is not the same as corporate piracy.
I'm aware, that's why I said that companies should pay for the data. But this is a criticism of individual companies, not of "AI". Again, if that's the extent of your position on why it's "stealing" or "ripping", then it seems extremely misleading to use those terms.
To my understanding, and I could be wrong here, but of the dozens (are we up to hundreds yet?) of AI models available today, literally only one company has claimed that their training content was...
But this is a criticism of individual companies, not of "AI".
To my understanding, and I could be wrong here, but of the dozens (are we up to hundreds yet?) of AI models available today, literally only one company has claimed that their training content was licensed. And even then, I believe their “licensure” was changing their own terms and conditions after having amassed a large library of content.
I think if every player on your favourite baseball team is using illegal performance enhancing drugs except for one player who isn’t, it’s entirely fair to say the team (collectively) is doing the wrong thing. Rather than individually naming each player for doing the wrong thing but the team overall is beyond criticism.
We aren't talking about licensing the training process. It's very debatable that it would be required as opposed to falling under fair use. We're talking about downloading media illegally, i.e....
We aren't talking about licensing the training process. It's very debatable that it would be required as opposed to falling under fair use. We're talking about downloading media illegally, i.e. piracy. This only applies to something like downloading a Disney movie. 99% of the time the conversation about AI being "theft" revolves around artists and content creators that post their content online to be freely accessed, so scraping it isn't the same thing as pirating a movie or whatever.
Licensing is the solution to piracy, legally speaking, so I assumed they go hand-in-hand in a discussion about piracy. After all, if I have a licence from the creator to do XYZ then by definition...
Licensing is the solution to piracy, legally speaking, so I assumed they go hand-in-hand in a discussion about piracy. After all, if I have a licence from the creator to do XYZ then by definition when I do XYZ it isn’t piracy.
I’ll also push back against the claim of fair use, which is an affirmative defence against copyright infringement, but which is often misused across the internet. Caveat that I’m not a lawyer or a judge, but it’s fairly easy to look up what fair use actually means and where it applies and what considerations form part of this defence.
Creating a for-profit business by harvesting entire works without credit and without even attempting to discuss licensing with original creators and not freely sharing your result smashes all four factors of fair use. Illegally downloading the latest Disney+ exclusive movie to watch at home is much closer to fair use than anything OpenAI or their competitors have done in this space.
Why would it only be valid into the future now that these models exist? Surely it’s also valid that anyone should be able to opt out of their existing works also not being up for grabs for AI...
Now if you want to say that you should be able to opt out of whatever you're posting to be used in AI training going forward through appropriate licenses, I can definitely see that argument having validity.
Why would it only be valid into the future now that these models exist? Surely it’s also valid that anyone should be able to opt out of their existing works also not being up for grabs for AI training?
I’m not sure why that should be an exemption either. If I created a blog 10 years ago and my intention was “nobody should be able to read my blog unless they pay me for it” and then I put up a...
I’m not sure why that should be an exemption either. If I created a blog 10 years ago and my intention was “nobody should be able to read my blog unless they pay me for it” and then I put up a paywall, and then I go “okay one exception is for google search crawlers to index my blog so more people can find my works and pay me for access” then why should this new technology get an exemption to my initial intentions? Just because it’s new?
I’m not sure what you think “not explicitly stated” means, but if a website has a robots.txt that permits only a google search web crawler, or otherwise enumerates the options that the website...
I’m not sure what you think “not explicitly stated” means, but if a website has a robots.txt that permits only a google search web crawler, or otherwise enumerates the options that the website owner writes up, I think that’s pretty explicitly stated intentions.
I think it’s ludicrous to assume that a new technology or a new use case for harvesting data somehow gets a free pass by default, instead of a requirement to seek permission first by default. Especially in the context of a for-profit company locking the result behind a paywall but paying for none of the pieces.
Where does the notion come from that you can post anything you want online and then exclusively decide exactly how people use it? Plagiarism (taking things as is) is a very specific case that we...
Where does the notion come from that you can post anything you want online and then exclusively decide exactly how people use it?
Plagiarism (taking things as is) is a very specific case that we forbid. What does this have to do with AI training? Moreover, for 10 years everyone was free to copy this content, save it on their computers, run algorithms, study that data, transform it, make it into something else, sell the results and it was fine. Why is AI training any different?
Copyright law is the easiest example I can think of, and various other intellectual property (IP) laws. But given my example included someone not just saying “pretty please” but also enforcing...
Where does the notion come from that you can post anything you want online and then exclusively decide exactly how people use it?
Copyright law is the easiest example I can think of, and various other intellectual property (IP) laws. But given my example included someone not just saying “pretty please” but also enforcing their wishes with a paywall and robots.txt then to my mind it’s clearly also protected by laws against circumventing technological barriers (to grossly simplify, “anti-hacking laws” etc)
Plagiarism (taking things as is) is a very specific case that we forbid.
Plagiarism is a much more narrow slice of IP, and I mostly see it discussed in an academic environment, but that’s about crediting the original source (which LLMs are also bad at). But it’s certainly not the only case that’s forbidden.
If I start printing and selling copies of the Harry Potter books verbatim (without a licensing agreement), but I make it clear on the front of the book that I’m not the original author, then this isn’t plagiarism, specifically. But this is definitely still illegal under copyright and other IP laws.
Moreover, for 10 years everyone was free to copy this content, save it on their computers, run algorithms, study that data, transform it, make it into something else, sell the results and it was fine.
There’s a huge difference between what’s simply possible and what’s permitted. I think all of this has been possible, but I disagree with the idea that it’s been legally permitted this whole time. If you can give me some examples so that I have a better idea of what you’re referencing here specifically, that would be great.
Oops wrote out a whole tangent on Fair Use which isn’t really relevant to the conversation currently
And there’s also a significant difference in how these actions are viewed (e.g. when examining if Fair Use applies) when the results are for educational purposes, or for personal/private use, or for commercial reasons, or if the results of the data processing are themselves also freely shared or not.
For example, if I scraped the internet for all writing (including copyrighted works, like LLMs have been doing) and then created an LLM that I never distributed and only kept on my home computer for personal use, then there would be a very weak case against me.
Or if I did the same and wrote academic papers about what I’d learned in the process, and provided educational materials for how this new transformer style model could do much more than markov chains, that would also be a fairly weak legal case against me.
Or if I did the same as both of the above, but also released the entire model free to the world to use, then there would be a bit of a stronger case against me, because now I’m making protected works available to the public, albeit in a somewhat scrambled way.
But it’s been proven than LLMs can be prompted to reconstruct entire copyrighted works, so saying “it’s transformative” as my defence is pretty weak when the model is fully capable of recreating entire non-transformed unaltered works.
But doing all of the above and then putting it all behind a paywall, I’ve now clearly swayed all four factors of Fair Use against myself, and if I don’t have Fair Use to hide behind then it’s pretty clearly in the territory of copyright infringement.
I guess you are using some technical definition of plagiarism. When I say plagiarism, I simply mean copying verbatim. This is not at all relevant to the AI discussion because AI is by definition a...
If I start printing and selling copies of the Harry Potter books verbatim (without a licensing agreement), but I make it clear on the front of the book that I’m not the original author, then this isn’t plagiarism, specifically.
I guess you are using some technical definition of plagiarism. When I say plagiarism, I simply mean copying verbatim. This is not at all relevant to the AI discussion because AI is by definition a neural network (its weights), it cannot possibly be a verbatim copy. And if you want to say that you can get it to produce a verbatim copy, I'll respond with the fact that Photoshop can be used to produce a perfect copy of any copyrighted image.
Textbooks strike me as literally publishers paying a clerk to strip the identity from their source texts in order to package and sell them as an all-in-one, in a way morally similar to these...
Textbooks strike me as literally publishers paying a clerk to strip the identity from their source texts in order to package and sell them as an all-in-one, in a way morally similar to these language models. Sure, they give citations, but...economically? What's the distinction?
Generally they're written by professors who are experts in their fields and thus are intentional in what they include and how they write about it. I think your framing of it redirects just ire at...
Generally they're written by professors who are experts in their fields and thus are intentional in what they include and how they write about it.
I think your framing of it redirects just ire at publishers to the entire concept of a textbook. We don't all always read primary texts about every topic, but we still need to learn about Sociology 101.
I'm not sure I've ever learned something from a textbook that wasn't better delivered in lecture, seminar, or primary texts, so the distaste was intended. But that's definitely subjective. To the...
I'm not sure I've ever learned something from a textbook that wasn't better delivered in lecture, seminar, or primary texts, so the distaste was intended. But that's definitely subjective.
To the point at hand: so the value is in curation, then? If it's A-alright for a publisher to shop around for an expert willing to write a textbook that fits their series schema, who then borrows liberally from secondary sources (nonderogatory, but not original) to fill the 400 large format pages of the manuscript, which is chopped and revised by the publisher comprehensively, wouldn't it stand that if the AI bots were better at curating fact and providing citation, then there's no problem with it? Is it literally that a person arranged the words? Because I'm not sure that's the value, real or economic.
I am someone who believes people offer value that AI doesn't. So no I wouldn't agree. (And we certainly wouldn't want Grok to be writing your history textbooks) I'm in favor of the open access...
I am someone who believes people offer value that AI doesn't. So no I wouldn't agree. (And we certainly wouldn't want Grok to be writing your history textbooks)
I'm in favor of the open access textbook movement but not literally removing all control of what is taught from the experts with terminal degrees in the field and handing them entirely to the publishers to control the narrative. I'm not sure that the pathway is really publisher seeks author to write narrative they want but it would be worse if if it was publisher uses AI to write narrative with zero oversight. But if you've already decided all textbooks are useless idk why you'd want them to be blatantly so.
I have had good text books and horrible lecturers and the opposite so it certainly depends on the topic, the teacher and the book.
Really, my initial point was that appealing to textbooks is unconvincing. I've outlined my reasoning there. When it comes to the intrinsic value of human construction heavily edited by...
Really, my initial point was that appealing to textbooks is unconvincing. I've outlined my reasoning there. When it comes to the intrinsic value of human construction heavily edited by profitseeking conglomerates as opposed to automated construction heavily edited by profitseeking conglomerates, I'm just not convinced. Beyond that, I don't disagree that it would be bad for a bad textbook generator to replace the currently mediocre textbook industry. I just think that's a matter of time, given how LLMs at their theoretical best can do summary quite well, rather than something intrinsic to the humanity of the drafters of such texts.
Aside: as mentioned in my other reply, I don't think all textbooks are useless, topic-focused books can be excellent references and primers can be excellent introductions, my gripe is with specifically those Sociology 101 type, overly broad and pseudoobjective, examples.
I don't think I follow. Are you inferring that textbooks aren't actually written, but instead cobbled together from other source materials? Because that doesn't seem accurate, at least not for the...
I don't think I follow. Are you inferring that textbooks aren't actually written, but instead cobbled together from other source materials? Because that doesn't seem accurate, at least not for the vast majority of textbooks.
Insofar as a person's Broca's area matched words to the ideas drawn from source materials, yes, they're written. Personally, the claim that neurons make the difference in pattern matching has yet...
Insofar as a person's Broca's area matched words to the ideas drawn from source materials, yes, they're written. Personally, the claim that neurons make the difference in pattern matching has yet to rise to proven.
I'm being a little overbroad here, I'll admit, and am thinking of the intro textbooks that pretend to articulate a broad number of topics within a subject as a survey, not the more traditional textbooks which are effectively original works focused on pedagogy.
Then we'll listen to the music that isn't commercially viable, that people create for the love of it. From everything you've said, here and below, it seems like you're angry at the economics...
Imagine the worst-case scenario, where commercially-viable music from people is no longer available.
Then we'll listen to the music that isn't commercially viable, that people create for the love of it.
From everything you've said, here and below, it seems like you're angry at the economics (entirely justifiably - I am too), but you're directing it at the technology. It's a problem of culture, and a problem of incentives, and to me at least burning up anger on the tooling rather than the cause is just another means of obscuring where the real problems lie.
It doesn't sound like @Drewbahr is mad at the technology but at the companies who built the technology ingesting music made by people who understand the human condition and make a living off their...
It doesn't sound like @Drewbahr is mad at the technology but at the companies who built the technology ingesting music made by people who understand the human condition and make a living off their hard work, then using the technology to output something trivially.
That's a fair way to put it, yeah. I might add that the whole of the AI industry, as I understand it, is effectively exploitative. It is exploiting artists' works without fair (or any)...
That's a fair way to put it, yeah. I might add that the whole of the AI industry, as I understand it, is effectively exploitative. It is exploiting artists' works without fair (or any) compensation.
I believe that all art - be it painting, music, filmmaking, or whatever - is inherently a conversation between the artist and the person or people appreciating their work. This extends to both the culturally-significant works of art, as well as stuff that is created for commercial use. Even if something is made "soullessly" by a corporation, there are still people working hard to create it. It is a fundamentally human-created thing, that can be appreciated (or not) for what it is.
When an AI model takes in millions of hours of music and regurgitates it into an algorithm-matching earworm, where's the conversation between artist and appreciator? What does the work actually say, other than "Spotify thinks you'll listen to this slop X many times this year, which will make us Y many dollars."
I also take issue with the technology, making it so that garbage media like this can be churned out ad nauseam. I am angry at the technology, because the technology has been given sufficient incentives to exist and be used in this way in the first place. Yes, it's a problem of culture and incentives - but it's also a problem of technology! We wouldn't be having this conversation if the technology itself did not exist.
I’m incredibly biased, I’ll be the first to admit that. But the technology wasn’t built for music - diffusion models were literally that, scientific models to study the process of diffusion....
I’m incredibly biased, I’ll be the first to admit that. But the technology wasn’t built for music - diffusion models were literally that, scientific models to study the process of diffusion. That’s the problem, and I think why the backlash bothers me so much: because the tech exists for a reason, a damn good one in my opinion, and the fact that it’s been relatively easy to port over to other purposes doesn’t undo that. It’s a problem of shitty incentives, and as far as I’m concerned only a problem of shitty incentives.
I do also get frustrated about the whole “all training is stealing” argument; not just because consuming and analysing data is quite different to copying it wholesale, but because it’s a fundamental misunderstanding of what’s possible with the technology as well. I’d actually love to create a model that makes work truly and unequivocally its own, without guidance from existing human work, simply because I find that a fascinating artistic concept in and of itself. But I have neither the time nor money to spend on that right now, and it would be a tricky and expensive way to create something artistically interesting that I’d never see commercial return from - again, incentive problem, not technical problem.
And for what it’s worth, I agree wholeheartedly about art being about communication, conversation between artist and audience. I’ve actually said that in almost the exact same words myself before, but with the exact opposite conclusion: machines can’t take art away from us, because that desire to communicate will always exist.
I think part of the reason that we reach different conclusions is this: I don't think it's reasonable to expect music to be created for the sole purpose of "the love of the art form". Music, and...
I think part of the reason that we reach different conclusions is this:
Then we'll listen to the music that isn't commercially viable, that people create for the love of it.
I don't think it's reasonable to expect music to be created for the sole purpose of "the love of the art form". Music, and other art forms, have always had a financial interest as much as a creative one. The ceiling of The Sistine Chapel, Michelangelo's David, many classical musical works, the list goes on - they were all commissioned art, made not solely for the love of the art itself but made possible by patrons. The artists created the work for money as well as for the arts.
Now, I suppose one could argue that patronage would still exist in an AI-makes-art ecosystem. Patreon exists after all. But one could argue that even that - an artist creating a Patreon to keep themselves afloat, or to finance the creation of their music - isn't actually making music for music's sake anymore. They're creating music based on the wants of their supporter(s).
Now, if we're just talking about people making jam bands, or high school/community orchestra/etc. performances ... well, yeah, those may continue to exist too. But I dunno, it seems quite bleak to have music exist solely as a creative outlet, and not as one that could financially support the artist too.
I may not be organizing my thoughts fully on this, so apologies if this is rambly and disorganized.
Nah, I see where you’re coming from, and I don’t entirely disagree. For what it’s worth I’m very much not saying commercial incentive diminishes genuine art, either - I don’t necessarily think...
Nah, I see where you’re coming from, and I don’t entirely disagree. For what it’s worth I’m very much not saying commercial incentive diminishes genuine art, either - I don’t necessarily think that all commercial content rises to the level of being considered art at all, but a huge amount of it does on its own merits and the payment doesn’t negate that. Hell, sometimes commercial constraints genuinely act as part of the art - I don’t think Clerks would be the movie it is with a higher budget, for example.
I’ve just also seen enough genuinely wonderful and moving work put out into the world from people who make their living elsewhere that I don’t really worry about that ever going away.
I think I’ve kind of skipped ahead to the part where the economy is so laughably fucked that I don’t see any link between something being worth doing (in the sense that the world and the people need and want it), and that thing actually making a living - I don’t worry about it being taken away because I’m already operating in a world where that kind of sensible reward structure is dead. Either we fundamentally restructure the economy to recognise a century or so of exponentially increasing productivity per worker and an exponentially increasing number of workers on top of that, in which case the “making a living” part is moot, or we’re heading for a technofeudalist hellscape in which we’re all doomed, in which case the “making a living” part is moot.
you seem reasonable lol—what does anyone actually get from the argument that AI is "wrong" and we shouldn't use it or benefit from it, when giant corporations are 100% going to use it to destroy...
you seem reasonable lol—what does anyone actually get from the argument that AI is "wrong" and we shouldn't use it or benefit from it, when giant corporations are 100% going to use it to destroy employment models? Like if I had an opportunity to buy some new hydrogen-powered car that cost $11/year to operate, & everyone was like "No you purchasing that is going to invalidate the work of thousands of oilfield workers who depend on gasoline barrel sales for their livelihood"—idk it feels gross saying it, but what am I really supposed to do individually? No one is going to stop a giant corporation from spending less/making more, and every driving/transport/shipping company will be all over that the second it's feasible (or, if self-driving cars & drone delivery &etc &etc are an indicator, then prob before it's feasible lol). So why do I have to hate the thing, and even if I do then what am I supposed to do about it.
I don't think that the rise and prevalence of AI is a foregone conclusion, considering how much advertising they're forcing on us to insist that we use LLMs and what not, along with how little...
I don't think that the rise and prevalence of AI is a foregone conclusion, considering how much advertising they're forcing on us to insist that we use LLMs and what not, along with how little these companies are actually making.
Furthermore, I argue that we shouldn't use it for moral and ... I dunno, "logic-related" reasons. These models were trained on stolen and copyright-infringed materials, and in some cases with the intent of replacing people in what they do. It would be one thing if the LLMs were being used to replace people in inherently-dangerous jobs - for example, automated control systems in industry have replaced the need to send people into dangerous facilities for the sole purpose of turning valves, activating pumps, etc. What's more, in those cases there needs to be a person at a control board that can watch and control said systems.
No, in these circumstances the AI and LLMs are replacing people in artistic endeavors - be they creating art, music, or written works. And yes, I consider writing things like encyclopedias and textbooks to be "artistic" as well.
So, if the people whose works were stolen and copyright-infringed were actually fairly paid for their works, I would have less issues with the use of these LLM models. Those same models would also probably be bankrupt, and/or prohibitively expensive to use, because they stole a lot of works and should have their pocketbooks raked for it.
As for the "logic" argument, what I mean here is that people are, in some cases, using these LLMs and what not to replace thinking and researching. Back in the day, we'd go to the library and hit the card catalogs to find resource books for research on whatever subject we wanted. Google and other search engines kind of replaced that for a lot of students, but when Googling something you still need to, y'know, actually go to the page in question to read about the topic you're searching. With LLMs, you don't even have to do that - you ask the LLM "explain the 100 Years War to me", and it'll do it. You have no reason to search further; the LLM has done the thinking and researching for you, and you never have to develop those skills.
In short, LLMs can make people worse at reasoning, worse at research, worse at "thinking". This is backed up by ongoing research as well.
No one is saying you, thumbsupemoji, have to hate LLMs or what not. We're just sharing the reasons why we don't like them.
I could dig into the example of a hydrogen-powered car, but I don't want to derail the conversation. Suffice it to say, as someone who works in the oil industry but has significant moral issues with the work they do ... I would welcome a cheap replacement to gas-powered vehicles! But hydrogen ain't it.
Yeah I know nothing about hydrogen lol, and it's a unique situation so I struggled to find something semi-relatable—I have strong feelings about copyright as well, although maybe more...
Yeah I know nothing about hydrogen lol, and it's a unique situation so I struggled to find something semi-relatable—I have strong feelings about copyright as well, although maybe more controversial ones : ) I still don't know who the "we" is in the argument, though, & what I was trying to say I think was : if I announce "I believe this is bad for humanity," and there are people like Eliezer Yudkowsky who are devoting their whole careers to actively saying that right now, & have been for years, then it just feels like saying something really really common & profitable is terrible & guys we gotta stop—again, who's "we"? I'm not trying to be discouraging or despondent, I just don't know what the end goal is beyond agreeing/disagreeing; right now every company is set on coming up with "their" intranet-style version of copilot or gemini or whatever, like nonstandardized light bulbs, but when someone finally makes it affordable & easy to deal with, that's when the user rate is going to shoot way up. So maybe getting to someone involved with that could improve the outcomes there? Idk I don't have answers just questions lol
As an amateur musician, I sometimes pay for things, but I also borrow a lot from other musicians without compensation. I don’t see what AI firms are doing as significantly worse than what almost...
As an amateur musician, I sometimes pay for things, but I also borrow a lot from other musicians without compensation. I don’t see what AI firms are doing as significantly worse than what almost all music fans do sometimes.
"competition" is kind of wallpapering over the fact that people who have put in zero effort are stealing from people who have put in a lot of effort. Like, imagine if Taylor Swift put out a new...
"competition" is kind of wallpapering over the fact that people who have put in zero effort are stealing from people who have put in a lot of effort.
Like, imagine if Taylor Swift put out a new song, and I just copied it, somehow promoted it better, and my version of her song sold a hundred million copies to the detriment of the "real" version. That would be just competition as well, but pretty blatantly unfair and immoral.
AI music is doing the same thing except it's cribbing from millions of unverifiable recorded tracks. It's not a consciousness being influenced by those tracks and remixing it based on their own lives experiences, it's just taking them, converting them to weights, then composing a song based on a sentence long prompt with a sprinkle of randomness thrown in.
There are obviously things that cross the line from "well, that's just competition" to blatantly immoral and unfair practices that hurt us all long term.
Often the artist is not creating music with the intent of just pleasing your ears and rake the money in. They tend to use it to communicate something. Go and listen to Voltaire - Industrial...
Often the artist is not creating music with the intent of just pleasing your ears and rake the money in. They tend to use it to communicate something.
Now the machines are working tirelessly
Through all night and day
Making garbage of our image
For a world that's "Made our way"
They won't stop! Until every inch
From Peru to Bombay
Looks like a mall in the U.S. of A
I don't think giving more space to generated sounds that game human psychology to get an easy win is a good idea. I would very much prefer to give space to interesting people with a story to tell.
And I don't really care for starving artists. The professionalization of music and the way it has turned into an industry is also bad.
It's probably going to happen more often as well. Music generation is in its own category because unlike LLMs, image generators, videos, etc, AI music is actually pretty good in its own right -...
It's probably going to happen more often as well. Music generation is in its own category because unlike LLMs, image generators, videos, etc, AI music is actually pretty good in its own right - morals and ethics apart of course.
And it's something that big companies like Spotify have been investing, so I'm expecting that we'll see more and more AI music taking center stage in the future.
If this is a good or bad or terrible thing.... That's another discussion
AI music to me feels very similar to AI image generation actually (and to some degree LLM generation as well) - there is this quality about both that is hard to put into words exactly, but a sort...
AI music to me feels very similar to AI image generation actually (and to some degree LLM generation as well) - there is this quality about both that is hard to put into words exactly, but a sort of uncanny intensity. Like you can feel the training is definitely distilling some deep human sensory processing mechanisms from the vast amount of human created data it uses, and is able push those sensory buttons in a very direct way - sometimes uncomfortably so, but with careful curation of the outputs, you can select examples of both that are close to the edge and still feel "realistic"
I think we have more defenses built into our visual processing (as we primarily interact with the world visually, and expect a lot more logical consistency in our visual inputs) vs audio processing, which makes it a bit easier to get AI music past them - most people process music on a more abstract level.
I also think we have a built in sensitivity to mimicry - if we detect something is trying to fool our senses in certain ways, it can trigger a negative emotional reaction. I find that the more AI generated content I am exposed to, I develop a new level of aversion towards it - what at first glance seems curious, even compelling, later makes me feel uncomfortable and repulsed as my brain recognizes it as mimicry of human expression. I went through this with AI images, and now AI music as well - once I begin to recognize the signs, I start to hate it, and I don't mean as a conscious choice (I actually came into the gen AI space with very open mind initially), the repulsion comes from some deeper layer of processing. I guess I just don't feel comfortable having my emotional buttons pushed by robots.
This has to be an individual thing because I've already seen many generated images where it was difficult to impossible to say whether they were real or not. And based on the relatively large...
I think we have more defenses built into our visual processing (as we primarily interact with the world visually, and expect a lot more logical consistency in our visual inputs) vs audio processing, which makes it a bit easier to get AI music past them - most people process music on a more abstract level.
This has to be an individual thing because I've already seen many generated images where it was difficult to impossible to say whether they were real or not. And based on the relatively large "image generation turing test" done by Scott Alexander which used a number of curated real and generated images in different painted, drawed, digitally painted, 3D rendered and other styles I am way above average in my ability to spot them. There are styles in which AI is really good and others in which it's really bad, plus there are some common styles that are not technically bad but are immediately recognized as AI, but in the styles in which a model is good at it's possible to create something where most people cannot recognize real from fake even with effort.
However I have not so far heard a single generated song where I didn't quickly realize that it's AI generated. I think that most often you hear the artificialness in the color of the voice, and it's no different here. I think it also depends on the quality of reproduction, because many of the artifacts get lost if you just listen to it on your phone, but good headphones or loudspeakers really make a difference.
For the record I am both a musician with an interest in sound reproduction and a graphic artist, both on a "paid hobby" level, and I commonly use generative image AI myself.
Not to detract from the outrage, but is there any significant difference between AI country music and this type of thing? Grift is grift one way or the other, right? I've had exposure to people...
Not to detract from the outrage, but is there any significant difference between AI country music and this type of thing? Grift is grift one way or the other, right?
I've had exposure to people who like country, both enjoyers and creators, and I know there's good stuff out there. But the fact of the matter is that there are listeners who don't really seem to care what they listen to anymore. It reminds me of how Netflix is purportedly asking studios to make movies that don't require the viewers to pay any attention to them. I bet AI can do that too.
This may sound like gatekeeping and snobbery, but I agree and I would expand this further: most "top xx" popular music of whatever genre is slop created by a cynical industry. So no matter what we...
This may sound like gatekeeping and snobbery, but I agree and I would expand this further: most "top xx" popular music of whatever genre is slop created by a cynical industry. So no matter what we think about generative AI from a moral standpoint, it's not surprising that a new kind of slop can compete with the old kind of slop.
I predict that this is going to become very common in the future, and that generative AI is going to be used at this hyper-mainstream end of the spectrum, where people just don't care, and at the opposite end of the spectrum, in the strongly niche spheres where people are going to use it in creative ways to push the envelope. And the hugely broad middle of the road group, the people who like music more interesting than whatever's on the radio but don't seek out the most out there stuff, is going to stay strongly anti-AI.
Did you accidentally link the wrong song there? That's the Bo Burnham parody of a country song. Surely you're not asking if there's a significant difference between an AI song and a satirical song...
Did you accidentally link the wrong song there? That's the Bo Burnham parody of a country song. Surely you're not asking if there's a significant difference between an AI song and a satirical song made by a comedian?
I don't think @Protected meant "is there any significant difference between AI country music and this Bo Burnham parody song" but rather "is there any significant difference between AI country...
I don't think @Protected meant "is there any significant difference between AI country music and this Bo Burnham parody song" but rather "is there any significant difference between AI country music and [what this Bo Burnham parody song is trying to say]".
And "what this Bo Burnham parody song is trying to say" is something like "most country music superstars are pandering liars".
Yeah pretty much :D Grifters gonna grift. I have no respect for the music industry to begin with (as I recently stated right here on Tildes), so I'm not surprised it would facilitate the...
Yeah pretty much :D
Grifters gonna grift. I have no respect for the music industry to begin with (as I recently stated right here on Tildes), so I'm not surprised it would facilitate the connection between mainstream listeners and AI-generated content. Nor do I think this is avoidable. Even though I personally dislike generative AI.
In the near future the grifters won't necessarily need musicians - they don't need the artistry of music - at all, but countless people actually love musicians and music and will hopefully be able to seek them out and connect with them more directly, as long as the Internet itself doesn't go to hell entirely. It's not the way things were fifty years ago, but it's a much larger and dispersed and globalized market anyway. Things were never going to be the same. Nvidia killed the radio star hehe
I see @cfabbro went for exactly the same example I did :)
@V17 made an interesting point about use of AI on the experimental, niche end of things that hadn't occured to me at all. That honestly bothers me a little. If AI becomes prevasive in the mainstream, how original can anyone be by using it?
I wouldn't worry about that. Surely a niche artist isn't going to use basic AI to create unoriginal mainstream music. I was thinking of stuff like training music making AI models on field...
@V17 made an interesting point about use of AI on the experimental, niche end of things that hadn't occured to me at all. That honestly bothers me a little. If AI becomes prevasive in the mainstream, how original can anyone be by using it?
I wouldn't worry about that. Surely a niche artist isn't going to use basic AI to create unoriginal mainstream music. I was thinking of stuff like training music making AI models on field recordings from a steel mill, experimental jazz musicians improvising with a realtime generating model (surely that's just a matter of time now) or making weird post-club music even weirder and more post-clubby.
Let's note that Billboard has been the textbook definition of slop quality for several decades at least. This is 'secondhand smoke at the airport' music we're dealing with here. The charts are...
Let's note that Billboard has been the textbook definition of slop quality for several decades at least. This is 'secondhand smoke at the airport' music we're dealing with here. The charts are politics and payola, that is all they have ever been. At no time in history did charts ever cover the 'best' music. They listed whatever was being pushed by the industry at the moment, typically via backroom deals with labels for radio exposure and market trend capitalization.
Charts exist as advertising to create popular music, not simply catalog it. The one who controls the catalog gets to choose what is popular, that's the scam. They pick the artists that make them the most money, period. If you didn't pay into that system in some way (usually by giving up more of your rights and a lot more of your profit), you didn't get radio play, and you didn't get to be on the charts. It's 'product' and all about moving 'units'. Even the language used by the people in the industry does its level best to divorce the art from the end product. Now ask yourself... does AI music make them more money than real artists do?
Put another way, Billboard charts are what people who don't listen to music consider to be music. The kind of people who if you asked them to name seven genres, they couldn't do it. I look forward to watching AI slop devour the charts permanently - if there's one thing an AI is going to be good at, it'll be cranking out basic 4/4 popcorn for elevators and bad DJ mixes for dull dancefloors.
I think musicians who make library music are the ones who are truly doomed. That large and quiet segment of the music industry just went up in digital smoke. If artists want to make money in this world, the old way is still the best way - get good live, go on tour, sell tickets, sell your albums and swag at the shows. Retain your rights, remain independent, keep your entire revenue stream to yourself - do not share it with corporations.
Here's an unpopular take - if an AI can kick your ass as a musician, perhaps it's time for a career change. It wasn't so easy back in the day to set up a digital audio workstation for a couple grand and compose your magnum opus. You had to have real talent and be at least good enough to keep up with the session musicians without them kicking you out of the million dollar studios where the music was made, because you were a chump who hadn't put in the work to develop chops yet.
On some level, all this kicking and screaming about AI strikes me as panic from pretender musicians who were never good enough at their craft to be offered a record deal in the first place. Cheap production and savvy computer software convinced them they could make millions without bothering to learn their chords and scales. Auto-tune trivialized singing for people with zero vocal control, and now it's replacing those same people. Now that AI is here to challenge them, they don't want to put in the work to do it live and go on tour.
I'm strangely cool with that as a cutoff point. If you can't beat the AI, you don't get to play this game. Turns out it's not hard to beat the AI - just show up as an actual live person and know how to jam. If you can't do that, I haven't got a lot of sympathy. Life is competitive, and learning how to play well is real, hard work even if you have natural talent. You were never promised a record deal as part of your basic human rights package - some things have to be earned the hard way.
That reminds me, have we killed Ticketmaster yet? It's a bigger problem for every touring artist than AI will ever be. Kill that malignant racketeering cancer and musicians get to double their ticket profit at the same time they cut the ticket price in half so the rest of us can afford to see the show.
Case in point: the VTuber Mori Calliope has charted in multiple countries, but I bet you haven't heard her on the radio in the US. (Her new one, Orpheus, is pretty good.) YOASOBI's Idol and Creepy...
The charts are politics and payola, that is all they have ever been. At no time in history did charts ever cover the 'best' music. They listed whatever was being pushed by the industry at the moment, typically via backroom deals with labels for radio exposure and market trend capitalization.
Case in point: the VTuber Mori Calliope has charted in multiple countries, but I bet you haven't heard her on the radio in the US. (Her new one, Orpheus, is pretty good.) YOASOBI's Idol and Creepy Nuts' Bling Bang Bang Born were popular enough to be global top songs on YouTube, and YOASOBI even played Coachella, but are ignored by the media gangs in the US. Bad Bunny has been fighting Taylor Swift for the top of streaming plays for years, but people acted like he didn't exist until suddenly this year.
There's a lot of cool music out there, but the US is extremely insular and transparently cultivates a false selection of "popular" music. The silver lining is, with music and movies in that sphere trending so mediocre, it'll hopefully lessen the US's ability to project pop culture to the rest of the world, and other places will get more of a chance.
The chart that this topped is the "Country Digital Song Sales", which does not include streaming - meaning this is purely powered through purchases of the track. This is most likely a situation of...
The chart that this topped is the "Country Digital Song Sales", which does not include streaming - meaning this is purely powered through purchases of the track. This is most likely a situation of an AI company buying their own single so they can advertise that "They topped a chart!", in a similar way to wine companies entering tiny competitions so they can legally call their wine "award-winning" Articles like this are advertising for AI companies - if the illusion is kept of AI as an "inevitable future", then people are more likely to invest in order to keep up.
Thank you for mentioning this, I was looking for someone to do so before I did. This track's status was achieved with 3000 $1 downloads. So far, this is a nothingburger.
Thank you for mentioning this, I was looking for someone to do so before I did.
This track's status was achieved with 3000 $1 downloads. So far, this is a nothingburger.
I don't understand the issue. Is the point of music the production process or that it is being enjoyed by the listeners? People have enjoyed canned music that reuses the same recipes over and over...
I don't understand the issue. Is the point of music the production process or that it is being enjoyed by the listeners? People have enjoyed canned music that reuses the same recipes over and over again at least for decades, but probably since the first dinosaur made a chirp. AI seems to be extremely good at pandering to popular demand. Whatever people like, AI reproduces it. So this is not surprising at all.
I don't like AI at all and I have never really used it, but if the ability to make orchestral music with a click of a button means we can finally get rid of the music industry and go back to playing guitar for your friends, I'm all for it.
It's very much both. Music created by someone for themselves alone has just as much of a point as music created for millions. And in some cases, like educational ensemble settings, it's almost...
Is the point of music the production process or that it is being enjoyed by the listeners?
It's very much both. Music created by someone for themselves alone has just as much of a point as music created for millions. And in some cases, like educational ensemble settings, it's almost entirely about the process. Art-making is just as core to the human experience as art-appreciating. People get together and make music because they enjoy it, not necessarily because anyone else cares that they're doing it.
Christopher Small talks about how "musicking" is an activity, one that involves not just the listeners or performers or writers/composers, but all of them together. He strongly argues against the popular convention of conceiving of music as discrete objects (songs or pieces) and firmly believes it should be understood as an event. That event may unfold in stages (writing, production, listening) rather than happening all at the same time, but it's still an event. It's about the doing of the thing, and that necessarily includes the creators.
Part of his book Musicking: The Meanings of Performance and Listening is online if anyone wants to explore. The prelude is enough to get the gist, don't feel intimidated by the idea of diving into a whole book. It's good stuff.
Sorry, I don't know why I put that question there. Of course the creation of music is at least as important as listening to it. Some of the best melodies I've ever heard was me whistling to...
Sorry, I don't know why I put that question there. Of course the creation of music is at least as important as listening to it. Some of the best melodies I've ever heard was me whistling to myself.
I guess my point was: If music creation is holy and can be desecrated by mindless repetition, countless numbers of musicians have been guilty of doing exactly that throuout history. AI is just making it easier for non-musicians to repeat popular patterns.
What should be more infuriating is that country music for the past several decades has been largely cookie-cutter songs, long before autotune came on the scene. It's no coincidence that the...
What should be more infuriating is that country music for the past several decades has been largely cookie-cutter songs, long before autotune came on the scene. It's no coincidence that the country genre is the first to see an AI work on a chart, albeit digital sales and not one of the main charts. I think it's just a matter of time until we see AI songs on main country charts, and then other genres. Record companies exist for one reason: to make money. Eliminate the artist and you've eliminated a cost. Of course this won't work for music from an artist like Taylor Swift where there is a persona behind it, touring, etc., but for probably 90% of popular music, it will work, well.
Y'know what I hate way more than AI generated art? Being told how I should feel about it by others.
How is that any different than most other top 40 music? New Country music especially has been riddled with soulless, heavily manufactured artists and songs pandering to their core audience's idealized "traditional" values for decades now. Bo Burnham 100% nailed it all the way back in 2015 in his Make Happy special:
The only difference between then and now is that it was record execs and their marketing departments doing the manufacturing of the music back then, and now it's a computer algorithm.
Just to pile on, I can reality-check that assumption with this pure-AI track. Those actually are AI-written lyrics, too. From the actual human who created this (imo rather delightful) little track:
His little AI experiments are all right here. There are a lot of them. So, this track was created by a software developer, not a musician.
If you want to discover the track you've still got to play it live, give it a chance to breathe and echo at a few shows, play with it on tour. I want to hear this AI track covered by The Greyhounds. They would embarrass any AI with the final cut. ;)
I'd also like a tool that could track the influences the AI used creating this track back to the original sources. If it's just lifting and remixing, seems like it's on the scent of some artists I'd like to hear... no credit is given to whatever sources the AI cribbed this from. Gotta admire capitalism for impossibly reinventing theft yet again, in an entirely new way. I think all the artists suing over AI right now have got a point. No attribution, no compensation.
Another good example that, like the one mentioned in the article, also made the Billboard charts: Xania Monet - How Was I supposed to Know?
And as far as contemporary R&B music goes, IMO it's actually a pretty decent track too despite using an AI generated voice.
Being told how you should feel about music is basically par for the course in western discourse. This goes across eras and genres. A lot of times the critiques are raising real issues, but it’s really about people having their identity tied to yet another commercial enterprise. Sorry for the tangent, it’s just a pet peeve of mine.
Amusingly, ten years ago if you talked about AI generated music people would probably marvel at how niche and creative of an exercise it is. Something reserved for serious academics.
Record companies are probably salivating at the chance to cut artists out even further. But imagine the irony if they get their clock cleaned by some TikToker with an AI generated next gen Gorillaz or Hatsune Miku and a well tuned agent system.
Eh I wouldn’t think so or at least not if they’re smart IMO. You already see tech companies purchasing or jockeying to purchase movie studios.
If they pursue this route, I can see it being a hybrid model of Uber and AWS. Uber(with Lyft) burning VC cash to dominate the market then extract all of their value once dependent. Everyone moved to “the cloud” and now half of the internet (hyperbole but not far off) goes down if AWS, Azure or GCP trips over a wire in the DC.
I expect once the AI dust starts to settle from this Wild West, any company or industry that put too much of their “portfolio” into whoever is left standing is going to be highly susceptible to essentially being held hostage. One of their own creation. Back to the cloud computing reference, trying to convince companies to move back to on-prem is a great example of this.
Money for nothing -Dire Straits
It's literally all said in that song.
There I Ruined It did a take last year on the same concept.
Worth mentioning that a lot of AI channels (like TIRI) are genuine musicians in their own right, and are simply having fun and being creative with the new style transfer tools that are available. He's still composing the words, playing the instruments and singing the melody. There's a lot more that goes on than just Suno-generated prompting.
Steve Goodman (and apparently an uncredited John Prine (RIP)) nailed it in 1971 with You Never Called Me By My Name
This comment alone was worth the work to join tildes.
Idk, if people are listening because they enjoy the music, is what it is.
The "music", as it is AI-generated, is directly harmful to the people who make the music in the first place.
If it is being boosted by bots or whatnot, that’s an issue, but if is just more popular because people like it more, then that seems fair game. That’s just competition.
It's "competition" in the same way that introducing an invasive species to an ecosystem is just "competition".
That AI-generated slop was created by stealing, for lack of a better way to frame it, from essentially every country music artist that has music available for download. It would not, could not exist without human creativity and feeling. This AI garbage only exists as a consequence of taking from human creation, without actually giving back to the artists from whom it stole. It can generate far more music than a person could create in the same time period, and the vast majority of it will be garbage. Machine-generated music will be able to flood the proverbial zone with so much garbage that you will have a very hard time finding anything created by people, if this becomes the norm.
The fact that some of it comes out listenable, even popular, is incredible as a technical feat. But it comes at the expense of everyone who cares, even a little, about the artistry behind what they do.
Imagine the worst-case scenario, where commercially-viable music from people is no longer available. People are no longer recording music, because it gets drowned out in the wave of AI-generated, algorithm-pleasing slop. There is no more feeling, no more soul, no more meaning behind anything you listen to anymore - because it's generated by a machine, looking to get the most views/listens/etc instead of trying to create something meaningful.
I dunno, I don't see a world where there's an actual benefit - to people, to culture, to anything - from just hosing down the internet with AI-generated slop like this.
If everything it makes is garbage, then it sounds like that says more about the human artists who are losing to said garbage in the ratings. Like I said, if there is foul play in the ratings like botting, that’s an issue, if it’s just people choosing to listen to it, skill issue?
I don’t see why Billboard should omit them if real people are listening, and I think people are free to listen to what they want. Is what it is.
What do you want to be done about it?
That's a pretty callous thing to say about people potentially losing their livelihoods due to a technology that is actively stealing their work.
I think financial recompense to the artists who have had their work stolen and their copyrights infringed is a start. Again, none of this AI-generated music would exist if it weren't for the fact that human artists have had their works stolen and copied by these huge AI companies. These companies aren't actually competing "fairly" in the first place - all of these LLM "AI" chatbots are based on news articles, books, literature, and the like having been published by people, for people, and protected under copyright.
Or, at least, they should be protected under copyright, except that these companies just ... stole the content, shuffled the bits around, and didn't pay for fair use of the material in the first place. There's ongoing legal cases surrounding it, and I imagine similar legal issues will become more prevalent, and find their way into AI-generated music too.
And if paying fair financial recompense to human artists would result in these AI-generated music companies going out of business - isn't that just competition too?
It’s more of a response to everyone who keeps saying it’s “garbage slop” or whatever, which seems logically inconsistent. You can’t have your cake and eat it too - either these music models can make music good enough that real humans like as much or more than music made by humans and are therefore deadly competition for artists, OR they just make dumb garbage no one likes and no humans need to be worried.
Personally, I’m fine with it being the former. The artists make good music, and so do the music models. That’s why this is even a relevant topic. If you call all AI music garbage, you’re implicitly calling the human artists who lose to them WORSE than garbage.
I think whether or not training a model counts as a non-transformative work of copyright is something that is not yet determined. Those court cases are working on it, and what precedence they set will determine future policy. The machines of government are in progress for this already. We just have to wait for them to complete.
I don't call it garbage because it's "bad music". I call it garbage because it's bad for culture, it's bad for actual artists who do actual work, it's bad for the environment, it's bad for people who want any financial benefit from their work ... it's just bad all around.
Garbage isn't just an assessment of quality or utility. It's an assessment of value. To me, a kids Dora the explorer quartz watch I found on the ground is garbage because it was mass produced in a Chinese factory for pennies just like the millions of other identical models. It's worthless because it's not special. No care was put into its production, it doesn't actually say anything. The fact that it objectively tells time better than a handcrafted 1 million dollar mechanical movement Patek Philippe doesnt somehow make it not garbage.
What does “good enough” mean, though? I think it’s logically consistent to believe that many people have bad taste in music. Popularity shows a kind of skill at making stuff that many people like and that’s important professionally, but you don’t have to like the result. Compare with junk food being popular, but known to be bad for you. There is more to life than commercial success.
It’s a lot harder, though, to show that music is objectively bad. Usually this is conditional: if you like X, this is a bad version of X.
Also, listening to music out of context, not knowing how it was made, is only one way to listen. While it can be interesting to do blind auditions and see what you can puzzle out going in cold, it’s also interesting to listen to music in context, knowing something of its history and influences. AI-generated music is going to have a different meaning than historically important music made by musicians.
Compare with sports. A sports movie is different from watching a game. You can admire the skill of the athletes performing in a real game or the skill of the actors and all the other people to fake an exciting game when making a sports movie. These are different kinds of admiration.
I think it’s possible to admire AI-generated music, but it will be for different reasons than admiring recordings by musicians. The AI researchers and programmers building these music generators are good at what they do, and they will get better. It would help me appreciate them if I knew more about how they did it.
(It’s also possible to admire the skill that went into making something and not be all that into it.)
I can't help but notice that you go from "actively stealing" to "being based on". It's just wild how liberally the word theft is used when it comes to conversations about AI. And I don't know about you, but it's not really obvious to me that authors are owed money for any algorithm that uses their works as input. Like if I run a script that takes articles published online, takes a random adjective, verb, noun from each and cleverly arranges them into a legible text - do I now owe financial recompense to every single author of an article that I used? It doesn't make any sense to me, the value of the finished work (if any) would be in the clever arrangement of the words, each one on their own would be worthless. Similarly, a raw dataset is just a pile of garbage, the value comes from filtering it, labeling it, writing an algorithm to train on it, from compute costs, from R&D behind it all. All of that work is not done by the original authors.
You could say, well what about the data that wasn't openly published on the internet, and yes indeed that would be piracy if it was downloaded illegally, something that no one really disagrees with including the courts. Yet it is NEVER talked about in AI criticisms, probably because in the eyes of most people piracy isn't really such a big deal, instead we have the accusations of "callous theft!" and such, which is in my opinion simply disingenuous.
I use different words because I was taught not to use a single word too frequently. Writing style and all that.
Let me make my stance clear: for lack of a better definition for what AI/LLMs/whatever are doing with music, written works, TV, films, etc - all of these AI companies are stealing.
Your argument is confusing, because you're drawing a line where it's okay to take piecemeal works published and available online, but not okay with something downloaded illegally or otherwise unavailable online. A lot of these LLMs training models are using works that were downloaded illegally. In many cases, such as ChatGPT, they are making direct quotes from the works they were trained on, and not simply cut-and-pasting random words.
I would argue that there is a similarity in the training of an AI to (ideally) regurgitate facts and otherwise be a point of reference ... to educating people. When we get educated, a teacher/professor/whatever often gives you a textbook, an assigned reading, that sort of thing. They are not asking you to repeat what you read verbatim, but rather to read it, consider it, and in the end be able to bring about original ideas and combinations of words that speak to truth, history, whatever.
When a teacher uses those textbooks in the classroom, they (or the school district, whatever) have to pay for the use of those books. They can't just steal them and insist that because they aren't repeating the words from the text verbatim, that it's somehow okay and solely original work. The vast, vast majority of the work used to train AIs was not paid for.
That's fine and good, but do you acknowledge that "being based on" and "stealing from" are fundamentally different concepts and the difference between them makes or breaks your argument?
To also be clear, I think there are a variety of better ways of describing whatever AI are doing with music et al than "stealing". I'm trying to say that it is not at all intuitive and you haven't demonstrated why exactly it is theft.
As I said in the previous post, piracy is not at all the same. I don't even think anyone would ever say "I stole a PDF of a textbook" because it's just not the same thing intuitively as lifting a book from a shop. So yes, in my opinion there's clear distinction there. Personally I also don't particularly care about piracy that much. In fact I think copyright laws are too draconian and way more works should be in public domain. I do agree that just as schools should pay for textbooks and companies should not pirate the training data. Thankfully there's no inherent requirement for training data to be pirated, it is not at all a problem with AI as a technology.
Now if you want to argue that using works publicly posted for free to view and download for AI training is stealing and every one of those authors is owed a compensation then you really should make a strong argument for it, because I don't see it. I mean, did Google researches know that AI was gonna become a golden goose when they posted their paper on Transformer architecture for all to use? No. Are they now retroactively owed compensation because their work is now being used to make billions? I don't think so. Neither do I think that everyone who posted their art/music/whatever online is now owed compensation because it happened to be used to train valuable models. That's just... progress.
Now if you want to say that you should be able to opt out of whatever you're posting to be used in AI training going forward through appropriate licenses, I can definitely see that argument having validity.
Individual piracy is not the same as corporate piracy. While copyright laws may or may not be draconian, and more works should be in the public domain, the fact of the matter is they're not - not yet - and in many cases the authors of copyrighted works are still alive and protected by those laws.
Training data need not be pirated, but a significant amount is.
https://www.tomshardware.com/tech-industry/artificial-intelligence/meta-staff-torrented-nearly-82tb-of-pirated-books-for-ai-training-court-records-reveal-copyright-violations
Elements publicly posted for free viewing is not the same as being made for public use. I can view a lot of things on YouTube, but created content is protected by the law. And again, there is a difference between a single person ripping a free YouTube video, and a corporation ripping millions of videos.
Ultimately, I don't think we're going to see eye to eye on this, and that's fine.
Again, you're using the word "ripping", but I don't see how it is that. Remixes and edits exist on YouTube just fine.
I'm aware, that's why I said that companies should pay for the data. But this is a criticism of individual companies, not of "AI". Again, if that's the extent of your position on why it's "stealing" or "ripping", then it seems extremely misleading to use those terms.
To my understanding, and I could be wrong here, but of the dozens (are we up to hundreds yet?) of AI models available today, literally only one company has claimed that their training content was licensed. And even then, I believe their “licensure” was changing their own terms and conditions after having amassed a large library of content.
I think if every player on your favourite baseball team is using illegal performance enhancing drugs except for one player who isn’t, it’s entirely fair to say the team (collectively) is doing the wrong thing. Rather than individually naming each player for doing the wrong thing but the team overall is beyond criticism.
We aren't talking about licensing the training process. It's very debatable that it would be required as opposed to falling under fair use. We're talking about downloading media illegally, i.e. piracy. This only applies to something like downloading a Disney movie. 99% of the time the conversation about AI being "theft" revolves around artists and content creators that post their content online to be freely accessed, so scraping it isn't the same thing as pirating a movie or whatever.
Licensing is the solution to piracy, legally speaking, so I assumed they go hand-in-hand in a discussion about piracy. After all, if I have a licence from the creator to do XYZ then by definition when I do XYZ it isn’t piracy.
I’ll also push back against the claim of fair use, which is an affirmative defence against copyright infringement, but which is often misused across the internet. Caveat that I’m not a lawyer or a judge, but it’s fairly easy to look up what fair use actually means and where it applies and what considerations form part of this defence.
Creating a for-profit business by harvesting entire works without credit and without even attempting to discuss licensing with original creators and not freely sharing your result smashes all four factors of fair use. Illegally downloading the latest Disney+ exclusive movie to watch at home is much closer to fair use than anything OpenAI or their competitors have done in this space.
Why would it only be valid into the future now that these models exist? Surely it’s also valid that anyone should be able to opt out of their existing works also not being up for grabs for AI training?
I meant as opposed to making anyone that have already trained a model on their works retroactively pay for it.
I’m not sure why that should be an exemption either. If I created a blog 10 years ago and my intention was “nobody should be able to read my blog unless they pay me for it” and then I put up a paywall, and then I go “okay one exception is for google search crawlers to index my blog so more people can find my works and pay me for access” then why should this new technology get an exemption to my initial intentions? Just because it’s new?
Because it's absurd to demand compensation based on some internal intentions you may have had 10 years ago that weren't explicitly stated.
I’m not sure what you think “not explicitly stated” means, but if a website has a robots.txt that permits only a google search web crawler, or otherwise enumerates the options that the website owner writes up, I think that’s pretty explicitly stated intentions.
I think it’s ludicrous to assume that a new technology or a new use case for harvesting data somehow gets a free pass by default, instead of a requirement to seek permission first by default. Especially in the context of a for-profit company locking the result behind a paywall but paying for none of the pieces.
Where does the notion come from that you can post anything you want online and then exclusively decide exactly how people use it?
Plagiarism (taking things as is) is a very specific case that we forbid. What does this have to do with AI training? Moreover, for 10 years everyone was free to copy this content, save it on their computers, run algorithms, study that data, transform it, make it into something else, sell the results and it was fine. Why is AI training any different?
Copyright law is the easiest example I can think of, and various other intellectual property (IP) laws. But given my example included someone not just saying “pretty please” but also enforcing their wishes with a paywall and robots.txt then to my mind it’s clearly also protected by laws against circumventing technological barriers (to grossly simplify, “anti-hacking laws” etc)
Plagiarism is a much more narrow slice of IP, and I mostly see it discussed in an academic environment, but that’s about crediting the original source (which LLMs are also bad at). But it’s certainly not the only case that’s forbidden.
If I start printing and selling copies of the Harry Potter books verbatim (without a licensing agreement), but I make it clear on the front of the book that I’m not the original author, then this isn’t plagiarism, specifically. But this is definitely still illegal under copyright and other IP laws.
There’s a huge difference between what’s simply possible and what’s permitted. I think all of this has been possible, but I disagree with the idea that it’s been legally permitted this whole time. If you can give me some examples so that I have a better idea of what you’re referencing here specifically, that would be great.
Oops wrote out a whole tangent on Fair Use which isn’t really relevant to the conversation currently
And there’s also a significant difference in how these actions are viewed (e.g. when examining if Fair Use applies) when the results are for educational purposes, or for personal/private use, or for commercial reasons, or if the results of the data processing are themselves also freely shared or not.
For example, if I scraped the internet for all writing (including copyrighted works, like LLMs have been doing) and then created an LLM that I never distributed and only kept on my home computer for personal use, then there would be a very weak case against me.
Or if I did the same and wrote academic papers about what I’d learned in the process, and provided educational materials for how this new transformer style model could do much more than markov chains, that would also be a fairly weak legal case against me.
Or if I did the same as both of the above, but also released the entire model free to the world to use, then there would be a bit of a stronger case against me, because now I’m making protected works available to the public, albeit in a somewhat scrambled way.
But it’s been proven than LLMs can be prompted to reconstruct entire copyrighted works, so saying “it’s transformative” as my defence is pretty weak when the model is fully capable of recreating entire non-transformed unaltered works.
But doing all of the above and then putting it all behind a paywall, I’ve now clearly swayed all four factors of Fair Use against myself, and if I don’t have Fair Use to hide behind then it’s pretty clearly in the territory of copyright infringement.
I guess you are using some technical definition of plagiarism. When I say plagiarism, I simply mean copying verbatim. This is not at all relevant to the AI discussion because AI is by definition a neural network (its weights), it cannot possibly be a verbatim copy. And if you want to say that you can get it to produce a verbatim copy, I'll respond with the fact that Photoshop can be used to produce a perfect copy of any copyrighted image.
Textbooks strike me as literally publishers paying a clerk to strip the identity from their source texts in order to package and sell them as an all-in-one, in a way morally similar to these language models. Sure, they give citations, but...economically? What's the distinction?
Generally they're written by professors who are experts in their fields and thus are intentional in what they include and how they write about it.
I think your framing of it redirects just ire at publishers to the entire concept of a textbook. We don't all always read primary texts about every topic, but we still need to learn about Sociology 101.
I'm not sure I've ever learned something from a textbook that wasn't better delivered in lecture, seminar, or primary texts, so the distaste was intended. But that's definitely subjective.
To the point at hand: so the value is in curation, then? If it's A-alright for a publisher to shop around for an expert willing to write a textbook that fits their series schema, who then borrows liberally from secondary sources (nonderogatory, but not original) to fill the 400 large format pages of the manuscript, which is chopped and revised by the publisher comprehensively, wouldn't it stand that if the AI bots were better at curating fact and providing citation, then there's no problem with it? Is it literally that a person arranged the words? Because I'm not sure that's the value, real or economic.
I am someone who believes people offer value that AI doesn't. So no I wouldn't agree. (And we certainly wouldn't want Grok to be writing your history textbooks)
I'm in favor of the open access textbook movement but not literally removing all control of what is taught from the experts with terminal degrees in the field and handing them entirely to the publishers to control the narrative. I'm not sure that the pathway is really publisher seeks author to write narrative they want but it would be worse if if it was publisher uses AI to write narrative with zero oversight. But if you've already decided all textbooks are useless idk why you'd want them to be blatantly so.
I have had good text books and horrible lecturers and the opposite so it certainly depends on the topic, the teacher and the book.
Really, my initial point was that appealing to textbooks is unconvincing. I've outlined my reasoning there. When it comes to the intrinsic value of human construction heavily edited by profitseeking conglomerates as opposed to automated construction heavily edited by profitseeking conglomerates, I'm just not convinced. Beyond that, I don't disagree that it would be bad for a bad textbook generator to replace the currently mediocre textbook industry. I just think that's a matter of time, given how LLMs at their theoretical best can do summary quite well, rather than something intrinsic to the humanity of the drafters of such texts.
Aside: as mentioned in my other reply, I don't think all textbooks are useless, topic-focused books can be excellent references and primers can be excellent introductions, my gripe is with specifically those Sociology 101 type, overly broad and pseudoobjective, examples.
I don't think I follow. Are you inferring that textbooks aren't actually written, but instead cobbled together from other source materials? Because that doesn't seem accurate, at least not for the vast majority of textbooks.
Insofar as a person's Broca's area matched words to the ideas drawn from source materials, yes, they're written. Personally, the claim that neurons make the difference in pattern matching has yet to rise to proven.
I'm being a little overbroad here, I'll admit, and am thinking of the intro textbooks that pretend to articulate a broad number of topics within a subject as a survey, not the more traditional textbooks which are effectively original works focused on pedagogy.
Then we'll listen to the music that isn't commercially viable, that people create for the love of it.
From everything you've said, here and below, it seems like you're angry at the economics (entirely justifiably - I am too), but you're directing it at the technology. It's a problem of culture, and a problem of incentives, and to me at least burning up anger on the tooling rather than the cause is just another means of obscuring where the real problems lie.
It doesn't sound like @Drewbahr is mad at the technology but at the companies who built the technology ingesting music made by people who understand the human condition and make a living off their hard work, then using the technology to output something trivially.
Edit: it's both :) Sorry for speaking for you!
That's a fair way to put it, yeah. I might add that the whole of the AI industry, as I understand it, is effectively exploitative. It is exploiting artists' works without fair (or any) compensation.
I believe that all art - be it painting, music, filmmaking, or whatever - is inherently a conversation between the artist and the person or people appreciating their work. This extends to both the culturally-significant works of art, as well as stuff that is created for commercial use. Even if something is made "soullessly" by a corporation, there are still people working hard to create it. It is a fundamentally human-created thing, that can be appreciated (or not) for what it is.
When an AI model takes in millions of hours of music and regurgitates it into an algorithm-matching earworm, where's the conversation between artist and appreciator? What does the work actually say, other than "Spotify thinks you'll listen to this slop X many times this year, which will make us Y many dollars."
I also take issue with the technology, making it so that garbage media like this can be churned out ad nauseam. I am angry at the technology, because the technology has been given sufficient incentives to exist and be used in this way in the first place. Yes, it's a problem of culture and incentives - but it's also a problem of technology! We wouldn't be having this conversation if the technology itself did not exist.
I’m incredibly biased, I’ll be the first to admit that. But the technology wasn’t built for music - diffusion models were literally that, scientific models to study the process of diffusion. That’s the problem, and I think why the backlash bothers me so much: because the tech exists for a reason, a damn good one in my opinion, and the fact that it’s been relatively easy to port over to other purposes doesn’t undo that. It’s a problem of shitty incentives, and as far as I’m concerned only a problem of shitty incentives.
I do also get frustrated about the whole “all training is stealing” argument; not just because consuming and analysing data is quite different to copying it wholesale, but because it’s a fundamental misunderstanding of what’s possible with the technology as well. I’d actually love to create a model that makes work truly and unequivocally its own, without guidance from existing human work, simply because I find that a fascinating artistic concept in and of itself. But I have neither the time nor money to spend on that right now, and it would be a tricky and expensive way to create something artistically interesting that I’d never see commercial return from - again, incentive problem, not technical problem.
And for what it’s worth, I agree wholeheartedly about art being about communication, conversation between artist and audience. I’ve actually said that in almost the exact same words myself before, but with the exact opposite conclusion: machines can’t take art away from us, because that desire to communicate will always exist.
I think part of the reason that we reach different conclusions is this:
I don't think it's reasonable to expect music to be created for the sole purpose of "the love of the art form". Music, and other art forms, have always had a financial interest as much as a creative one. The ceiling of The Sistine Chapel, Michelangelo's David, many classical musical works, the list goes on - they were all commissioned art, made not solely for the love of the art itself but made possible by patrons. The artists created the work for money as well as for the arts.
Now, I suppose one could argue that patronage would still exist in an AI-makes-art ecosystem. Patreon exists after all. But one could argue that even that - an artist creating a Patreon to keep themselves afloat, or to finance the creation of their music - isn't actually making music for music's sake anymore. They're creating music based on the wants of their supporter(s).
Now, if we're just talking about people making jam bands, or high school/community orchestra/etc. performances ... well, yeah, those may continue to exist too. But I dunno, it seems quite bleak to have music exist solely as a creative outlet, and not as one that could financially support the artist too.
I may not be organizing my thoughts fully on this, so apologies if this is rambly and disorganized.
Nah, I see where you’re coming from, and I don’t entirely disagree. For what it’s worth I’m very much not saying commercial incentive diminishes genuine art, either - I don’t necessarily think that all commercial content rises to the level of being considered art at all, but a huge amount of it does on its own merits and the payment doesn’t negate that. Hell, sometimes commercial constraints genuinely act as part of the art - I don’t think Clerks would be the movie it is with a higher budget, for example.
I’ve just also seen enough genuinely wonderful and moving work put out into the world from people who make their living elsewhere that I don’t really worry about that ever going away.
I think I’ve kind of skipped ahead to the part where the economy is so laughably fucked that I don’t see any link between something being worth doing (in the sense that the world and the people need and want it), and that thing actually making a living - I don’t worry about it being taken away because I’m already operating in a world where that kind of sensible reward structure is dead. Either we fundamentally restructure the economy to recognise a century or so of exponentially increasing productivity per worker and an exponentially increasing number of workers on top of that, in which case the “making a living” part is moot, or we’re heading for a technofeudalist hellscape in which we’re all doomed, in which case the “making a living” part is moot.
you seem reasonable lol—what does anyone actually get from the argument that AI is "wrong" and we shouldn't use it or benefit from it, when giant corporations are 100% going to use it to destroy employment models? Like if I had an opportunity to buy some new hydrogen-powered car that cost $11/year to operate, & everyone was like "No you purchasing that is going to invalidate the work of thousands of oilfield workers who depend on gasoline barrel sales for their livelihood"—idk it feels gross saying it, but what am I really supposed to do individually? No one is going to stop a giant corporation from spending less/making more, and every driving/transport/shipping company will be all over that the second it's feasible (or, if self-driving cars & drone delivery &etc &etc are an indicator, then prob before it's feasible lol). So why do I have to hate the thing, and even if I do then what am I supposed to do about it.
Actually maybe this question is for @Drewbahr lol
I don't think that the rise and prevalence of AI is a foregone conclusion, considering how much advertising they're forcing on us to insist that we use LLMs and what not, along with how little these companies are actually making.
Furthermore, I argue that we shouldn't use it for moral and ... I dunno, "logic-related" reasons. These models were trained on stolen and copyright-infringed materials, and in some cases with the intent of replacing people in what they do. It would be one thing if the LLMs were being used to replace people in inherently-dangerous jobs - for example, automated control systems in industry have replaced the need to send people into dangerous facilities for the sole purpose of turning valves, activating pumps, etc. What's more, in those cases there needs to be a person at a control board that can watch and control said systems.
No, in these circumstances the AI and LLMs are replacing people in artistic endeavors - be they creating art, music, or written works. And yes, I consider writing things like encyclopedias and textbooks to be "artistic" as well.
So, if the people whose works were stolen and copyright-infringed were actually fairly paid for their works, I would have less issues with the use of these LLM models. Those same models would also probably be bankrupt, and/or prohibitively expensive to use, because they stole a lot of works and should have their pocketbooks raked for it.
As for the "logic" argument, what I mean here is that people are, in some cases, using these LLMs and what not to replace thinking and researching. Back in the day, we'd go to the library and hit the card catalogs to find resource books for research on whatever subject we wanted. Google and other search engines kind of replaced that for a lot of students, but when Googling something you still need to, y'know, actually go to the page in question to read about the topic you're searching. With LLMs, you don't even have to do that - you ask the LLM "explain the 100 Years War to me", and it'll do it. You have no reason to search further; the LLM has done the thinking and researching for you, and you never have to develop those skills.
In short, LLMs can make people worse at reasoning, worse at research, worse at "thinking". This is backed up by ongoing research as well.
No one is saying you, thumbsupemoji, have to hate LLMs or what not. We're just sharing the reasons why we don't like them.
I could dig into the example of a hydrogen-powered car, but I don't want to derail the conversation. Suffice it to say, as someone who works in the oil industry but has significant moral issues with the work they do ... I would welcome a cheap replacement to gas-powered vehicles! But hydrogen ain't it.
Yeah I know nothing about hydrogen lol, and it's a unique situation so I struggled to find something semi-relatable—I have strong feelings about copyright as well, although maybe more controversial ones : ) I still don't know who the "we" is in the argument, though, & what I was trying to say I think was : if I announce "I believe this is bad for humanity," and there are people like Eliezer Yudkowsky who are devoting their whole careers to actively saying that right now, & have been for years, then it just feels like saying something really really common & profitable is terrible & guys we gotta stop—again, who's "we"? I'm not trying to be discouraging or despondent, I just don't know what the end goal is beyond agreeing/disagreeing; right now every company is set on coming up with "their" intranet-style version of copilot or gemini or whatever, like nonstandardized light bulbs, but when someone finally makes it affordable & easy to deal with, that's when the user rate is going to shoot way up. So maybe getting to someone involved with that could improve the outcomes there? Idk I don't have answers just questions lol
As an amateur musician, I sometimes pay for things, but I also borrow a lot from other musicians without compensation. I don’t see what AI firms are doing as significantly worse than what almost all music fans do sometimes.
Cue Weird Al’s “Don’t download this song”.
"competition" is kind of wallpapering over the fact that people who have put in zero effort are stealing from people who have put in a lot of effort.
Like, imagine if Taylor Swift put out a new song, and I just copied it, somehow promoted it better, and my version of her song sold a hundred million copies to the detriment of the "real" version. That would be just competition as well, but pretty blatantly unfair and immoral.
AI music is doing the same thing except it's cribbing from millions of unverifiable recorded tracks. It's not a consciousness being influenced by those tracks and remixing it based on their own lives experiences, it's just taking them, converting them to weights, then composing a song based on a sentence long prompt with a sprinkle of randomness thrown in.
There are obviously things that cross the line from "well, that's just competition" to blatantly immoral and unfair practices that hurt us all long term.
Maybe, but I dont listen to music as a roundabout way to give donations to artists, I listen to music to hear sounds I like.
Often the artist is not creating music with the intent of just pleasing your ears and rake the money in. They tend to use it to communicate something.
Go and listen to Voltaire - Industrial Revolution.
I don't think giving more space to generated sounds that game human psychology to get an easy win is a good idea. I would very much prefer to give space to interesting people with a story to tell.
And I don't really care for starving artists. The professionalization of music and the way it has turned into an industry is also bad.
It's probably going to happen more often as well. Music generation is in its own category because unlike LLMs, image generators, videos, etc, AI music is actually pretty good in its own right - morals and ethics apart of course.
And it's something that big companies like Spotify have been investing, so I'm expecting that we'll see more and more AI music taking center stage in the future.
If this is a good or bad or terrible thing.... That's another discussion
AI music to me feels very similar to AI image generation actually (and to some degree LLM generation as well) - there is this quality about both that is hard to put into words exactly, but a sort of uncanny intensity. Like you can feel the training is definitely distilling some deep human sensory processing mechanisms from the vast amount of human created data it uses, and is able push those sensory buttons in a very direct way - sometimes uncomfortably so, but with careful curation of the outputs, you can select examples of both that are close to the edge and still feel "realistic"
I think we have more defenses built into our visual processing (as we primarily interact with the world visually, and expect a lot more logical consistency in our visual inputs) vs audio processing, which makes it a bit easier to get AI music past them - most people process music on a more abstract level.
I also think we have a built in sensitivity to mimicry - if we detect something is trying to fool our senses in certain ways, it can trigger a negative emotional reaction. I find that the more AI generated content I am exposed to, I develop a new level of aversion towards it - what at first glance seems curious, even compelling, later makes me feel uncomfortable and repulsed as my brain recognizes it as mimicry of human expression. I went through this with AI images, and now AI music as well - once I begin to recognize the signs, I start to hate it, and I don't mean as a conscious choice (I actually came into the gen AI space with very open mind initially), the repulsion comes from some deeper layer of processing. I guess I just don't feel comfortable having my emotional buttons pushed by robots.
This has to be an individual thing because I've already seen many generated images where it was difficult to impossible to say whether they were real or not. And based on the relatively large "image generation turing test" done by Scott Alexander which used a number of curated real and generated images in different painted, drawed, digitally painted, 3D rendered and other styles I am way above average in my ability to spot them. There are styles in which AI is really good and others in which it's really bad, plus there are some common styles that are not technically bad but are immediately recognized as AI, but in the styles in which a model is good at it's possible to create something where most people cannot recognize real from fake even with effort.
However I have not so far heard a single generated song where I didn't quickly realize that it's AI generated. I think that most often you hear the artificialness in the color of the voice, and it's no different here. I think it also depends on the quality of reproduction, because many of the artifacts get lost if you just listen to it on your phone, but good headphones or loudspeakers really make a difference.
For the record I am both a musician with an interest in sound reproduction and a graphic artist, both on a "paid hobby" level, and I commonly use generative image AI myself.
Not to detract from the outrage, but is there any significant difference between AI country music and this type of thing? Grift is grift one way or the other, right?
I've had exposure to people who like country, both enjoyers and creators, and I know there's good stuff out there. But the fact of the matter is that there are listeners who don't really seem to care what they listen to anymore. It reminds me of how Netflix is purportedly asking studios to make movies that don't require the viewers to pay any attention to them. I bet AI can do that too.
This may sound like gatekeeping and snobbery, but I agree and I would expand this further: most "top xx" popular music of whatever genre is slop created by a cynical industry. So no matter what we think about generative AI from a moral standpoint, it's not surprising that a new kind of slop can compete with the old kind of slop.
I predict that this is going to become very common in the future, and that generative AI is going to be used at this hyper-mainstream end of the spectrum, where people just don't care, and at the opposite end of the spectrum, in the strongly niche spheres where people are going to use it in creative ways to push the envelope. And the hugely broad middle of the road group, the people who like music more interesting than whatever's on the radio but don't seek out the most out there stuff, is going to stay strongly anti-AI.
Did you accidentally link the wrong song there? That's the Bo Burnham parody of a country song. Surely you're not asking if there's a significant difference between an AI song and a satirical song made by a comedian?
I don't think @Protected meant "is there any significant difference between AI country music and this Bo Burnham parody song" but rather "is there any significant difference between AI country music and [what this Bo Burnham parody song is trying to say]".
And "what this Bo Burnham parody song is trying to say" is something like "most country music superstars are pandering liars".
Yeah pretty much :D
Grifters gonna grift. I have no respect for the music industry to begin with (as I recently stated right here on Tildes), so I'm not surprised it would facilitate the connection between mainstream listeners and AI-generated content. Nor do I think this is avoidable. Even though I personally dislike generative AI.
In the near future the grifters won't necessarily need musicians - they don't need the artistry of music - at all, but countless people actually love musicians and music and will hopefully be able to seek them out and connect with them more directly, as long as the Internet itself doesn't go to hell entirely. It's not the way things were fifty years ago, but it's a much larger and dispersed and globalized market anyway. Things were never going to be the same. Nvidia killed the radio star hehe
I see @cfabbro went for exactly the same example I did :)
@V17 made an interesting point about use of AI on the experimental, niche end of things that hadn't occured to me at all. That honestly bothers me a little. If AI becomes prevasive in the mainstream, how original can anyone be by using it?
I wouldn't worry about that. Surely a niche artist isn't going to use basic AI to create unoriginal mainstream music. I was thinking of stuff like training music making AI models on field recordings from a steel mill, experimental jazz musicians improvising with a realtime generating model (surely that's just a matter of time now) or making weird post-club music even weirder and more post-clubby.
Let's note that Billboard has been the textbook definition of slop quality for several decades at least. This is 'secondhand smoke at the airport' music we're dealing with here. The charts are politics and payola, that is all they have ever been. At no time in history did charts ever cover the 'best' music. They listed whatever was being pushed by the industry at the moment, typically via backroom deals with labels for radio exposure and market trend capitalization.
Charts exist as advertising to create popular music, not simply catalog it. The one who controls the catalog gets to choose what is popular, that's the scam. They pick the artists that make them the most money, period. If you didn't pay into that system in some way (usually by giving up more of your rights and a lot more of your profit), you didn't get radio play, and you didn't get to be on the charts. It's 'product' and all about moving 'units'. Even the language used by the people in the industry does its level best to divorce the art from the end product. Now ask yourself... does AI music make them more money than real artists do?
Put another way, Billboard charts are what people who don't listen to music consider to be music. The kind of people who if you asked them to name seven genres, they couldn't do it. I look forward to watching AI slop devour the charts permanently - if there's one thing an AI is going to be good at, it'll be cranking out basic 4/4 popcorn for elevators and bad DJ mixes for dull dancefloors.
I think musicians who make library music are the ones who are truly doomed. That large and quiet segment of the music industry just went up in digital smoke. If artists want to make money in this world, the old way is still the best way - get good live, go on tour, sell tickets, sell your albums and swag at the shows. Retain your rights, remain independent, keep your entire revenue stream to yourself - do not share it with corporations.
Here's an unpopular take - if an AI can kick your ass as a musician, perhaps it's time for a career change. It wasn't so easy back in the day to set up a digital audio workstation for a couple grand and compose your magnum opus. You had to have real talent and be at least good enough to keep up with the session musicians without them kicking you out of the million dollar studios where the music was made, because you were a chump who hadn't put in the work to develop chops yet.
On some level, all this kicking and screaming about AI strikes me as panic from pretender musicians who were never good enough at their craft to be offered a record deal in the first place. Cheap production and savvy computer software convinced them they could make millions without bothering to learn their chords and scales. Auto-tune trivialized singing for people with zero vocal control, and now it's replacing those same people. Now that AI is here to challenge them, they don't want to put in the work to do it live and go on tour.
I'm strangely cool with that as a cutoff point. If you can't beat the AI, you don't get to play this game. Turns out it's not hard to beat the AI - just show up as an actual live person and know how to jam. If you can't do that, I haven't got a lot of sympathy. Life is competitive, and learning how to play well is real, hard work even if you have natural talent. You were never promised a record deal as part of your basic human rights package - some things have to be earned the hard way.
That reminds me, have we killed Ticketmaster yet? It's a bigger problem for every touring artist than AI will ever be. Kill that malignant racketeering cancer and musicians get to double their ticket profit at the same time they cut the ticket price in half so the rest of us can afford to see the show.
Case in point: the VTuber Mori Calliope has charted in multiple countries, but I bet you haven't heard her on the radio in the US. (Her new one, Orpheus, is pretty good.) YOASOBI's Idol and Creepy Nuts' Bling Bang Bang Born were popular enough to be global top songs on YouTube, and YOASOBI even played Coachella, but are ignored by the media gangs in the US. Bad Bunny has been fighting Taylor Swift for the top of streaming plays for years, but people acted like he didn't exist until suddenly this year.
There's a lot of cool music out there, but the US is extremely insular and transparently cultivates a false selection of "popular" music. The silver lining is, with music and movies in that sphere trending so mediocre, it'll hopefully lessen the US's ability to project pop culture to the rest of the world, and other places will get more of a chance.
The chart that this topped is the "Country Digital Song Sales", which does not include streaming - meaning this is purely powered through purchases of the track. This is most likely a situation of an AI company buying their own single so they can advertise that "They topped a chart!", in a similar way to wine companies entering tiny competitions so they can legally call their wine "award-winning" Articles like this are advertising for AI companies - if the illusion is kept of AI as an "inevitable future", then people are more likely to invest in order to keep up.
Thank you for mentioning this, I was looking for someone to do so before I did.
This track's status was achieved with 3000 $1 downloads. So far, this is a nothingburger.
I don't understand the issue. Is the point of music the production process or that it is being enjoyed by the listeners? People have enjoyed canned music that reuses the same recipes over and over again at least for decades, but probably since the first dinosaur made a chirp. AI seems to be extremely good at pandering to popular demand. Whatever people like, AI reproduces it. So this is not surprising at all.
I don't like AI at all and I have never really used it, but if the ability to make orchestral music with a click of a button means we can finally get rid of the music industry and go back to playing guitar for your friends, I'm all for it.
It's very much both. Music created by someone for themselves alone has just as much of a point as music created for millions. And in some cases, like educational ensemble settings, it's almost entirely about the process. Art-making is just as core to the human experience as art-appreciating. People get together and make music because they enjoy it, not necessarily because anyone else cares that they're doing it.
Christopher Small talks about how "musicking" is an activity, one that involves not just the listeners or performers or writers/composers, but all of them together. He strongly argues against the popular convention of conceiving of music as discrete objects (songs or pieces) and firmly believes it should be understood as an event. That event may unfold in stages (writing, production, listening) rather than happening all at the same time, but it's still an event. It's about the doing of the thing, and that necessarily includes the creators.
Part of his book Musicking: The Meanings of Performance and Listening is online if anyone wants to explore. The prelude is enough to get the gist, don't feel intimidated by the idea of diving into a whole book. It's good stuff.
Sorry, I don't know why I put that question there. Of course the creation of music is at least as important as listening to it. Some of the best melodies I've ever heard was me whistling to myself.
I guess my point was: If music creation is holy and can be desecrated by mindless repetition, countless numbers of musicians have been guilty of doing exactly that throuout history. AI is just making it easier for non-musicians to repeat popular patterns.
Because it's ragebait for engagement.
It took 12 years from the airing of this Futurama episode for it to become reality.
What should be more infuriating is that country music for the past several decades has been largely cookie-cutter songs, long before autotune came on the scene. It's no coincidence that the country genre is the first to see an AI work on a chart, albeit digital sales and not one of the main charts. I think it's just a matter of time until we see AI songs on main country charts, and then other genres. Record companies exist for one reason: to make money. Eliminate the artist and you've eliminated a cost. Of course this won't work for music from an artist like Taylor Swift where there is a persona behind it, touring, etc., but for probably 90% of popular music, it will work, well.