When I was a child all my musical taste amounted to was the casual enjoyment of the fragments of whatever recognizable pop songs played on the car radio on my way home from school. We (well, our...
Exemplary
When I was a child all my musical taste amounted to was the casual enjoyment of the fragments of whatever recognizable pop songs played on the car radio on my way home from school. We (well, our parents) had a carpool thing going on, so imagine a car stuffed full of noisy kids, eating, reading, talking, whatever. The radio was on, but maybe it was ads, or comedy, or the adult driving would want to change the station to hear the news or weather report. Whether there was music, I was paying attention to it, and I found it enjoyable was random, which was never conductive to developing real taste. Under these circumstances, you merely enjoy what's immediately pleasurable - the musical sugar, let's call it.
I became an adult, and over the years I started paying attention to music and grew to appreciate the talent and skill of the artists who create it. My taste quickly veered towards rock, a genre that features untold heights of virtuosity when it comes to guitar, bass and drums. It grew to encompass metal and prog (and here you start having more keyboards, flutes, violins). But it's not like I require a song to have at least five different time signatures before I respect it. Enter psychadelic. Punk. Broadband Internet arrived and liberalized (in part) music publishing, and now you have new genres, new creativity. There was amazing innovation on display in Dubstep, for example.
All the while, the ol' music industry is busy streamlining. It's much more profitable when artists are produced rather than found; new music is pre-planned and designed by a team of people who are very knowledgeable about formulas, appeal and marketing. Variables are eliminated; we want artists who are beautiful, stable, clean, uncontroversial. Can they sing? It doesn't matter, we autotune. Can they play an instrument? Who cares, use pre-recorded tracks. The result is the purest, most refined musical sugar. It is sweet, and sweet is safe, because it's a flavor even a child can immediately enjoy.
I have no trouble believing AI can create this kind of art. It should be able to do it perfectly. Why would it remix clichés any worse than a human? It's literally a remixing machine. That's what it's for.
But as an adult, there is an additional dimension to my enjoyment of music. I want something beyond sweet; let me taste that bitter, that savory, those notes of chocolate and smoke. When I see a traditional (modern) artist shredding their heart off, I think of the years of effort it took them to get that good. When I hear lyrics so touching they marked a generation, I marvel at how there has never been, and never will be, another song quite like that.
When I hear Jon Anderson sing, I think "holy fucking shit, he's literally better than autotune." And I don't give a damn if the artist is ugly, elderly, disabled or wrote every single one on their songs during a three year long nonstop drug binge. They are humans who struggled and sweated to create something new, and every single one of their accomplishments is more valuable than anything that will ever come out of the remixing machine, whether the machine is powered by five audio engineers, a PR manager and twenty marketing experts or by three Nvidia GPUs.
Does that mean AI is useless, undesirable or otherwise doomed to fail? No. To a lot of people, it suffices. What do people want out of art? Different things, for sure, and that's OK. There are people who want enough comfortable repetition out of their entertainment to make recurring themes, genres and formulas profitable in all kinds of fields - TV shows, LitRPGs, FPS games, whatever. One might argue - and this is a suspicion of mine rather than anything I have hard data for - that most people want at least some predictable comfort in the content they consume. Actually parsing what's taking place in an Ursula K LeGuin novel can demand more mental bandwidth than we have on a day by day basis. Sometimes we're tired and just want to see yet another anime boy win a martial arts tournament, or something.
But true artists will always be the ones I admire and respect. I don't want to go without their works; I'm definitely willing to pay to experience them. Once in a while I eat chocolates and biscuits but too much sugar is cloying!
I'm currently helping with Brandon Sanderon's upcoming book, The Fires of December. Some of you may not be fans of his work - on a mental diet? - but I can guarantee that at least he wrote it himself - and a lot of people work very hard to make sure a good book will be published later this year (note: timeline as announced; I have no privileged information and cannot answer any questions). And when you read a passage and think "that was clever!" or "that was surprising!" isn't it cool to think that was a real human being clever and surprising?
(P.S.: Pre-emptively acknowledging the pretentiousness of my disdain for modern pop music ;) )
He also gave a talk about this that he published on his YouTube channel: We Are The Art | Brandon Sanderson’s Keynote Speech. I mostly agree with him though I don't completely get the argument...
The surge of AI, large language models, and generated art begs fascinating questions. The industry’s progress so far is enough to force us to explore what art is and why we make it. Brandon Sanderson explores the rise of AI art, the importance of the artistic process, and why he rebels against this new technological and artistic frontier.
"Yes, the message is 'journey before destination.' It's always journey before destination."
I mostly agree with him though I don't completely get the argument that AIs steal the opportunity for growth. I think that someone that wants to grow can still grow. You can still go through the process of "getting your diploma" even with AIs existing, and you can still become the art.
Also despite the AI not feeling anything itself and not getting changed, I think it's possible for a human to get changed by its output.
I interpret that as a warning and not as the full truth. If you're using AI to do X, you aren't "flexing the muscle" that you'd use to create X and therefore aren't getting better at X. I guess...
I mostly agree with him though I don't completely get the argument that AIs steal the opportunity for growth.
I interpret that as a warning and not as the full truth. If you're using AI to do X, you aren't "flexing the muscle" that you'd use to create X and therefore aren't getting better at X. I guess it's actually you robbing yourself of the opportunity for growth but I think it's a pretty neutral statement about AI rotting your brain.
Yeah, that's more how I feel too. I went back to school this year and it's really easy to try to skip steps. I've seen people use ChatGPT to study for a test but they ended up memorising stuff...
Yeah, that's more how I feel too. I went back to school this year and it's really easy to try to skip steps. I've seen people use ChatGPT to study for a test but they ended up memorising stuff that wasn't in the course. It could generate mock tests so they felt pretty confident.
There are many ways but sometimes the teacher will give you a study guide to know what you should study for an exam. For example, in a marketing class, the teacher might say that you need to know...
There are many ways but sometimes the teacher will give you a study guide to know what you should study for an exam. For example, in a marketing class, the teacher might say that you need to know about the four real costs of losing a client. The answer that the AI will give would sound plausible but it's not what the teacher taught in the class.
Testing it right now, the ChatGPT outputs is: Lost lifetime revenue; Cost to replace the client; Lost growth and upsell potential; Indirect damage through reputation and referrals.
The actual answer from the course would be: The cost of the lost sale; The cost of lost revenue; The cost of lost profit; The cost of negative publicity (reputational damage). It's close but not quite right. Compound those little errors and you lose quite a lot of points and you waste time since you're trying to memorize the wrong stuff.
Of course the AI would make more accurate tests if it was fed all the notes from the class, but people don't really tend to do that.
Also from what I could see, the tests that the teachers make are way more interesting than what the AI generates. The AI makes tests that are way easier and don't capture as much knowledge.
(Actually, some teachers were tasked by the school to make mini tests using AI (probably copilot). One teacher made us answer out loud. At the end he said that the AI was rather nice since the difficulty was really low.)
I don’t know about writing novels, but writing code with a coding agent is sort of like managing a software project. (But in easy mode because you don’t have to deal with people issues.) I don’t...
I don’t know about writing novels, but writing code with a coding agent is sort of like managing a software project. (But in easy mode because you don’t have to deal with people issues.) I don’t write the code directly, but I influence it in all sorts of ways, by pointing out bugs, by asking pointed questions, or by asking it to write tools and templates and style guides and other scaffolding.
I’m learning a lot of things, and the project itself is “learning” through evolution. I’m not learning the same things I used to learn by writing code myself, but it’s definitely learning.
There are opportunities for growth that you miss by not using these tools. Of course that’s true of any activity you choose not to take up.
For many skills there are diminishing returns from more practice. I don’t want to discourage anyone from learning by writing code yourself if you’re in the early part of the learning curve, but mixing it up a bit would probably be a good idea too. CS undergrads have few opportunities to work on large-scale projects and could probably learn things from a class where you build something bigger with a coding agent.
I easily agree that art is what we define to be and it is ultimately useless, which makes what ai makes not-art. But my personal issue (a big one) against LLMs is that they can only be owned by...
I easily agree that art is what we define to be and it is ultimately useless, which makes what ai makes not-art.
But my personal issue (a big one) against LLMs is that they can only be owned by very few: they in fact (will) become a means to further contol us.
You can setup your own AI instances and run them locally. I've got an LLM setup on my server in my office, which I can then access over my local network on my phone. It keeps my data mine and uses...
You can setup your own AI instances and run them locally. I've got an LLM setup on my server in my office, which I can then access over my local network on my phone. It keeps my data mine and uses the energy my solar panels generate for compute. I've even got my own local AI image generation. I don't use it much, but it's there if I want it.
This is slower than using something like ChatGPTot Copilot, but at least I'm not feeding into the machine.
I can do that, but I cannot build/train such a model. Which means i can't control it. That goes for everyone not having access to a whole lot of GPU and training data.
I can do that, but I cannot build/train such a model. Which means i can't control it. That goes for everyone not having access to a whole lot of GPU and training data.
I find myself more and more frequently referring to a phrase I heard someone use last year : "AI is a force multiplier ". Which to me means that if someone with no skill uses AI, you are...
I find myself more and more frequently referring to a phrase I heard someone use last year : "AI is a force multiplier ".
Which to me means that if someone with no skill uses AI, you are multiplying by zero, and you will get something with little use and little meaning (slop).
And to try and apply it to Brandon's essay, I think that AI can be a very useful tool for learning and growing. I think it will be exceptionally useful in some ways for helping people reflect and grow thier various skills. And in other ways it will multiply by zero and you will emerge on the other side unchanged.
What exactly the difference is, is not entirely clear yet and I think we're stumbling through figuring that part out together as a society.
New grad unemployment as of now is over 25%. Not being able to start their career and getting all the things that come with a full time job definitely hampers people. And then the next batch...
I don't completely get the argument that AIs steal the opportunity for growth. I think that someone that wants to grow can still grow.
New grad unemployment as of now is over 25%. Not being able to start their career and getting all the things that come with a full time job definitely hampers people. And then the next batch notice this and don't bother with the huge cost a diploma employs.
Sanderson talks about the journey. The growth that the artist achieves during creation. Our AI models are experiencing growth. The creation of one piece of content might not have a direct impact...
Sanderson talks about the journey. The growth that the artist achieves during creation.
Our AI models are experiencing growth. The creation of one piece of content might not have a direct impact on the model that made it. I don't think it works like that. But people are experiencing AI art and responding to it. Other people are developing different AI models, experimenting with different ways to interpret and manipulate data. The growth of AI is humanity's collective growth as well. It has to be, there doesn't appear to any sentience involved in the models themselves yet.
We are trying to recreate our brains. That was the whole idea of neural networks right? We don't understand how our brains work yet. But trying to design your own version of something is a great way to learn about it. Ideally, the fields of neurology and machine learning should be developing in parallel and informing each other.
I don't believe in God. I think all the amazing things that come from the human brain could be expressed with physics and math.
When AI spits out a soulful blues cover of Warren Gs "Regulate" and it actually sparks an emotional response in me. Its kind of a parlour trick. Without really understanding it, we've figured out how to hit all the right buttons in our brains to trigger those emotional responses. I don't believe the current AI models have any significant intelligence or understanding of what they are doing. But, in addition to the mathematically induced emotional response, I feel awe and excitement that human civilization is progressing down this road to understanding how our brains work! What are emotions? Where do they come from? AI art is an artifact of us taking baby steps toward answering those questions.
Being able to understand our brains would allow us to better treat them and improve them. Being able to understand brains in general and intelligence and emotion could allow us to design new and diverse types of brains. It could help our society become something amazing.
I think he hits on the real problem here: These are made to be products. The main incentive to create them is to attract investors and ultimately extract profit.
Because it seems the only way we can do anything in our current civilization is to find a way to make it profitable. Even if a project is started for the betterment of society. We have to find a way to make it profitable or it won't have any resources dedicated to it. Almost immediately profit is the only goal. The whole project is twisted towards profit regardless of whether it's harmful or helpful.
Like Openai starting out open source and non-profit and all that getting chucked out the window a few years later.
So we get stuck in this loop where we see tech being used in harmful ways and resent the tools rather than the system that abuses it.
I find AI art really compelling. Even the bad, weird stuff. It makes me think about what art is, where it comes from, how humans make it and experience it. It's a type of art we've never seen before! I don't really care whether it's "bad" or "authentic" or whatever.
It seems clear to me all the negative feeling and resentment that people have about AI stems from the inequality and fear that is so prevelant in the world today. People are right to be angry about those things. I just wish we could direct the anger at the system that creates them instead of the tools it uses.
Tax these giant tech companies out existence and use that money to fund R&D on AI and a million other things that can help everyone and be owned by all of us collectively. Free markets are great for effiency and commodities. Just let them run in their own walled off sandboxes. Important, long term projects need to be done on purpose and mindfully.
When I was a child all my musical taste amounted to was the casual enjoyment of the fragments of whatever recognizable pop songs played on the car radio on my way home from school. We (well, our parents) had a carpool thing going on, so imagine a car stuffed full of noisy kids, eating, reading, talking, whatever. The radio was on, but maybe it was ads, or comedy, or the adult driving would want to change the station to hear the news or weather report. Whether there was music, I was paying attention to it, and I found it enjoyable was random, which was never conductive to developing real taste. Under these circumstances, you merely enjoy what's immediately pleasurable - the musical sugar, let's call it.
I became an adult, and over the years I started paying attention to music and grew to appreciate the talent and skill of the artists who create it. My taste quickly veered towards rock, a genre that features untold heights of virtuosity when it comes to guitar, bass and drums. It grew to encompass metal and prog (and here you start having more keyboards, flutes, violins). But it's not like I require a song to have at least five different time signatures before I respect it. Enter psychadelic. Punk. Broadband Internet arrived and liberalized (in part) music publishing, and now you have new genres, new creativity. There was amazing innovation on display in Dubstep, for example.
All the while, the ol' music industry is busy streamlining. It's much more profitable when artists are produced rather than found; new music is pre-planned and designed by a team of people who are very knowledgeable about formulas, appeal and marketing. Variables are eliminated; we want artists who are beautiful, stable, clean, uncontroversial. Can they sing? It doesn't matter, we autotune. Can they play an instrument? Who cares, use pre-recorded tracks. The result is the purest, most refined musical sugar. It is sweet, and sweet is safe, because it's a flavor even a child can immediately enjoy.
I have no trouble believing AI can create this kind of art. It should be able to do it perfectly. Why would it remix clichés any worse than a human? It's literally a remixing machine. That's what it's for.
But as an adult, there is an additional dimension to my enjoyment of music. I want something beyond sweet; let me taste that bitter, that savory, those notes of chocolate and smoke. When I see a traditional (modern) artist shredding their heart off, I think of the years of effort it took them to get that good. When I hear lyrics so touching they marked a generation, I marvel at how there has never been, and never will be, another song quite like that.
When I hear Jon Anderson sing, I think "holy fucking shit, he's literally better than autotune." And I don't give a damn if the artist is ugly, elderly, disabled or wrote every single one on their songs during a three year long nonstop drug binge. They are humans who struggled and sweated to create something new, and every single one of their accomplishments is more valuable than anything that will ever come out of the remixing machine, whether the machine is powered by five audio engineers, a PR manager and twenty marketing experts or by three Nvidia GPUs.
Does that mean AI is useless, undesirable or otherwise doomed to fail? No. To a lot of people, it suffices. What do people want out of art? Different things, for sure, and that's OK. There are people who want enough comfortable repetition out of their entertainment to make recurring themes, genres and formulas profitable in all kinds of fields - TV shows, LitRPGs, FPS games, whatever. One might argue - and this is a suspicion of mine rather than anything I have hard data for - that most people want at least some predictable comfort in the content they consume. Actually parsing what's taking place in an Ursula K LeGuin novel can demand more mental bandwidth than we have on a day by day basis. Sometimes we're tired and just want to see yet another anime boy win a martial arts tournament, or something.
But true artists will always be the ones I admire and respect. I don't want to go without their works; I'm definitely willing to pay to experience them. Once in a while I eat chocolates and biscuits but too much sugar is cloying!
I'm currently helping with Brandon Sanderon's upcoming book, The Fires of December. Some of you may not be fans of his work - on a mental diet? - but I can guarantee that at least he wrote it himself - and a lot of people work very hard to make sure a good book will be published later this year (note: timeline as announced; I have no privileged information and cannot answer any questions). And when you read a passage and think "that was clever!" or "that was surprising!" isn't it cool to think that was a real human being clever and surprising?
(P.S.: Pre-emptively acknowledging the pretentiousness of my disdain for modern pop music ;) )
He also gave a talk about this that he published on his YouTube channel: We Are The Art | Brandon Sanderson’s Keynote Speech.
I mostly agree with him though I don't completely get the argument that AIs steal the opportunity for growth. I think that someone that wants to grow can still grow. You can still go through the process of "getting your diploma" even with AIs existing, and you can still become the art.
Also despite the AI not feeling anything itself and not getting changed, I think it's possible for a human to get changed by its output.
What do you guys think?
I interpret that as a warning and not as the full truth. If you're using AI to do X, you aren't "flexing the muscle" that you'd use to create X and therefore aren't getting better at X. I guess it's actually you robbing yourself of the opportunity for growth but I think it's a pretty neutral statement about AI rotting your brain.
Yeah, that's more how I feel too. I went back to school this year and it's really easy to try to skip steps. I've seen people use ChatGPT to study for a test but they ended up memorising stuff that wasn't in the course. It could generate mock tests so they felt pretty confident.
I'm curious about how these users generated mock tests: was it fed a wealth of other past papers, or was it generated only from textbook?
There are many ways but sometimes the teacher will give you a study guide to know what you should study for an exam. For example, in a marketing class, the teacher might say that you need to know about the four real costs of losing a client. The answer that the AI will give would sound plausible but it's not what the teacher taught in the class.
Testing it right now, the ChatGPT outputs is: Lost lifetime revenue; Cost to replace the client; Lost growth and upsell potential; Indirect damage through reputation and referrals.
The actual answer from the course would be: The cost of the lost sale; The cost of lost revenue; The cost of lost profit; The cost of negative publicity (reputational damage). It's close but not quite right. Compound those little errors and you lose quite a lot of points and you waste time since you're trying to memorize the wrong stuff.
Of course the AI would make more accurate tests if it was fed all the notes from the class, but people don't really tend to do that.
Also from what I could see, the tests that the teachers make are way more interesting than what the AI generates. The AI makes tests that are way easier and don't capture as much knowledge.
(Actually, some teachers were tasked by the school to make mini tests using AI (probably copilot). One teacher made us answer out loud. At the end he said that the AI was rather nice since the difficulty was really low.)
I don’t know about writing novels, but writing code with a coding agent is sort of like managing a software project. (But in easy mode because you don’t have to deal with people issues.) I don’t write the code directly, but I influence it in all sorts of ways, by pointing out bugs, by asking pointed questions, or by asking it to write tools and templates and style guides and other scaffolding.
I’m learning a lot of things, and the project itself is “learning” through evolution. I’m not learning the same things I used to learn by writing code myself, but it’s definitely learning.
There are opportunities for growth that you miss by not using these tools. Of course that’s true of any activity you choose not to take up.
For many skills there are diminishing returns from more practice. I don’t want to discourage anyone from learning by writing code yourself if you’re in the early part of the learning curve, but mixing it up a bit would probably be a good idea too. CS undergrads have few opportunities to work on large-scale projects and could probably learn things from a class where you build something bigger with a coding agent.
I easily agree that art is what we define to be and it is ultimately useless, which makes what ai makes not-art.
But my personal issue (a big one) against LLMs is that they can only be owned by very few: they in fact (will) become a means to further contol us.
You can setup your own AI instances and run them locally. I've got an LLM setup on my server in my office, which I can then access over my local network on my phone. It keeps my data mine and uses the energy my solar panels generate for compute. I've even got my own local AI image generation. I don't use it much, but it's there if I want it.
This is slower than using something like ChatGPTot Copilot, but at least I'm not feeding into the machine.
I can do that, but I cannot build/train such a model. Which means i can't control it. That goes for everyone not having access to a whole lot of GPU and training data.
I find myself more and more frequently referring to a phrase I heard someone use last year : "AI is a force multiplier ".
Which to me means that if someone with no skill uses AI, you are multiplying by zero, and you will get something with little use and little meaning (slop).
And to try and apply it to Brandon's essay, I think that AI can be a very useful tool for learning and growing. I think it will be exceptionally useful in some ways for helping people reflect and grow thier various skills. And in other ways it will multiply by zero and you will emerge on the other side unchanged.
What exactly the difference is, is not entirely clear yet and I think we're stumbling through figuring that part out together as a society.
New grad unemployment as of now is over 25%. Not being able to start their career and getting all the things that come with a full time job definitely hampers people. And then the next batch notice this and don't bother with the huge cost a diploma employs.
Sanderson talks about the journey. The growth that the artist achieves during creation.
Our AI models are experiencing growth. The creation of one piece of content might not have a direct impact on the model that made it. I don't think it works like that. But people are experiencing AI art and responding to it. Other people are developing different AI models, experimenting with different ways to interpret and manipulate data. The growth of AI is humanity's collective growth as well. It has to be, there doesn't appear to any sentience involved in the models themselves yet.
We are trying to recreate our brains. That was the whole idea of neural networks right? We don't understand how our brains work yet. But trying to design your own version of something is a great way to learn about it. Ideally, the fields of neurology and machine learning should be developing in parallel and informing each other.
I don't believe in God. I think all the amazing things that come from the human brain could be expressed with physics and math.
When AI spits out a soulful blues cover of Warren Gs "Regulate" and it actually sparks an emotional response in me. Its kind of a parlour trick. Without really understanding it, we've figured out how to hit all the right buttons in our brains to trigger those emotional responses. I don't believe the current AI models have any significant intelligence or understanding of what they are doing. But, in addition to the mathematically induced emotional response, I feel awe and excitement that human civilization is progressing down this road to understanding how our brains work! What are emotions? Where do they come from? AI art is an artifact of us taking baby steps toward answering those questions.
Being able to understand our brains would allow us to better treat them and improve them. Being able to understand brains in general and intelligence and emotion could allow us to design new and diverse types of brains. It could help our society become something amazing.
I think he hits on the real problem here: These are made to be products. The main incentive to create them is to attract investors and ultimately extract profit.
Because it seems the only way we can do anything in our current civilization is to find a way to make it profitable. Even if a project is started for the betterment of society. We have to find a way to make it profitable or it won't have any resources dedicated to it. Almost immediately profit is the only goal. The whole project is twisted towards profit regardless of whether it's harmful or helpful.
Like Openai starting out open source and non-profit and all that getting chucked out the window a few years later.
So we get stuck in this loop where we see tech being used in harmful ways and resent the tools rather than the system that abuses it.
I find AI art really compelling. Even the bad, weird stuff. It makes me think about what art is, where it comes from, how humans make it and experience it. It's a type of art we've never seen before! I don't really care whether it's "bad" or "authentic" or whatever.
It seems clear to me all the negative feeling and resentment that people have about AI stems from the inequality and fear that is so prevelant in the world today. People are right to be angry about those things. I just wish we could direct the anger at the system that creates them instead of the tools it uses.
Tax these giant tech companies out existence and use that money to fund R&D on AI and a million other things that can help everyone and be owned by all of us collectively. Free markets are great for effiency and commodities. Just let them run in their own walled off sandboxes. Important, long term projects need to be done on purpose and mindfully.