Hope they relegate the crap out of it like GDPR did last decade. You can't double/triple dip on monetizing user data between ads, data collection/selling, and AI training and expect everyone to...
Hope they relegate the crap out of it like GDPR did last decade. You can't double/triple dip on monetizing user data between ads, data collection/selling, and AI training and expect everyone to just be cool with it. At least the former two weren't being pitched as a way to reduce labor.
This all circles back to the same issue of these endless TOS agreements that litigate away your life, but are likely unenforceable for anyone who could actually fight them in court. Society needs...
This all circles back to the same issue of these endless TOS agreements that litigate away your life, but are likely unenforceable for anyone who could actually fight them in court.
Society needs a solution on things like this, and it's going to have to come from a country that gives a shit, and it sure looks unlikely to be the US.
I do understand that this is the "blood" that the supposed free services have been built on, but it's absurd how the need to squeeze every drop out of everything has turned the world into a constant surveillance and plagiarism machine.
The fact that it’s even remotely possible (and actually it seems incredibly easy, not challenging at all) for a company with a bajillion dollars to just keep paying money to never have a court...
but are likely unenforceable for anyone who could actually fight them in court.
The fact that it’s even remotely possible (and actually it seems incredibly easy, not challenging at all) for a company with a bajillion dollars to just keep paying money to never have a court case actually conclude ever, is a huge problem with the justice system.
I understand that it’s an incredibly difficult problem to regulate without also breaking some other part of the justice system, but it needs to be changed somehow.
If companies are going to dump resources into busted, inadequate tools to force on us, least they could do is use the data they're already collecting from me to make it suck less lol
If companies are going to dump resources into busted, inadequate tools to force on us, least they could do is use the data they're already collecting from me to make it suck less lol
They'll only do that if you pay for the Sucks Less Tier. They're also adding ads to the cheaper Sucks the Same Amount tier and increasing the amount in the Sucks More Somehow tier. Sign up for the...
They'll only do that if you pay for the Sucks Less Tier. They're also adding ads to the cheaper Sucks the Same Amount tier and increasing the amount in the Sucks More Somehow tier.
Sign up for the Sucks The Same Amount family plan and save $3 a year!
I think this is pretty much required for any AI product to function. As more and more external sources (i.e. Reddit) start closing their doors to AI scrapers, the only way stuff like Copilot can...
I think this is pretty much required for any AI product to function. As more and more external sources (i.e. Reddit) start closing their doors to AI scrapers, the only way stuff like Copilot can continue to improve is to suck up data from its users. I suspect if/when too many people decide to opt out, the controls to do so will become more and more obfuscated and the "kinds" of data that you can choose to keep private will be whittled away to only the most sensitive.
Or, perhaps, they pay for the data they want to consume? Like they would for any other business? If they weren't so shifty with OpenAI it any have helped, but the short term aspects of AI are just...
the only way stuff like Copilot can continue to improve is to suck up data from its users
Or, perhaps, they pay for the data they want to consume? Like they would for any other business?
If they weren't so shifty with OpenAI it any have helped, but the short term aspects of AI are just tarnished in my eyes. They spent decades CD-ing any fan works, so I won't feel bad just because now they want to walk over the copyright hell they helped create.
We will soon start using consumer data from Copilot, Bing, and Microsoft Start (MSN) (including interactions with advertisements) to help train the generative AI models in Copilot.
We will soon start using consumer data from Copilot, Bing, and Microsoft Start (MSN) (including interactions with advertisements) to help train the generative AI models in Copilot.
So it's their free services they're using the data of? Seems alright to me but I don't know what the practical use of search query data is.
We will also make it simple for consumers to opt-out of their data being used for training, with clear notices displayed in Copilot, Bing, and Microsoft Start. We will start providing these opt-out controls in October, and we won’t begin training our AI models on this data until at least 15 days after we notify consumers that the opt-out controls are available.
So it's their free services they're using the data of? Seems alright to me but I don't know what the practical use of search query data is.
The implication is that only business ("commercial") customers are protected. I don't trust that all sensitive information from my personal licensed products will be protected adequately.
The implication is that only business ("commercial") customers are protected. I don't trust that all sensitive information from my personal licensed products will be protected adequately.
MS considers a "Microsoft Account" a personal account. This would not include customers covered under an enterprise agreement. Those folks with E3/E5/etc licenses have had "no training"...
These changes will only apply to consumers who are signed into their Microsoft Account
MS considers a "Microsoft Account" a personal account. This would not include customers covered under an enterprise agreement. Those folks with E3/E5/etc licenses have had "no training" protections since last August unless their admins went out of their way to disable it.
Lovely, when combined with MS trying to kill local accounts, badger users into using MS accounts, removing workarounds to avoid MS accounts, and then pushing users to upload their personal files...
So let me get this straight: MS wants to scrape data not only from search queries, but from their AI service that companies use to write in-house code? I can't imagine everyone will be cool with...
So let me get this straight: MS wants to scrape data not only from search queries, but from their AI service that companies use to write in-house code? I can't imagine everyone will be cool with that.
A couple years back I started seeing stickers saying "Microsoft ❤️s Linux" on the work laptops of Microsoft employees. I didn't realize they loved Linux so much they'd tank all their enterprise business for it.
Microsoft Copilot and GitHub Copilot are two different products (confusingly, given that they're both Microsoft products). The business license for GitHub Copilot comes with greater assurances...
Microsoft Copilot and GitHub Copilot are two different products (confusingly, given that they're both Microsoft products). The business license for GitHub Copilot comes with greater assurances that they aren't using your data in the contract terms as opposed to the personal license (and it's thus more expensive per user).
They're the types of assurances you'd be able to sue over later, so I'd trust them reasonably well. Certainly more than I'd trust anything in the ToS for private use.
They're the types of assurances you'd be able to sue over later, so I'd trust them reasonably well. Certainly more than I'd trust anything in the ToS for private use.
It just seems to me that big corps like MS are more willing than ever to change terms, even ignore them completely, if they think they can get away with it. It's worth remembering that the third...
It just seems to me that big corps like MS are more willing than ever to change terms, even ignore them completely, if they think they can get away with it. It's worth remembering that the third step in Doctorow's Enshittification process is "abuse those business customers to claw back all the value for themselves."
But I suspect you're right. Microsoft do tend to have a functional sense of how far they can push things. Faint praise, I suppose.
Yeah, I def agree that corps like Microsoft will push things as far as they can. But they can push individuals a lot farther than they can push business customers without pushback. Faint praise...
Yeah, I def agree that corps like Microsoft will push things as far as they can. But they can push individuals a lot farther than they can push business customers without pushback. Faint praise for sure, but since a lot of companies would be paying only for the guarantees of not using their code/data for training in this situation, I think Microsoft knows exactly how far they can push it.
“Oh no worries, tiny one-person freelance business — as long as you’re paying for our Enterprise licenses then you’re safe! What’s that? You can’t afford the Enterprise license fee every year plus...
“Oh no worries, tiny one-person freelance business — as long as you’re paying for our Enterprise licenses then you’re safe!
What’s that? You can’t afford the Enterprise license fee every year plus an IT professional to make sure all the group policy stuff is correctly set up to not eat all your data? That’s a shame, I guess it’s to the data mines with you! I sure hope all your past and future clients are okay with us slurping up all their data off your backups and project files too!”
Are we at all surprised? The hunger for growing AI capabilities is becoming a little ridiculous. I hope it ends up becoming a polished turd.
Hope they relegate the crap out of it like GDPR did last decade. You can't double/triple dip on monetizing user data between ads, data collection/selling, and AI training and expect everyone to just be cool with it. At least the former two weren't being pitched as a way to reduce labor.
This all circles back to the same issue of these endless TOS agreements that litigate away your life, but are likely unenforceable for anyone who could actually fight them in court.
Society needs a solution on things like this, and it's going to have to come from a country that gives a shit, and it sure looks unlikely to be the US.
I do understand that this is the "blood" that the supposed free services have been built on, but it's absurd how the need to squeeze every drop out of everything has turned the world into a constant surveillance and plagiarism machine.
The fact that it’s even remotely possible (and actually it seems incredibly easy, not challenging at all) for a company with a bajillion dollars to just keep paying money to never have a court case actually conclude ever, is a huge problem with the justice system.
I understand that it’s an incredibly difficult problem to regulate without also breaking some other part of the justice system, but it needs to be changed somehow.
If companies are going to dump resources into busted, inadequate tools to force on us, least they could do is use the data they're already collecting from me to make it suck less lol
They'll only do that if you pay for the Sucks Less Tier. They're also adding ads to the cheaper Sucks the Same Amount tier and increasing the amount in the Sucks More Somehow tier.
Sign up for the Sucks The Same Amount family plan and save $3 a year!
I think this is pretty much required for any AI product to function. As more and more external sources (i.e. Reddit) start closing their doors to AI scrapers, the only way stuff like Copilot can continue to improve is to suck up data from its users. I suspect if/when too many people decide to opt out, the controls to do so will become more and more obfuscated and the "kinds" of data that you can choose to keep private will be whittled away to only the most sensitive.
Or, perhaps, they pay for the data they want to consume? Like they would for any other business?
If they weren't so shifty with OpenAI it any have helped, but the short term aspects of AI are just tarnished in my eyes. They spent decades CD-ing any fan works, so I won't feel bad just because now they want to walk over the copyright hell they helped create.
We will soon start using consumer data from Copilot, Bing, and Microsoft Start (MSN) (including interactions with advertisements) to help train the generative AI models in Copilot.
So it's their free services they're using the data of? Seems alright to me but I don't know what the practical use of search query data is.
The implication is that only business ("commercial") customers are protected. I don't trust that all sensitive information from my personal licensed products will be protected adequately.
MS considers a "Microsoft Account" a personal account. This would not include customers covered under an enterprise agreement. Those folks with E3/E5/etc licenses have had "no training" protections since last August unless their admins went out of their way to disable it.
I guess I'm bad at reading between the lines but that's crazy if true.
Lovely, when combined with MS trying to kill local accounts, badger users into using MS accounts, removing workarounds to avoid MS accounts, and then pushing users to upload their personal files on OneDrive. (Blech!)
So let me get this straight: MS wants to scrape data not only from search queries, but from their AI service that companies use to write in-house code? I can't imagine everyone will be cool with that.
A couple years back I started seeing stickers saying "Microsoft ❤️s Linux" on the work laptops of Microsoft employees. I didn't realize they loved Linux so much they'd tank all their enterprise business for it.
Microsoft Copilot and GitHub Copilot are two different products (confusingly, given that they're both Microsoft products). The business license for GitHub Copilot comes with greater assurances that they aren't using your data in the contract terms as opposed to the personal license (and it's thus more expensive per user).
Gotcha. I don't know how much I trust such assurances, considering the state of things, but maybe that's just me and my biases.
They're the types of assurances you'd be able to sue over later, so I'd trust them reasonably well. Certainly more than I'd trust anything in the ToS for private use.
It just seems to me that big corps like MS are more willing than ever to change terms, even ignore them completely, if they think they can get away with it. It's worth remembering that the third step in Doctorow's Enshittification process is "abuse those business customers to claw back all the value for themselves."
But I suspect you're right. Microsoft do tend to have a functional sense of how far they can push things. Faint praise, I suppose.
Yeah, I def agree that corps like Microsoft will push things as far as they can. But they can push individuals a lot farther than they can push business customers without pushback. Faint praise for sure, but since a lot of companies would be paying only for the guarantees of not using their code/data for training in this situation, I think Microsoft knows exactly how far they can push it.
I'm sure companies are going to love this new worry about keeping IP in house.
“Oh no worries, tiny one-person freelance business — as long as you’re paying for our Enterprise licenses then you’re safe!
What’s that? You can’t afford the Enterprise license fee every year plus an IT professional to make sure all the group policy stuff is correctly set up to not eat all your data? That’s a shame, I guess it’s to the data mines with you! I sure hope all your past and future clients are okay with us slurping up all their data off your backups and project files too!”
Maybe 2024 will finally be the year of the Linux desktop.
Every year since 1998 has been the year of the Linux desktop! :-D