That echoes my feelings about an experience I had recently, where a coworker submitted a Pull Request (PR) for me to review. Typically, you'd spend 3-5 minutes writing a description of your PR...
But I’m increasingly of the opinion that sharing unreviewed content that has been artificially generated with other people is rude.
That echoes my feelings about an experience I had recently, where a coworker submitted a Pull Request (PR) for me to review.
Typically, you'd spend 3-5 minutes writing a description of your PR that gives a useful and concise overview of your work. But instead, they dumped a veritable wall of slop generated by an LLM that very literally summarized their PR (e.g. "This block of code was added to this path/to/file. This achieves [SOME INACCURATE DESCRIPTION OF THE IMPACT]").
I was really frustrated by that interaction. It felt like they were signaling that their time was too valuable to spend on this pull request, but that somehow my time wasn't valuable enough to avoid reading their slop.
This reeks of Copilot Workspace. It loves mentioning what files it modified, as if the diff doesn't make it obvious. I'm all for using AI to help speed up development, but at the bare minimum...
This reeks of Copilot Workspace. It loves mentioning what files it modified, as if the diff doesn't make it obvious.
I'm all for using AI to help speed up development, but at the bare minimum people should edit the pull requests to contain only what's needed, and also prevent it from editing the README every single fucking time.
And test, for the love of god. I've lost count of how many times an LLM has hallucinated a library that doesn't exist, or a method/object that isn't real, to the point that I rarely reach out to AI and instead go back to StackOverflow because I feel I'm slower with AI than without it. But maybe I'm just a terrible prompt engineer.
EDIT: Oh, that was at work. I thought it was an FOSS project (in which case, low effort contributions from randos is nothing new). If a coworker of mine tried to slop all over my codebase I'd probably yell at them in front of their manager, and nitpick to death every single peer review of their future contributions. It's one thing to mess around in your free time, it's another to do that at work.
I don’t know how software development or etiquette or any of that stuff applies to your situation, but if it was me, I’d just reject/delete the change request and ask your least-competent AI to...
I don’t know how software development or etiquette or any of that stuff applies to your situation, but if it was me, I’d just reject/delete the change request and ask your least-competent AI to write an inaccurate page-long email explaining why that code change request won’t be used…
But then again maybe you shouldn’t take my advice, being just a random stranger with no skin in the game and no professional reputation to uphold…
I love it. Although the word "slop" doesn't mean this at all (non-native speaker here, Google says it means waste water or spillage), to me it sounds like it means a pile of grease that just...
I love it. Although the word "slop" doesn't mean this at all (non-native speaker here, Google says it means waste water or spillage), to me it sounds like it means a pile of grease that just stains everything, which is a fitting name for AI-generated crap.
That echoes my feelings about an experience I had recently, where a coworker submitted a Pull Request (PR) for me to review.
Typically, you'd spend 3-5 minutes writing a description of your PR that gives a useful and concise overview of your work. But instead, they dumped a veritable wall of slop generated by an LLM that very literally summarized their PR (e.g. "This block of code was added to this
path/to/file
. This achieves [SOME INACCURATE DESCRIPTION OF THE IMPACT]").I was really frustrated by that interaction. It felt like they were signaling that their time was too valuable to spend on this pull request, but that somehow my time wasn't valuable enough to avoid reading their slop.
This reeks of Copilot Workspace. It loves mentioning what files it modified, as if the diff doesn't make it obvious.
I'm all for using AI to help speed up development, but at the bare minimum people should edit the pull requests to contain only what's needed, and also prevent it from editing the README every single fucking time.
And test, for the love of god. I've lost count of how many times an LLM has hallucinated a library that doesn't exist, or a method/object that isn't real, to the point that I rarely reach out to AI and instead go back to StackOverflow because I feel I'm slower with AI than without it. But maybe I'm just a terrible prompt engineer.
EDIT: Oh, that was at work. I thought it was an FOSS project (in which case, low effort contributions from randos is nothing new). If a coworker of mine tried to slop all over my codebase I'd probably yell at them in front of their manager, and nitpick to death every single peer review of their future contributions. It's one thing to mess around in your free time, it's another to do that at work.
I don’t know how software development or etiquette or any of that stuff applies to your situation, but if it was me, I’d just reject/delete the change request and ask your least-competent AI to write an inaccurate page-long email explaining why that code change request won’t be used…
But then again maybe you shouldn’t take my advice, being just a random stranger with no skin in the game and no professional reputation to uphold…
I love it. Although the word "slop" doesn't mean this at all (non-native speaker here, Google says it means waste water or spillage), to me it sounds like it means a pile of grease that just stains everything, which is a fitting name for AI-generated crap.
The prevailing connotation here is probably that of low-quality food waste that is fed to animals, especially pigs.