I think you may be misunderstanding the use here. They're not using LLMs to write the Rust code itself. AUTOSEL integrates an LLM (actually, multiple) to do its selection logic. It creates...
I think you may be misunderstanding the use here. They're not using LLMs to write the Rust code itself. AUTOSEL integrates an LLM (actually, multiple) to do its selection logic. It creates embeddings for each commit, and narrows down the best options for inclusion in a patch. It's an upgrade over their previous approach that used an earlier neutral network design.
This article goes into more detail on how it works.
I get that is how it is being used now but the linked article also basically asks the same question going forward under the An official kernel AI policy is needed section.
I get that is how it is being used now but the linked article also basically asks the same question going forward under the An official kernel AI policy is needed section.
This article paints a pretty positive picture of using AI for coding and how it's a useful tool when used properly. Not sure what the 'disease' you are referring to is.
This article paints a pretty positive picture of using AI for coding and how it's a useful tool when used properly. Not sure what the 'disease' you are referring to is.
Ceding an incredible power (the ability to create software) to corporations who can restrict access to that power for money or ideology is a terrible thing, even if it is a "useful tool".
Ceding an incredible power (the ability to create software) to corporations who can restrict access to that power for money or ideology is a terrible thing, even if it is a "useful tool".
If you’re talking about software developers ceding the power to write software for high salaries because AI can do it cheaply, that’s a real potential issue. But if you’re talking about online...
If you’re talking about software developers ceding the power to write software for high salaries because AI can do it cheaply, that’s a real potential issue.
But if you’re talking about online LLMs that exist today being restricted in the future, we’ve been writing software without LLMs pre-2021 (before Copilot), and local LLMs are already good enough to do code completion and basic refactors (although online models are still better).
Also, money has always allowed non-technical people to create highly-complex software, by hiring technical people. LLMs strictly decrease cost. If we eventually get a model that’s as smart and productive as an experienced coder, but costs $2,000/mo; rich people can already pay an experienced coder $10,000/mo (a $120k salary), so the only difference (except that job going away) is slightly-less-rich people gain that power.
I'm not talking about salaries, I'm talking about my power as an individual to build and ship software. In a world where software development becomes dominated by AI tooling to the degree that it...
I'm not talking about salaries, I'm talking about my power as an individual to build and ship software. In a world where software development becomes dominated by AI tooling to the degree that it diminishes the ecosystem for human programmers, the owners of the AI tooling have incredible power. I hope local LLMs etc. can keep up but I'm skeptical they can.
You'd have the power to build and ship any software you can build today. The difference is that some people would be able to build more complex software, using e.g. libraries that are too complex...
You'd have the power to build and ship any software you can build today. The difference is that some people would be able to build more complex software, using e.g. libraries that are too complex for humans to understand (but these are new libraries, or new versions of existing libraries). However, we already have this in the sense that very rich people can hire expert programmers in multiple specific fields (e.g. graphics, networking, mobile) to collaborate and build whatever they want; so the real difference is that less rich people can use AI to achieve the same complexity.
Also note that despite having complex word processors, productivity tools, video games, etc. designed by large companies, many people prefer simpler equivalents (Markdown, CLI tools, indie games). And although people have increased their standard for software, it seems to be tapering: most gamers wouldn't be happy with 2000s era 3D graphics in a AAA game, but more would be satisfied with 2010s era graphics, and many wouldn't notice a difference between graphics from 2020 vs. 2025. Even if LLMs substantially increase productivity, I'm skeptical they'll make software of current quality unacceptable, especially if they're restricted or still controversial.
Like Microsoft and Office? Can you even picture a world without it anymore? When is the last time you saw a type-writer in an office? I feel AI will have the same effect on coding, making coders...
Like Microsoft and Office? Can you even picture a world without it anymore? When is the last time you saw a type-writer in an office? I feel AI will have the same effect on coding, making coders hugely more efficient just as Office did for administration workers.
At least the last time I checked Office can run locally without an internet connection and without a subscription. If it's been enshittified by Microsoft to require those things, well, there are...
At least the last time I checked Office can run locally without an internet connection and without a subscription. If it's been enshittified by Microsoft to require those things, well, there are other options that don't.
Both of these things are possible with LLMs too (download any open-weights model from huggingface to see for yourself); and while I’m rather skeptical of the “AGI by 2027” types to say the very...
Both of these things are possible with LLMs too (download any open-weights model from huggingface to see for yourself); and while I’m rather skeptical of the “AGI by 2027” types to say the very least, it’s pretty undeniable the same compute (output quality) has gotten cheaper (more efficient/less power needed) over the recent years.
I don’t think it’s completely unreasonable to assume we’ll have an almost-state of the art model in a phone-sized, local version in the future – especially if you also start factoring in consumer hardware (chip and spec) improvements.
In fact, the Office comparison seems pretty apt now that I’m thinking about it: There’s the “polished business” version, and the “free/libre” version of an otherwise fairly identical product. And if the bubble does not burst entirely, for some reason, it seems like closed-source is winning again in terms of popularity, at least currently.
Edit: Yes, running locally on most consumer-grade hardware will be slower and/or worse quality-wise at the moment. The point is that it’s not impossible in principle, and I have no reason not to have faith in the OSS/open-weights community to improve things for “local model usage” enthusiasts, as has been the case for every piece of open computing/software ever up until the present.
It's cool to see that kernel developers are making use of AI. In particular, the use in AUTOSEL seems pretty novel and very useful!
So who owns the submitted code? The AI prompter, the developers of the AI, or the person who wrote the original code the AI used in its model?
I think you may be misunderstanding the use here. They're not using LLMs to write the Rust code itself. AUTOSEL integrates an LLM (actually, multiple) to do its selection logic. It creates embeddings for each commit, and narrows down the best options for inclusion in a patch. It's an upgrade over their previous approach that used an earlier neutral network design.
This article goes into more detail on how it works.
I get that is how it is being used now but the linked article also basically asks the same question going forward under the An official kernel AI policy is needed section.
The disease is spreading everywhere, it was only a matter of time.
This article paints a pretty positive picture of using AI for coding and how it's a useful tool when used properly. Not sure what the 'disease' you are referring to is.
Ceding an incredible power (the ability to create software) to corporations who can restrict access to that power for money or ideology is a terrible thing, even if it is a "useful tool".
If you’re talking about software developers ceding the power to write software for high salaries because AI can do it cheaply, that’s a real potential issue.
But if you’re talking about online LLMs that exist today being restricted in the future, we’ve been writing software without LLMs pre-2021 (before Copilot), and local LLMs are already good enough to do code completion and basic refactors (although online models are still better).
Also, money has always allowed non-technical people to create highly-complex software, by hiring technical people. LLMs strictly decrease cost. If we eventually get a model that’s as smart and productive as an experienced coder, but costs $2,000/mo; rich people can already pay an experienced coder $10,000/mo (a $120k salary), so the only difference (except that job going away) is slightly-less-rich people gain that power.
I'm not talking about salaries, I'm talking about my power as an individual to build and ship software. In a world where software development becomes dominated by AI tooling to the degree that it diminishes the ecosystem for human programmers, the owners of the AI tooling have incredible power. I hope local LLMs etc. can keep up but I'm skeptical they can.
You'd have the power to build and ship any software you can build today. The difference is that some people would be able to build more complex software, using e.g. libraries that are too complex for humans to understand (but these are new libraries, or new versions of existing libraries). However, we already have this in the sense that very rich people can hire expert programmers in multiple specific fields (e.g. graphics, networking, mobile) to collaborate and build whatever they want; so the real difference is that less rich people can use AI to achieve the same complexity.
Also note that despite having complex word processors, productivity tools, video games, etc. designed by large companies, many people prefer simpler equivalents (Markdown, CLI tools, indie games). And although people have increased their standard for software, it seems to be tapering: most gamers wouldn't be happy with 2000s era 3D graphics in a AAA game, but more would be satisfied with 2010s era graphics, and many wouldn't notice a difference between graphics from 2020 vs. 2025. Even if LLMs substantially increase productivity, I'm skeptical they'll make software of current quality unacceptable, especially if they're restricted or still controversial.
Like Microsoft and Office? Can you even picture a world without it anymore? When is the last time you saw a type-writer in an office? I feel AI will have the same effect on coding, making coders hugely more efficient just as Office did for administration workers.
At least the last time I checked Office can run locally without an internet connection and without a subscription. If it's been enshittified by Microsoft to require those things, well, there are other options that don't.
Both of these things are possible with LLMs too (download any open-weights model from huggingface to see for yourself); and while I’m rather skeptical of the “AGI by 2027” types to say the very least, it’s pretty undeniable the same compute (output quality) has gotten cheaper (more efficient/less power needed) over the recent years.
I don’t think it’s completely unreasonable to assume we’ll have an almost-state of the art model in a phone-sized, local version in the future – especially if you also start factoring in consumer hardware (chip and spec) improvements.
In fact, the Office comparison seems pretty apt now that I’m thinking about it: There’s the “polished business” version, and the “free/libre” version of an otherwise fairly identical product. And if the bubble does not burst entirely, for some reason, it seems like closed-source is winning again in terms of popularity, at least currently.
Edit: Yes, running locally on most consumer-grade hardware will be slower and/or worse quality-wise at the moment. The point is that it’s not impossible in principle, and I have no reason not to have faith in the OSS/open-weights community to improve things for “local model usage” enthusiasts, as has been the case for every piece of open computing/software ever up until the present.