Nice to see them be honest about how this isn't really panning out. Everyone wants AI except the consumer. That said, are there any functional-as-of-right-now use cases for NPUs on consumer...
Nice to see them be honest about how this isn't really panning out. Everyone wants AI except the consumer.
That said, are there any functional-as-of-right-now use cases for NPUs on consumer machines? All of the "AI" I see people yammering about are just web APIs to cloud computation services.
Ill add another one as an American living in Germany who's German still isnt good enough to read bureaucratic webpages: Local language translation, without having to rely on google translate....
Ill add another one as an American living in Germany who's German still isnt good enough to read bureaucratic webpages: Local language translation, without having to rely on google translate. Firefox has their own language translation models that run on your machine that work quite well.
to be clear, I dont know if firefox's implementation uses NPU's. They may not, as there are so many out there with no standardized interface to them yet. But it is neural network based, so no...
to be clear, I dont know if firefox's implementation uses NPU's. They may not, as there are so many out there with no standardized interface to them yet. But it is neural network based, so no reason it shouldnt run on an NPU.
macOS's system wide translation, on context menu, is such a useful tool for people that write on more than one language. Its convenience is unbeatable, and one of the things I miss the most when...
macOS's system wide translation, on context menu, is such a useful tool for people that write on more than one language. Its convenience is unbeatable, and one of the things I miss the most when using something else.
This reminds me of how Google used machine learning in its many services before LLM’s. Sure, it was there and powered many features (including search), but the user didn’t need to know or care how...
This reminds me of how Google used machine learning in its many services before LLM’s. Sure, it was there and powered many features (including search), but the user didn’t need to know or care how the algorithms work. Similarly, these NPU chips could be in the background.
For developers to care about the NPU’s and use them in their apps, though, they need to be exposed in an API, hopefully in a standard, portable way. As we’ve seen with GPU’s, that can take some doing.
For web developers, I see that Chrome has an experimental Prompt API that’s hidden behind a flag. It looks like there’s been little progress coming up with a web standard.
More reading: both of these provide access to the NPU in a hardware-agnostic manner on Windows and macOS: https://learn.microsoft.com/en-us/windows/ai/new-windows-ml/overview...
More reading: both of these provide access to the NPU in a hardware-agnostic manner on Windows and macOS:
Dell is making a canny choice here. There's no consumer killer app (yet) that demands an NPU for portability (reduced battery consumption compared to GPU) even if common applications are taking...
Dell is making a canny choice here. There's no consumer killer app (yet) that demands an NPU for portability (reduced battery consumption compared to GPU) even if common applications are taking advantage of NPU functions. As NPUs are becoming widespread and common features in new laptops and phones, it's just a matter of development time before they're fully used for operations in games, audio, video, and image processing, etc. using task-specific small models. There's a good Ars Technica discussion here.
I've been following the vanishingly small "spatial computing" niche market because reasons. It's not LLM "AI", but one of the less visible machine learning uses that might be more quietly transformative.
There's a spatial computing application suite which requires an Intel NPU. Aside from Spacetop's frankly absurd pricing (it's pitched for enterprise use and there are no serious competitors yet), the ability to handle the visual transforms close to the end user makes the experience almost good enough.
Nice to see them be honest about how this isn't really panning out. Everyone wants AI except the consumer.
That said, are there any functional-as-of-right-now use cases for NPUs on consumer machines? All of the "AI" I see people yammering about are just web APIs to cloud computation services.
Facial recognition in cameras, background removal in televideo, noise suppression, bunch of other little niceties that all benefit from an NPU
Ill add another one as an American living in Germany who's German still isnt good enough to read bureaucratic webpages: Local language translation, without having to rely on google translate. Firefox has their own language translation models that run on your machine that work quite well.
I didn't know those used the NPU, but that's pretty cool.
to be clear, I dont know if firefox's implementation uses NPU's. They may not, as there are so many out there with no standardized interface to them yet. But it is neural network based, so no reason it shouldnt run on an NPU.
macOS's system wide translation, on context menu, is such a useful tool for people that write on more than one language. Its convenience is unbeatable, and one of the things I miss the most when using something else.
This reminds me of how Google used machine learning in its many services before LLM’s. Sure, it was there and powered many features (including search), but the user didn’t need to know or care how the algorithms work. Similarly, these NPU chips could be in the background.
For developers to care about the NPU’s and use them in their apps, though, they need to be exposed in an API, hopefully in a standard, portable way. As we’ve seen with GPU’s, that can take some doing.
For web developers, I see that Chrome has an experimental Prompt API that’s hidden behind a flag. It looks like there’s been little progress coming up with a web standard.
More reading: both of these provide access to the NPU in a hardware-agnostic manner on Windows and macOS:
On Linux it's all still a bit up in the air. Conflicting userspace components and such. https://docs.kernel.org/accel/index.html
Dell is making a canny choice here. There's no consumer killer app (yet) that demands an NPU for portability (reduced battery consumption compared to GPU) even if common applications are taking advantage of NPU functions. As NPUs are becoming widespread and common features in new laptops and phones, it's just a matter of development time before they're fully used for operations in games, audio, video, and image processing, etc. using task-specific small models. There's a good Ars Technica discussion here.
I've been following the vanishingly small "spatial computing" niche market because reasons. It's not LLM "AI", but one of the less visible machine learning uses that might be more quietly transformative.
There's a spatial computing application suite which requires an Intel NPU. Aside from Spacetop's frankly absurd pricing (it's pitched for enterprise use and there are no serious competitors yet), the ability to handle the visual transforms close to the end user makes the experience almost good enough.