In the past year, every tech company has been talking about NPUs.
As you might have guessed, its all due to the ongoing hype cycle around AI.
Everybody wants a piece of the AI pie.
The neural processing unit is currently part of the CPU and is specifically designed to handle machine learning processes.Image: Microsoft
Either CPU will still offer an NPU with 45 TOPS.
What does that mean?
Well, the new PCs should be able to support on-rig AI.
Qualcomm shared how its Snapdragon X Elite chip could handle AI processes like live transcriptions.Photo: Kyle Barr / Gizmodo
The CPU, or central processing unit, isessentiallythe brain of the computer processing most of the users tasks.
Its a punch in of processor designed to handle the mathematical computations specific to machine learning algorithms.
Its specifically engineered to handle the intense demands of neural networks without leveraging any of the other systems processors.
Apple has had NPU capabilities in its M-series chips for years before the M4.Screenshot: Apple / YouTube
The standard for judging NPU speed is in TOPS, or trillions of operations per second.
Currently, its the only way big tech companies are comparing their neural processing capability with each other.
Its also an incredibly reductive way to compare processing speeds.
Google showed off its new AI-based ‘Ask Photos’ feature at this year’s I/O.Gif: Google
Qualcommexplainsthat TOPS is just a quick and dirty math equation combining the neural processors speed and accuracy.
And even then, none of this delineation of processors is set in stone.
Theres also the idea of GPNPUs, which are basically a combo platter of GPU and NPU capabilities.
Google talked about NPUs and AI capabilities as far back as thePixel 2.
Chinese-centric Huawei and Asus debuted NPUs on phones like 2017sMate 10and the 2018Zenphone 5.
Computer chips have already sported neural processors for years before 2023.
For instance, Apples M-series CPUs, the companys proprietary ARC-based chips, already supported neural capabilities in 2020.
The M1 chip had 11 TOPS, and the M2 and M3 had 15.8 and 19 TOPS, respectively.
And what iPad Pro AI applications truly make use of that new capability?
Not many, to be honest.
Perhaps well see more in a few weeks atWWDC 2024, but well have towait and see.
For the longest time, it was the opposite.
Software makers would push the boundaries of whats available on consumer-end hardware, forcing the chipmakers to catch up.
But since 2023, weve only seen some marginal AI applications capable of running on-unit.
Most demos of the AI capabilities of Qualcomms or Intels chips usually involve running the Zoom background blur feature.
Instead, were limited to relatively simple applications with Gemini Nano onPixel phones, liketext and audio summaries.
Googles smallest version of its AI is coming to the Pixel 8 andPixel 8a.
On-equipment AI is still hampered by the lack of processing power for consumer-end products.
Having AI run on the unit benefits users and the environment.
News from the future, delivered to your present.
Apple Intelligence No Longer Available Now After False Advertising Inquiry
Apple’s AI debacle keeps getting worse.
Two banks say Amazon has paused negotiations on some international data centers.