AI is all the rage, but most people are still trying to figure out what it is and how to best utilize it. As if choosing the right AI software weren’t hard enough, now we’re starting to see hardware that's supposed to supercharge the use of AI. Neural processing units, AI PCs, Copilot+ PCs...what does it all mean for organizations looking to deploy the right hardware to the right teams?
In this episode, we aim to cut through the marketing jargon to get a clearer view of what these devices are and whether they’re worth considering.
A PC by any other name would work as hard
Victoria and Michael spoke to SHI’s resident AI PC expert, Ruva Chimusoro, to discuss whether there are any tangible benefits to buying an AI PC, or if it’s all just marketing buzz.
Generally speaking, AI PCs are computers with a neural processing unit, or NPU. Every computer has a central processing unit (CPU) and graphics processing unit (GPU), so the NPU works alongside them to perform AI or machine learning tasks. The performance of NPUs is measured in trillions of operations per second, or TOPS.
It’s true that most modern PCs can execute AI and machine learning tasks even without an NPU. Every time you’re on a conference call, your machine is running AI tasks without you realizing it: things like noise suppression, background blur, transcription, and live captions are all utilizing machine learning. Without an NPU, those features get pushed to the CPU and GPU, putting extra strain on the main processors that should be focused on other critical tasks. A device with an NPU can offload those tasks, reducing the load on the CPU and GPU and providing noticeable performance and battery life improvements.
We’ll make it up as we go
While the general understanding is that an NPU is what constitutes an “AI PC,” it’s still a bit of a nebulous term. Apple has included NPUs in their in-house designed processors since the iPhone X in 2017, utilizing them for machine learning tasks before the current “AI” hype. So technically all recent Macs, iPhones, and iPads are already AI PCs, though Apple doesn't use that term. Microsoft, however, decided that an AI PC needs not only an NPU, but also its Copilot software and a dedicated Copilot key on the keyboard.
Then there are the recommended specs. The general consensus is that a device needs at least 16 GB of memory and 256 GB of storage to sufficiently run AI tasks, though of course, more is better. For Apple’s part, they don’t seem too concerned with this. The MacBook Air, Mac mini, iMac, and even the new iPhone 16 lineup ship with 8 GB of memory but will still be able to use Apple Intelligence features. Considering Apple’s unified memory architecture, it may not be a fair direct comparison, but it’s worth noting.
On the software side, things are still very much up in the air...literally. Most AI tools that utilize large language models (LLMs), including Microsoft Copilot, offload processing to the cloud. So why bother getting a new device with an NPU if these new tools are still offloading processing to data centers?
The future is (almost) now
It’s true that most LLMs need more horsepower than is available on modern-day machines, but that won’t last long. With Apple Intelligence, Apple has created a number of small language models (SLMs) that can run on device with the help of the NPUs in their custom silicon. Microsoft looks to be headed down a similar path by introducing Copilot+, an umbrella term that encompasses multiple AI tools and require AI PC hardware (so, an NPU).
Enter Copilot+ PCs, yet another entry into the AI PC cage fight. Copilot+ PCs are specially designed to run Copilot+ tools, including Microsoft Recall (discussed back in episode 4), directly on the device. A device must have an NPU capable of 40+ TOPS in order to be considered a Copilot+ PC. According to Microsoft, Qualcomm, AMD, and Intel all offer capable processors in their Snapdragon Series X, Ryzen AI 300 series, and Intel Core Ultra 200V series, respectively.
Where do we go now?
So where does that leave us? Individuals or smaller organizations who are interested in the latest technology and want to future-proof their investments may do well to look at purchasing AI PC hardware. It seems like new AI tools and features are rolling out daily, and having a device capable of running AI tasks without ruining performance and battery life may be a huge advantage.
For larger organizations, Ruva recommends considering personas and their use cases. “When we're thinking about these devices, it's important to think about the different personas and the types of applications that they may be using on ...their day-to-day basis,” she says. “So someone in admin may not be using the same amount of memory as maybe a developer or somebody that's in sales that has to have multiple applications open at one time.”
Of course, it may make sense to sit back and wait until the picture becomes a little clearer.
---
Be sure to check out WeGotYourMac.com for more episodes and content on Mac adoption and other end-user computing topics.
This episode is sponsored by OtterBusiness. For more than 25 years, innovations from Otter Products have been deployed by some of the largest enterprise and public sector organizations in the world. They have partnered closely with end-customers, strategic partners, and authorized resellers to support their efforts and gather feedback on how they can evolve their products and support model to ensure continued success. These inputs, along with an iterative approach, have become the foundation for the newly created OtterBusiness brand and supporting organization. OtterBusiness is committed to innovating and activating business solutions that break barriers and empower customers and partners to unlock their full potential.