“AI PCs” were everywhere at CES 2024 last week, and companies like Intel, AMD, NVIDIA, and Qualcomm are all touting how great their hardware is at running AI tasks. As Microsoft put it, this is “the year of AI-powered Windows PCs.”
But since these “AI PCs” are already on shelves and you can buy them right now, we need to cut through the hype and focus on what you’re getting for your money.
Spoiler alert: These PCs aren’t all they’re cracked up to be, and if you’re expecting something transformative when you buy one at the start of 2024, you’re going to be disappointed. They might one day deliver a lot of cool features — just not yet.
Let’s talk about what’s actually going on — and why, even if you’re passionate about generative AI (genAI), you should be skipping this wave of AI-accelerating chips and turning to different hardware entirely — hardware you may already own!
Want to stay up to date on Windows and the future of your PC? My free Windows Intelligence newsletter delivers all the best Windows tips and tricks straight to your inbox, along with the news that really matters. Plus, you’ll get free copies of Paul Thurrott’s Windows 11 and Windows 10 Field Guides (a $10 value) just for subscribing!
Current AI isn’t ‘happening on your computer’
Before wrapping your head around the details here, it’s important to realize what’s going on with most current genAI tools. When you use Copilot on Windows 11, pull up ChatGPT, turn to Adobe Firefly to generate images, or interact with similar popular genAI tools, the work isn’t actually being done on your computer.
Instead, all that processing is happening in a data center somewhere — incredibly powerful computers are using a lot of resources and electricity to run the AI model and generate an image of a dog having a birthday party or a response to your question.
You can do a lot of this locally on your computer — witness open-source tools such as the Stable Diffusion text-to-image model or the LLAMA large language model. But, to get good results, you’ll generally need a PC with solid processing power. In particular, you’ll need a very fast GPU — the same kind of hardware that people were recently using to mine cryptocurrencies.
The dream is a mainstream PC that can run these genAI models locally, on your own hardware — with no data sent to a central processing center. But not everything is going to start happening on your PC overnight. We’re moving toward a more “hybrid” future, where the most demanding tasks are sent to data centers and simpler tasks that deliver contextual suggestions and help you rewrite that email take place on your computer.
The promise of neural processing units (NPUs)
We’re all talking about AI PCs thanks to recent hardware advances — specifically, the addition of a “neural processing unit” (NPU) in Intel’s newest Meteor Lake chips and recent AMD chips.
The promise is that these NPUs are dedicated, low-power hardware that can locally run genAI models and accelerate AI processing. The idea is that soon every PC you buy will have an NPU. They won’t use much power, so the genAI tasks can run in the background even on battery power.
PC makers aren’t alone here. Do you have a Google Pixel 8 or Pixel 8 Pro phone? Google’s Tensor G3 chip does something similar — in fact, Google says it ”[paves] the way for on-device generative AI.” In the real world, it enables AI features such as summarizing audio in the Recorder app and generating smart replies in the Gboard keyboard.
On a Pixel phone, that hardware does not provide a way to run Google’s powerful Bard AI experience locally. It runs a smaller model named Gemini Nano. In the same way, low-power NPUs on a PC will offer more AI experiences on your PC — but they will not let you run Copilot or ChatGPT locally anytime soon.
Yes, as we saw at CES 2024, the hardware is here — you can buy “AI PCs” with NPUs! But just because the hardware is here doesn’t mean anything transformative has happened yet.
Windows isn’t ready for ‘AI PCs’ yet
Given Microsoft’s chatter about how this is the year of the Windows-powered AI PCs thanks to NPUs, you might assume there’s a lot you can do right now. If so, you’re going to be disappointed.
In fact, Microsoft’s AI PC marketing strategy is the perfect example of what’s going on. Microsoft is pushing a new a Copilot key as part of the AI PC push. That key will launch the same old Copilot sidebar you can access on other Windows PCs, with no extra “AI PC” features that use that fancy hardware. (Maybe that will change in the future.)
Windows 11 can barely do anything with an NPU at the moment. Out of the box, all you get is “Windows Studio Effects” for video and voice meetings — things like blurring your background and making it seem as if your eyes are looking directly at your camera when they’re not.
Paul Thurrott, who’s been covering Windows for decades, just reviewed a 2024 version of the HP Spectre x360, a laptop that comes with an NPU. After trying Studio Effects, he said that he “[assumes] that pushing this work off the CPU and GPU improves general system performance and battery life.” However, he says “this software is not demonstrably superior, quality-wise, to the third-party webcam enhancements I’ve used in review PCs over the past year, none of which require an NPU.”
In other words, Windows Studio Effects, which is the only feature built into Windows 11 that uses an NPU, doesn’t deliver anything you couldn’t do with traditional webcam-tweaking software.
As far as Windows itself goes, we’re all waiting with bated breath for the promised “AI release” of Windows that should launch late this year, according to leaks reported by Windows Central. That release will supposedly deliver a lot of NPU-powered features that will improve how we work with our computers, although Microsoft hasn’t commented yet.
Still, the promise of Windows itself using that NPU is nearly a year away. Don’t buy an AI PC and expect some promised AI transformation of your operating system today.
You can’t do much with an NPU
Since Windows itself can’t do much with an NPU, you’ll have to turn to third-party software. When Intel shows off an NPU accelerating genAI workloads, it shows off plugins for open-source applications like Stable Diffusion in the unfortunately named GIMP image editor and AI tools in the Audacity audio program. You can hunt down and configure these plugins yourself to take advantage of an NPU on your PC.
But there’s not a lot else. PCWorld’s Brad Chacos was on the ground at CES, and he notes that “NPU-driving AI isn’t compelling yet.” Manufacturers are showing off little built-in tools that use the NPU, like improved microphone noise reduction and an “intelligent AI engine” that detects whether to spin up (or down) your PC’s fans.
As you might imagine, these are just slight variations on things computers already do, with or without an NPU. (In fact, I remember seeing a company advertising an “AI laptop” that used some unusual software to control how fast the fans spun at CES over five years ago.)
GPUs still beat NPUs for AI
So NPUs can’t do much yet for the average person — they’re a way to enable powerful future software. That’s the promise of NPUs right now, anyway.
But things get a little muddier when you start looking at benchmarks. PCWorld’s Mark Hachman was blunt: “You probably already own an AI PC.”
What the heck does that mean? In short, these NPUs aren’t all they’re cracked up to be. Current genAI workloads — like generating images with Stable Diffusion and running open-source LLMs directly on your PC — perform much better using a traditional GPU than one of these new NPUs.
That’s because NPUs really are about efficiency and low power usage — you can imagine a future version of Windows that’s always performing some type of AI task in the background with that NPU, and your laptop still getting good battery life. But if you want to run current AI workloads, benchmarks show you should go with a powerful GPU. That means getting a power-hungry desktop or laptop with a powerful dedicated GPU.
‘AI PC’ buying advice for 2024
So now let’s talk about what you should actually buy!
If you’re looking forward to helpful AI features built into Windows, you should wait. Those features won’t arrive until late 2024. There may be entirely newer, better AI laptops out by then — or you may just be able to get a great deal on one next Black Friday!
If you really want to run current AI workloads on your PC, you should avoid the NPU hype and get a computer with a powerful GPU from NVIDIA or perhaps AMD. (I like AMD hardware, but these models tend to be more optimized for NVIDIA hardware from what I’ve seen.) The best performance is going to come from a desktop PC with a high-end GPU. You could also go for a laptop with a high-end GPU — just be prepared to plug it in, as you aren’t going to get long battery life while crunching these things.
If you just need a new laptop, I do recommend picking up one of those “AI laptops.” It’s likely that a lot of future Windows features will require an NPU in some way — or perhaps they’ll be able to use a powerful GPU, too. When I recommend an AI laptop, I’m really just recommending buying current hardware — a PC with a modern Intel Meteor Lake CPU or an equivalent modern AMD chip, for example.
Whatever you choose, you’re still going to have almost all the current AI features in Windows and other applications. That’s because things like Copilot and ChatGPT do the work on far-away data centers, not your PC. That’s true whether you have an “AI PC” or not.
All that said, the pieces seem to be coming together. In the future, that NPU should actually be useful for Windows and the applications you use. But for now, I wouldn’t race to buy hardware that won’t start delivering on its promises for nearly a year.
That advice goes beyond PCs: Don’t buy hardware based on what it might do in the future — buy hardware because of what it can do today.
Let’s keep in touch! Get the best Windows PC advice with my free Windows Intelligence newsletter — three things to try every Friday. Plus, get free copies of Paul Thurrott’s Windows 11 and Windows 10 Field Guides (a $10 value) for signing up.
Copyright © 2024 IDG Communications, Inc.
This story originally appeared on Computerworld