Ask anyone on the street what artificial intelligence is and you’ll likely hear talk of job losses and the dangers of Terminator-like technology that await us in the future. That kind of artificial intelligence is still a long way off, perhaps decades away. For now, many are comparing the birth of artificial intelligence to the birth of the internet in terms of the changes that will come. The real problem, however, is getting consumers to pay more for AI-enabled devices when the benefits are somewhat limited right now.
Chicken-and-egg scenarios come to mind, but the immediate future is an exciting mix of highly useful tools ranging from real-time translation, medical advances, and dramatic improvements in the tools available to content creators using PC and smartphone devices.
Hardware companies are investing heavily. Really heavily.
I recently attended an AI conference with AMD and some of Europe’s leading PC journalists, and a few seismic facts were clearly evident. AMD, for example, is making huge changes to the way its workforce is focused. Software support and optimization is absolutely critical in many areas of the PC industry – it’s something Nvidia, for example, has always done well through massive investment.
AMD is looking to bridge that gap. The company is tripling down its software division specifically dedicated to AI. It’s a major shift by any measure, but it’s a testament to the magnitude of change that’s coming in the coming years as the goal of handling AI tasks locally on your devices rather than in the cloud becomes the norm.
AI will be omnipresent on PCs
AI neural processing units (NPUs) will be on most PCs in just a few years, and have been the subject of much discussion at recent tech events like Computex, where AMD, Intel, and Qualcomm have unveiled their latest AI hardware. In many ways, this is similar to the shift to multi-core processors. These days, you’d be hard-pressed to buy a desktop or laptop with fewer than four cores. But 15 years ago, single-core processors were common.
The measure of AI power is the number of billions or tera operations per second (TOPS). Currently, flagship processors have around 45 TOPS, with AMD’s Ryzen AI 300 series Strix Point processors offering 50, which is more than enough to meet the 40 TOPS required by Microsoft’s Copilot+ for example. This figure is expected to increase significantly over the next generations of NPU-enabled devices.
The Biggest Problem: Selling AI to the Average Consumer
There’s no denying that no matter what brand you go for, a laptop with a cutting-edge NPU is going to cost several hundred dollars more than a standard model. Right now, I’m betting that the benefits of AI-powered tasks are simply not being realized by the average consumer looking for a $1,500-$2,000 laptop.
Even those who frequent technology review sites don’t seem willing to shell out extra money for AI features, a recent survey on TechPowerUpMore than 80% of respondents said they would not pay more for AI hardware.
The comments on the thread were a bit more mixed, with some saying that current hardware isn’t powerful enough to handle the tasks they currently rely on cloud-based services for and others being more positive and saying that many of the negative comments weren’t fully up to par with current AI capabilities. I have to agree with this given that several forum members have stated that AI is non-existent in gaming and content creation, when that’s absolutely not the case, as they both see AI being implemented and on a massive scale soon.
Personally, I use AI on a daily basis, mainly to facilitate content creation through apps like Canva, Photoshop, and Photoroom. These are largely cloud-based and as such can be extremely slow, and of course, if you don’t have an internet connection, most of these features won’t work at all.
NPUs have significant advantages outside of AI
We just mentioned one of the big advantages of letting AI do tasks locally rather than using a cloud service: the fact that it could be much faster and wouldn’t require an internet connection. Cloud-based AI also involves sending your data to the cloud, which, in addition to slowing down the whole process, can also raise privacy issues. So, doing these tasks locally would be more secure.
But processing these tasks locally is only one aspect of the benefits. NPUs are able to perform many tasks much more efficiently than your dedicated CPU or GPU, while consuming less power. This means that laptops with NPUs can be more powerful in many ways than a CPU-only laptop, but perform AI tasks using less power than a CPU or GPU. A dedicated graphics card can still handle AI tasks, but will consume more power to do so. This is one reason why NPUs are unlikely to make their way to desktop PCs where battery life and temperatures aren’t vital considerations. For now, GPUs handle AI tasks just fine.
The fact that an NPU consumes less power means reduced heat and power consumption and as a result, AI laptops can significantly increase battery life – we’re talking 20+ hours here – and can be designed to be much thinner as they require less cooling. In return, they can also be much lighter. A thinner and lighter laptop with better battery life is a much more appealing prospect than a simple AI laptop and I’ve certainly found myself drooling over the latest models using Qualcomm and AMD NPUs for this reason.
The funny thing here is that if lightness, portability, and battery life are your primary concerns, you might even end up adding an AI laptop to your shortlist without specifically looking for one and these benefits could also help kickstart its adoption.
This is just the beginning
One thing is for sure, though: AI on our devices is certainly not a passing fad. The billions of dollars spent on developing software and hardware belies any ignorance and pessimism about the benefits of AI. And while AI services are still dominated by the cloud, the buzzword of almost every company involved in consumer-driven AI in 2024 is to enable AI at the edge. In other words, it’s about having NPUs inside our devices to handle AI tasks locally instead of sending data to and from the cloud.
From what I’ve seen this year, the benefits are clear and obvious, even if software is currently lagging in some areas. Many software programs are unable to take advantage of local hardware for AI tasks or still require an internet connection. As for new AI platforms like the ARM-based Qualcomm Snapdragon Elite, Adobe’s Premiere Pro video editing software can’t yet run natively on ARM-based Windows devices. although Adobe is working on this.
These gaps will be filled, and it is likely that the NPU will now be at the center of discussions, just like CPUs and GPUs, in terms of specifications. Like it or not, AI is here to stay.