The Impact of AI on Next-Gen Computing Power: Expect Higher Specs in Future PCs

The Impact of AI on Next-Gen Computing Power: Expect Higher Specs in Future PCs

Daniel Lv13

The Impact of AI on Next-Gen Computing Power: Expect Higher Specs in Future PCs

Key Takeaways

  • AI PCs require significant memory and storage for local processing.
  • New computers with specialized NPUs are necessary for AI features like Microsoft’s Copilot.
  • 16GB RAM is the minimum for AI PCs, with 32GB recommended for those who care about AI features.

The age of the “AI PC” is upon us, and you’ll soon have the option to run some AI features locally without an internet connection even on relatively modest systems. However, what you may not realize is that despite having AI accelerators, these computers will still need a heap of memory and storage just to offer local AI features.

AI PCs Have the Power for Local Processing

Microsoft’s Copilot AI feature is one of the main reasons that a new generation of computers with specialized NPUs (Neural Processing Units) are coming to market. That’s one reason why new Surface Pro devices have ARM Snapdragon processors usually found in phones. Right now, the minimum requirement for a Copilot AI PC is an NPU with 40 TOPS (Trillions of Operations Per Second) and that’s what these new chips have packed inside them.

The Copilot AI logo on a keyboard key.

Microsoft

There are many reasons to want local AI features. It’s fast, you’re not competing for resources with other users, and you don’t have to pay beyond the cost of the hardware. Giving your laptop the smarts of something on the same continuum as ChatGPT and its contemporaries is definitely exciting!

Even “Small” LLMs Are Huge

The thing is, these AI “models” are pretty large. The smallest and most efficient LLMs (Large Language Models) might need 1-3GB of memory, while not being all that smart. As the model gains parameters and becomes more capable, it needs more room. The super-smart LLMs that you access via the web might range in the hundreds of Gigabytes of RAM when it comes to their model sizes. With hundreds of billions, or even more than a trillion parameters in some cases, these AI models are not for scrappy little laptops to run!

Then we have LLMs small enough to run on a smartphone, such as phi-3-mini which clocks in at 2.4GB. That’s more in line with what’s actually going to be running on these AI laptops, though probably not quite as anemic!

Models Have to Live in RAM

AI chip hologram with circuits around.

Lucas Gouveia / How-To Geek

AI chip hologram with circuits around

While having an AI model take up a few GBs of space might not sound like a big deal, you have to understand that the entire model has to be in RAM for the technology to work with good performance. Those NPUs are specialized parallel processors, which crunch the numbers on these virtual neural networks. with billions of “parameters” at the same time.

If the model isn’t in RAM, then the NPU can’t be fed quickly enough to give you results fast enough. it would impact the speed at which an LLM replies, or functions such as real-time translation, or image generation. So if you want AI features available at the push of a button or by simply speaking to your computer, the model essentially need to reserve as much RAM as it needs.

To be sure, these new PCs all have ultra-fast SSDs that can swap data in and out of RAM quickly, but apart from the additional wear and tear that puts on your SSD, it’s still just not fast enough for everything to run smoothly, and don’t forget you still want to run all of your other apps at the same time! If your PC had to dump your browser and other apps from RAM to disk every time you wanted the AI to do something, it wouldn’t be a great experience.

16GB Is the Absolute Minimum

While 8GB of RAM is still perfectly fine for a modern office desktop or laptop used for general productivity and media consumption, Windows computers with AI features simply would not get by with that little space. Which is why Microsoft has made 16GB of RAM the absolute minimum for Copilot PCs .

While that’s a healthy amount for a normal non-AI PC, I have a feeling that for PCs running local AI assistants in the background all the time, this will be more like an 8GB PC plus AI, unless you disable the AI part of the equation.

Some AI PCs Won’t Have Upgradable RAM

If your new AI Windows PC uses an integrated system-on-a-chip or has its RAM soldered permanently, as is the case with many thin and light laptops, then you’re going to be stuck with whatever amount of RAM the system came with. Which means you need to get the amount of RAM you need right at the start, or you’ll end up needing a whole new computer if it turns out you’ve done your sums wrong.

Some time ago, I looked at whether 32GB of RAM was not the amount to aim for, and my conclusion was that this amount of memory was still overkill for average users. However, with future AI model sizes in mind, I think 32GB should be what anyone who cares about these AI features must aim for.

If You Care About AI, Get More Memory

Whether you care about local AI features is, of course, what actually matters here. If you don’t, then it’s great that these Copilot PCs are pushing the RAM minimum to 16GB. Just disable or ignore the AI part of the equation and you’ll be just fine. In fact, unless you have some compelling alternative reason, skip these AI PCs completely and stick with more mainstream option.

If, on the other hand, you’re excited about local AI features and see plenty of ways you could use these tools in future, don’t settle for the bare minimum requirement unless you have the option to add more RAM later.

Also read:

  • Title: The Impact of AI on Next-Gen Computing Power: Expect Higher Specs in Future PCs
  • Author: Daniel
  • Created at : 2024-12-07 19:30:08
  • Updated at : 2024-12-14 01:43:33
  • Link: https://some-skills.techidaily.com/the-impact-of-ai-on-next-gen-computing-power-expect-higher-specs-in-future-pcs/
  • License: This work is licensed under CC BY-NC-SA 4.0.