How To Run OpenAI’s GPT-OSS 20B and 120B Models on AMD Ryzen™ AI Processors
OpenAI has just
released two new AI models – gpt‑oss‑20b and gpt‑oss-120b – which are the first open‑weight models from the firm since GPT‑2.
PC spec requirements for both gpt‑oss‑20b – the more restrained model packing 21 billion parameters – and gpt‑oss-120b, which offers 117 billion parameters. The latter is designed for data center use, but it will run on a high-end PC, whereas gpt‑oss‑20b is the model designed specifically for consumer devices.
These models can be downloaded from Hugging Face (here's
gpt‑oss‑20b and here’s
gpt‑oss-120b) under the Apache 2.0 license, or for the merely curious, there's an online
demo you can check out (no download necessary).
you can run gpt-oss-20b on any laptop or PC that has 16GB of system memory (or 16GB of video RAM, or a combo of both). However, it's very much a case of the more, the merrier – or faster, rather. The model might chug along with that bare minimum of 16GB, and ideally, you'll want a bit more on tap.
It's the same overall deal with the beefier gpt-oss-120b model, except as you might guess, you need
a lot more memory. Officially, this means 80GB
AMD's recommendation in this case, CPU-wise, is for its top-of-the-range Ryzen AI Max+ 395 processor coupled with 128GB of system RAM (and 96GB of that allocated as Variable Graphics Memory).
speeds of up to 30 tokens per second
Powerful on-device AI can be yours – if your PC can handle it
www.techradar.com
View attachment 128395