Heads up, tech fans! OpenAI just dropped two free, open-weight language models you can download and run locally 🧠💻. Whether you're coding in a college dorm at 2 AM or building your next AI-powered startup pitch, these new models have got your back.
Why Open-Weight Matters
An open-weight model gives you the actual trained parameters (aka “weights”), so you can tweak and fine-tune it for your own tasks without needing the original training data. It's not quite the same as open-source (which shares code, data, and methods), but it’s a massive win for privacy and customisation.
Meet the Squad
- gpt-oss-120b: Packs 120 billion parameters and can run on a single GPU 🚀.
- gpt-oss-20b: A lean, mean 20B model that you can fire up on your personal computer. No GPU? No problem.
How They Stack Up
OpenAI says these newcomers match the performance of its own mini reasoning models (o3-mini and o4-mini), especially shining in coding tasks, competition math, and health queries. Imagine having an AI buddy that helps debug your code or solves tricky equations in real time!
Where Were The Others?
Earlier this year, the Chinese mainland's DeepSeek made waves with a cost-effective reasoning model, and Meta was working on Llama 4. Now, OpenAI is back in the open-weight game for the first time since GPT-2 in 2019.
Ready to level up? Download links are live—run these models behind your own firewall, on your own infrastructure, and keep your projects close to the chest 🔒.
Reference(s):
OpenAI releases free, downloadable models in competition catch-up
cgtn.com