For the primary time in additional than 5 years, OpenAI has launched two open-weight AI reasoning fashions, amid China’s rise in open supply AI know-how, and questions round OpenAI swaying away from its preliminary goal of constructing brazenly obtainable know-how.
The fashions launched by OpenAI are free to obtain from Hugging Face and don’t want excessive computing energy to run. They’ve related capabilities as the corporate’s o-series fashions. The fashions are available in two sizes: a bigger and extra succesful gpt-oss-120b mannequin that may run on a single Nvidia GPU, and a lighter-weight gpt-oss-20b mannequin that may run on a shopper laptop computer with 16GB of reminiscence.
That is the corporate’s first ‘open’ language mannequin since GPT-2 in 2019.
For OpenAI, it is a shift from its deal with constructing primarily proprietary fashions, however one which was necessitated by the meteoric rise of China’s DeepSeek, which was open supply and took the AI world by storm. It additionally affirmed China’s lead in open supply AI, with the US taking a backseat, and its administration having to induce builders to open supply extra applied sciences.
Earlier this yr, OpenAI CEO Sam Altman mentioned that the corporate has been on the flawed facet of historical past in relation to open sourcing its applied sciences.
Open weight vs open supply AI fashions
To make sure, the fashions launched by OpenAI are ‘open weight,’ and never open supply fashions — the previous has much less transparency in comparison with the latter.
Open supply fashions present full transparency, sharing supply code, mannequin structure, coaching algorithms, and weights below a licence permitting free use, modification, and distribution. Ideally, coaching information is disclosed, however authorized constraints typically restrict this. In distinction, open weight fashions solely have the educated mannequin weights, not the supply code, coaching information, or full structure particulars. This restricts transparency and customisation, since customers can run the mannequin however not totally modify or retrain it.
Story continues beneath this advert
Why OpenAI is altering tack
After years of specializing in closed supply know-how, the shift in technique at OpenAI was triggered by the emergence of China’s DeepSeek. The latter confirmed the world {that a} language mannequin, which was open sourced, might be made at a fraction of the price that it took a few of its opponents to develop a mannequin. Meta has additionally discovered success by its open weight mannequin, Llama, which has hit greater than a billion downloads — regardless that builders have complained that its mannequin’s licence phrases might be commercially restrictive.
OpenAI presently affords its AI fashions by a chatbot and the cloud, in contrast to its rivals, whose fashions might be downloaded and modified by individuals.
In a current Reddit Q&A, OpenAI CEO Sam Altman mentioned that the corporate has been on the flawed facet of historical past in relation to open sourcing its applied sciences. “[I personally think we need to] determine a distinct open supply technique,” Altman mentioned. “Not everybody at OpenAI shares this view, and it’s additionally not our present highest precedence… We’ll produce higher fashions, however we are going to keep much less of a lead than we did in earlier years.”
In response to a suggestions type revealed by OpenAI on its web site, the corporate was inviting “builders, researchers, and [members of] the broader neighborhood” and included questions like, “What would you wish to see in an open weight mannequin from OpenAI?” and “What open fashions have you ever used previously?”.

