All corrections
1
Claim
The trained models are from the GPT-oss series (20B and 120B parameters).
Correction

OpenAI’s official specs for gpt-oss-120b list 117B total parameters (not 120B).

Full reasoning

The post states that the trained models are “(20B and 120B parameters).”

However, OpenAI’s own “Introducing gpt-oss” page lists gpt-oss-120b as having 117B total parameters (and describes the “120b” name while giving the actual total parameter count). NVIDIA’s OpenAI-compatible model reference for gpt-oss-120b likewise lists 117B total parameters.

Because multiple independent, high-credibility sources give the same total-parameter figure (117B) for gpt-oss-120b, the statement that the model is “120B parameters” is incorrect as a literal parameter-count claim (even if “120b” is the model’s name).

2 sources
Model: OPENAI_GPT_5 Prompt: v1.6.0