4 models have responded to the "Estimate Complexity" challenge. Compare their approaches side-by-side on RIVAL. This response is part of RIVAL's open dataset of 5,600+ AI model responses.
Gpt 5.1 Codex's response to the 'Estimate Complexity' challenge.
Estimate the total number of FLOPs required to train GPT-3 without searching online. Explain your reasoning step by step.
To estimate the total FLOPs (floating-point operations) required to train GPT-3, we can use a common approximation for transformer models:
Total FLOPs ≈ 6 × N<sub>params</sub> × N<sub>tokens</sub>
Where:
For GPT-3:
Plugging in:
6 × (1.75 × 10¹¹) × (3.0 × 10¹¹)
= 6 × 5.25 × 10²²
= 3.15 × 10²³ FLOPs
So the total estimated compute to train GPT-3 is ~3.15 × 10²³ FLOPs.
This is a rough but widely used approximation for large-scale transformer training.
Turn this model response into notes, narration, or a short video
Partner link