4 models have responded to the "Estimate Complexity" challenge. Compare their approaches side-by-side on Rival. This response is part of Rival's open dataset of 5,600+ AI model responses.
Gpt 5.3 Chat's response to the 'Estimate Complexity' challenge.
Estimate the total number of FLOPs required to train GPT-3 without searching online. Explain your reasoning step by step.
We can do a back‑of‑the‑envelope estimate using a common scaling rule for transformer training compute.
FLOPs per token ≈ 6 × N
Reasoning (rough):
= 6 × N × T
Plug in values:
6 × (1.75 × 10^11) × (3 × 10^11)
First multiply constants: 6 × 1.75 × 3 = 31.5
Then powers: 10^11 × 10^11 = 10^22
So:
Total ≈ 31.5 × 10^22
≈ 3.15 × 10^23 FLOPs
≈ 3 × 10^23 FLOPs
Final estimate: about 3 × 10^23 FLOPs to train GPT‑3.
Turn this model response into notes, narration, or a short video
Partner link