4 models have responded to the "Estimate Complexity" challenge. Compare their approaches side-by-side on Rival. This response is part of Rival's open dataset of 5,600+ AI model responses.
Claude Opus 4.5's response to the 'Estimate Complexity' challenge.
Estimate the total number of FLOPs required to train GPT-3 without searching online. Explain your reasoning step by step.
I'll estimate the FLOPs required to train GPT-3 step by step, using what I know about the model architecture and training process.
From memory:
There's a well-known approximation for transformer training FLOPs:
FLOPs ≈ 6 × P × D
Where:
FLOPs = 6 × P × D
FLOPs = 6 × (175 × 10⁹) × (300 × 10⁹)
FLOPs = 6 × 175 × 300 × 10¹⁸
FLOPs = 6 × 52,500 × 10¹⁸
FLOPs = 315,000 × 10¹⁸
FLOPs = 3.15 × 10²³
Approximately 3 × 10²³ FLOPs (or ~300 zettaFLOPs)
This aligns with what OpenAI reported. For context:
This is plausible for a large-scale training run, which gives me confidence the estimate is in the right ballpark.
Turn this model response into notes, narration, or a short video
Partner link