3 models have responded to the "Estimate Complexity" challenge. Compare their approaches side-by-side on RIVAL. This response is part of RIVAL's open dataset of 5,600+ AI model responses.
Glm 5's response to the 'Estimate Complexity' challenge.
Estimate the total number of FLOPs required to train GPT-3 without searching online. Explain your reasoning step by step.
The total number of FLOPs required to train GPT‑3 is approximately 3.15 × 10²³.
Step‑by‑step reasoning:
Model size: GPT‑3 has 175 billion (1.75 × 10¹¹) trainable parameters.
Training data: According to the paper abstract, the model was trained on 300 billion (3 × 10¹¹) tokens.
FLOPs per token for training a transformer:
Calculation:
Total FLOPs ≈ 6 × (1.75 × 10¹¹) × (3 × 10¹¹)
= 6 × 5.25 × 10²²
= 3.15 × 10²³.
Consistency check: The GPT‑3 paper reports a training compute of 3,640 PF‑days. Converting:
1 PF‑day = 10¹⁵ FLOP/s × 86400 s = 8.64 × 10¹⁹ FLOPs.
3,640 PF‑days = 3,640 × 8.64 × 10¹⁹ ≈ 3.14 × 10²³ FLOPs, matching the estimate.
Thus, the total FLOPs required to train GPT‑3 is about 3.15 × 10²³.
Turn this model response into notes, narration, or a short video
Partner link