Loading share button...

Z.AI: GLM 4.5vsOpenAI o3

42 challenges

Why Z.AI: GLM 4.5?

17.9x cheaper overall$0.60/M in · $2.20/M out
3 months newerJul 2025

Why OpenAI o3?

Leads 1 of 1 benchmarks
Stronger on SWE-bench Verified69.1% vs 64.2%
3 / 42

Estimate Complexity

reasoning·hard

Tests an AI's ability to make educated estimates based on technical knowledge

Z.AI: GLM 4.5

Z.AI: GLM 4.5

To estimate the total number of FLOPs (floating-point operations) required to train GPT-3 without searching online, we rely on known architectural details and standard computational models for transformer training. Here's the step-by-step reasoning:

Step 1: Key Parameters of GPT-3

  • Number of parameters (P): GPT-3 has 175 billion parameters (175 × 10⁹).
  • Training dataset size (D): The model was trained on approximately 300 billion tokens (300 × 10⁹ tokens). This is a standard figure associated with GPT-3.

Step 2: FLOPs per Token Calculation

Training a transformer model involves two main operations per token:

  1. Forward pass: Computes the model's output.
  2. Backward pass: Computes gradients for backpropagation.

For transformer architectures, the combined FLOPs for forward and backward passes per token is empirically estimated as 6 × P. This is derived from:

  • Forward pass: Requires ~2 × P FLOPs per token (matrix multiplications dominate, with each parameter used in multiply-add operations).
  • Backward pass: Typically requires ~2× the FLOPs of the forward pass due to gradient calculations (~4 × P FLOPs per token).
  • Total per token: 2P (forward) + 4P (backward) = 6P FLOPs per token.

Step 3: Total FLOPs for the Entire Dataset

The total FLOPs scales linearly with the dataset size:

  • Total FLOPs = FLOPs per token × Number of tokens.
  • Substituting the values:
    Total FLOPs = (6 × P) × D = 6 × (175 × 10⁹) × (300 × 10⁹).

Step 4: Calculation

  • Compute the product:
    6 × 175 × 300 × 10⁹ × 10⁹ = 6 × 52,500 × 10¹⁸.
  • Simplify:
    6 × 52,500 = 315,000.
    Total FLOPs = 315,000 × 10¹⁸ = 3.15 × 10²³ FLOPs.

Step 5: Validation with Known Compute

  • GPT-3 training reportedly used ~3,640 petaflop-days of compute.
  • Convert petaflop-days to FLOPs:
    1 petaflop-day = 10¹⁵ FLOPs × 86,400 seconds/day = 8.64 × 10¹⁹ FLOPs.
    3,640 petaflop-days = 3,640 × 8.64 × 10¹⁹ ≈ 3.14 × 10²³ FLOPs.
  • This closely matches our estimate of 3.15 × 10²³ FLOPs, confirming the calculation.

Step 6: Batch Size and Steps Consideration

  • Batch size: GPT-3 used a batch size of 3.2 million tokens per step (80 batches × 4096 tokens/batch).
  • Number of training steps:
    Steps = Total tokens / Batch size = 300 × 10⁹ / 3.2 × 10⁶ = 93,750 steps.
  • FLOPs per step:
    Per step FLOPs = 6 × P × Batch size = 6 × 175 × 10⁹ × 3.2 × 10⁶ = 3.36 × 10¹⁸ FLOPs/step.
  • Total FLOPs:
    93,750 steps × 3.36 × 10¹⁸ FLOPs/step ≈ 3.15 × 10²³ FLOPs.
    This consistency reinforces the estimate.

Step 7: Negligible Factors

  • Parameter updates: Optimizer steps (e.g., Adam) add O(P) FLOPs per step. For 93,750 steps:
    Update FLOPs ≈ 93,750 × 175 × 10⁹ ≈ 1.64 × 10¹⁶ FLOPs (negligible vs. 3.15 × 10²³).
  • Embedding/softmax layers: Included in the 6P per-token estimate.
  • Attention mechanisms: Captured by the 6P heuristic (validated for transformers).

Final Estimate

The total FLOPs required to train GPT-3 is 3.15 × 10²³ FLOPs. This is derived from:

  • Standard transformer FLOPs per token (6P).
  • GPT-3 parameters (175B) and dataset size (300B tokens).
  • Cross-validation with reported compute usage (3,640 petaflop-days).
OpenAI o3

OpenAI o3

Below is a simple "back-of-the-envelope" estimate that uses only a few public facts about GPT-3 and very generic rules of thumb for counting operations.

Step 1. Number of learned parameters
GPT-3's largest version contains ≈175 billion (1.75 × 10¹¹) trainable parameters.

Step 2. How much data was seen during training
OpenAI reports that the model was trained on about 300 billion tokens (3 × 10¹¹).

Step 3. How many floating-point operations does one parameter cost per token?
A practical rule of thumb for Transformer language models is
 Training FLOPs ≈ 6 × (# parameters) × (# training tokens)

Where does the factor 6 come from?

• Forward pass: every parameter is involved in one multiply-accumulate → 2 FLOPs
• Back-prop gradient computation: roughly another 2 FLOPs
• Parameter-update (optimizer) and miscellaneous overhead: ≈2 FLOPs
Total ≈6 FLOPs per parameter per token.

Step 4. Plug in the numbers
Total FLOPs = 6 × (1.75 × 10¹¹) × (3 × 10¹¹)
      = 6 × 5.25 × 10²²
      ≈ 3.15 × 10²³ FLOPs.

Step 5. Express the answer in more familiar units
3.15 × 10²³ ≈ 315 zettaFLOPs (1 ZFLOP = 10²¹ FLOPs).

Step 6. Sanity check with hardware-time intuition
An NVIDIA A100 delivers ≈ 312 teraFLOP/s (0.312 PFLOP/s) in mixed precision.
Wall-clock time on a single A100:
 3.15 × 10²³ / 3.12 × 10¹⁴ ≈ 1.0 × 10⁹ s ≈ 32 years.
Running on 1,000 A100s in parallel would take ≈12 days, which is the right ballpark for a large industrial training run.

Final estimate
Training GPT-3 (175 B parameters, 300 B tokens) requires on the order of 3 × 10²³ floating-point operations, i.e. roughly 300 zettaFLOPs.

Vote
2 / 42

AI Board Game Logic

reasoning·medium

Tests an AI's ability to understand game rules and strategy

Z.AI: GLM 4.5

Z.AI: GLM 4.5

OpenAI o3

OpenAI o3

Vote
3 / 42

Logic Puzzle

reasoning·medium

Tests an AI's ability to solve a simple but potentially confusing logic puzzle

Z.AI: GLM 4.5

Z.AI: GLM 4.5

Sally has 3 brothers, meaning there are 3 boys in the family. Each brother has 2 sisters, indicating that there are exactly 2 girls in the family. Since Sally is one of the girls, there must be one additional sister. Therefore, Sally has one sister.

This can be verified by considering the family composition:

  • Sisters: Sally and one other girl.
  • Brothers: 3 boys.

From Sally's perspective:

  • She has 3 brothers.
  • She has 1 sister (the other girl).

From each brother's perspective:

  • They have 2 sisters (Sally and the other girl).

This satisfies both conditions: Sally has 3 brothers, and each brother has 2 sisters.

If there were only one sister (Sally), then each brother would have only one sister, which contradicts the information that each brother has 2 sisters. If there were three sisters, then each brother would have three sisters, which also contradicts the information that each brother has 2 sisters. Thus, there must be exactly two sisters, meaning Sally has one sister.

\boxed{1}

OpenAI o3

OpenAI o3

Let
• S = number of sisters in the family (girls)
• B = number of brothers in the family (boys)

Given

  1. Sally has 3 brothers ⇒ B = 3.
  2. Each brother has 2 sisters ⇒ every boy sees exactly 2 girls in the sibling group.
    Therefore S = 2.

Since Sally herself is one of those 2 sisters, the number of sisters Sally has (other than herself) is

2 – 1 = 1.

Sally has 1 sister.

Vote
Loading