Loading share button...

Z.ai: GLM 5vsOpenAI o3

45 challenges

Why Z.ai: GLM 5?

15.1x cheaper overall$0.80/M in · $2.56/M out
Leads 1 of 1 benchmarks
Stronger on SWE-bench Verified77.8% vs 69.1%
10 months newerFeb 2026

Why OpenAI o3?

No standout advantages
3 / 45

Estimate Complexity

reasoning·hard

Tests an AI's ability to make educated estimates based on technical knowledge

Z.ai: GLM 5

Z.ai: GLM 5

The total number of FLOPs required to train GPT‑3 is approximately 3.15 × 10²³.

Step‑by‑step reasoning:

  1. Model size: GPT‑3 has 175 billion (1.75 × 10¹¹) trainable parameters.

  2. Training data: According to the paper abstract, the model was trained on 300 billion (3 × 10¹¹) tokens.

  3. FLOPs per token for training a transformer:

    • Forward pass: each parameter is used in one multiply‑add operation → ~2 FLOPs per parameter → 2 N FLOPs per token.
    • Backward pass (gradient computation): roughly twice the forward cost → ~4 N FLOPs per token.
    • Total: ~6 N FLOPs per token.
      This approximation is standard for large transformer models and neglects small overheads like attention quadratic terms (which add only ~1–2% for GPT‑3’s sequence length).
  4. Calculation:
    Total FLOPs ≈ 6 × (1.75 × 10¹¹) × (3 × 10¹¹)
    = 6 × 5.25 × 10²²
    = 3.15 × 10²³.

  5. Consistency check: The GPT‑3 paper reports a training compute of 3,640 PF‑days. Converting:
    1 PF‑day = 10¹⁵ FLOP/s × 86400 s = 8.64 × 10¹⁹ FLOPs.
    3,640 PF‑days = 3,640 × 8.64 × 10¹⁹ ≈ 3.14 × 10²³ FLOPs, matching the estimate.

Thus, the total FLOPs required to train GPT‑3 is about 3.15 × 10²³.

OpenAI o3

OpenAI o3

Below is a simple "back-of-the-envelope" estimate that uses only a few public facts about GPT-3 and very generic rules of thumb for counting operations.

Step 1. Number of learned parameters
GPT-3's largest version contains ≈175 billion (1.75 × 10¹¹) trainable parameters.

Step 2. How much data was seen during training
OpenAI reports that the model was trained on about 300 billion tokens (3 × 10¹¹).

Step 3. How many floating-point operations does one parameter cost per token?
A practical rule of thumb for Transformer language models is
 Training FLOPs ≈ 6 × (# parameters) × (# training tokens)

Where does the factor 6 come from?

• Forward pass: every parameter is involved in one multiply-accumulate → 2 FLOPs
• Back-prop gradient computation: roughly another 2 FLOPs
• Parameter-update (optimizer) and miscellaneous overhead: ≈2 FLOPs
Total ≈6 FLOPs per parameter per token.

Step 4. Plug in the numbers
Total FLOPs = 6 × (1.75 × 10¹¹) × (3 × 10¹¹)
      = 6 × 5.25 × 10²²
      ≈ 3.15 × 10²³ FLOPs.

Step 5. Express the answer in more familiar units
3.15 × 10²³ ≈ 315 zettaFLOPs (1 ZFLOP = 10²¹ FLOPs).

Step 6. Sanity check with hardware-time intuition
An NVIDIA A100 delivers ≈ 312 teraFLOP/s (0.312 PFLOP/s) in mixed precision.
Wall-clock time on a single A100:
 3.15 × 10²³ / 3.12 × 10¹⁴ ≈ 1.0 × 10⁹ s ≈ 32 years.
Running on 1,000 A100s in parallel would take ≈12 days, which is the right ballpark for a large industrial training run.

Final estimate
Training GPT-3 (175 B parameters, 300 B tokens) requires on the order of 3 × 10²³ floating-point operations, i.e. roughly 300 zettaFLOPs.

Vote
2 / 45

AI Board Game Logic

reasoning·medium

Tests an AI's ability to understand game rules and strategy

Z.ai: GLM 5

Z.ai: GLM 5

OpenAI o3

OpenAI o3

Vote
3 / 45

Logic Puzzle

reasoning·medium

Tests an AI's ability to solve a simple but potentially confusing logic puzzle

Z.ai: GLM 5

Z.ai: GLM 5

Sally has 1 sister.

Here is the breakdown:

  1. Since the brothers are all siblings, they share the same sisters.
  2. If each brother has 2 sisters, that means there are 2 girls in total in the family.
  3. Sally is one of those girls.
  4. Therefore, the other girl is Sally's only sister.
OpenAI o3

OpenAI o3

Let
• S = number of sisters in the family (girls)
• B = number of brothers in the family (boys)

Given

  1. Sally has 3 brothers ⇒ B = 3.
  2. Each brother has 2 sisters ⇒ every boy sees exactly 2 girls in the sibling group.
    Therefore S = 2.

Since Sally herself is one of those 2 sisters, the number of sisters Sally has (other than herself) is

2 – 1 = 1.

Sally has 1 sister.

Vote
Loading