[PR #1630] fix(labs): correct FLOP scaling answer key in lab 05 Part C #9236

Open
opened 2026-05-03 01:29:43 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/harvard-edge/cs249r_book/pull/1630
Author: @Shashank-Tripathi-07
Created: 5/3/2026
Status: 🔄 Open

Base: devHead: fix/lab05-flop-scaling-answer


📝 Commits (2)

  • 05f6d8f fix(labs): correct FLOP scaling answer key in lab 05 Part C
  • 1d14fd9 fix(labs): correct FLOPs scaling description in lab 05 learning objectives

📊 Changes

1 file changed (+11 additions, -8 deletions)

View changed files

📝 labs/vol1/lab_05_nn_compute.py (+11 -8)

📄 Description

Summary

Part C asks: "A 3-layer MLP has hidden layers of width 128. You double hidden width to 256. By how much do total FLOPs increase?"

The answer key marked "C) ~4x (quadratic)" as correct, but the code itself computes:

_actual_256 = (2*784*256 + 2*256*256 + 2*256*10) / (2*784*128 + 2*128*128 + 2*128*10)
            = 537,600 / 236,032
            ≈ 2.28x

The 4x scaling only applies to the hidden-to-hidden layer (2W^2). The input layer (2784W) and output layer (2W*10) scale linearly, pulling the total to ~2.3x. The correct answer is "A) 2x".

The "Correct" callout itself was already printing ~{_actual_256:.1f}x = ~2.3x while saying "Correct" for 4x -- a visible contradiction. The MathPeek also said "(not 2x!)" when it should say "(not 4x!)".

Changes

  • Correct answer: "4x""2x"
  • "2x" callout: now shows "Correct" with accurate explanation
  • "4x" callout: now shows why it's wrong (quadratic layer alone is 4x, but total is ~2.3x)
  • MathPeek note: (not 2x!)(not 4x!)

Test plan

  • Open lab 05 Part C, confirm the live ratio displays ~2.3x
  • Select "A) 2x" -- confirm "Correct" callout appears
  • Select "C) ~4x" -- confirm correction callout explains the ~2.3x actual value

🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/harvard-edge/cs249r_book/pull/1630 **Author:** [@Shashank-Tripathi-07](https://github.com/Shashank-Tripathi-07) **Created:** 5/3/2026 **Status:** 🔄 Open **Base:** `dev` ← **Head:** `fix/lab05-flop-scaling-answer` --- ### 📝 Commits (2) - [`05f6d8f`](https://github.com/harvard-edge/cs249r_book/commit/05f6d8f3847c5b053f26e60ccda603ff9b8c8ea8) fix(labs): correct FLOP scaling answer key in lab 05 Part C - [`1d14fd9`](https://github.com/harvard-edge/cs249r_book/commit/1d14fd934c6bab61cbcad84248c4b000726c6f8f) fix(labs): correct FLOPs scaling description in lab 05 learning objectives ### 📊 Changes **1 file changed** (+11 additions, -8 deletions) <details> <summary>View changed files</summary> 📝 `labs/vol1/lab_05_nn_compute.py` (+11 -8) </details> ### 📄 Description ## Summary Part C asks: "A 3-layer MLP has hidden layers of width 128. You double hidden width to 256. By how much do total FLOPs increase?" The answer key marked **"C) ~4x (quadratic)"** as correct, but the code itself computes: ``` _actual_256 = (2*784*256 + 2*256*256 + 2*256*10) / (2*784*128 + 2*128*128 + 2*128*10) = 537,600 / 236,032 ≈ 2.28x ``` The 4x scaling only applies to the hidden-to-hidden layer (2*W^2). The input layer (2*784*W) and output layer (2*W*10) scale linearly, pulling the total to ~2.3x. The correct answer is **"A) 2x"**. The "Correct" callout itself was already printing `~{_actual_256:.1f}x` = `~2.3x` while saying "Correct" for 4x -- a visible contradiction. The MathPeek also said "(not 2x!)" when it should say "(not 4x!)". ## Changes - Correct answer: `"4x"` → `"2x"` - "2x" callout: now shows "Correct" with accurate explanation - "4x" callout: now shows why it's wrong (quadratic layer alone is 4x, but total is ~2.3x) - MathPeek note: `(not 2x!)` → `(not 4x!)` ## Test plan - [ ] Open lab 05 Part C, confirm the live ratio displays ~2.3x - [ ] Select "A) 2x" -- confirm "Correct" callout appears - [ ] Select "C) ~4x" -- confirm correction callout explains the ~2.3x actual value --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-05-03 01:29:43 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/cs249r_book#9236