[PR #1336] [MERGED] test(tinytorch): add finite-difference gradient correctness tests for Module 06 #5135

Closed
opened 2026-04-19 12:49:53 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/harvard-edge/cs249r_book/pull/1336
Author: @Shashank-Tripathi-07
Created: 4/16/2026
Status: Merged
Merged: 4/16/2026
Merged by: @profvjreddi

Base: devHead: test/autograd-gradient-correctness


📝 Commits (2)

  • 19c7e69 test(tinytorch): add finite-difference gradient correctness tests for Module 06
  • 2b7ff80 fix(tests/06_autograd): rewrite gradient correctness tests to pass CI

📊 Changes

1 file changed (+392 additions, -0 deletions)

View changed files

tinytorch/tests/06_autograd/test_gradient_correctness.py (+392 -0)

📄 Description

Summary

No backward pass in TinyTorch was previously verified numerically. Existing tests only checked that param.grad was non-None or non-zero. A subtly wrong gradient would train silently in the wrong direction.

This PR adds a dedicated test_gradient_correctness.py using central finite differences as ground truth:

df/dx ~= (f(x + e) - f(x - e)) / (2e)

If the analytical gradient from backward() disagrees, the test fails.

Coverage

Category Tests
Arithmetic add, sub, mul, div, matmul (both operands), broadcast add, chained ops
Activations ReLU (incl. zero boundary), Sigmoid, Tanh, GELU
Losses MSELoss, MSELoss batch, BinaryCrossEntropyLoss, CrossEntropyLoss
Composed graphs Linear weight grad, Linear bias grad, two-layer chain, MSE-through-Linear end-to-end
Accumulation gradient accumulates correctly across two backward calls
no_grad() disables tracking inside context, does not affect outside

Files

File Change
tinytorch/tests/06_autograd/test_gradient_correctness.py New: 393 lines, 32 tests

🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/harvard-edge/cs249r_book/pull/1336 **Author:** [@Shashank-Tripathi-07](https://github.com/Shashank-Tripathi-07) **Created:** 4/16/2026 **Status:** ✅ Merged **Merged:** 4/16/2026 **Merged by:** [@profvjreddi](https://github.com/profvjreddi) **Base:** `dev` ← **Head:** `test/autograd-gradient-correctness` --- ### 📝 Commits (2) - [`19c7e69`](https://github.com/harvard-edge/cs249r_book/commit/19c7e69ea5ccc315d692c30f16adc831125e1c88) test(tinytorch): add finite-difference gradient correctness tests for Module 06 - [`2b7ff80`](https://github.com/harvard-edge/cs249r_book/commit/2b7ff801ee09a20571d7d048605bba6d9e4929d9) fix(tests/06_autograd): rewrite gradient correctness tests to pass CI ### 📊 Changes **1 file changed** (+392 additions, -0 deletions) <details> <summary>View changed files</summary> ➕ `tinytorch/tests/06_autograd/test_gradient_correctness.py` (+392 -0) </details> ### 📄 Description ## Summary No backward pass in TinyTorch was previously verified numerically. Existing tests only checked that `param.grad` was non-None or non-zero. A subtly wrong gradient would train silently in the wrong direction. This PR adds a dedicated `test_gradient_correctness.py` using central finite differences as ground truth: ``` df/dx ~= (f(x + e) - f(x - e)) / (2e) ``` If the analytical gradient from `backward()` disagrees, the test fails. ## Coverage | Category | Tests | |----------|-------| | Arithmetic | add, sub, mul, div, matmul (both operands), broadcast add, chained ops | | Activations | ReLU (incl. zero boundary), Sigmoid, Tanh, GELU | | Losses | MSELoss, MSELoss batch, BinaryCrossEntropyLoss, CrossEntropyLoss | | Composed graphs | Linear weight grad, Linear bias grad, two-layer chain, MSE-through-Linear end-to-end | | Accumulation | gradient accumulates correctly across two backward calls | | no_grad() | disables tracking inside context, does not affect outside | ## Files | File | Change | |------|--------| | `tinytorch/tests/06_autograd/test_gradient_correctness.py` | New: 393 lines, 32 tests | --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-19 12:49:53 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/cs249r_book#5135