mirror of
https://github.com/harvard-edge/cs249r_book.git
synced 2026-05-06 17:49:07 -05:00
[PR #1336] [MERGED] test(tinytorch): add finite-difference gradient correctness tests for Module 06 #7282
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/harvard-edge/cs249r_book/pull/1336
Author: @Shashank-Tripathi-07
Created: 4/16/2026
Status: ✅ Merged
Merged: 4/16/2026
Merged by: @profvjreddi
Base:
dev← Head:test/autograd-gradient-correctness📝 Commits (2)
19c7e69test(tinytorch): add finite-difference gradient correctness tests for Module 062b7ff80fix(tests/06_autograd): rewrite gradient correctness tests to pass CI📊 Changes
1 file changed (+392 additions, -0 deletions)
View changed files
➕
tinytorch/tests/06_autograd/test_gradient_correctness.py(+392 -0)📄 Description
Summary
No backward pass in TinyTorch was previously verified numerically. Existing tests only checked that
param.gradwas non-None or non-zero. A subtly wrong gradient would train silently in the wrong direction.This PR adds a dedicated
test_gradient_correctness.pyusing central finite differences as ground truth:If the analytical gradient from
backward()disagrees, the test fails.Coverage
Files
tinytorch/tests/06_autograd/test_gradient_correctness.py🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.