mirror of
https://github.com/harvard-edge/cs249r_book.git
synced 2026-05-07 02:03:55 -05:00
[PR #1114] [MERGED] fix: initialize parameter's gradient after creating Optimizer object #1124
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/harvard-edge/cs249r_book/pull/1114
Author: @minhdang26403
Created: 1/19/2026
Status: ✅ Merged
Merged: 1/19/2026
Merged by: @profvjreddi
Base:
dev← Head:fix/optimizer-integration-test📝 Commits (1)
8a8936ffix: initialize parameter's gradient after creating Optimizer object📊 Changes
1 file changed (+16 additions, -15 deletions)
View changed files
📝
tinytorch/tests/07_optimizers/test_progressive_integration.py(+16 -15)📄 Description
Currently, there are three tests that fail in the Optimizers module:
The root cause of this issue is that the test code initializes the parameter's gradient before passing it to the Optimizer constructor, but the Optimizer's init method sets parameter's gradient to None. Therefore, we need to initialize the parameter's gradient after that.
🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.