[PR #1114] [MERGED] fix: initialize parameter's gradient after creating Optimizer object #1124

Closed
opened 2026-03-22 16:01:18 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/harvard-edge/cs249r_book/pull/1114
Author: @minhdang26403
Created: 1/19/2026
Status: Merged
Merged: 1/19/2026
Merged by: @profvjreddi

Base: devHead: fix/optimizer-integration-test


📝 Commits (1)

  • 8a8936f fix: initialize parameter's gradient after creating Optimizer object

📊 Changes

1 file changed (+16 additions, -15 deletions)

View changed files

📝 tinytorch/tests/07_optimizers/test_progressive_integration.py (+16 -15)

📄 Description

Currently, there are three tests that fail in the Optimizers module:

  • test_adam_optimizer_exists
  • test_sgd_optimizer_exists
  • test_zero_grad

The root cause of this issue is that the test code initializes the parameter's gradient before passing it to the Optimizer constructor, but the Optimizer's init method sets parameter's gradient to None. Therefore, we need to initialize the parameter's gradient after that.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/harvard-edge/cs249r_book/pull/1114 **Author:** [@minhdang26403](https://github.com/minhdang26403) **Created:** 1/19/2026 **Status:** ✅ Merged **Merged:** 1/19/2026 **Merged by:** [@profvjreddi](https://github.com/profvjreddi) **Base:** `dev` ← **Head:** `fix/optimizer-integration-test` --- ### 📝 Commits (1) - [`8a8936f`](https://github.com/harvard-edge/cs249r_book/commit/8a8936f22544979d31eb6d15dacb2f9806bf48eb) fix: initialize parameter's gradient after creating Optimizer object ### 📊 Changes **1 file changed** (+16 additions, -15 deletions) <details> <summary>View changed files</summary> 📝 `tinytorch/tests/07_optimizers/test_progressive_integration.py` (+16 -15) </details> ### 📄 Description Currently, there are three tests that fail in the Optimizers module: - test_adam_optimizer_exists - test_sgd_optimizer_exists - test_zero_grad The root cause of this issue is that the test code initializes the parameter's gradient before passing it to the Optimizer constructor, but the Optimizer's __init__ method sets parameter's gradient to None. Therefore, we need to initialize the parameter's gradient after that. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-03-22 16:01:18 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/cs249r_book#1124