mirror of
https://github.com/harvard-edge/cs249r_book.git
synced 2026-05-06 09:38:33 -05:00
[PR #1338] [MERGED] fix(tests/10_tokenization): replace raw numpy array params with Tensor in DummyModel #5137
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/harvard-edge/cs249r_book/pull/1338
Author: @Shashank-Tripathi-07
Created: 4/16/2026
Status: ✅ Merged
Merged: 4/16/2026
Merged by: @profvjreddi
Base:
dev← Head:fix/tokenization-dummy-model-numpy-params📝 Commits (3)
19c7e69test(tinytorch): add finite-difference gradient correctness tests for Module 062b7ff80fix(tests/06_autograd): rewrite gradient correctness tests to pass CI96e672bfix(tests/10_tokenization): replace raw numpy array params with Tensor in DummyModel📊 Changes
2 files changed (+397 additions, -1 deletions)
View changed files
➕
tinytorch/tests/06_autograd/test_gradient_correctness.py(+392 -0)📝
tinytorch/tests/10_tokenization/test_10_tokenization_progressive.py(+5 -1)📄 Description
Summary
`DummyModel.parameters()` in `test_progressive_stability` was returning `[np.array([1.0])]`. Any `Trainer.init` that iterates `model.parameters()` and accesses `param.requires_grad` crashes with:
```
AttributeError: 'numpy.ndarray' object has no attribute 'requires_grad'
```
This is a latent bug in the test itself: every other test stub in this file already uses `Tensor(np.array(...), requires_grad=True)` for parameters. `DummyModel` was the one outlier.
Fix
Give `DummyModel` a proper `Tensor` parameter with `requires_grad=True`, consistent with the rest of the test file and with what `Trainer` actually expects.
Files
🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.