mirror of
https://github.com/harvard-edge/cs249r_book.git
synced 2026-05-07 02:03:55 -05:00
[GH-ISSUE #1341] tanh missing from enable_autograd() wiring #5720
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @profvjreddi on GitHub (Apr 16, 2026).
Original GitHub issue: https://github.com/harvard-edge/cs249r_book/issues/1341
just noticed tanh doesnt track gradients. if i do Tanh()(x) with x.requires_grad=True, x.grad is None after backward().
looked at enable_autograd() in src/06_autograd/06_autograd.py ... sigmoid, relu, gelu, softmax, mse, bce and crossentropy are all in the patch list but tanh is missing. also no TanhBackward class anywhere.
came up while looking at #1336.
fix is pretty mechanical: