mirror of
https://github.com/harvard-edge/cs249r_book.git
synced 2026-05-08 09:57:21 -05:00
[PR #1444] [MERGED] fix(tinytorch): constant tensor silently zeroed after quantize/dequantize roundtrip #10055
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/harvard-edge/cs249r_book/pull/1444
Author: @Shashank-Tripathi-07
Created: 4/22/2026
Status: ✅ Merged
Merged: 4/22/2026
Merged by: @profvjreddi
Base:
dev← Head:fix/tinytorch-quantize-constant-tensor-zeroed📝 Commits (1)
20f812ffix(tinytorch): constant tensor quantized to all-zeros, losing original value📊 Changes
1 file changed (+22 additions, -3 deletions)
View changed files
📝
tinytorch/src/15_quantization/15_quantization.py(+22 -3)📄 Description
What this fixes
quantize_int8()insrc/15_quantization/15_quantization.pyhas a special case for constant tensors (all elements equal, somax == min). The guard exists to avoid division by zero when computingscale. It setsscale=1.0andzero_point=0, then returns an all-zeros INT8 tensor.On dequantization the formula is
(quantized - zero_point) * scale. Withquantized=0,zero_point=0,scale=1.0that gives0.0for every element -- the original constant value is gone.Any weight tensor that happens to be uniform (e.g. a bias layer initialised to a constant, or a frozen embedding) is silently zeroed out after quantization. No error is raised.
Root cause
The contributor correctly avoided the division-by-zero when
max == min, but forgot thatzero_pointmust encode the constant so dequantization can recover it. The invariant is:With
scale=1.0this simplifies tozero_point = round(-min_val), clamped to[-128, 127].Fix
Dequantization now correctly recovers the constant:
Test gap closed
The existing unit test only asserted
scale_const == 1.0and never verified the roundtrip. That is why this survived undetected. The updated test explicitly dequantizes and asserts value recovery for both positive and negative constants.🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.