Fix all remaining modules to prevent test execution on import

Wrapped test code in if __name__ == '__main__': guards for:
- Module 02 (activations): 7 test calls protected
- Module 03 (layers): 7 test calls protected
- Module 04 (losses): 10 test calls protected
- Module 05 (autograd): 7 test calls protected
- Module 06 (optimizers): 8 test calls protected
- Module 07 (training): 7 test calls protected
- Module 09 (spatial): 5 test calls protected

Impact:
- All modules can now be imported cleanly without test execution
- Tests still run when modules are executed directly
- Clean dependency chain throughout the framework
- Follows Python best practices for module structure

This completes the fix for the entire module system. Modules can now
properly import from each other without triggering test code execution.
This commit is contained in:
Vijay Janapa Reddi
2025-09-30 06:40:45 -04:00
parent 64fb1ae730
commit 682801f7bc
7 changed files with 139 additions and 62 deletions

View File

@@ -345,8 +345,6 @@ def test_unit_optimizer_base():
print("✅ Base Optimizer works correctly!")
test_unit_optimizer_base()
# %% [markdown]
"""
## SGD - Stochastic Gradient Descent
@@ -562,8 +560,6 @@ def test_unit_sgd_optimizer():
print("✅ SGD optimizer works correctly!")
test_unit_sgd_optimizer()
# %% [markdown]
"""
## Adam - Adaptive Moment Estimation
@@ -807,8 +803,6 @@ def test_unit_adam_optimizer():
print("✅ Adam optimizer works correctly!")
test_unit_adam_optimizer()
# %% [markdown]
"""
## AdamW - Adam with Decoupled Weight Decay
@@ -1045,8 +1039,6 @@ def test_unit_adamw_optimizer():
print("✅ AdamW optimizer works correctly!")
test_unit_adamw_optimizer()
# %% [markdown]
"""
## 4. Integration: Bringing It Together
@@ -1129,8 +1121,6 @@ def demonstrate_optimizer_integration():
print("- Adam: Smaller, adaptive steps")
print("- AdamW: Similar to Adam but with weight decay effects")
demonstrate_optimizer_integration()
# %% [markdown]
"""
## 5. Systems Analysis: Optimizer Performance and Memory
@@ -1214,8 +1204,6 @@ def analyze_optimizer_memory_usage():
print("- Memory scales linearly with model size")
print("- Trade-off: More memory for better convergence")
analyze_optimizer_memory_usage()
# %% nbgrader={"grade": false, "grade_id": "optimizer-convergence", "solution": true}
def analyze_optimizer_convergence_behavior():
"""📊 Analyze convergence behavior of different optimizers."""
@@ -1282,8 +1270,6 @@ def analyze_optimizer_convergence_behavior():
print("- Adam: Adaptive rates help with different parameter scales")
print("- AdamW: Similar to Adam with regularization effects")
analyze_optimizer_convergence_behavior()
# %% [markdown]
"""
## 🧪 Module Integration Test
@@ -1421,12 +1407,26 @@ def test_module():
print("🎉 ALL TESTS PASSED! Module ready for export.")
print("Run: tito module complete 06_optimizers")
test_module()
# %%
if __name__ == "__main__":
print("🚀 Running Optimizers module...")
# Run all unit tests
test_unit_optimizer_base()
test_unit_sgd_optimizer()
test_unit_adam_optimizer()
test_unit_adamw_optimizer()
# Run integration demonstrations
demonstrate_optimizer_integration()
# Run analysis functions
analyze_optimizer_memory_usage()
analyze_optimizer_convergence_behavior()
# Run final module test
test_module()
print("✅ Module validation complete!")
# %% [markdown]