Commit Graph

11 Commits

Author SHA1 Message Date
Vijay Janapa Reddi
34a59e2064 Fix module test execution issues
- Fixed test functions to only run when modules executed directly
- Added proper __name__ == '__main__' guards to all test calls
- Fixed syntax errors from incorrect replacements in Module 13 and 15
- Modules now import properly without executing tests
- ProductionBenchmarkingProfiler (Module 14) and ProductionMLSystemProfiler (Module 16) fully working
- Other profiler classes present but require full numpy environment to test completely
2025-09-16 00:17:32 -04:00
Vijay Janapa Reddi
c33f62ca79 Updates markdown headers in development files
Updates markdown headers in development files to improve consistency and readability.

Removes the redundant "🔧 DEVELOPMENT" headers and standardizes the subsequent headers to indicate the purpose of the following code, such as "🧪 Test Your Matrix Multiplication". This change enhances the clarity and organization of the development files.
2025-07-20 17:36:32 -04:00
Vijay Janapa Reddi
a2788db356 Add section organization to 07_attention module: Add DEVELOPMENT section header
- Insert ## 🔧 DEVELOPMENT header before first test function
- Organizes module according to educational structure guidelines
- Maintains all existing functionality and test execution
- Improves readability and navigation for educational use
2025-07-20 14:04:31 -04:00
Vijay Janapa Reddi
cc9cdee97d Deprecate AUTO TESTING: Remove run_module_tests_auto from all _dev.py modules. Standardize on full-module test execution for reliable, context-aware testing. 2025-07-20 13:28:10 -04:00
Vijay Janapa Reddi
ede665e2dc Simplify plot handling - remove _should_show_plots functions and plot guards 2025-07-20 12:47:14 -04:00
Vijay Janapa Reddi
b9b9637fa1 Standardize section headers for 07_attention module 2025-07-20 12:28:32 -04:00
Vijay Janapa Reddi
6fef27be01 Separate plot functions from test execution 2025-07-20 12:19:39 -04:00
Vijay Janapa Reddi
b6e4c2ad1a 🧪 Add missing test function calls in 07_attention module
- Added test_unit_attention_mechanism() call after function definition
- Added test_unit_self_attention_wrapper() call after function definition
- Added test_unit_masking_utilities() call after function definition

Ensures all test functions are executed when cells run, providing immediate feedback to students.
2025-07-20 10:27:34 -04:00
Vijay Janapa Reddi
100a683790 Add structural organization headers to 07_attention module
- Added ## 🔧 DEVELOPMENT section before Step 1 where development begins
- Added ## 🤖 AUTO TESTING section before auto testing block
- Updated to ## 🎯 MODULE SUMMARY: Attention Mechanisms

Improves notebook organization without changing any code logic or content.
2025-07-20 10:00:33 -04:00
Vijay Janapa Reddi
dfad756278 🧠 Core ML: Standardize test naming in neural network building blocks
- Activations: test_integration_* → test_module_* (module dependency tests)
- Layers: test_matrix_multiplication → test_unit_matrix_multiplication
- Layers: test_dense_layer → test_unit_dense_layer
- Layers: test_layer_activation → test_unit_layer_activation
- Dense: test_integration_* → test_module_* (module dependency tests)
- Spatial: test_integration_* → test_module_* (module dependency tests)
- Attention: test_integration_* → test_module_* (module dependency tests)
- Establishes unit vs module test distinction for neural network components
2025-07-20 08:39:00 -04:00
Vijay Janapa Reddi
59d58718f9 refactor: Implement learner-focused module progression with better naming
 Renamed modules for clearer pedagogical flow:
- 05_networks → 05_dense (multi-layer dense/fully connected networks)
- 06_cnn → 06_spatial (convolutional networks for spatial patterns)
- 06_attention → 07_attention (attention mechanisms for sequences)

 Shifted remaining modules down by 1:
- 07_dataloader → 08_dataloader
- 08_autograd → 09_autograd
- 09_optimizers → 10_optimizers
- 10_training → 11_training
- 11_compression → 12_compression
- 12_kernels → 13_kernels
- 13_benchmarking → 14_benchmarking
- 14_mlops → 15_mlops
- 15_capstone → 16_capstone

 Updated module metadata (module.yaml files):
- Updated names, descriptions, dependencies
- Fixed prerequisite chains and enables relationships
- Updated export paths to match new names

New learner progression:
Foundation → Individual Layers → Dense Networks → Spatial Networks → Attention Networks → Training Pipeline

Perfect pedagogical flow: Build one layer → Stack dense layers → Add spatial patterns → Add attention mechanisms → Learn to train them all.
2025-07-18 00:12:50 -04:00