- Delete bin/py_to_notebook.py and tito/tools/py_to_notebook.py
- Update notebooks command to use Jupytext directly
- Jupytext is already configured in all *_dev.py files
- Simpler, more standard workflow using established tools
- Better integration with NBDev ecosystem
Benefits:
- Eliminates duplicate conversion tools
- Uses industry-standard Jupytext instead of custom tool
- Reduces maintenance burden
- Better error handling and compatibility
- Remove version field from all module.yaml files
- Update template generator to exclude version field
- Further simplify metadata to focus on system information only
- Status remains dynamically determined by test results
- Reduce module.yaml files from 100+ lines to ~25 lines focused on system needs
- Remove pedagogical details (learning objectives, difficulty, time estimates)
- Keep only essential fields: name, title, description, status, dependencies, exports, files, components
- Update status command to work with simplified metadata format
- Update metadata generation script to create simplified templates
- Focus on system metadata for CLI tools and build systems, not educational content
Before: Verbose pedagogical metadata with 20+ fields
After: Concise system metadata with 8 core fields
This aligns with the principle that module.yaml should be for systems, not pedagogy.
- Changed all module titles to be short and clean (e.g., 'Autograd' not 'Autograd - Automatic Differentiation')
- Updated metadata generation template to use concise titles by default
- Fixed CLI reference in metadata generator to use new hierarchical structure
- Titles are now consistent: just the module name capitalized
- Detailed descriptions remain in the description field where they belong
- Add module.yaml files for setup, tensor, activations, layers, and autograd modules
- Enhanced tito status command with --metadata flag for rich information display
- Created metadata schema with learning objectives, dependencies, components, and more
- Added metadata generation script (bin/generate_module_metadata.py)
- Comprehensive documentation in docs/development/module-metadata-system.md
- Status command now shows module status, difficulty, time estimates, and detailed metadata
- Supports dependency tracking, component-level status, and educational information
- Enables rich CLI experience with structured module information
- Rename modules/data/ → modules/dataloader/
- Rename data_dev.py → dataloader_dev.py
- Update NBDev export target: core.data → core.dataloader
- Rename test files: test_data.py → test_dataloader.py
- Update package exports to tinytorch.core.dataloader
- Update module imports and internal references
This makes the module name more descriptive and aligned with ML industry standards.
- Added pytest-timeout configuration with 5-minute timeout for all tests
- Added timeout handling to test command with proper error messages
- Created small local test dataset (50 train + 20 test samples) that mimics CIFAR-10 structure
- Updated data module tests to use local test data instead of downloading CIFAR-10
- Tests now run much faster (~0.1s vs ~30s) and don't require internet connection
- Added TestCIFAR10Dataset class that loads from local pickle files
- All test functionality preserved but using local data for speed and reliability
- Fixed plt.show() call in data module to respect _should_show_plots() check
- Protected test code that calls visualization functions in data module
- Protected test code that calls visualization functions in networks module
- All visualization functions now properly skip during testing
- Tests should no longer block waiting for user interaction
Introduces documentation for TinyTorch module development, including guides for developers and AI assistants.
Provides comprehensive resources for creating high-quality, educational modules, focusing on real-world applications and systems thinking.
Initializes the networks module, enabling the composition of layers into complete neural network architectures.
It introduces sequential networks, MLP creation, and network visualization tools to facilitate architecture understanding and analysis.
Adds practical classification and regression network implementations and network behavior analysis capabilities.
Adds ReLU, Sigmoid, and Tanh activation functions, enabling
non-linearity in neural networks.
Includes testing and visualization of each function to ensure
correct behavior and understanding of their properties.
Introduces a comprehensive module for 2D Convolutional Neural Networks.
This module provides a foundational understanding of CNNs through:
- Implementation of a naive Conv2D layer with sliding window convolution
- Visualization of kernel operations and feature map construction
- Composition of Conv2D layers with other layers to build a simple ConvNet
This structure provides a step-by-step guide to building and understanding CNNs, with clear examples and tests.
This commit introduces the core building blocks for neural networks,
including a naive matrix multiplication implementation and a Dense layer.
It provides a foundation for constructing and experimenting with
neural networks, emphasizing the concept of layers as tensor
transformations and function composition.
The module includes thorough testing and performance comparisons
to demonstrate the importance of optimized operations.
Implements the core Tensor class with data handling and properties, including initialization, shape, size, dtype, and string representation.
Adds element-wise addition and multiplication functions for tensors.
Implements tensor addition and multiplication as methods within the Tensor class.
Enhances the notebook by replacing some unicode characters with more standard and universally compatible symbols, improving the overall readability and user experience.
- Add template section to tensor, layers, activations, and cnn modules
- Create docs/development/module-template.md for future reference
- Clarify learning vs building structure consistently
- Show students where their code will live in the final package
- Decouple learning modules from production organization
- Add cnn_dev.py with NBDev educational pattern, Conv2D for-loop TODO, and all scaffolding
- Add README.md explaining learning goals, what is implemented vs provided, and rationale
- Add tests/test_cnn.py for basic correctness and shape tests
- Generate cnn_dev.ipynb for notebook workflow
- Add matmul_naive function with for-loop implementation for learning
- Update Dense layer to support both NumPy (@) and naive matrix multiplication
- Add comprehensive tests comparing both implementations (correctness & performance)
- Include step-by-step computation visualization for 2x2 matrices
- Fix missing imports in tensor.py and activations.py
- Export both tensor and activations modules to package
This provides students with immediate success using NumPy while allowing them to
understand the underlying computation through explicit for-loops. The scaffolding
includes performance comparisons and educational insights about why NumPy is faster.
- Add modules/networks/networks_dev.py and networks_dev.ipynb (Jupytext/nbdev educational pattern)
- Add comprehensive visualizations: architecture, data flow, layer analysis, network comparison
- Add modules/networks/README.md with learning goals, usage, and visualization docs
- Add modules/networks/tests/test_networks.py with thorough tests for composition, MLPs, and visualizations
- Register 'networks' in CLI info and test commands
- Update CLI info command to check layers/networks status
- This module focuses on forward pass only (no training yet)
- Restored tools/py_to_notebook.py as a focused, standalone tool
- Updated tito notebooks command to use subprocess to call the separate tool
- Maintains clean separation of concerns: tito.py for CLI orchestration, py_to_notebook.py for conversion logic
- Updated documentation to use 'tito notebooks' command instead of direct tool calls
- Benefits: easier debugging, better maintainability, focused single-responsibility modules
��️ Major architectural improvement implementing clean separation of concerns:
✨ NEW: Activations Module
- Complete activations module with ReLU, Sigmoid, Tanh implementations
- Educational NBDev structure with student TODOs + instructor solutions
- Comprehensive testing suite (24 tests) with mathematical correctness validation
- Visual learning features with matplotlib plotting (disabled during testing)
- Clean export to tinytorch.core.activations
🔧 REFACTOR: Layers Module
- Removed duplicate activation function implementations
- Clean import from activations module: 'from tinytorch.core.activations import ReLU, Sigmoid, Tanh'
- Updated documentation to reflect modular architecture
- Preserved all existing functionality while improving code organization
🧪 TESTING: Comprehensive Test Coverage
- All 24 activations tests passing ✅
- All 17 layers tests passing ✅
- Integration tests verify clean architecture works end-to-end
- CLI testing with 'tito test --module' works for both modules
📦 ARCHITECTURE: Clean Dependency Graph
- activations (math functions) → layers (building blocks) → networks (applications)
- Separation of concerns: pure math vs. neural network components
- Reusable components across future modules
- Single source of truth for activation implementations
�� PEDAGOGY: Enhanced Learning Experience
- Week-sized chunks: students master activations, then build layers
- Clear progression from mathematical foundations to applications
- Real-world software architecture patterns
- Modular design principles in practice
This establishes the foundation for scalable, maintainable ML systems education.
✨ Features:
- Dense layer with Xavier initialization (y = Wx + b)
- Activation functions: ReLU, Sigmoid, Tanh
- Layer composition for building neural networks
- Comprehensive test suite (17 passed, 5 skipped stretch goals)
- Package-level integration tests (14 passed)
- Complete documentation and examples
🎯 Educational Design:
- Follows 'Build → Use → Understand' pedagogical framework
- Immediate visual feedback with working examples
- Progressive complexity from simple layers to full networks
- Students see neural networks as function composition
🧪 Testing Architecture:
- Module tests: 17/17 core tests pass, 5 stretch goals available
- Package tests: 14/14 integration tests pass
- Dual testing supports both learning and validation
📚 Complete Implementation:
- Dense layer with proper weight initialization
- Numerically stable activation functions
- Batch processing support
- Real-world examples (image classification network)
- CLI integration: 'tito test --module layers'
This establishes the fundamental building blocks students need
to understand neural networks before diving into training.
- Fixed module-level tests to work with current implementation
- Added stretch goal testing pattern with pytest.skip()
- Created comprehensive testing documentation
- Both test levels now pass: 29 package tests + 22 module tests
- 11 stretch goals available for student implementation
Package tests (integration): ✅ 29/29 passed
Module tests (development): ✅ 22/22 passed, 11 skipped (stretch goals)
This provides immediate functionality for students while offering
clear implementation targets for advanced features.
- Restore complete setup module with system information and developer profiles
- Add NBDev educational patterns with #| export and #| hide directives
- Implement SystemInfo class for platform detection and compatibility checking
- Add DeveloperProfile class with ASCII art customization and personalization
- Include comprehensive educational content with step-by-step learning progression
- Expand test suite to 20 comprehensive tests covering all functionality:
- Function execution and output validation
- Arithmetic operations (basic, negative, floating-point)
- System information collection and compatibility
- Developer profile creation and customization
- ASCII art file loading and fallback behavior
- Error recovery and integration testing
- Update README with complete feature documentation and usage examples
- Convert to executed notebook showing rich output and system information
- Maintain file-based ASCII art system with tinytorch_flame.txt
- All tests passing with professional pytest structure
- Replace manual testing with pytest test classes and assertions
- Add comprehensive test coverage:
- Function execution without errors
- Correct output content validation
- ASCII art file existence and content checks
- Error handling and fallback behavior
- Missing file graceful handling
- Update README with pytest testing instructions
- Make testing consistent with other TinyTorch modules
- All 5 tests passing with proper pytest integration
- Replace complex setup module with simple hello_tinytorch() function
- Keep only the ASCII art file (tinytorch_flame.txt) for visual appeal
- Simplify tests to just verify function runs and file exists
- Update README to reflect simplified purpose and usage
- Remove complex developer profiles and system info classes
- Focus on minimal introduction to TinyTorch workflow
🔧 Notebook Compatibility:
- Fix __file__ NameError in notebook environment
- Add try/except for file path detection
- Handle both script and notebook execution contexts
- Use os.getcwd() fallback when __file__ not available
📓 Notebook Execution:
- Execute setup_dev.ipynb with all outputs generated
- Beautiful ASCII art displays properly in notebook
- All test cells show expected results
- Student-friendly interactive experience
🎨 ASCII Art Display:
- Stunning flame ASCII art renders perfectly in notebook
- Full profile display with Tiny🔥Torch branding
- Interactive testing cells with clear outputs
- Students can see the ASCII art immediately
🧪 Testing:
- All 17 tests still pass after notebook conversion
- File loading works in both environments
- Comprehensive coverage maintained
- Ready for student use
Students can now run the setup_dev.ipynb notebook and immediately see the beautiful ASCII art output, making their first TinyTorch experience memorable and engaging
🎨 File-Based ASCII Art System:
- Add beautiful tinytorch_flame.txt with sophisticated flame design
- Dynamic file loading with fallback to simple flame
- Easy customization - students can edit the file directly
- Proper error handling for missing files
🔥 Enhanced Visual Experience:
- Stunning detailed flame ASCII art from file
- 'Tiny🔥Torch' branding with 'Build ML Systems from Scratch!' tagline
- Much more sophisticated design than hardcoded version
- Students can easily customize by editing tinytorch_flame.txt
📚 Educational Benefits:
- Two customization options: parameter override or file editing
- Teaches file I/O and error handling concepts
- Encourages creative expression and ownership
- Professional development practices
🧪 Testing:
- Updated all tests to work with file-based system
- All 17 tests pass successfully
- Comprehensive coverage of file loading and fallback
- Verified both custom and default ASCII art functionality
Students can now easily create their own ASCII art by editing the tinytorch_flame.txt file or providing custom art via parameters
🎨 ASCII Art Features:
- Add beautiful TinyTorch flame ASCII art as default
- Support custom ASCII art for student personalization
- New get_ascii_art() and get_full_profile() methods
- Enhanced visual appeal and student engagement
👨💻 Profile Updates:
- Update instructor info: vj@eecs.harvard.edu, @profvjreddi
- Add comprehensive ASCII art examples and ideas
- Encourage creative student customization
- Professional signature generation with visual flair
🧪 Testing:
- Add 2 new ASCII art tests (17 total tests pass)
- Test default flame art and custom art functionality
- Verify full profile display with proper formatting
- Comprehensive coverage of all new features
Students can now create truly personalized TinyTorch experiences with custom ASCII art, making their learning journey unique and memorable
- Add DeveloperProfile class with customizable developer information
- Default to course instructor (Vijay Janapa Reddi) but allow student customization
- Include professional signature generation for code attribution
- Add comprehensive test coverage (6 new tests, 15 total tests pass)
- Encourage student ownership and personalization of their TinyTorch journey
- Maintain educational structure with TODO sections and hidden solutions
- Fix syntax errors in setup_dev.py test cells (proper indentation)
- Add comprehensive test suite for setup module (test_setup.py)
- Test coverage: functions, SystemInfo class, integration tests
- Custom test runner (no pytest dependency required)
- All 9 tests pass successfully
- Update .gitignore to allow modules/ directory for educational structure
- Fix CLI tool to look for tests in modules/{module}/tests/ instead of tests/
- Update test imports to use parent directory module imports
- Update Cursor rules to reflect new test structure:
* Project structure shows tests/ subdirectory
* Testing patterns show correct paths and import patterns
* Development workflow shows updated test locations
- Test imports now work: from tensor_dev import Tensor
- CLI commands now find tests in correct locations
Tests are now properly organized and discoverable
- Move test_tensor.py to modules/tensor/tests/test_tensor.py
- Create tests/ directories in setup and tensor modules
- Clean up setup module: remove setup_dev.html, setup_dev_files/, __init__.py
- Establish clean module structure pattern:
* {module}_dev.py - Main development file
* {module}_dev.ipynb - Generated notebook
* README.md - Module documentation
* tests/ - Test directory with test_{module}.py
This makes it crystal clear to students where tests are located.
🎯 Student Version (Visible):
- All functions raise NotImplementedError('Student implementation required')
- Clear TODO instructions with specific guidance
- Test cells handle NotImplementedError gracefully
- Students must implement everything from scratch
🔧 Instructor Version (Hidden with #|hide):
- Complete working implementations
- Same function signatures as student version
- Production-ready code for package export
📚 Added MODULE_GENERATION_GUIDE.md:
- Clear instructions for creating student/instructor modules
- NBDev workflow documentation
- Module structure patterns and templates
- Generation commands and checklist
🔄 Generation Flow:
1. Write .py with student stubs + hidden instructor solutions
2. Convert to notebook: jupytext --to notebook module_dev.py
3. Export package: python bin/tito.py sync --module name
4. Package gets instructor solutions, students see exercises
Perfect foundation for all TinyTorch modules
- Removed modules/setup/setup_nbdev_educational.* files
- Keep only the clean setup_dev.* files
- Setup module now focused and clean
- NBDev features work behind scenes without clutter
- Focused on original purpose: just setting up development environment
- Students don't see NBDev educational features explanations
- Simple workflow: hello world, basic class, export, test, progress
- NBDev #|hide directive works behind scenes for instructor solutions
- Clean and simple, just like the original but with hidden solutions
Back to basics: setup is about setup, not teaching NBDev features
SINGLE SOURCE approach: One notebook serves both instructors and students
🎯 Key Features:
- #|hide directive hides complete solutions from students
- #|code-fold creates collapsible sections for details
- #|export with educational metadata
- Progressive learning: simple → ML-relevant complexity
- Vector operations (add, dot product) as ML foundations
- ML-aware SystemInfo class with library checking
- Comprehensive testing for both student/instructor versions
🔄 Workflow:
- Students see exercises with TODOs and hints
- Instructors see complete solutions (hidden by default)
- Package exports get instructor (complete) implementations
- Single notebook maintains both audiences
This establishes the pattern for all TinyTorch modules
- Create tensor_nbdev_educational.py with NBDev directives
- Demonstrate #|hide, #|code-fold, #|filter_stream features
- Convert using standard Jupytext (# %% markers)
- Add comprehensive guides and comparisons
- Show superior approach vs custom generator
Key insight: NBDev already has mature educational capabilities!
No need to build custom tools when industry standards exist.
Introduces a README with project overview, setup instructions,
and course structure.
Adds a VISION document outlining the project's goals, conventions,
and architecture.
Includes updates to the setup module's README to clarify module to
package mapping.
✅ Setup Module Implementation:
- Created comprehensive setup_dev.ipynb with TinyTorch workflow tutorial
- Added hello_tinytorch(), add_numbers(), and SystemInfo class
- Updated README with clear learning objectives and development workflow
- All 11 tests passing for complete workflow validation
🔧 CLI Enhancements:
- Added --module flag to 'tito sync' for module-specific exports
- Implemented 'tito reset' command with --force option
- Smart auto-generated file detection and cleanup
- Interactive confirmation with safety preservations
📚 Documentation Updates:
- Updated all references to use [module]_dev.ipynb naming convention
- Enhanced test coverage for new functionality
- Clear error handling and user guidance
This establishes the foundation workflow that students will use throughout TinyTorch development.
Updates references to the development notebook naming convention from `[module].ipynb` to `[module]_dev.ipynb` in documentation. This change ensures consistency across the project and aligns with the intended naming scheme for development notebooks.
Introduces a standardized module structure with README, notebooks, tutorials, tests, and solutions.
Refactors the project to emphasize a modular learning path, enhancing clarity and consistency across the TinyTorch course.
Changes the virtual environment path to ".venv".
Updates the project to use `.venv` as the standard virtual environment directory. This change:
- Updates `.gitignore` to ignore `.venv/`.
- Modifies the activation script to create and activate `.venv`.
- Adjusts the `tito.py` script to check for `.venv`'s existence and activation.
- Updates documentation and setup scripts to reflect the new virtual environment naming convention.
This change streamlines environment management and aligns with common Python practices.