Commit Graph

674 Commits

Author SHA1 Message Date
Vijay Janapa Reddi
e07ed66548 assets(pdf): add logo images for PDF cover page
- fire-emoji.png for Tiny🔥Torch branding in LaTeX
- logo-mlsysbook.png for MLSysBook badge reference
2025-12-07 13:39:32 -08:00
Vijay Janapa Reddi
3cbe46e21c build(pdf): add XeLaTeX PDF build target to Makefile
- Add 'make pdf' target for PDF generation via XeLaTeX
- Include dependency checks for jupyter-book and xelatex
- Run latex_postprocessor.py for emoji cleanup
- Copy logo assets to build directory
- Add restore-emoji target for interrupted builds
2025-12-07 13:39:24 -08:00
Vijay Janapa Reddi
b48cba6c81 feat(pdf): add LaTeX postprocessor for emoji cleanup
- Remove emojis for clean professional PDF output
- Replace fire emoji with inline image for branding
- Convert Unicode subscripts to LaTeX math
- Clear duplicate Sphinx title page metadata
- Add regex patterns for escaped LaTeX commands
2025-12-07 13:39:14 -08:00
Vijay Janapa Reddi
15b0fee1a5 style(pdf): refine cover page typography and badge layout
- Badge spacing: 114pt from author block for breathing room
- Internal line spacing: 10pt (line 1-2), 6pt (line 2-3)
- Bottom padding: 14pt for softer feel
- Draft Edition gap: 5pt (lifted closer to badge)
- Badge text: 'The Build-It-Yourself Companion to the'
- Lowercase 'textbook' for academic consistency
2025-12-07 13:39:04 -08:00
Vijay Janapa Reddi
ceb384e863 refactor(paper): improve consistency and add memory_footprint to Tensor
- Add memory_footprint() method to Tensor class matching paper Listing 1
- Fix milestone numbering: use 'Milestone 1-6' instead of confusing 'M03/M06' format
- Remove unvalidated hour estimates (60-80 hours) from abstract and configurations
- Simplify NBGrader language, removing 'unvalidated' caveats
- Clean up time-to-completion language in validation roadmap
2025-12-07 13:35:42 -08:00
Vijay Janapa Reddi
3571c0104a docs(paper): add pedagogical precedent subsection to Related Work
Add new subsection positioning TinyTorch within the canonical tradition
of build-to-understand systems education: MINIX, SICP, Tiger compiler,
Nachos, and Pintos. This strengthens the paper by showing TinyTorch
follows a proven 50-year pedagogical pattern.

New references: tanenbaum1987minix, abelson1996sicp, appel2004tiger,
christopher1993nachos, pfaff2004pintos
2025-12-07 13:15:39 -08:00
Vijay Janapa Reddi
24178f6cbd fix(tinytorch): remove unused matplotlib import from memoization module
The matplotlib import in profile_naive_generation() was unused and causing
import errors when matplotlib is not installed. Removed to fix module tests.
2025-12-07 12:06:20 -08:00
Vijay Janapa Reddi
a774c7e4bb refactor(tinytorch): standardize all module ABOUT.md structure
- Move 'Getting Started' section earlier (position 6, after Build → Use → Reflect)
- Add 'Common Pitfalls' section to all modules (3-5 pitfalls with code examples)
- Add 'Production Context' section to all modules (framework comparisons, real-world usage)
- Verify professional emoji usage (no emoji in section headers)
- Apply consistent structure across all 20 modules
2025-12-07 11:13:11 -08:00
Vijay Janapa Reddi
6ed685534d refactor(site): enhance PDF build with XeLaTeX and TinyTorch branding
- Switch from LuaLaTeX to XeLaTeX for better font handling and Unicode support
- Add comprehensive TinyTorch brand colors matching the logo
- Implement syntax-highlighted code blocks with flame accent
- Enhance title page with professional logo placement
- Add clean headers/footers with branded styling
- Reorganize TOC structure with semantic parts and captions
- Improve chapter titles for better pedagogical clarity
- Update build process to use latexmk for robust compilation
2025-12-07 10:09:03 -08:00
Vijay Janapa Reddi
deb2ced61d refactor(site): consolidate build scripts into Makefile
Simplified build system by removing redundant scripts:
- Removed build.sh (functionality moved to Makefile)
- Removed build_pdf.sh (consolidated into Makefile)
- Removed build_pdf_simple.sh (consolidated into Makefile)

Enhanced Makefile with better organization and PDF build support
Updated README with clearer build instructions
Improved _config_pdf.yml with better PDF generation settings
2025-12-07 06:07:46 -08:00
Vijay Janapa Reddi
11a55101be refactor(module 03): remove redundant code and fix docstrings
Additional cleanup following module review:
- Removed redundant __call__ method from Linear (inherits from Layer)
- Fixed Dropout docstrings to correctly describe inference behavior
- Simplified Sequential.parameters() by removing unnecessary hasattr check

All 61 tests still passing after cleanup
2025-12-07 06:05:32 -08:00
Vijay Janapa Reddi
c0af0eb36b refactor(tinytorch): cleanup modules 02, 09, 12, 17, 18 following module 03 principles
Applied API simplification and consistency improvements across multiple modules:

Module 02 (Activations):
- Added __all__ export list to control public API
- Removed redundant import statement
- Prevents internal constants from polluting namespace

Module 09 (Spatial):
- Fixed test naming to use PyTorch conventions (Conv2d not Conv2D)
- Fixed AvgPool2d gradient tracking (added requires_grad parameter)
- Updated all test imports to use lowercase 'd' naming

Module 12 (Attention):
- Fixed progressive integration tests to use correct Trainer API
- Added missing loss_fn parameter to Trainer calls

Module 17 (Memoization):
- Removed redundant create_kv_cache() function (use KVCache() directly)
- Made internal constants private (_BYTES_PER_FLOAT32, _MB_TO_BYTES)
- Simplified API from 6 exports to 3 core components
- 50% reduction in public API surface

Module 18 (Acceleration):
- Fixed test suite to match function-based API
- Added tests for vectorized_matmul, fused_gelu, tiled_matmul
- All 6 tests now passing

Rationale:
- API simplicity: one clear way to do things
- Progressive disclosure: hide implementation details
- Consistent naming: follow established conventions
- Test coverage: validate all exported functionality

All module tests passing after changes
2025-12-07 06:05:15 -08:00
Vijay Janapa Reddi
0cf91ee0c3 refactor(tinytorch): simplify module 03 API and remove confusing aliases
Simplifies the layers module API by removing alias proliferation that could confuse students in a pedagogical framework.

Changes:
- Rename SimpleModel → Sequential (matches PyTorch naming)
- Remove create_mlp() and MLP alias (taught in milestones, not core modules)
- Remove input_size/output_size aliases from Linear (keep only in_features/out_features)
- Update all tests to use explicit Sequential composition
- Fix dtype test to validate float32 normalization (TinyTorch's design)

Module focus: Individual building blocks (Linear, Dropout, Sequential container)
MLP construction: Taught in Milestone 03 (1986 MLP) using manual composition

Rationale:
- Progressive disclosure: students learn explicit composition first
- API clarity: one way to do things reduces cognitive load
- Separation of concerns: modules teach primitives, milestones teach patterns

All tests passing: 48/48 in module 03, 214/221 across all modules
2025-12-07 05:31:05 -08:00
Vijay Janapa Reddi
fa6d951a8a fix(tinytorch): fix test flakiness and coverage requirements
- Add np.random.seed(42) to test_deep_network_gradient_chain for reproducibility
- Add --no-cov to tito module test to avoid root pyproject.toml coverage requirements
- Skip test_layers_networks_integration.py when tinytorch.core.dense is not implemented
2025-12-07 04:32:14 -08:00
Vijay Janapa Reddi
edfe9d9a77 fix(tinytorch): add missing exports and fix benchmark tests
- Module 19: Add #| export to import block so dataclass is exported
- Fix benchmark tests to use correct Benchmark API (requires models/datasets)
2025-12-06 21:42:14 -08:00
Vijay Janapa Reddi
56a5abff0f docs: add volume structure proposal and distribution design docs
- book/docs/VOLUME_STRUCTURE_PROPOSAL.md: Proposal for textbook volume structure
- tinytorch/docs/DISTRIBUTION_DESIGN.md: Design document for TinyTorch pip distribution
2025-12-06 21:20:57 -08:00
Vijay Janapa Reddi
85393c66bc fix(tinytorch): update benchmarking demo to use correct API
- Add missing imports for Tensor and Linear
- Fix parameter names: warmup_iterations -> warmup_runs
- Fix attribute names: min/max -> min_val/max_val
- Use run_latency_benchmark method with input_shape parameter
2025-12-06 21:20:24 -08:00
Vijay Janapa Reddi
ddb2683268 refactor(tito): extract export utilities and add venv guard
- Extract shared export logic to export_utils.py to reduce duplication
  between export.py and src.py commands
- Add virtual environment check to prevent running tito outside venv
  (can be bypassed with TITO_ALLOW_SYSTEM=1 for advanced users)
2025-12-06 21:20:10 -08:00
Vijay Janapa Reddi
e839e9658f feat(tinytorch): add SimpleModel utility class to layers module
Add SimpleModel as a minimal container for explicit layer composition.
Used by quantization, compression, and capstone modules for:
- Collecting parameters from multiple layers
- Running integration tests
- Enabling optimization functions that need a model object

This consolidates SimpleModel definitions that were scattered across modules.
2025-12-06 21:19:19 -08:00
Vijay Janapa Reddi
4556d225b5 fix(tinytorch): add missing export directives for quantization/compression functions
Add #| export directives to ensure functions are properly exported to package:
- Module 15: quantize_int8, dequantize_int8, quantize_model
- Module 16: measure_sparsity

These functions were defined but not exported, causing import errors when
using the perf/ package path.
2025-12-06 21:12:14 -08:00
Vijay Janapa Reddi
744f1499e6 feat(tinytorch): add verify_quantization_works and verify_pruning_works functions
Port verification functions from mlsysbook/TinyTorch standalone repo.
These functions prove optimizations work using real .nbytes measurements.

Module 15 (quantization):
- Add verify_quantization_works() function
- Measures actual FP32 vs INT8 memory reduction
- Asserts >= 3.5x reduction (targeting 4x)

Module 16 (compression):
- Add verify_pruning_works() function
- Counts actual zeros in parameter arrays
- Honestly reports memory unchanged (dense storage)
- Explains compute savings vs memory savings

Both functions:
- Are exported to tinytorch package
- Return dicts with verification results
- Include educational messaging about production usage
2025-12-06 21:05:52 -08:00
Vijay Janapa Reddi
14c7b58c02 fix(tinytorch): resolve function shadowing in modules 15 and 16
Modules 15 (quantization) and 16 (compression) had a bug where
'convenience wrapper' functions at the end of the file shadowed
the main implementations, causing test failures.

Changes:
- Module 15: Import SimpleModel from tinytorch.core.layers
- Module 15: Quantizer class now delegates to standalone functions
- Module 15: Remove shadowing wrappers (quantize_model, dequantize_int8)
- Module 16: Import SimpleModel from tinytorch.core.layers
- Module 16: Compressor class now delegates to standalone functions
- Module 16: Remove shadowing wrappers (measure_sparsity, magnitude_prune, etc.)

The pattern now is:
- Standalone functions: Primary implementations students build
- Quantizer/Compressor classes: OOP interface that delegates to standalone functions
- No duplicate definitions that could shadow each other

All 20 modules now pass their tests.
2025-12-06 21:00:56 -08:00
Vijay Janapa Reddi
0bd42e3b98 refactor: simplify TinyTorch workflows
- Remove tinytorch-ci.yml (not needed for now)
- Remove tinytorch-release-check.yml (not needed for now)
- Remove tinytorch-build-pdf.yml (not needed for now)
- Simplify tinytorch-publish-dev.yml and tinytorch-publish-live.yml
- Update _config.yml to point to new repo location

Deployments:
- Dev: mlsysbook.ai/tinytorch-dev/
- Live: mlsysbook.ai/tinytorch/
2025-12-05 19:35:49 -08:00
Vijay Janapa Reddi
c602f97364 feat: integrate TinyTorch into MLSysBook repository
TinyTorch educational deep learning framework now lives at tinytorch/

Structure:
- tinytorch/src/         - Source modules (single source of truth)
- tinytorch/tito/        - CLI tool
- tinytorch/tests/       - Test suite
- tinytorch/site/        - Jupyter Book website
- tinytorch/milestones/  - Historical ML implementations
- tinytorch/datasets/    - Educational datasets (tinydigits, tinytalks)
- tinytorch/assignments/ - NBGrader assignments
- tinytorch/instructor/  - Teaching materials

Workflows (with tinytorch- prefix):
- tinytorch-ci.yml           - CI/CD pipeline
- tinytorch-publish-dev.yml  - Dev site deployment
- tinytorch-publish-live.yml - Live site deployment
- tinytorch-build-pdf.yml    - PDF generation
- tinytorch-release-check.yml - Release validation

Repository Variables added:
- TINYTORCH_ROOT  = tinytorch
- TINYTORCH_SRC   = tinytorch/src
- TINYTORCH_SITE  = tinytorch/site
- TINYTORCH_TESTS = tinytorch/tests

All workflows use \${{ vars.TINYTORCH_* }} for path configuration.

Note: tinytorch/site/_static/favicon.svg kept as SVG (valid for favicons)
2025-12-05 19:23:18 -08:00