Fix documentation links after site → docs reorganization

- Replace all .html → .md in markdown source files (43 instances)
- Fix broken links: tito-essentials.md → tito/overview.md
- Remove broken links to non-existent leaderboard/olympics-rules pages
- Fix PDF_BUILD_GUIDE reference in website-README.md

Website rebuilt successfully with 46 warnings.

Changes:
- All markdown files now use .md extension for internal links
- Removed references to missing/planned files
- Website builds cleanly and all links are functional
This commit is contained in:
Vijay Janapa Reddi
2025-11-28 05:01:44 +01:00
parent a5d9ed3984
commit c058ab9419
65 changed files with 701 additions and 2052 deletions

View File

@@ -289,7 +289,7 @@ After each tier, you become the team member who:
**Next Steps**:
- **New to TinyTorch**: Start with [Quick Start Guide](../quickstart-guide.md) for immediate hands-on experience
- **Ready to Commit**: Begin [Module 01: Tensor](../modules/01_tensor_ABOUT.html) to start building
- **Ready to Commit**: Begin [Module 01: Tensor](../modules/01_tensor_ABOUT.md) to start building
- **Teaching a Course**: Review [Getting Started Guide - For Instructors](../getting-started.html#instructors) for classroom integration
```{admonition} Your Three-Tier Journey Awaits

View File

@@ -152,7 +152,7 @@ Every checkpoint completion unlocks a concrete capability:
The checkpoint system provides comprehensive progress tracking and capability validation through automated testing infrastructure.
**📖 See [Essential Commands](tito-essentials.md)** for complete command reference and usage examples.
**📖 See [Essential Commands](tito/overview.md)** for complete command reference and usage examples.
### Integration with Development
The checkpoint system connects directly to your actual development work:
@@ -248,7 +248,7 @@ The checkpoint progression **Foundation → Architecture → Training → Infere
- **Problem**: Modules don't work together due to missing dependencies
- **Solution**: Verify prerequisite capabilities before testing advanced features
**📖 See [Essential Commands](tito-essentials.md)** for complete debugging command reference.
**📖 See [Essential Commands](tito/overview.md)** for complete debugging command reference.
### Checkpoint Test Structure
@@ -299,4 +299,4 @@ print("🏆 Foundation checkpoint PASSED")
- Analyze memory usage during testing
- Identify bottlenecks in capability validation
**📖 See [Essential Commands](tito-essentials.md)** for complete command reference and advanced usage examples.
**📖 See [Essential Commands](tito/overview.md)** for complete command reference and advanced usage examples.

View File

@@ -117,7 +117,7 @@ tito benchmark capstone
**Submission:** After benchmarks complete, you'll be prompted to submit results (optional). Submissions are saved locally and can be shared with the community.
See [TITO CLI Reference](tito/overview.html) for complete command documentation.
See [TITO CLI Reference](tito/overview.md) for complete command documentation.
---

View File

@@ -228,7 +228,7 @@ Each milestone proves your framework works by running actual ML experiments.
- Helpful for self-assessment
- Use `tito checkpoint status` to view progress
**📖 See [Module Workflow](tito/modules.html)** for the core development cycle.
**📖 See [Module Workflow](tito/modules.md)** for the core development cycle.
---
@@ -255,7 +255,7 @@ cd modules/01_tensor
jupyter lab tensor_dev.py
```
**📖 See [Getting Started Guide](getting-started.html)** for detailed setup.
**📖 See [Getting Started Guide](getting-started.md)** for detailed setup.
### What's the typical workflow?
@@ -272,7 +272,7 @@ cd ../../milestones/01_1957_perceptron
python rosenblatt_forward.py # Uses YOUR implementation!
```
**📖 See [Module Workflow](tito/modules.html)** for complete details.
**📖 See [Module Workflow](tito/modules.md)** for complete details.
### Can I use this in my classroom?

443
docs/for-instructors.md Normal file
View File

@@ -0,0 +1,443 @@
# 👥 For Instructors & TAs
**Complete guide for teaching ML Systems Engineering with TinyTorch**
<div style="background: #f0f9ff; padding: 1.5rem; border-radius: 0.5rem; border-left: 4px solid #3b82f6; margin: 2rem 0;">
<h3 style="margin: 0 0 0.5rem 0;">📋 Quick Course Assessment</h3>
<p style="margin: 0.5rem 0;">
<strong>Duration:</strong> 14-16 weeks (flexible pacing)<br>
<strong>Prerequisites:</strong> Python + basic linear algebra<br>
<strong>Student Outcome:</strong> Complete ML framework supporting vision AND language models<br>
<strong>Grading:</strong> 70% auto-graded (NBGrader), 30% manual (systems thinking)
</p>
</div>
## For Instructors: Course Setup
### 30-Minute Initial Setup
**Step 1: Environment Setup (10 minutes)**
```bash
# Clone repository
git clone https://github.com/mlsysbook/TinyTorch.git
cd TinyTorch
# Create virtual environment
python -m venv .venv
source .venv/bin/activate # or `.venv\Scripts\activate` on Windows
# Install dependencies
pip install -r requirements.txt
pip install nbgrader
# Verify installation
tito system doctor
```
**Step 2: Initialize Grading (10 minutes)**
```bash
# Setup NBGrader integration
tito grade setup
# Verify grading commands
tito grade --help
```
**Step 3: Prepare First Assignment (10 minutes)**
```bash
# Generate instructor version (with solutions)
tito grade generate 01_tensor
# Create student version (solutions removed)
tito grade release 01_tensor
# Student assignments ready in: release/01_tensor/
```
### Assignment Workflow
TinyTorch wraps NBGrader behind simple `tito grade` commands:
**1. Prepare Assignments**
```bash
# Generate instructor version with solutions
tito grade generate MODULE_NAME
# Create student version (auto-removes solutions)
tito grade release MODULE_NAME
```
**2. Distribute to Students**
- **Option A: GitHub Classroom** (recommended)
- Create assignment repository from TinyTorch template
- Students clone and work in their repos
- Automatic submission via GitHub
- **Option B: Direct Distribution**
- Share `release/` directory contents
- Students download and submit via LMS
**3. Collect Submissions**
```bash
# Collect all students
tito grade collect MODULE_NAME
# Or specific student
tito grade collect MODULE_NAME --student student_id
```
**4. Auto-Grade**
```bash
# Grade all submissions
tito grade autograde MODULE_NAME
# Grade specific student
tito grade autograde MODULE_NAME --student student_id
```
**5. Manual Review**
```bash
# Open browser-based grading interface
tito grade manual MODULE_NAME
```
**6. Generate Feedback**
```bash
# Create feedback files for students
tito grade feedback MODULE_NAME
```
**7. Export Grades**
```bash
# Export all grades to CSV
tito grade export
# Or specific module
tito grade export --module MODULE_NAME --output grades.csv
```
### Grading Components
**Auto-Graded (70%)**
- Code implementation correctness
- Test passing
- Function signatures
- Output validation
- Edge case handling
**Manually Graded (30%)**
- ML Systems Thinking questions (3 per module)
- Each question: 10 points
- Focus on understanding, not perfection
### Grading Rubric for Systems Thinking Questions
| Points | Criteria |
|--------|----------|
| 9-10 | Deep understanding, specific code references, discusses systems implications (memory, scaling, trade-offs) |
| 7-8 | Good understanding, some code references, basic systems thinking |
| 5-6 | Surface understanding, generic response, limited systems perspective |
| 3-4 | Attempted but misses key concepts |
| 0-2 | No attempt or completely off-topic |
**What to Look For:**
- References to actual implemented code
- Memory/performance analysis
- Scaling considerations
- Production system comparisons (PyTorch, TensorFlow)
- Understanding of trade-offs
### Sample 16-Week Schedule
| Week | Module | Focus | Teaching Notes |
|------|--------|-------|----------------|
| 1 | 01 Tensor | Data Structures, Memory | Demo: memory profiling, copying behavior |
| 2 | 02 Activations | Non-linearity, Stability | Demo: gradient vanishing/exploding |
| 3 | 03 Layers | Neural Components | Demo: forward/backward passes |
| 4 | 04 Losses | Optimization Objectives | Demo: loss landscapes |
| 5 | 05 Autograd | Auto Differentiation | ⚠️ Most challenging - allocate extra TA hours |
| 6 | 06 Optimizers | Training Algorithms | Demo: optimizer comparisons |
| 7 | 07 Training | Complete Training Loop | Milestone: Train first network! |
| 8 | **Midterm Project** | Build and Train Network | Assessment: End-to-end system |
| 9 | 08 DataLoader | Data Pipeline | Demo: batching, shuffling |
| 10 | 09 Spatial | Convolutions, CNNs | ⚠️ High demand - O(N²) complexity |
| 11 | 10 Tokenization | Text Processing | Demo: vocabulary building |
| 12 | 11 Embeddings | Word Representations | Demo: embedding similarity |
| 13 | 12 Attention | Attention Mechanisms | ⚠️ Moderate-high demand |
| 14 | 13 Transformers | Transformer Architecture | Milestone: Text generation! |
| 15 | 14-19 Optimization | Profiling, Quantization | Focus on production trade-offs |
| 16 | 20 Capstone | **Torch Olympics** | Final Competition |
### Critical Modules (Extra TA Support Needed)
1. **Module 05: Autograd** - Most conceptually challenging
- Pre-record debugging walkthroughs
- Create FAQ document
- Schedule additional office hours
2. **Module 09: Spatial (CNNs)** - Complex nested loops
- Focus on memory profiling
- Loop optimization strategies
- Padding/stride calculations
3. **Module 12: Attention** - Attention mechanisms
- Scaling factor importance
- Numerical stability
- Positional encoding issues
### Module-Specific Teaching Notes
**Module 01: Tensor**
- **Key Concept:** Memory layout is crucial for ML performance
- **Demo:** Show `memory_footprint()`, compare copying vs views
- **Watch For:** Students hardcoding float32 instead of using `dtype`
**Module 05: Autograd**
- **Key Concept:** Computational graphs enable deep learning
- **Demo:** Visualize computational graphs, show gradient flow
- **Watch For:** Gradient shape mismatches, disconnected graphs
**Module 09: Spatial (CNNs)**
- **Key Concept:** O(N²) operations become bottlenecks
- **Demo:** Profile convolution memory usage
- **Watch For:** Index out of bounds, missing padding
**Module 12: Attention**
- **Key Concept:** Attention is compute-intensive but powerful
- **Demo:** Profile attention with different sequence lengths
- **Watch For:** Missing scaling factor (1/√d_k), softmax errors
**Module 20: Capstone**
- **Key Concept:** Production requires optimization across ALL components
- **Project:** Torch Olympics Competition (4 tracks: Speed, Compression, Accuracy, Efficiency)
### Assessment Strategy
**Continuous Assessment (70%)**
- Module completion: 4% each × 16 modules = 64%
- Checkpoint achievements: 6%
**Projects (30%)**
- Midterm: Build and train CNN on CIFAR-10 (15%)
- Final: Torch Olympics Competition (15%)
### Tracking Student Progress
```bash
# Check specific student
tito checkpoint status --student student_id
# Export class progress
tito checkpoint export --output class_progress.csv
# View module completion rates
tito module status --comprehensive
```
**Identify Struggling Students:**
- Missing checkpoint achievements
- Low scores on systems thinking questions
- Incomplete module submissions
- Late milestone completions
---
## For Teaching Assistants: Student Support
### TA Preparation
**Develop Deep Familiarity With:**
1. **Module 05: Autograd** - Most student questions
2. **Module 09: CNNs** - Complex implementation
3. **Module 13: Transformers** - Advanced concepts
**Preparation Process:**
1. Complete all three critical modules yourself
2. Introduce bugs intentionally
3. Practice debugging scenarios
4. Review past student submissions
### Common Student Errors
#### Module 05: Autograd
**Error 1: Gradient Shape Mismatches**
- Symptom: `ValueError: shapes don't match for gradient`
- Cause: Incorrect gradient accumulation
- Debug: Check gradient shapes match parameter shapes
**Error 2: Disconnected Computational Graph**
- Symptom: Gradients are None or zero
- Cause: Operations not tracked
- Debug: Verify `requires_grad=True`, check graph construction
**Error 3: Broadcasting Failures**
- Symptom: Shape errors during backward pass
- Cause: Incorrect handling of broadcasted operations
- Debug: Check gradient accumulation for broadcasted dims
#### Module 09: CNNs (Spatial)
**Error 1: Index Out of Bounds**
- Symptom: `IndexError` in convolution loops
- Cause: Incorrect padding/stride calculations
- Debug: Verify output shape calculations
**Error 2: Memory Issues**
- Symptom: Out of memory errors
- Cause: Creating unnecessary intermediate arrays
- Debug: Profile memory, look for unnecessary copies
#### Module 13: Transformers
**Error 1: Attention Scaling Issues**
- Symptom: Attention weights don't sum to 1
- Cause: Missing softmax or incorrect scaling
- Debug: Verify softmax, check scaling factor (1/√d_k)
**Error 2: Positional Encoding Errors**
- Symptom: Model doesn't learn positional information
- Cause: Incorrect implementation
- Debug: Verify sinusoidal patterns
### Debugging Strategy
**Guide students with questions, not answers:**
1. "What error message are you seeing?" - Read full traceback
2. "What did you expect to happen?" - Clarify mental model
3. "What actually happened?" - Compare expected vs actual
4. "What have you tried?" - Avoid repeating failed approaches
5. "Can you test with a simpler case?" - Reduce complexity
### Productive vs Unproductive Struggle
**Productive Struggle (encourage):**
- Trying different approaches
- Making incremental progress
- Understanding error messages
- Passing more tests over time
**Unproductive Frustration (intervene):**
- Repeated identical errors
- Random code changes
- Unable to articulate the problem
- No progress after 30+ minutes
### Office Hour Patterns
**Expected Demand Spikes:**
- **Weeks 5-6 (Module 05: Autograd)**: Highest demand
- Schedule 2× TA capacity
- Pre-record debugging walkthroughs
- Create FAQ document
- **Week 10 (Module 09: CNNs)**: High demand
- Focus on memory profiling
- Loop optimization
- Padding/stride help
- **Week 13 (Module 13: Transformers)**: Moderate-high
- Attention debugging
- Scaling problems
- Architecture questions
### Manual Review Focus
While auto-grading handles 70%, focus manual review on:
1. **Code Quality**
- Readability
- Design choices
- Documentation
2. **Edge Case Handling**
- Appropriate checks
- Error handling
- Boundary conditions
3. **Systems Thinking**
- Memory analysis
- Performance understanding
- Scaling awareness
### Teaching Tips
1. **Encourage Exploration** - Let students try different approaches
2. **Connect to Production** - Reference PyTorch equivalents
3. **Make Systems Visible** - Profile memory, analyze complexity together
4. **Build Confidence** - Acknowledge progress and validate understanding
---
## Troubleshooting Common Issues
### Environment Problems
```bash
# Student fix:
tito system doctor
tito system reset
```
### Module Import Errors
```bash
# Rebuild package
tito module complete N
```
### Test Failures
```bash
# Detailed test output
tito module test N --verbose
```
### NBGrader Issues
**Database Locked**
```bash
# Clear and reinitialize
rm gradebook.db
tito grade setup
```
**Missing Submissions**
```bash
# Check submission directory
ls submitted/*/MODULE/
```
---
## Additional Resources
- **[Complete Course Structure](chapters/00-introduction.md)** - Full curriculum overview
- **[Student Getting Started](getting-started.md)** - Send this to students
- **[CLI Documentation](tito/overview.md)** - Detailed command reference
- **[Troubleshooting Guide](tito/troubleshooting.md)** - Common issues and solutions
- **[GitHub Discussions](https://github.com/mlsysbook/TinyTorch/discussions)** - Community support
- **[Issue Tracker](https://github.com/mlsysbook/TinyTorch/issues)** - Report bugs
---
## Contact & Support
**Need help?**
- Open an issue on GitHub
- Join discussions forum
- Email: support@mlsysbook.ai (if available)
**Contributing:**
- Sample solutions welcome
- Teaching material improvements
- Bug fixes and enhancements
---
<div style="background: #d4edda; border: 1px solid #28a745; padding: 1.5rem; border-radius: 0.5rem; margin: 2rem 0;">
<h3 style="margin: 0 0 0.5rem 0; color: #155724;">✅ You're Ready to Teach!</h3>
<p style="margin: 0; color: #155724;">
With NBGrader integration, automated grading, and comprehensive teaching materials, you have everything needed to run a successful ML systems course.
</p>
</div>

View File

@@ -71,7 +71,7 @@ tito community join
tito benchmark baseline
```
All community data is stored locally in `.tinytorch/` directory. See **[Community Guide](community.html)** for complete features.
All community data is stored locally in `.tinytorch/` directory. See **[Community Guide](community.md)** for complete features.
### The TinyTorch Build Cycle
@@ -136,7 +136,7 @@ Each milestone has a README explaining:
- Expected results
- What you're learning
**📖 See [Historical Milestones](chapters/milestones.html)** for the complete progression through ML history.
**📖 See [Historical Milestones](chapters/milestones.md)** for the complete progression through ML history.
### Your First Module (15 Minutes)
@@ -170,7 +170,7 @@ TinyTorch has 20 modules organized in progressive tiers:
- **Optimization (14-19)**: Production optimization - profiling, quantization, benchmarking
- **Capstone (20)**: Torch Olympics Competition
**📖 See [Complete Course Structure](chapters/00-introduction.html)** for detailed module descriptions.
**📖 See [Complete Course Structure](chapters/00-introduction.md)** for detailed module descriptions.
### Essential Commands Reference
@@ -191,7 +191,7 @@ tito community join
tito benchmark baseline
```
**📖 See [TITO CLI Reference](tito/overview.html)** for complete command documentation.
**📖 See [TITO CLI Reference](tito/overview.md)** for complete command documentation.
### Notebook Platform Options
@@ -416,7 +416,7 @@ tito grade export --module 01_tensor --output grades_module01.csv
- **System health monitoring** (`tito module status --comprehensive`)
- **Community support** via GitHub Issues
**📖 See [Complete Course Structure](chapters/00-introduction.html)** for full curriculum overview.
**📖 See [Complete Course Structure](chapters/00-introduction.md)** for full curriculum overview.
---

View File

@@ -82,7 +82,7 @@ tito checkpoint status # View your progress
This provides 21 capability checkpoints corresponding to modules and validates your understanding. Helpful for self-assessment but **not required** for the core workflow.
**📖 See [Essential Commands](tito-essentials.md)** for checkpoint commands.
**📖 See [Essential Commands](tito/overview.md)** for checkpoint commands.
---

View File

@@ -99,7 +99,7 @@ You need TWO things to start building:
### Python & NumPy
**[NumPy Quickstart Tutorial](https://numpy.org/doc/stable/user/quickstart.html)**
**[NumPy Quickstart Tutorial](https://numpy.org/doc/stable/user/quickstart.md)**
- Essential NumPy operations and array manipulation
- **Review before Module 01** if NumPy is unfamiliar
@@ -121,14 +121,14 @@ You need TWO things to start building:
## Next Steps
**Ready to Build:**
- See [Quick Start Guide](quickstart-guide.html) for hands-on experience
- See [Student Workflow](student-workflow.html) for development process
- See [Course Structure](chapters/00-introduction.html) for full curriculum
- See [Quick Start Guide](quickstart-guide.md) for hands-on experience
- See [Student Workflow](student-workflow.md) for development process
- See [Course Structure](chapters/00-introduction.md) for full curriculum
**Need More Context:**
- See [Additional Resources](resources.html) for broader ML learning materials
- See [FAQ](faq.html) for common questions about TinyTorch
- See [Community](community.html) to connect with other learners
- See [Additional Resources](resources.md) for broader ML learning materials
- See [FAQ](faq.md) for common questions about TinyTorch
- See [Community](community.md) to connect with other learners
---

View File

@@ -74,7 +74,7 @@ tito benchmark baseline
- Your "Hello World" moment!
- Generates score and saves results locally
See [Community Guide](community.html) for complete features.
See [Community Guide](community.md) for complete features.
</div>

View File

@@ -208,7 +208,7 @@ tito community profile
tito community update
```
**Privacy:** All information is optional. Data is stored locally in `.tinytorch/` directory. See [Community Guide](community.html) for details.
**Privacy:** All information is optional. Data is stored locally in `.tinytorch/` directory. See [Community Guide](community.md) for details.
### Benchmark Your Progress
@@ -226,7 +226,7 @@ tito benchmark capstone --track all
**Capstone Benchmark:** Full performance evaluation across speed, compression, accuracy, and efficiency tracks.
See [Community Guide](community.html) for complete community and benchmarking features.
See [Community Guide](community.md) for complete community and benchmarking features.
## Instructor Integration

View File

@@ -382,4 +382,4 @@ tito olympics leaderboard
---
**[← Back to Home](../intro)****[View Leaderboard](../leaderboard)** • **[Competition Rules](../olympics-rules)**
**[← Back to Home](../intro)**

View File

@@ -187,13 +187,13 @@ tito milestone run 03
| Command | Description | Guide |
|---------|-------------|-------|
| `tito community join` | Join the community (optional info) | [Community Guide](../community.html) |
| `tito community update` | Update your community profile | [Community Guide](../community.html) |
| `tito community profile` | View your community profile | [Community Guide](../community.html) |
| `tito community stats` | View community statistics | [Community Guide](../community.html) |
| `tito community leave` | Remove your community profile | [Community Guide](../community.html) |
| `tito community join` | Join the community (optional info) | [Community Guide](../community.md) |
| `tito community update` | Update your community profile | [Community Guide](../community.md) |
| `tito community profile` | View your community profile | [Community Guide](../community.md) |
| `tito community stats` | View community statistics | [Community Guide](../community.md) |
| `tito community leave` | Remove your community profile | [Community Guide](../community.md) |
**See**: [Community Guide](../community.html) for complete details
**See**: [Community Guide](../community.md) for complete details
### Benchmark Commands
@@ -201,10 +201,10 @@ tito milestone run 03
| Command | Description | Guide |
|---------|-------------|-------|
| `tito benchmark baseline` | Quick setup validation ("Hello World") | [Community Guide](../community.html) |
| `tito benchmark capstone` | Full Module 20 performance evaluation | [Community Guide](../community.html) |
| `tito benchmark baseline` | Quick setup validation ("Hello World") | [Community Guide](../community.md) |
| `tito benchmark capstone` | Full Module 20 performance evaluation | [Community Guide](../community.md) |
**See**: [Community Guide](../community.html) for complete details
**See**: [Community Guide](../community.md) for complete details
### Developer Commands
@@ -370,9 +370,9 @@ tito milestone run --help
## Related Resources
- **[Getting Started Guide](../getting-started.html)** - Complete setup and first steps
- **[Module Workflow](modules.html)** - Day-to-day development cycle
- **[Datasets Guide](../datasets.html)** - Understanding TinyTorch datasets
- **[Getting Started Guide](../getting-started.md)** - Complete setup and first steps
- **[Module Workflow](modules.md)** - Day-to-day development cycle
- **[Datasets Guide](../datasets.md)** - Understanding TinyTorch datasets
---

View File

@@ -0,0 +1,217 @@
# ⚡ Quick Reference
**One-page cheatsheet for experienced developers**
## Essential Commands
### Setup & Verification
```bash
# Initial setup
git clone https://github.com/mlsysbook/TinyTorch.git
cd TinyTorch
./setup-environment.sh
source activate.sh
# Verify installation
tito system doctor
# System information
tito system info
```
### Core Workflow
```bash
# 1. Edit module in Jupyter
cd modules/NN_name
jupyter lab NN_name.ipynb
# 2. Export to package
tito module complete N
# 3. Validate (run milestone scripts)
cd milestones/MM_YYYY_name
python script.py
```
### Module Management
```bash
# Export specific module
tito module complete 01
# Check module status
tito checkpoint status
# System reset (if needed)
tito system reset
```
### Community & Benchmarking
```bash
# Join community (optional)
tito community join
# Run baseline benchmark
tito benchmark baseline
# View profile
tito community profile
```
## Module Dependencies
### Foundation (Required for All)
**Modules 01-07**: Tensor, Activations, Layers, Losses, Autograd, Optimizers, Training
**Unlocks:**
- Milestone 01: Perceptron (1957)
- Milestone 02: XOR Crisis (1969)
- Milestone 03: MLP (1986)
### Architecture (Vision + Language)
**Modules 08-13**: DataLoader, Convolutions, Tokenization, Embeddings, Attention, Transformers
**Unlocks:**
- Milestone 04: CNN (1998) - requires M01-09
- Milestone 05: Transformer (2017) - requires M01-13
### Optimization (Production)
**Modules 14-19**: Profiling, Quantization, Compression, Memoization, Acceleration, Benchmarking
**Unlocks:**
- Milestone 06: Torch Olympics (2018) - requires M01-19
### Capstone
**Module 20**: Torch Olympics Competition
## Common Workflows
### Starting a New Module
```bash
# Navigate to module directory
cd modules/05_autograd
# Open in Jupyter
jupyter lab 05_autograd.ipynb
# Implement required functions
# Run inline tests
# Add docstrings
# Export when complete
cd ../..
tito module complete 05
```
### Debugging Module Errors
```bash
# Check system health
tito system doctor
# View detailed error logs
tito module complete N --verbose
# Reset if corrupted
tito system reset
# Reimport in Python
python
>>> from tinytorch import *
>>> # Test your implementation
```
### Running Milestones
```bash
# After completing Foundation (M01-07)
cd milestones/01_1957_perceptron
python 01_rosenblatt_forward.py
# After completing Architecture (M01-09)
cd milestones/04_1998_cnn
python 01_lenet_inference.py
python 02_lenet_training.py
# After completing Optimization (M01-19)
cd milestones/06_2018_mlperf
python benchmark.py
```
## File Structure
```
TinyTorch/
├── modules/ # Source notebooks (edit these)
│ ├── 01_tensor/
│ │ └── 01_tensor.ipynb
│ └── ...
├── tinytorch/ # Exported package (auto-generated)
│ ├── core/
│ ├── nn/
│ └── ...
├── milestones/ # Validation scripts (run these)
│ ├── 01_1957_perceptron/
│ ├── 02_1969_xor/
│ └── ...
└── tito/ # CLI tool
```
## Import Patterns
```python
# After exporting modules
from tinytorch.core.tensor import Tensor # M01
from tinytorch.nn.activations import ReLU, Softmax # M02
from tinytorch.nn.layers import Linear # M03
from tinytorch.nn.losses import CrossEntropyLoss # M04
from tinytorch.autograd import backward # M05
from tinytorch.optim import SGD, Adam # M06
from tinytorch.training import Trainer # M07
```
## Troubleshooting Quick Fixes
| Issue | Quick Fix |
|-------|-----------|
| `ModuleNotFoundError: tinytorch` | Run `tito module complete N` to export |
| `tito: command not found` | Run `source activate.sh` |
| Import works in notebook, fails in Python | Restart Python kernel after export |
| Tests pass in notebook, fail in milestone | Check module dependencies (M01-07 required) |
| OOM errors | Profile memory usage, check for unnecessary copies |
| NaN losses | Check gradient flow, activation stability |
## Performance Expectations
### Baseline Benchmarks (Your Hardware May Vary)
- **M01-07**: Perceptron trains in <1 second
- **M01-09**: CIFAR-10 CNN trains in 10-30 minutes (CPU)
- **M01-13**: Small transformer inference in 1-5 seconds
- **M01-19**: Optimized models run 10-40× faster
### Memory Usage
- **M01**: Tensor operations ~100 MB
- **M01-09**: CNN training ~1-2 GB
- **M01-13**: Transformer training ~2-4 GB
- **M01-19**: Optimized models use 50-90% less memory
## NBGrader (For Students in Courses)
```bash
# If using NBGrader in classroom setting
# Submit your completed notebook
# Do NOT submit the exported package
# Grading components:
# - 70% Auto-graded (tests)
# - 30% Manual (systems thinking questions)
```
## Next Steps
- **New to TinyTorch?** Start with [Getting Started](../getting-started.md)
- **Stuck on a module?** Check [Troubleshooting](troubleshooting.md)
- **Need detailed docs?** See [CLI Documentation](overview.md)
- **Teaching a course?** See [For Instructors & TAs](../for-instructors.md)
---
**💡 Pro Tip**: Bookmark this page for quick command reference while building!

View File

@@ -34,7 +34,7 @@
</div>
<div>
<ul style="margin: 0; padding-left: 1rem;">
<li><strong>Complete instructor guide</strong> with setup & grading ([available now](../instructor-guide.html))</li>
<li><strong>Complete instructor guide</strong> with setup & grading ([available now](../instructor-guide.md))</li>
<li><strong>Flexible pacing</strong> (14-18 weeks depending on depth)</li>
<li><strong>Industry practices</strong> (Git, testing, documentation)</li>
<li><strong>Academic foundation</strong> from university research</li>
@@ -46,7 +46,7 @@
**Planned Course Duration:** 14-16 weeks (flexible pacing)
**Student Outcome:** Complete ML framework supporting vision AND language models
**Current Status:** Complete NBGrader integration available! See the [Instructor Guide](../instructor-guide.html) for setup, grading workflows, and sample solutions.
**Current Status:** Complete NBGrader integration available! See the [Instructor Guide](../instructor-guide.md) for setup, grading workflows, and sample solutions.
---

View File

@@ -62,7 +62,7 @@ Using scripts directly:
## 📖 Detailed Documentation
See **[PDF_BUILD_GUIDE.md](PDF_BUILD_GUIDE.md)** for:
See PDF build documentation for:
- Complete setup instructions
- Troubleshooting guide
- Configuration options
@@ -78,7 +78,7 @@ site/
├── _static/ # Images, CSS, JavaScript
├── intro.md # Book introduction
├── quickstart-guide.md # Quick start for students
├── tito-essentials.md # CLI reference
├── tito/overview.md # CLI reference
└── ... # Additional course pages
```

View File

@@ -1,2 +0,0 @@
# This file tells GitHub Pages not to use Jekyll processing
# Required for Jupyter Book deployment

View File

@@ -1,61 +0,0 @@
# TinyTorch Book Build Makefile
# Convenient shortcuts for building HTML and PDF versions
.PHONY: help html pdf pdf-simple clean install test
help:
@echo "TinyTorch Book Build Commands"
@echo "=============================="
@echo ""
@echo " make html - Build HTML version (default website)"
@echo " make pdf - Build PDF via LaTeX (requires LaTeX installation)"
@echo " make pdf-simple - Build PDF via HTML (no LaTeX required)"
@echo " make clean - Remove all build artifacts"
@echo " make install - Install Python dependencies"
@echo " make install-pdf - Install dependencies for PDF building"
@echo " make test - Test build configuration"
@echo ""
@echo "Quick start for PDF:"
@echo " make install-pdf && make pdf-simple"
@echo ""
html:
@echo "🌐 Building HTML version..."
@echo "📓 Preparing notebooks for launch buttons..."
@./prepare_notebooks.sh || echo "⚠️ Notebook preparation skipped (tito not available)"
@echo ""
jupyter-book build .
pdf:
@echo "📚 Building PDF via LaTeX..."
@./build_pdf.sh
pdf-simple:
@echo "📚 Building PDF via HTML..."
@./build_pdf_simple.sh
clean:
@echo "🧹 Cleaning build artifacts..."
jupyter-book clean . --all
rm -rf _build/
install:
@echo "📦 Installing base dependencies..."
pip install -U pip
pip install "jupyter-book<1.0"
pip install -r requirements.txt
install-pdf:
@echo "📦 Installing PDF dependencies..."
pip install -U pip
pip install "jupyter-book<1.0" pyppeteer
pip install -r requirements.txt
test:
@echo "🧪 Testing build configuration..."
jupyter-book config sphinx .
@echo "✅ Configuration valid"
# Default target
.DEFAULT_GOAL := help

View File

@@ -1,104 +0,0 @@
# TinyTorch: Build ML Systems from Scratch
# Interactive Jupyter Book Configuration
# Branding: Use stylized "Tiny🔥Torch" for public-facing site branding
# This matches the branding convention for memorable, personality-driven presentation
title: "Tiny🔥Torch"
author: "Prof. Vijay Janapa Reddi (Harvard University)"
copyright: "2025"
# Logo: Updated to use standard logo (replaces white version for better visibility)
logo: _static/logos/logo-tinytorch.png
# Book description and metadata
description: >-
An interactive course for building machine learning systems from the ground up.
Learn by implementing your own PyTorch-style framework with hands-on coding,
real datasets, and production-ready practices.
# Execution settings for interactive notebooks
execute:
execute_notebooks: "cache"
allow_errors: true
timeout: 300
# Exclude patterns - don't scan these directories/files
exclude_patterns:
- _build
- .venv
- appendices
- "**/.venv/**"
- "**/__pycache__/**"
- "**/.DS_Store"
- "modules/**/*.md"
- "!modules/*_ABOUT.md"
# GitHub repository configuration for GitHub Pages
repository:
url: https://github.com/mlsysbook/TinyTorch
path_to_book: site
branch: main
# HTML output configuration
html:
use_issues_button: true
use_repository_button: true
use_edit_page_button: true
use_download_button: true
use_fullscreen_button: true
# Custom styling
extra_css:
- _static/custom.css
# Custom JavaScript
extra_js:
- _static/wip-banner.js
- _static/ml-timeline.js
- _static/hero-carousel.js
- _static/sidebar-link.js
- _static/marimo-badges.js
# Favicon configuration
favicon: "_static/favicon.svg"
# Binder integration for executable notebooks
launch_buttons:
binderhub_url: "https://mybinder.org"
colab_url: "https://colab.research.google.com"
# LaTeX/PDF output
latex:
latex_documents:
targetname: tinytorch-course.tex
# Bibliography support
bibtex_bibfiles:
- references.bib
# Sphinx extensions for enhanced functionality
sphinx:
extra_extensions:
- sphinxcontrib.mermaid
config:
mermaid_version: "10.6.1"
# Sidebar collapsible sections configuration
html_theme_options:
show_navbar_depth: 1 # Initial expanded depth (1 = top-level only)
collapse_navigation: false # Allow navigation to be collapsible
navigation_depth: 4 # Maximum depth for navigation tree
# Parse configuration for MyST Markdown
parse:
myst_enable_extensions:
- "colon_fence"
- "deflist"
- "html_admonition"
- "html_image"
- "linkify"
- "replacements"
- "smartquotes"
- "substitution"
- "tasklist"
# Advanced options
only_build_toc_files: true

View File

@@ -1,117 +0,0 @@
# TinyTorch: Build ML Systems from Scratch
# Table of Contents Structure
format: jb-book
root: intro
title: "TinyTorch Course"
parts:
# Getting Started - Consolidated single entry point
- caption: 🚀 Getting Started
chapters:
- file: getting-started
title: "Complete Guide"
# Foundation Tier - Collapsible section
- caption: 🏗 Foundation Tier (01-07)
chapters:
- file: tiers/foundation
title: "📖 Tier Overview"
- file: modules/01_tensor_ABOUT
title: "01. Tensor"
- file: modules/02_activations_ABOUT
title: "02. Activations"
- file: modules/03_layers_ABOUT
title: "03. Layers"
- file: modules/04_losses_ABOUT
title: "04. Losses"
- file: modules/05_autograd_ABOUT
title: "05. Autograd"
- file: modules/06_optimizers_ABOUT
title: "06. Optimizers"
- file: modules/07_training_ABOUT
title: "07. Training"
# Architecture Tier - Collapsible section
- caption: 🏛️ Architecture Tier (08-13)
chapters:
- file: tiers/architecture
title: "📖 Tier Overview"
- file: modules/08_dataloader_ABOUT
title: "08. DataLoader"
- file: modules/09_spatial_ABOUT
title: "09. Convolutions"
- file: modules/10_tokenization_ABOUT
title: "10. Tokenization"
- file: modules/11_embeddings_ABOUT
title: "11. Embeddings"
- file: modules/12_attention_ABOUT
title: "12. Attention"
- file: modules/13_transformers_ABOUT
title: "13. Transformers"
# Optimization Tier - Collapsible section
- caption: ⏱️ Optimization Tier (14-19)
chapters:
- file: tiers/optimization
title: "📖 Tier Overview"
- file: modules/14_profiling_ABOUT
title: "14. Profiling"
- file: modules/15_quantization_ABOUT
title: "15. Quantization"
- file: modules/16_compression_ABOUT
title: "16. Compression"
- file: modules/17_memoization_ABOUT
title: "17. Memoization"
- file: modules/18_acceleration_ABOUT
title: "18. Acceleration"
- file: modules/19_benchmarking_ABOUT
title: "19. Benchmarking"
# Capstone Competition - Collapsible section
- caption: 🏅 Capstone Competition
chapters:
- file: tiers/olympics
title: "📖 Competition Overview"
- file: modules/20_capstone_ABOUT
title: "20. Torch Olympics"
# Course Orientation - Collapsible section
- caption: 🧭 Course Orientation
chapters:
- file: chapters/00-introduction
title: "Course Structure"
- file: prerequisites
title: "Prerequisites & Resources"
- file: chapters/learning-journey
title: "Learning Journey"
- file: chapters/milestones
title: "Historical Milestones"
- file: faq
title: "FAQ"
# TITO CLI Reference - Collapsible section
- caption: 🛠️ TITO CLI Reference
chapters:
- file: tito/overview
title: "Command Overview"
- file: tito/modules
title: "Module Workflow"
- file: tito/milestones
title: "Milestone System"
- file: tito/data
title: "Progress & Data"
- file: tito/troubleshooting
title: "Troubleshooting"
- file: datasets
title: "Datasets Guide"
# Community - Collapsible section
- caption: 🤝 Community
chapters:
- file: community
title: "Ecosystem"
- file: resources
title: "Learning Resources"
- file: credits
title: "Credits & Acknowledgments"

View File

@@ -1,70 +0,0 @@
#!/bin/bash
# TinyTorch Website Build Script
# Jupyter Book 1.x (Sphinx) Build System
# Quick and easy: ./site/build.sh (from root) or ./build.sh (from site/)
set -e # Exit on error
echo "🏗️ Building TinyTorch documentation website (Jupyter Book 1.x)..."
echo ""
# Detect where we're running from and navigate to site directory
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
SITE_DIR=""
PROJECT_ROOT=""
if [ -f "_config.yml" ]; then
# Already in site directory
SITE_DIR="$(pwd)"
PROJECT_ROOT="$(dirname "$SITE_DIR")"
elif [ -f "site/_config.yml" ]; then
# In root directory
PROJECT_ROOT="$(pwd)"
SITE_DIR="$(pwd)/site"
cd "$SITE_DIR"
echo "📂 Changed to site directory: $SITE_DIR"
else
echo "❌ Error: Cannot find site directory with _config.yml"
echo " Run from project root or site/ directory"
exit 1
fi
# Activate virtual environment if it exists and we're not already in it
if [ -z "$VIRTUAL_ENV" ] && [ -f "$PROJECT_ROOT/.venv/bin/activate" ]; then
echo "🔧 Activating virtual environment..."
source "$PROJECT_ROOT/.venv/bin/activate"
elif [ -z "$VIRTUAL_ENV" ]; then
echo "⚠️ Warning: No virtual environment detected"
echo " Recommend running: source scripts/activate-tinytorch"
fi
# Verify jupyter-book is available
if ! command -v jupyter-book &> /dev/null; then
echo "❌ Error: jupyter-book not found"
echo " Install with: pip install jupyter-book"
exit 1
fi
echo "📦 Using: $(which jupyter-book)"
echo " Version: $(jupyter-book --version | head -1)"
echo ""
# Clean previous build
if [ -d "_build" ]; then
echo "🧹 Cleaning previous build..."
jupyter-book clean .
echo ""
fi
# Build the site
echo "🚀 Building Jupyter Book site..."
echo ""
jupyter-book build . --all
echo ""
echo "✅ Build complete!"
echo ""
echo "📖 To view the site locally:"
echo " python -m http.server 8000 --directory _build/html"
echo " Then open: http://localhost:8000"
echo ""

View File

@@ -1,73 +0,0 @@
#!/bin/bash
# Build PDF version of TinyTorch book
# This script builds the LaTeX/PDF version using jupyter-book
set -e # Exit on error
echo "🔥 Building TinyTorch PDF..."
echo ""
# Check if we're in the site directory
if [ ! -f "_config.yml" ]; then
echo "❌ Error: Must run from site/ directory"
echo "Usage: cd site && ./build_pdf.sh"
exit 1
fi
# Check dependencies
echo "📋 Checking dependencies..."
if ! command -v jupyter-book &> /dev/null; then
echo "❌ Error: jupyter-book not installed"
echo "Install with: pip install jupyter-book"
exit 1
fi
if ! command -v pdflatex &> /dev/null; then
echo "⚠️ Warning: pdflatex not found"
echo "PDF build requires LaTeX installation:"
echo " - macOS: brew install --cask mactex-no-gui"
echo " - Ubuntu: sudo apt-get install texlive-latex-extra texlive-fonts-recommended"
echo " - Windows: Install MiKTeX from miktex.org"
echo ""
echo "Alternatively, use HTML-to-PDF build (doesn't require LaTeX):"
echo " jupyter-book build . --builder pdfhtml"
exit 1
fi
echo "✅ Dependencies OK"
echo ""
# Clean previous builds
echo "🧹 Cleaning previous builds..."
jupyter-book clean . --all || true
echo ""
# Prepare notebooks (for consistency, though PDF doesn't need launch buttons)
echo "📓 Preparing notebooks..."
./prepare_notebooks.sh || echo "⚠️ Notebook preparation skipped"
# Build PDF via LaTeX
echo "📚 Building LaTeX/PDF (this may take a few minutes)..."
jupyter-book build . --builder pdflatex
# Check if build succeeded
if [ -f "_build/latex/tinytorch-course.pdf" ]; then
PDF_SIZE=$(du -h "_build/latex/tinytorch-course.pdf" | cut -f1)
echo ""
echo "✅ PDF build complete!"
echo "📄 Output: site/_build/latex/tinytorch-course.pdf"
echo "📊 Size: ${PDF_SIZE}"
echo ""
echo "To view the PDF:"
echo " open _build/latex/tinytorch-course.pdf # macOS"
echo " xdg-open _build/latex/tinytorch-course.pdf # Linux"
echo " start _build/latex/tinytorch-course.pdf # Windows"
else
echo ""
echo "❌ PDF build failed - check errors above"
echo ""
echo "📝 Build artifacts in: _build/latex/"
echo "Check _build/latex/tinytorch-course.log for detailed errors"
exit 1
fi

View File

@@ -1,70 +0,0 @@
#!/bin/bash
# Build PDF version of TinyTorch book (Simple HTML-to-PDF method)
# This script builds PDF via HTML conversion - no LaTeX installation required
set -e # Exit on error
echo "🔥 Building TinyTorch PDF (Simple Method - No LaTeX Required)..."
echo ""
# Check if we're in the site directory
if [ ! -f "_config.yml" ]; then
echo "❌ Error: Must run from site/ directory"
echo "Usage: cd site && ./build_pdf_simple.sh"
exit 1
fi
# Check dependencies
echo "📋 Checking dependencies..."
if ! command -v jupyter-book &> /dev/null; then
echo "❌ Error: jupyter-book not installed"
echo "Install with: pip install jupyter-book pyppeteer"
exit 1
fi
# Check if pyppeteer is installed
python3 -c "import pyppeteer" 2>/dev/null || {
echo "❌ Error: pyppeteer not installed"
echo "Install with: pip install pyppeteer"
echo ""
echo "Note: First run will download Chromium (~170MB)"
exit 1
}
echo "✅ Dependencies OK"
echo ""
# Clean previous builds
echo "🧹 Cleaning previous builds..."
jupyter-book clean . --all || true
echo ""
# Prepare notebooks (for consistency, though PDF doesn't need launch buttons)
echo "📓 Preparing notebooks..."
./prepare_notebooks.sh || echo "⚠️ Notebook preparation skipped"
# Build PDF via HTML
echo "📚 Building PDF from HTML (this may take a few minutes)..."
echo " First run will download Chromium browser (~170MB)"
jupyter-book build . --builder pdfhtml
# Check if build succeeded
if [ -f "_build/pdf/book.pdf" ]; then
# Copy to standard location with better name
cp "_build/pdf/book.pdf" "_build/tinytorch-course.pdf"
PDF_SIZE=$(du -h "_build/tinytorch-course.pdf" | cut -f1)
echo ""
echo "✅ PDF build complete!"
echo "📄 Output: site/_build/tinytorch-course.pdf"
echo "📊 Size: ${PDF_SIZE}"
echo ""
echo "To view the PDF:"
echo " open _build/tinytorch-course.pdf # macOS"
echo " xdg-open _build/tinytorch-course.pdf # Linux"
echo " start _build/tinytorch-course.pdf # Windows"
else
echo ""
echo "❌ PDF build failed - check errors above"
exit 1
fi

View File

@@ -1,73 +0,0 @@
# TinyTorch PDF Book Generation
This directory contains the configuration for generating the TinyTorch course as a PDF book.
## Building the PDF
To build the PDF version of the TinyTorch course:
```bash
# Install Jupyter Book if not already installed
pip install jupyter-book
# Build the PDF (from the docs/ directory)
jupyter-book build . --builder pdflatex
# Or from the repository root:
jupyter-book build docs --builder pdflatex
```
The generated PDF will be in `docs/_build/latex/tinytorch-course.pdf`.
## Structure
- `_config_pdf.yml` - Jupyter Book configuration optimized for PDF output
- `_toc_pdf.yml` - Linear table of contents for the PDF book
- `cover.md` - Cover page for the PDF
- `preface.md` - Preface explaining the book's approach and philosophy
## Content Sources
The PDF pulls content from:
- **Module ABOUT.md files**: `../modules/XX_*/ABOUT.md` - Core technical content
- **Site files**: `../site/*.md` - Introduction, quick start guide, resources
- **Site chapters**: `../site/chapters/*.md` - Course overview and milestones
All content is sourced from a single location and reused for both the website and PDF, ensuring consistency.
## Customization
### PDF-Specific Settings
The `_config_pdf.yml` includes PDF-specific settings:
- Disabled notebook execution (`execute_notebooks: "off"`)
- LaTeX engine configuration
- Custom page headers and formatting
- Paper size and typography settings
### Chapter Ordering
The `_toc_pdf.yml` provides linear chapter ordering suitable for reading cover-to-cover, unlike the website's multi-section structure.
## Dependencies
Building the PDF requires:
- `jupyter-book`
- `pyppeteer` (for HTML to PDF conversion)
- LaTeX distribution (e.g., TeX Live, MiKTeX)
- `latexmk` (usually included with LaTeX distributions)
## Troubleshooting
**LaTeX errors**: Ensure you have a complete LaTeX distribution installed
**Missing fonts**: Install the required fonts for the logo and styling
**Build timeouts**: Increase the timeout in `_config_pdf.yml` if needed
## Future Enhancements
Planned improvements for the PDF:
- Custom LaTeX styling for code blocks
- Better figure placement and captions
- Index generation
- Cross-reference optimization
- Improved table formatting

File diff suppressed because it is too large Load Diff

View File

@@ -1,39 +0,0 @@
###############################################################################
# Auto-generated by `jupyter-book config`
# If you wish to continue using _config.yml, make edits to that file and
# re-generate this one.
###############################################################################
author = 'Prof. Vijay Janapa Reddi (Harvard University)'
bibtex_bibfiles = ['references.bib']
comments_config = {'hypothesis': False, 'utterances': False}
copyright = '2025'
exclude_patterns = ['**.ipynb_checkpoints', '**/.DS_Store', '**/.venv/**', '**/__pycache__/**', '.DS_Store', '.venv', 'Thumbs.db', '_build', 'appendices']
extensions = ['sphinx_togglebutton', 'sphinx_copybutton', 'myst_nb', 'jupyter_book', 'sphinx_thebe', 'sphinx_comments', 'sphinx_external_toc', 'sphinx.ext.intersphinx', 'sphinx_design', 'sphinx_book_theme', 'sphinxcontrib.mermaid', 'sphinxcontrib.bibtex', 'sphinx_jupyterbook_latex', 'sphinx_multitoc_numbering']
external_toc_exclude_missing = True
external_toc_path = '_toc.yml'
html_baseurl = ''
html_css_files = ['custom.css']
html_favicon = '_static/favicon.svg'
html_js_files = ['wip-banner.js', 'ml-timeline.js', 'hero-carousel.js']
html_logo = 'logo-tinytorch-white.png'
html_sourcelink_suffix = ''
html_static_path = ['_static']
html_theme = 'sphinx_book_theme'
html_theme_options = {'search_bar_text': 'Search this book...', 'launch_buttons': {'notebook_interface': 'classic', 'binderhub_url': 'https://mybinder.org', 'jupyterhub_url': '', 'thebe': False, 'colab_url': 'https://colab.research.google.com', 'deepnote_url': ''}, 'path_to_docs': 'site', 'repository_url': 'https://github.com/mlsysbook/TinyTorch', 'repository_branch': 'main', 'extra_footer': '', 'home_page_in_toc': True, 'announcement': '', 'analytics': {'google_analytics_id': '', 'plausible_analytics_domain': '', 'plausible_analytics_url': 'https://plausible.io/js/script.js'}, 'use_repository_button': True, 'use_edit_page_button': True, 'use_issues_button': True}
html_title = 'TinyTorch'
latex_engine = 'pdflatex'
mermaid_version = '10.6.1'
myst_enable_extensions = ['colon_fence', 'deflist', 'html_admonition', 'html_image', 'linkify', 'replacements', 'smartquotes', 'substitution', 'tasklist']
myst_url_schemes = ['mailto', 'http', 'https']
nb_execution_allow_errors = True
nb_execution_cache_path = ''
nb_execution_excludepatterns = []
nb_execution_in_temp = False
nb_execution_mode = 'cache'
nb_execution_timeout = 300
nb_output_stderr = 'show'
numfig = True
pygments_style = 'sphinx'
suppress_warnings = ['myst.domains']
use_jupyterbook_latex = True
use_multitoc_numbering = True

View File

@@ -1 +0,0 @@
../../src/01_tensor/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/02_activations/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/03_layers/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/04_losses/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/05_autograd/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/06_optimizers/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/07_training/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/08_dataloader/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/09_spatial/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/10_tokenization/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/11_embeddings/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/12_attention/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/13_transformers/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/14_profiling/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/15_quantization/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/16_compression/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/17_memoization/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/18_acceleration/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/19_benchmarking/ABOUT.md

View File

@@ -1 +0,0 @@
../../src/20_capstone/ABOUT.md

View File

@@ -1,77 +0,0 @@
#!/bin/bash
# Prepare notebooks for site build
# This script ensures notebooks exist in site/ for launch buttons to work
# Called automatically during site build
#
# Workflow:
# 1. Uses existing assignment notebooks if available (from tito nbgrader generate)
# 2. Falls back to generating notebooks from modules if needed
# 3. Copies notebooks to site/chapters/modules/ for Jupyter Book launch buttons
set -e
# Get the site directory (where this script lives)
SITE_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
REPO_ROOT="$(cd "$SITE_DIR/.." && pwd)"
echo "📓 Preparing notebooks for site build..."
# Create notebooks directory in site if it doesn't exist
NOTEBOOKS_DIR="$SITE_DIR/chapters/modules"
mkdir -p "$NOTEBOOKS_DIR"
cd "$REPO_ROOT"
# Strategy: Use existing assignment notebooks if available, otherwise generate
# This is faster and uses already-processed notebooks
echo "🔄 Looking for existing assignment notebooks..."
MODULES=$(ls -1 modules/ 2>/dev/null | grep -E "^[0-9]" | sort -V || echo "")
if [ -z "$MODULES" ]; then
echo "⚠️ No modules found. Skipping notebook preparation."
exit 0
fi
NOTEBOOKS_COPIED=0
NOTEBOOKS_GENERATED=0
for module in $MODULES; do
TARGET_NB="$NOTEBOOKS_DIR/${module}.ipynb"
# Check if assignment notebook already exists
ASSIGNMENT_NB="$REPO_ROOT/assignments/source/$module/${module}.ipynb"
if [ -f "$ASSIGNMENT_NB" ]; then
# Use existing assignment notebook
cp "$ASSIGNMENT_NB" "$TARGET_NB"
echo " ✅ Copied existing notebook: $module"
NOTEBOOKS_COPIED=$((NOTEBOOKS_COPIED + 1))
elif command -v tito &> /dev/null; then
# Try to generate notebook if tito is available
echo " 🔄 Generating notebook for $module..."
if tito nbgrader generate "$module" >/dev/null 2>&1; then
if [ -f "$ASSIGNMENT_NB" ]; then
cp "$ASSIGNMENT_NB" "$TARGET_NB"
echo " ✅ Generated and copied: $module"
NOTEBOOKS_GENERATED=$((NOTEBOOKS_GENERATED + 1))
fi
else
echo " ⚠️ Could not generate notebook for $module (module may not be ready)"
fi
else
echo " ⚠️ No notebook found for $module (install tito CLI to generate)"
fi
done
echo ""
if [ $NOTEBOOKS_COPIED -gt 0 ] || [ $NOTEBOOKS_GENERATED -gt 0 ]; then
echo "✅ Notebook preparation complete!"
echo " Copied: $NOTEBOOKS_COPIED | Generated: $NOTEBOOKS_GENERATED"
echo " Notebooks available in: $NOTEBOOKS_DIR"
echo " Launch buttons will now work on notebook pages!"
else
echo "⚠️ No notebooks prepared. Launch buttons may not appear."
echo " Run 'tito nbgrader generate --all' first to create assignment notebooks."
fi

View File

View File

@@ -1,36 +0,0 @@
# TinyTorch Course Dependencies for Site Documentation Builds
# Note: For Binder/Colab environments, see binder/requirements.txt
# Keep synchronized with main requirements.txt
# Core numerical computing
numpy>=1.24.0,<3.0.0
matplotlib>=3.5.0
# Data handling
PyYAML>=6.0
# Rich terminal formatting (for development feedback)
rich>=13.0.0
# Jupyter Book for building documentation
jupyter-book>=1.0.0,<2.0.0
# Jupyter environment
jupyter>=1.0.0
jupyterlab>=4.0.0
ipykernel>=6.0.0
ipywidgets>=8.0.0
# Sphinx extensions
sphinxcontrib-mermaid>=0.9.2
# Type checking support
typing-extensions>=4.0.0
# For executing TinyTorch code
setuptools>=70.0.0
wheel>=0.42.0
# Optional: for advanced visualizations
# plotly>=5.0.0
# seaborn>=0.11.0