mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-04-28 16:12:32 -05:00
Following Module 05's successful visual learning patterns: - Add ASCII diagrams for complex concepts - Natural markdown flow explaining what's about to happen - Visual memory layouts, data flows, and computation graphs - Enhanced test sections with clear explanations - Consistent with new MODULE_DEVELOPMENT guidelines Module 01 (Tensor): - Tensor dimension hierarchy visualization - Memory layout and broadcasting diagrams - Matrix multiplication step-by-step Module 02 (Activations): - Linearity problem and activation curves - Dead neuron visualization for ReLU - Softmax probability transformation Module 03 (Layers): - Linear layer computation visualization - Parameter management hierarchy - Batch processing shape transformations Module 04 (Losses): - Loss landscape visualizations - MSE quadratic penalty diagrams - CrossEntropy confidence patterns All modules tested and working correctly
🔥 Module: Tensor
📊 Module Info
- Difficulty: ⭐⭐ Intermediate
- Time Estimate: 4-6 hours
- Prerequisites: Setup module
- Next Steps: Activations, Layers
Build the foundation of TinyTorch! This module implements the core Tensor class - the fundamental data structure that powers all neural networks and machine learning operations.
🎯 Learning Objectives
By the end of this module, you will:
- Understand what tensors are and why they're essential for ML
- Implement a complete Tensor class with core operations
- Handle tensor shapes, data types, and memory management efficiently
- Implement element-wise operations and reductions with proper broadcasting
- Have a solid foundation for building neural networks
🧠 Build → Use → Understand
- Build: Complete Tensor class with arithmetic operations, shape management, and reductions
- Use: Create tensors, perform operations, and validate with real data
- Understand: How tensors serve as the foundation for all neural network computations
📚 What You'll Build
Core Tensor Class
# Creating tensors
x = Tensor([[1.0, 2.0], [3.0, 4.0]])
y = Tensor([[0.5, 1.5], [2.5, 3.5]])
# Properties
print(x.shape) # (2, 2)
print(x.size) # 4
print(x.dtype) # float64
# Element-wise operations
z = x + y # Addition
w = x * y # Multiplication
p = x ** 2 # Exponentiation
# Shape manipulation
reshaped = x.reshape(4, 1) # (4, 1)
transposed = x.T # (2, 2) transposed
# Reductions
total = x.sum() # Scalar sum
means = x.mean(axis=0) # Mean along axis
Essential Operations
- Arithmetic: Addition, subtraction, multiplication, division, powers
- Shape management: Reshape, transpose, broadcasting rules
- Reductions: Sum, mean, min, max along any axis
- Memory handling: Efficient data storage and copying
🚀 Getting Started
Prerequisites Check
tito checkpoint test 00 # Environment setup should pass ✅
Development Workflow
# Navigate to tensor module
cd modules/01_tensor
# Open development file
jupyter lab tensor_dev.py
# OR edit directly: code tensor_dev.py
Step-by-Step Implementation
- Basic Tensor class - Constructor and properties
- Shape management - Understanding tensor dimensions
- Arithmetic operations - Addition, multiplication, etc.
- Utility methods - Reshape, transpose, sum, mean
- Error handling - Robust edge case management
🧪 Testing Your Implementation
Inline Testing
# Test in the notebook or Python REPL
x = Tensor([[1.0, 2.0], [3.0, 4.0]])
print(f"Shape: {x.shape}") # Should be (2, 2)
print(f"Sum: {x.sum()}") # Should be 10.0
Module Tests
# Complete and export your tensor implementation
tito module complete 01_tensor
# Test specific checkpoint
tito checkpoint test 01 # Foundation checkpoint
Manual Verification
# Create and test tensors
from tinytorch.core.tensor import Tensor
x = Tensor([1, 2, 3, 4, 5])
y = Tensor([2, 4, 6, 8, 10])
# Test operations
assert (x + y).data.tolist() == [3, 6, 9, 12, 15]
assert (x * 2).data.tolist() == [2, 4, 6, 8, 10]
print("✅ Basic operations working!")
🎯 Key Concepts
Tensors as Universal Data Structures
- Scalars: 0-dimensional tensors (single numbers)
- Vectors: 1-dimensional tensors (arrays)
- Matrices: 2-dimensional tensors (common in ML)
- Higher dimensions: Images (3D), video (4D), etc.
Why Tensors Matter in ML
- Neural networks: All computations operate on tensors
- GPU acceleration: operates on tensor primitives
- Broadcasting: Efficient operations across different shapes
- Vectorization: Process entire datasets simultaneously
Real-World Connections
- PyTorch/TensorFlow: Your implementation mirrors production frameworks
- NumPy: Foundation for scientific computing (we build similar abstractions)
- Production systems: Understanding tensors is essential for ML engineering
Memory and Performance
- Data layout: How tensors store data efficiently
- Broadcasting: Smart operations without data copying
- View vs Copy: Understanding memory management
🎉 Ready to Build?
The tensor module is where TinyTorch really begins. You're about to create the fundamental building block that will power neural networks, training loops, and production ML systems.
Take your time, test thoroughly, and enjoy building something that really works! 🔥