- Add enhanced student notebook generator with dual-purpose content - Create complete setup module with 100-point nbgrader allocation - Implement nbgrader CLI commands (init, generate, release, collect, autograde, feedback) - Add nbgrader configuration and directory structure - Create comprehensive documentation and implementation plan - Support both self-learning and formal assessment workflows - Maintain backward compatibility with existing TinyTorch system This implementation provides: - Single source → multiple outputs (learning + assessment) - Automated grading with 80% workload reduction - Scalable course management for 100+ students - Comprehensive analytics and reporting - Production-ready nbgrader integration
🔥 Module: Tensor
📊 Module Info
- Difficulty: ⭐⭐ Intermediate
- Time Estimate: 4-6 hours
- Prerequisites: Setup module
- Next Steps: Activations, Layers
Build the foundation of TinyTorch! This module implements the core Tensor class - the fundamental data structure that powers all neural networks and machine learning operations.
🎯 Learning Objectives
By the end of this module, you will:
- ✅ Understand what tensors are and why they're essential for ML
- ✅ Implement a complete Tensor class with core operations
- ✅ Handle tensor shapes, data types, and memory management
- ✅ Implement element-wise operations and reductions
- ✅ Have a solid foundation for building neural networks
📋 Module Structure
modules/tensor/
├── README.md # 📖 This file - Module overview
├── tensor_dev.ipynb # 📓 Main development notebook
├── test_tensor.py # 🧪 Automated tests
└── check_tensor.py # ✅ Manual verification (coming soon)
🚀 Getting Started
Step 1: Complete Prerequisites
Make sure you've completed the setup module:
python bin/tito.py test --module setup # Should pass
Step 2: Open the Tensor Notebook
# Start from the tensor module directory
cd modules/tensor/
# Open the development notebook
jupyter lab tensor_dev.ipynb
Step 3: Work Through the Implementation
The notebook guides you through building:
- Basic Tensor class - Constructor and properties
- Shape management - Understanding tensor dimensions
- Arithmetic operations - Addition, multiplication, etc.
- Utility methods - Reshape, transpose, sum, mean
- Error handling - Robust edge case management
Step 4: Export and Test
# Export your tensor implementation
python bin/tito.py sync
# Test your implementation
python bin/tito.py test --module tensor
📚 What You'll Implement
Core Tensor Class
You'll build a complete Tensor class that supports:
1. Construction and Properties
# Creating tensors
a = Tensor([1, 2, 3]) # 1D tensor
b = Tensor([[1, 2], [3, 4]]) # 2D tensor
c = Tensor(5.0) # Scalar tensor
# Properties
print(a.shape) # (3,)
print(b.size) # 4
print(c.dtype) # float32
2. Arithmetic Operations
# Element-wise operations
result = a + b # Addition
result = a * 2 # Scalar multiplication
result = a @ b # Matrix multiplication (bonus)
3. Utility Methods
# Shape manipulation
reshaped = b.reshape(1, 4) # Change shape
transposed = b.transpose() # Swap dimensions
# Reductions
total = a.sum() # Sum all elements
mean_val = a.mean() # Average value
max_val = a.max() # Maximum value
Technical Requirements
Your Tensor class must:
- Handle multiple data types (int, float)
- Support N-dimensional arrays
- Implement proper error checking
- Work with NumPy arrays internally
- Export to
tinytorch.core.tensor
🧪 Testing Your Implementation
Automated Tests
python bin/tito.py test --module tensor
Tests verify:
- ✅ Tensor creation (scalars, vectors, matrices)
- ✅ Property access (shape, size, dtype)
- ✅ Arithmetic operations (all combinations)
- ✅ Utility methods (reshape, transpose, reductions)
- ✅ Error handling (invalid operations)
Interactive Testing
# Test in the notebook or Python REPL
from tinytorch.core.tensor import Tensor
# Create and test tensors
a = Tensor([1, 2, 3])
b = Tensor([[1, 2], [3, 4]])
print(a + 5) # Should work
print(a.sum()) # Should return scalar
🎯 Success Criteria
Your tensor module is complete when:
- All tests pass:
python bin/tito.py test --module tensor - Tensor imports correctly:
from tinytorch.core.tensor import Tensor - Basic operations work: Can create tensors and do arithmetic
- Properties work: Shape, size, dtype return correct values
- Utilities work: Reshape, transpose, reductions function properly
💡 Implementation Tips
Start with the Basics
- Simple constructor - Handle lists and NumPy arrays
- Basic properties - Shape, size, dtype
- One operation - Start with addition
- Test frequently - Verify each feature works
Design Patterns
class Tensor:
def __init__(self, data, dtype=None):
# Convert input to numpy array
# Store shape, size, dtype
def __add__(self, other):
# Handle tensor + tensor
# Handle tensor + scalar
# Return new Tensor
def sum(self, axis=None):
# Reduce along specified axis
# Return scalar or tensor
Common Challenges
- Shape compatibility - Check dimensions for operations
- Data type handling - Convert inputs consistently
- Memory efficiency - Don't create unnecessary copies
- Error messages - Provide helpful debugging info
🔧 Advanced Features (Optional)
If you finish early, try implementing:
- Broadcasting - Operations on different-shaped tensors
- Slicing -
tensor[1:3, :]syntax - In-place operations -
tensor += other - Matrix multiplication -
tensor @ other
🚀 Next Steps
Once you complete the tensor module:
- Move to Autograd:
cd modules/autograd/ - Build automatic differentiation: Enable gradient computation
- Combine with tensors: Make tensors differentiable
- Prepare for neural networks: Ready for the MLP module
🔗 Why Tensors Matter
Tensors are the foundation of all ML systems:
- Neural networks store weights and activations as tensors
- Training computes gradients on tensors
- Data processing represents batches as tensors
- GPU acceleration operates on tensor primitives
Your tensor implementation will power everything else in TinyTorch!
🎉 Ready to Build?
The tensor module is where TinyTorch really begins. You're about to create the fundamental building block that will power neural networks, training loops, and production ML systems.
Take your time, test thoroughly, and enjoy building something that really works! 🔥