feat: implement selective exports for modules 09-11

- 09_spatial: Export Conv2d, MaxPool2d, AvgPool2d only
- 10_tokenization: Export Tokenizer, CharTokenizer, BPETokenizer only
- 11_embeddings: Export Embedding, PositionalEncoding only

Continues professional selective export pattern. Clean public APIs,
development utilities remain in development environment.
This commit is contained in:
Vijay Janapa Reddi
2025-09-30 09:56:50 -04:00
parent b678fe8f77
commit 956efe76a7
2 changed files with 5 additions and 0 deletions

View File

@@ -195,6 +195,7 @@ This ensures consistent behavior across different tokenization strategies.
"""
# %% nbgrader={"grade": false, "grade_id": "base-tokenizer", "solution": true}
#| export
class Tokenizer:
"""
Base tokenizer class providing the interface for all tokenizers.
@@ -303,6 +304,7 @@ Result: "hello"
"""
# %% nbgrader={"grade": false, "grade_id": "char-tokenizer", "solution": true}
#| export
class CharTokenizer(Tokenizer):
"""
Character-level tokenizer that treats each character as a separate token.
@@ -513,6 +515,7 @@ BPE discovers natural word boundaries and common patterns automatically!
"""
# %% nbgrader={"grade": false, "grade_id": "bpe-tokenizer", "solution": true}
#| export
class BPETokenizer(Tokenizer):
"""
Byte Pair Encoding (BPE) tokenizer that learns subword units.

View File

@@ -201,6 +201,7 @@ Now let's build the core embedding layer that performs efficient token-to-vector
"""
# %% nbgrader={"grade": false, "grade_id": "embedding-class", "solution": true}
#| export
class Embedding:
"""
Learnable embedding layer that maps token indices to dense vectors.
@@ -402,6 +403,7 @@ Let's build trainable positional embeddings that can learn position-specific pat
"""
# %% nbgrader={"grade": false, "grade_id": "positional-encoding", "solution": true}
#| export
class PositionalEncoding:
"""
Learnable positional encoding layer.