[GH-ISSUE #1298] [TinyTorch] #5710

Closed
opened 2026-04-21 21:44:20 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Cohegen on GitHub (Apr 9, 2026).
Original GitHub issue: https://github.com/harvard-edge/cs249r_book/issues/1298

Module

01 Tensor

Type of Improvement

Other

Description

I’ve been working with TinyTorch and had a few ideas that could make it feel more intuitive and closer to a PyTorch-like experience, especially when building ML models.

Proposed Solution

1. PyTorch-like API Aliases

To make the API more familiar and easier to use:

  • It might be helpful to add @property support for:

    • ndim
    • numel (to align with common tensor usage patterns)
  • Adding view() as an alias for reshape() would also improve consistency with PyTorch.

  • A simple contiguous() pass-through method could be useful too, mainly to avoid attribute errors when running training pipelines that expect it.

2. More Robust Tensor Initialization

The __init__ method could be made a bit more flexible when handling different types of inputs:

  • For example, if a list of Tensors is passed in, it could recursively extract their .data.

  • It would also be nice if requires_grad is preserved automatically when initializing from an existing Tensor.

3. Advanced Indexing Utility

  • It could be useful to add a masked_fill(mask, value) method, since this comes up often in transformer models .
Originally created by @Cohegen on GitHub (Apr 9, 2026). Original GitHub issue: https://github.com/harvard-edge/cs249r_book/issues/1298 ### Module 01 Tensor ### Type of Improvement Other ### Description I’ve been working with TinyTorch and had a few ideas that could make it feel more intuitive and closer to a PyTorch-like experience, especially when building ML models. ### Proposed Solution ## 1. PyTorch-like API Aliases To make the API more familiar and easier to use: - It might be helpful to add `@property` support for: - `ndim` - `numel` (to align with common tensor usage patterns) - Adding `view()` as an alias for `reshape()` would also improve consistency with PyTorch. - A simple `contiguous()` pass-through method could be useful too, mainly to avoid attribute errors when running training pipelines that expect it. ## 2. More Robust Tensor Initialization The `__init__` method could be made a bit more flexible when handling different types of inputs: - For example, if a list of Tensors is passed in, it could recursively extract their `.data`. - It would also be nice if `requires_grad` is preserved automatically when initializing from an existing Tensor. ## 3. Advanced Indexing Utility - It could be useful to add a `masked_fill(mask, value)` method, since this comes up often in transformer models .
GiteaMirror added the type: improvementarea: tinytorch labels 2026-04-21 21:44:20 -05:00
Author
Owner

@Shashank-Tripathi-07 commented on GitHub (Apr 14, 2026):

I agree to this, we can expand tinytorch to be closer to PyTorch and can even make optional modules where they explore more deep-tech stuff that can take learners forward. I think you should mention prof. and ask him about this.

<!-- gh-comment-id:4245012333 --> @Shashank-Tripathi-07 commented on GitHub (Apr 14, 2026): I agree to this, we can expand tinytorch to be closer to PyTorch and can even make optional modules where they explore more deep-tech stuff that can take learners forward. I think you should mention prof. and ask him about this.
Author
Owner

@profvjreddi commented on GitHub (Apr 17, 2026):

Thanks guys, appreciate all the help!

<!-- gh-comment-id:4271741154 --> @profvjreddi commented on GitHub (Apr 17, 2026): Thanks guys, appreciate all the help!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/cs249r_book#5710