[PR #1769] Add JAX #1535

Open
opened 2025-11-06 13:18:12 -06:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/vinta/awesome-python/pull/1769
Author: @n2cholas
Created: 5/29/2021
Status: 🔄 Open

Base: masterHead: patch-1


📝 Commits (1)

📊 Changes

1 file changed (+1 additions, -0 deletions)

View changed files

📝 README.md (+1 -0)

📄 Description

What is this Python project?

JAX is a machine learning library with a NumPy-like interface designed to run on accelerators such as GPUs and TPUs. Its distinguishing feature are arbitrarily composable function transformations, enabling JIT compilation, higher order gradients, automatic batching, simple multi-device parallelism, and more.

What's the difference between this Python project and similar ones?

Enumerate comparisons.

  • PyTorch: JAX provides higher order gradients, automatic batching, and JIT compilation at higher performance than PyTorch
  • TensorFlow: JAX is much simpler and narrower in scope, which means all its components are better integrated, easier to use, and has fewer bugs. It also exclusively leverages the XLA compiler instead of pre-compiled kernels.

Anyone who agrees with this pull request could submit an Approve review to it.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/vinta/awesome-python/pull/1769 **Author:** [@n2cholas](https://github.com/n2cholas) **Created:** 5/29/2021 **Status:** 🔄 Open **Base:** `master` ← **Head:** `patch-1` --- ### 📝 Commits (1) - [`218c133`](https://github.com/vinta/awesome-python/commit/218c1339fc43027a92cfdebdc4fa59a6906a734c) Add JAX ### 📊 Changes **1 file changed** (+1 additions, -0 deletions) <details> <summary>View changed files</summary> 📝 `README.md` (+1 -0) </details> ### 📄 Description ## What is this Python project? JAX is a machine learning library with a NumPy-like interface designed to run on accelerators such as GPUs and TPUs. Its distinguishing feature are arbitrarily composable function transformations, enabling JIT compilation, higher order gradients, automatic batching, simple multi-device parallelism, and more. ## What's the difference between this Python project and similar ones? Enumerate comparisons. - PyTorch: JAX provides higher order gradients, automatic batching, and JIT compilation at higher performance than PyTorch - TensorFlow: JAX is much simpler and narrower in scope, which means all its components are better integrated, easier to use, and has fewer bugs. It also exclusively leverages the XLA compiler instead of pre-compiled kernels. Anyone who agrees with this pull request could submit an *Approve* review to it. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2025-11-06 13:18:12 -06:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/awesome-python#1535