---
title: "Batch Size Balancer"
subtitle: "Push throughput to the edge of OOM."
description: "A browser mini-game from MLSysBook Playground. Dynamically adjust batch size to process images quickly, without bursting your GPU memory."
page-layout: article
format:
html:
include-in-header:
- text: |
---
```{=html}
batch 1 · score 0↑ / ↓ adjust batch size · keep memory under 100% · R retry
```
## How to play
Increase batch size to raise throughput, but watch the memory meter. Larger batches process the queue faster; if queue pressure plus batch memory exceeds 100%, the run OOMs immediately and your score stops there. Find the largest safe batch size as demand changes.
## The Systems Concept
Spikes in activation memory are the penalty. Larger batches improve hardware utilization, but they also reserve more memory per step. The optimal batch size is not simply “as large as possible”; it is the largest batch that keeps throughput high without crossing the memory limit.
---
*Part of [MLSysBook Playground](/games/). Found a bug? [Report an issue](https://github.com/harvard-edge/cs249r_book/issues/new?labels=bug&title=Bug+in+Game).*