diff --git a/book/quarto/contents/vol1/frontmatter/notation.qmd b/book/quarto/contents/vol1/frontmatter/notation.qmd index 7fc078cdd..261347512 100644 --- a/book/quarto/contents/vol1/frontmatter/notation.qmd +++ b/book/quarto/contents/vol1/frontmatter/notation.qmd @@ -86,9 +86,9 @@ We follow standard deep learning conventions (@goodfellow2016deep) with explicit ## Units and Precision {#sec-notation-conventions-units-precision-fdaf} -* **Physical Units**: This book uses SI (metric) units throughout---meters, kilograms, seconds, watts, °C---consistent with standard engineering and scientific practice. Where source data was originally reported in imperial units, we convert to SI and note the original values parenthetically. -* **Storage**: We use decimal prefixes (1 KB = 1000 bytes, 1 GB = $10^9$ bytes) throughout for consistency with ML literature. In memory contexts, industry convention uses these same symbols for binary values ($2^{10}$, $2^{30}$); the difference is negligible for our purposes. -* **Compute**: We use decimal prefixes for operations. +* **Physical Units**: This book uses SI (metric) units throughout---meters, kilograms, seconds, watts, °C---consistent with standard engineering and scientific practice. Where source data was originally reported in imperial units, we convert to SI and note the original values parenthetically. A space always separates the number from the unit (e.g., 100 ms, 2 TB/s). +* **Data and memory**: We use **decimal SI prefixes only**: KB = $10^3$ bytes, MB = $10^6$, GB = $10^9$, TB = $10^{12}$. We do not use binary units (KiB, MiB, GiB) in prose; all capacities, throughputs, and model sizes are reported in decimal units (e.g., 80 GB, 2 TB/s, 102 MB). +* **Compute**: We use decimal prefixes for operations (e.g., GFLOPs, TFLOPs). * 1 TFLOP = $10^{12}$ FLOPs * **Precision**: * **FP32**: Single precision (4 bytes)