mirror of
https://github.com/harvard-edge/cs249r_book.git
synced 2026-03-11 17:49:25 -05:00
style: Vol2 register pass follow-up #2 — fix two more violations in sustainable_ai
Flagged by the sustainable_ai editor agent as newly discovered during fixing: - line 635: "If your cluster consumes...how much...actually went...how much was wasted?" → impersonal declarative; removes "your", two embedded rhetorical questions, two "actually" - line 2261: "You want to fine-tune a small language model" in .callout-notebook → "Consider fine-tuning a small language model" (impersonal)
This commit is contained in:
@@ -632,7 +632,7 @@ The convergence of exponential computational demands with hard physical efficien
|
||||
|
||||
## Energy Measurement and Modeling {#sec-sustainable-ai-part-ii-measurement-assessment-fb0b}
|
||||
|
||||
Engineers cannot optimize what they cannot measure. If your cluster consumes five megawatts of power during a large language model training run, how much of that power actually went into matrix multiplications, and how much was wasted spinning cooling fans to remove the resulting heat? Effective energy modeling requires decomposing the monolithic datacenter power bill into granular, component-level metrics that engineers can actually target for optimization.
|
||||
Engineers cannot optimize what they cannot measure. A cluster consuming five megawatts during a large language model training run directs only a fraction of that power into matrix multiplications; the remainder is consumed by cooling fans removing the resulting heat. Effective energy modeling requires decomposing the monolithic datacenter power bill into granular, component-level metrics that engineers can target for optimization.
|
||||
|
||||
The datacenter infrastructure foundations from @sec-compute-infrastructure established power and cooling as dominant engineering constraints. Systematic measurement now transforms these constraints into sustainability metrics. This part develops quantitative frameworks for three critical areas: energy consumption tracking during training and inference, carbon footprint analysis across system lifecycles, and resource usage assessment for hardware and infrastructure. These measurement tools enable engineers to identify optimization opportunities, compare alternative designs, and validate that sustainability improvements achieve their intended effects. Just as performance engineering requires profiling before optimization, sustainable AI engineering requires measurement before mitigation.
|
||||
|
||||
@@ -2258,7 +2258,7 @@ While inference on TinyML devices is highly efficient, **on-device learning** in
|
||||
The energy design power (TDP) of mobile processors creates hard constraints that shape every aspect of on-device learning strategies. Modern smartphones typically maintain sustained processing at 2--3 W for ML workloads to prevent thermal discomfort, but can burst to 5--10 W for brief periods before thermal throttling occurs. This thermal design power determines the entire feasible space of adaptive algorithms.
|
||||
|
||||
::: {.callout-notebook title="The Energy of Learning"}
|
||||
**Problem**: You want to fine-tune a small language model (1B parameters) on a user's smartphone overnight. Is this feasible within a **5% battery budget**?
|
||||
**Problem**: Consider fine-tuning a small language model (1B parameters) on a user's smartphone overnight. Is this feasible within a **5% battery budget**?
|
||||
|
||||
**The Math**:
|
||||
|
||||
|
||||
Reference in New Issue
Block a user