refactor: improve Kits content organization per MIT Press standards

Getting Started:
- Restructured as step-by-step guide (Select → Setup → Choose Lab → Start)
- Added textbook connection section for academic context
- Clearer prerequisite expectations

Platforms:
- Renamed title from "Hardware Kits" to "Hardware Platforms" (avoid site title collision)
- Removed system requirements (moved to IDE Setup where they belong)
- Focused purely on hardware specifications and comparisons

IDE Setup:
- Added System Requirements section (moved from Platforms)
- Streamlined introduction
- Now contains all setup-related prerequisites in one place

Style:
- Improved carousel caption bar legibility (bolder text, larger font)
This commit is contained in:
Vijay Janapa Reddi
2025-12-27 18:07:54 -05:00
parent 32e96e88a0
commit baf11336f8
4 changed files with 102 additions and 80 deletions

View File

@@ -1179,7 +1179,7 @@ figure figcaption {
background: $kits-accent;
color: white;
text-align: center;
padding: 16px 24px;
padding: 20px 30px;
border-radius: 8px;
margin-top: 15px;
margin-bottom: 20px;
@@ -1189,20 +1189,22 @@ figure figcaption {
justify-content: center;
h5 {
font-weight: 700;
margin: 0 0 6px 0;
font-size: 1.2rem;
font-weight: 800;
margin: 0 0 8px 0;
font-size: 1.4rem;
border: none !important;
padding: 0 !important;
color: white;
letter-spacing: 0.02em;
color: white !important;
letter-spacing: 0.03em;
text-shadow: 0 1px 2px rgba(0, 0, 0, 0.2);
}
p {
margin: 0;
font-size: 0.95rem;
opacity: 0.9;
color: white;
font-size: 1.05rem;
font-weight: 500;
color: white !important;
opacity: 1;
}
}

View File

@@ -1,61 +1,83 @@
# Getting Started {.unnumbered}
Ready to deploy machine learning on embedded hardware? This guide helps you choose the right platform and get your development environment running.
This guide walks you through selecting hardware, configuring your development environment, and running your first embedded ML application. Most students complete setup in under an hour.
## Choose Your Hardware
## Step 1: Select Your Platform
Select a platform based on your budget and learning goals:
Your choice depends on budget, learning objectives, and the types of applications you want to build.
| Platform | Price | Best For |
|----------|-------|----------|
| [Grove Vision AI V2](seeed/grove_vision_ai_v2/grove_vision_ai_v2.qmd) | ~$20 | Beginners, plug-and-play vision AI |
| [XIAO ESP32S3](seeed/xiao_esp32s3/xiao_esp32s3.qmd) | ~$25 | Best value, multi-modal sensing |
| [Raspberry Pi](raspi/raspi.qmd) | ~$80 | Advanced users, LLMs and VLMs |
| [Nicla Vision](arduino/nicla_vision/nicla_vision.qmd) | ~$100 | Professional, battery-powered |
**For beginners or budget-conscious learners:**
: Quick platform selection guide {.striped .hover}
| Platform | Cost | Why Choose It |
|----------|------|---------------|
| [Grove Vision AI V2](seeed/grove_vision_ai_v2/grove_vision_ai_v2.qmd) | ~$20 | No-code interface, fastest path to running models |
| [XIAO ESP32S3](seeed/xiao_esp32s3/xiao_esp32s3.qmd) | ~$25 | Best value, supports vision, audio, and motion |
For detailed specifications and comparisons, see [Platforms](platforms.qmd).
**For advanced applications:**
## Set Up Your Environment
| Platform | Cost | Why Choose It |
|----------|------|---------------|
| [Raspberry Pi](raspi/raspi.qmd) | ~$80 | Full Linux environment, LLMs and VLMs |
| [Nicla Vision](arduino/nicla_vision/nicla_vision.qmd) | ~$100 | Professional-grade, ultra-low power design |
Development environment setup typically takes 30-60 minutes. Follow the [IDE Setup Guide](ide-setup.qmd) for step-by-step instructions covering:
For detailed specifications and technical comparisons, see [Platforms](platforms.qmd).
## Step 2: Set Up Your Environment
Development environment configuration is platform-dependent but follows a common pattern: install software tools, configure communication with hardware, and verify the setup works.
**Time estimate:** 30-60 minutes depending on platform and internet speed.
Follow the [IDE Setup Guide](ide-setup.qmd) for complete procedures covering:
- System requirements for your development computer
- Arduino IDE installation for microcontroller platforms
- Python environment setup for Raspberry Pi
- Python environment configuration for Raspberry Pi
- SenseCraft AI web interface for Grove Vision AI V2
- Serial communication and debugging tools
- Serial communication and hardware verification
## Available Labs
## Step 3: Choose Your First Lab
Each platform supports different exercise types. Choose exercises that match your hardware:
Each platform supports different exercise categories. Select labs that match both your hardware and learning goals.
| Exercise | Grove Vision | XIAO ESP32S3 | Nicla Vision | Raspberry Pi |
|----------|:------------:|:------------:|:------------:|:------------:|
| Image Classification | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| Object Detection | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| Keyword Spotting | | :white_check_mark: | :white_check_mark: | |
| Motion Classification | | :white_check_mark: | :white_check_mark: | |
| Large Language Models | | | | :white_check_mark: |
| Vision Language Models | | | | :white_check_mark: |
| Lab Category | Grove Vision | XIAO | Nicla | Raspberry Pi |
|--------------|:------------:|:----:|:-----:|:------------:|
| Image Classification | ✓ | ✓ | ✓ | ✓ |
| Object Detection | ✓ | ✓ | ✓ | ✓ |
| Keyword Spotting | | ✓ | ✓ | |
| Motion Classification | | ✓ | ✓ | |
| Large Language Models | | | | |
| Vision Language Models | | | | |
: Lab compatibility by platform {.striped .hover}
: Exercise availability by platform {.striped .hover}
## Step 4: Start Your First Lab
**Grove Vision AI V2:** Begin with [Setup and No-Code Apps](seeed/grove_vision_ai_v2/setup_and_no_code_apps/setup_and_no_code_apps.qmd). You'll deploy a pre-trained model in minutes using the visual interface.
**XIAO ESP32S3:** Start with [Setup](seeed/xiao_esp32s3/setup/setup.qmd), then proceed to [Image Classification](seeed/xiao_esp32s3/image_classification/image_classification.qmd) to train and deploy your first custom model.
**Nicla Vision:** Complete [Setup](arduino/nicla_vision/setup/setup.qmd) to configure your board, then try [Image Classification](arduino/nicla_vision/image_classification/image_classification.qmd).
**Raspberry Pi:** Follow [Setup](raspi/setup/setup.qmd), then choose your path:
- [Image Classification](raspi/image_classification/image_classification.qmd) for computer vision fundamentals
- [LLM Deployment](raspi/llm/llm.qmd) to run language models on edge hardware
## Prerequisites
**Programming:** Python proficiency required. C/C++ familiarity helpful but not required.
These labs assume:
**Math:** Basic linear algebra and probability. No advanced math needed.
- **Programming:** Proficiency in Python. Familiarity with C/C++ is helpful for microcontroller platforms but not required.
- **Mathematics:** Working knowledge of linear algebra and basic probability at the undergraduate level.
- **Hardware:** No prior embedded systems experience. Each lab includes complete setup and troubleshooting procedures.
**Hardware:** No prior embedded experience assumed. Labs include complete setup procedures.
## Connection to ML Systems Textbook
## Start Building
These laboratories complement specific chapters in the ML Systems textbook:
Once your environment is configured, jump into the labs for your platform:
- **Image Classification labs** reinforce concepts from the Computer Vision and Model Optimization chapters
- **Keyword Spotting labs** connect to Audio Processing and Real-time Inference
- **Motion Classification labs** demonstrate Sensor Fusion and Time-series Analysis
- **LLM/VLM labs** extend Large Model Deployment to resource-constrained environments
- **Grove Vision AI V2:** Start with [Setup and No-Code Apps](seeed/grove_vision_ai_v2/setup_and_no_code_apps/setup_and_no_code_apps.qmd) for the fastest path to running AI models
- **XIAO ESP32S3:** Begin with [Setup](seeed/xiao_esp32s3/setup/setup.qmd), then try [Image Classification](seeed/xiao_esp32s3/image_classification/image_classification.qmd)
- **Nicla Vision:** Follow [Setup](arduino/nicla_vision/setup/setup.qmd) to configure your board
- **Raspberry Pi:** Complete [Setup](raspi/setup/setup.qmd), then explore [Image Classification](raspi/image_classification/image_classification.qmd) or jump to [LLMs](raspi/llm/llm.qmd)
Each lab includes implementation procedures, expected results, and troubleshooting guidance.
Each lab identifies relevant textbook sections for deeper theoretical understanding.

View File

@@ -1,10 +1,34 @@
# IDE Setup {.unnumbered}
Setting up your interactive development environment (IDE) is a critical first step that determines your success throughout the laboratory sequence. Unlike cloud-based ML development, where infrastructure is abstracted away, embedded systems require you to understand the complete toolchain from code compilation to hardware deployment. This hands-on setup process introduces fundamental concepts about embedded development workflows while preparing your workstation for laboratory exercises.
Setting up your development environment is a critical first step that determines your success throughout the laboratory sequence. Unlike cloud-based ML development where infrastructure is abstracted away, embedded systems require understanding the complete toolchain from code compilation to hardware deployment.
Environment setup typically takes 30-60 minutes, depending on the platform choice and internet connection speed. The procedures below are designed to be completed by students with no prior embedded systems experience, with each step building the skills needed for subsequent laboratory work.
Environment setup typically takes 30-60 minutes, depending on platform choice and internet speed. These procedures are designed for students with no prior embedded systems experience.
After completing hardware selection as outlined in the [Platforms](platforms.qmd) chapter, these procedures will establish the development tools, libraries, and verification methods needed for embedded ML programming.
## System Requirements {#sec-ide-setup-system-requirements}
Before beginning installation, verify your development computer meets these requirements:
**Development Computer:**
- **Operating System:** Windows 10/11, macOS 10.15+, or Linux (Ubuntu 18.04+)
- **Memory:** 8GB RAM minimum (16GB recommended for Raspberry Pi development)
- **Storage:** 10GB free space for development tools and libraries
- **USB Ports:** At least one USB 2.0/3.0 port for device connection
- **Internet Connection:** Required for software installation and library downloads
**Software Prerequisites:**
- **Arduino IDE 2.0+** for Arduino-based platforms (XIAO, Nicla Vision)
- **Python 3.8+** for Raspberry Pi development
- **Git** for version control and example code access
- **Text Editor/IDE** such as VS Code or PyCharm
**Hardware Accessories:**
- **USB cables:** USB-C or Micro-USB (must support data transfer, not power-only)
- **SD Card:** 32GB+ Class 10 for Raspberry Pi
- **Power adapters:** Appropriate for each platform
- **Camera modules:** Included with most kits or available separately
## Platform-Specific Software Installation {#sec-ide-setup-platformspecific-software-installation-f432}

View File

@@ -1,42 +1,16 @@
# Hardware Kits {.unnumbered}
# Hardware Platforms {.unnumbered}
This section introduces the four hardware platforms selected for the TinyML curriculum. Each platform represents a different point along the spectrum of embedded computing capabilities, from ultra-low-power microcontrollers to full-featured edge computers. These platforms illustrate distinct engineering trade-offs in power consumption, computational capability, and development complexity.
This chapter provides detailed technical specifications for the four hardware platforms used in these laboratories. Each platform represents a different point along the spectrum of embedded computing capabilities, from ultra-low-power microcontrollers to full-featured edge computers.
The selected platforms are widely used in commercial applications, thereby ensuring that the skills developed through these exercises translate directly to embedded systems development.
These platforms were selected because they illustrate distinct engineering trade-offs in power consumption, computational capability, and development complexity. All are widely used in commercial applications, ensuring that skills developed here transfer directly to professional embedded systems work.
## Our Featured Platform {#sec-hardware-kits-featured-platform-73d8}
## Featured Platform {#sec-hardware-kits-featured-platform-73d8}
![Complete XIAOML Kit with all components](seeed/xiao_esp32s3/images/png/xiaoml_kit_complete.png){fig-align="center"}
The [XIAOML Kit](https://www.seeedstudio.com/blog/2025/08/05/introducing-the-xiaoml-kit-your-tinyml-journey-starts-here/) is the most recent addition to our educational hardware platforms (released on July 31st, 2025). It offers a comprehensive TinyML development environment for learning about ML systems, featuring integrated wireless connectivity, a camera, multiple sensors, and extensive documentation. This compact board exemplifies how contemporary embedded systems can efficiently provide advanced machine learning capabilities within a cost-effective framework.
## System Requirements and Prerequisites {#sec-hardware-kits-system-requirements-prerequisites-9767}
Before selecting a hardware platform, ensure your development environment meets the following requirements:
**Development Computer Requirements:**
- **Operating System:** Windows 10/11, macOS 10.15+, or Linux (Ubuntu 18.04+)
- **Memory:** 8GB RAM minimum (16GB recommended for Raspberry Pi development)
- **Storage:** 10GB free space for development tools and libraries
- **USB Ports:** At least one USB 2.0/3.0 port for device connection
- **Internet Connection:** Required for software installation and library downloads
**Software Prerequisites:**
- **Arduino IDE 2.0+** for Arduino-based platforms
- **Python 3.8+** for Raspberry Pi development
- **Git** for version control and example code access
- **Text Editor/IDE** (VS Code, PyCharm, or similar)
**Hardware Accessories:**
- **USB-C or Micro-USB cables** (data transfer capable, not power-only)
- **SD Card** (32GB+ Class 10) for Raspberry Pi
- **Power adapters** appropriate for each platform
- **Camera modules** (included with most kits or available separately)
## Hardware Platform Overview {#sec-hardware-kits-hardware-platform-overview-9f77}
## Platform Overview {#sec-hardware-kits-hardware-platform-overview-9f77}
Our curriculum features four carefully selected platforms that span the full spectrum of embedded computing capabilities. Each platform shown in @tbl-platform-selection has been chosen to illustrate specific engineering trade-offs and learning objectives.