mirror of
https://github.com/harvard-edge/cs249r_book.git
synced 2026-05-05 17:18:48 -05:00
Merge branch 'main' into exercises-2-and-4
This commit is contained in:
@@ -7,19 +7,19 @@
|
||||
],
|
||||
"contributors": [
|
||||
{
|
||||
"login": "mpstewart1",
|
||||
"name": "Matthew Stewart",
|
||||
"avatar_url": "https://avatars.githubusercontent.com/mpstewart1",
|
||||
"profile": "https://github.com/mpstewart1",
|
||||
"login": "ShvetankPrakash",
|
||||
"name": "Shvetank Prakash",
|
||||
"avatar_url": "https://avatars.githubusercontent.com/ShvetankPrakash",
|
||||
"profile": "https://github.com/ShvetankPrakash",
|
||||
"contributions": [
|
||||
"doc"
|
||||
]
|
||||
},
|
||||
{
|
||||
"login": "uchendui",
|
||||
"name": "Ikechukwu Uchendu",
|
||||
"avatar_url": "https://avatars.githubusercontent.com/uchendui",
|
||||
"profile": "https://github.com/uchendui",
|
||||
"login": "mpstewart1",
|
||||
"name": "Matthew Stewart",
|
||||
"avatar_url": "https://avatars.githubusercontent.com/mpstewart1",
|
||||
"profile": "https://github.com/mpstewart1",
|
||||
"contributions": [
|
||||
"doc"
|
||||
]
|
||||
@@ -34,19 +34,19 @@
|
||||
]
|
||||
},
|
||||
{
|
||||
"login": "jveejay",
|
||||
"name": "Vijay Janapa Reddi",
|
||||
"avatar_url": "https://avatars.githubusercontent.com/jveejay",
|
||||
"profile": "https://github.com/jveejay",
|
||||
"login": "uchendui",
|
||||
"name": "Ikechukwu Uchendu",
|
||||
"avatar_url": "https://avatars.githubusercontent.com/uchendui",
|
||||
"profile": "https://github.com/uchendui",
|
||||
"contributions": [
|
||||
"doc"
|
||||
]
|
||||
},
|
||||
{
|
||||
"login": "ShvetankPrakash",
|
||||
"name": "Shvetank Prakash",
|
||||
"avatar_url": "https://avatars.githubusercontent.com/ShvetankPrakash",
|
||||
"profile": "https://github.com/ShvetankPrakash",
|
||||
"login": "profvjreddi",
|
||||
"name": "Vijay Janapa Reddi",
|
||||
"avatar_url": "https://avatars.githubusercontent.com/profvjreddi",
|
||||
"profile": "https://github.com/profvjreddi",
|
||||
"contributions": [
|
||||
"doc"
|
||||
]
|
||||
|
||||
@@ -6,7 +6,7 @@ import requests
|
||||
|
||||
CONTRIBUTORS_FILE = '.all-contributorsrc'
|
||||
|
||||
EXCLUDED_USERS = {'web-flow', 'github-actions[bot]', 'mrdragonbear'}
|
||||
EXCLUDED_USERS = {'web-flow', 'github-actions[bot]', 'mrdragonbear', 'jveejay'}
|
||||
|
||||
OWNER = "harvard-edge"
|
||||
REPO = "cs249r_book"
|
||||
@@ -68,7 +68,7 @@ def main(_):
|
||||
existing_contributor_logins.append(existing_contributor['login'])
|
||||
existing_contributor_logins_set = set(existing_contributor_logins)
|
||||
print('Existing contributors: ', existing_contributor_logins_set)
|
||||
|
||||
existing_contributor_logins_set -= EXCLUDED_USERS
|
||||
# All contributors in the file should be in the API
|
||||
assert existing_contributor_logins_set.issubset(
|
||||
users_from_api), 'All contributors in the .all-contributorsrc file should be pulled using the API'
|
||||
|
||||
@@ -86,11 +86,11 @@ quarto render
|
||||
<table>
|
||||
<tbody>
|
||||
<tr>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/mpstewart1"><img src="https://avatars.githubusercontent.com/mpstewart1?s=100" width="100px;" alt="Matthew Stewart"/><br /><sub><b>Matthew Stewart</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=mpstewart1" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/uchendui"><img src="https://avatars.githubusercontent.com/uchendui?s=100" width="100px;" alt="Ikechukwu Uchendu"/><br /><sub><b>Ikechukwu Uchendu</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=uchendui" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/jessicaquaye"><img src="https://avatars.githubusercontent.com/jessicaquaye?s=100" width="100px;" alt="Jessica Quaye"/><br /><sub><b>Jessica Quaye</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=jessicaquaye" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/jveejay"><img src="https://avatars.githubusercontent.com/jveejay?s=100" width="100px;" alt="Vijay Janapa Reddi"/><br /><sub><b>Vijay Janapa Reddi</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=jveejay" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/ShvetankPrakash"><img src="https://avatars.githubusercontent.com/ShvetankPrakash?s=100" width="100px;" alt="Shvetank Prakash"/><br /><sub><b>Shvetank Prakash</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=ShvetankPrakash" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/mpstewart1"><img src="https://avatars.githubusercontent.com/mpstewart1?s=100" width="100px;" alt="Matthew Stewart"/><br /><sub><b>Matthew Stewart</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=mpstewart1" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/jessicaquaye"><img src="https://avatars.githubusercontent.com/jessicaquaye?s=100" width="100px;" alt="Jessica Quaye"/><br /><sub><b>Jessica Quaye</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=jessicaquaye" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/uchendui"><img src="https://avatars.githubusercontent.com/uchendui?s=100" width="100px;" alt="Ikechukwu Uchendu"/><br /><sub><b>Ikechukwu Uchendu</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=uchendui" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/profvjreddi"><img src="https://avatars.githubusercontent.com/profvjreddi?s=100" width="100px;" alt="Vijay Janapa Reddi"/><br /><sub><b>Vijay Janapa Reddi</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=profvjreddi" title="Documentation">📖</a></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
24
_quarto.yml
24
_quarto.yml
@@ -1,6 +1,11 @@
|
||||
project:
|
||||
type: book
|
||||
output-dir: _book
|
||||
preview:
|
||||
browser: true
|
||||
navigate: true
|
||||
render:
|
||||
- "*.qmd"
|
||||
|
||||
website:
|
||||
comments:
|
||||
@@ -8,13 +13,14 @@ website:
|
||||
theme: clean
|
||||
openSidebar: true
|
||||
|
||||
#abstract: Machine Learning Systems for TinyML offers comprehensive guidance on deploying machine learning on embedded devices. As edge computing and the Internet of Things proliferate, this textbook provides professionals and students the expertise to implement performant AI on resource-constrained hardware. A unique aspect of this book elucidates the entire machine learning workflow, from data engineering through training, optimization, acceleration, and production deployment. Key topics covered include deep learning and classical ML algorithms for embedded systems, efficient neural network architectures, hardware-aware training techniques, model compression, benchmarking for tinyML, and on-device learning. Additional chapters highlight cutting-edge advances like on-device data generation and crucial considerations around reliability, privacy, security, and responsible AI. With its rigorous approach spanning theory and practice across diverse tinyML application domains like smart homes, wearables, and industrial IoT, the book enables readers to develop specialized knowledge. Using concrete use cases and hands-on examples, readers will learn to apply machine learning to transform embedded and IoT systems. Overall, this indispensable guide provides a research-based foundation for leveraging machine learning in embedded systems.
|
||||
include-in-header:
|
||||
text: <script src="https://hypothes.is/embed.js" async></script>
|
||||
|
||||
book:
|
||||
page-navigation: true
|
||||
title: "MACHINE LEARNING SYSTEMS for TinyML"
|
||||
abstract: Machine Learning Systems for TinyML offers comprehensive guidance on deploying machine learning on embedded devices. As edge computing and the Internet of Things proliferate, this textbook provides professionals and students the expertise to implement performant AI on resource-constrained hardware. A unique aspect of this book elucidates the entire machine learning workflow, from data engineering through training, optimization, acceleration, and production deployment. Key topics covered include deep learning and classical ML algorithms for embedded systems, efficient neural network architectures, hardware-aware training techniques, model compression, benchmarking for tinyML, and on-device learning. Additional chapters highlight cutting-edge advances like on-device data generation and crucial considerations around reliability, privacy, security, and responsible AI. With its rigorous approach spanning theory and practice across diverse tinyML application domains like smart homes, wearables, and industrial IoT, the book enables readers to develop specialized knowledge. Using concrete use cases and hands-on examples, readers will learn to apply machine learning to transform embedded and IoT systems. Overall, this indispensable guide provides a research-based foundation for leveraging machine learning in embedded systems.
|
||||
subtitle: ""
|
||||
search: true
|
||||
repo-url: https://github.com/harvard-edge/cs249r_book
|
||||
repo-actions: [edit, issue, source]
|
||||
@@ -25,7 +31,7 @@ book:
|
||||
favicon: cover.png
|
||||
page-footer:
|
||||
left: |
|
||||
Edited by Prof. Vijay Janapa Reddi (Harvard University) and Prof. Song Han (MIT).
|
||||
Edited by Prof. Vijay Janapa Reddi (Harvard University)
|
||||
right: |
|
||||
This book was built with <a href="https://quarto.org/">Quarto</a>.
|
||||
|
||||
@@ -57,7 +63,12 @@ book:
|
||||
- generative_ai.qmd
|
||||
- ai_for_good.qmd
|
||||
- sustainable_ai.qmd
|
||||
- references.qmd
|
||||
|
||||
- part: HANDS-ON EXERCISES
|
||||
chapters:
|
||||
- embedded_sys_exercise.qmd
|
||||
|
||||
references: references.qmd
|
||||
|
||||
appendices:
|
||||
- tools.qmd
|
||||
@@ -66,6 +77,7 @@ book:
|
||||
- learning_resources.qmd
|
||||
- community.qmd
|
||||
- case_studies.qmd
|
||||
- embedded_sys_exercise.qmd
|
||||
|
||||
citation: true
|
||||
bibliography: references.bib
|
||||
@@ -94,9 +106,11 @@ format:
|
||||
link-external-newwindow: true
|
||||
callout-appearance: simple
|
||||
anchor-sections: true
|
||||
smooth-scroll: true
|
||||
smooth-scroll: false
|
||||
toc: true
|
||||
citations-hover: false
|
||||
fig-width: 8
|
||||
fig-height: 6
|
||||
|
||||
editor:
|
||||
render-on-save: true
|
||||
render-on-save: true
|
||||
|
||||
@@ -8,11 +8,11 @@ We extend our sincere thanks to the diverse group of individuals who have genero
|
||||
<table>
|
||||
<tbody>
|
||||
<tr>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/mpstewart1"><img src="https://avatars.githubusercontent.com/mpstewart1?s=100" width="100px;" alt="Matthew Stewart"/><br /><sub><b>Matthew Stewart</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=mpstewart1" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/uchendui"><img src="https://avatars.githubusercontent.com/uchendui?s=100" width="100px;" alt="Ikechukwu Uchendu"/><br /><sub><b>Ikechukwu Uchendu</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=uchendui" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/jessicaquaye"><img src="https://avatars.githubusercontent.com/jessicaquaye?s=100" width="100px;" alt="Jessica Quaye"/><br /><sub><b>Jessica Quaye</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=jessicaquaye" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/jveejay"><img src="https://avatars.githubusercontent.com/jveejay?s=100" width="100px;" alt="Vijay Janapa Reddi"/><br /><sub><b>Vijay Janapa Reddi</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=jveejay" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/ShvetankPrakash"><img src="https://avatars.githubusercontent.com/ShvetankPrakash?s=100" width="100px;" alt="Shvetank Prakash"/><br /><sub><b>Shvetank Prakash</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=ShvetankPrakash" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/mpstewart1"><img src="https://avatars.githubusercontent.com/mpstewart1?s=100" width="100px;" alt="Matthew Stewart"/><br /><sub><b>Matthew Stewart</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=mpstewart1" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/jessicaquaye"><img src="https://avatars.githubusercontent.com/jessicaquaye?s=100" width="100px;" alt="Jessica Quaye"/><br /><sub><b>Jessica Quaye</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=jessicaquaye" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/uchendui"><img src="https://avatars.githubusercontent.com/uchendui?s=100" width="100px;" alt="Ikechukwu Uchendu"/><br /><sub><b>Ikechukwu Uchendu</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=uchendui" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/profvjreddi"><img src="https://avatars.githubusercontent.com/profvjreddi?s=100" width="100px;" alt="Vijay Janapa Reddi"/><br /><sub><b>Vijay Janapa Reddi</b></sub></a><br /><a href="https=//github.com/harvard-edge/cs249r_book/commits?author=profvjreddi" title="Documentation">📖</a></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
@@ -1,14 +1,3 @@
|
||||
---
|
||||
title: "4 Embedded AI - Exercise: Image Classification"
|
||||
format:
|
||||
html:
|
||||
code-fold: false
|
||||
execute:
|
||||
eval: false
|
||||
jupyter: python3
|
||||
---
|
||||
|
||||
|
||||
### **Introduction**
|
||||
|
||||
As we initiate our studies into embedded machine learning or tinyML,
|
||||
@@ -305,7 +294,7 @@ or rotating the images).
|
||||
Under the rood, here you can see how Edge Impulse implements a data
|
||||
Augmentation policy on your data:
|
||||
|
||||
```{python}
|
||||
```{{python}}
|
||||
# Implements the data augmentation policy
|
||||
def augment_image(image, label):
|
||||
# Flips the image randomly
|
||||
@@ -414,23 +403,19 @@ height="4.263888888888889in"}
|
||||
|
||||
On your computer, you will find a ZIP file. Open it:
|
||||
|
||||
{width="6.5in"
|
||||
height="2.625in"}
|
||||
{width="6.5in" height="2.625in"}
|
||||
|
||||
Use the Bootloader tool on the OpenMV IDE to load the FW on your board:
|
||||
|
||||
{width="6.5in"
|
||||
height="3.625in"}
|
||||
{width="6.5in" height="3.625in"}
|
||||
|
||||
Select the appropriate file (.bin for Nicla-Vision):
|
||||
|
||||
{width="6.5in"
|
||||
height="1.9722222222222223in"}
|
||||
{width="6.5in" height="1.9722222222222223in"}
|
||||
|
||||
After the download is finished, press OK:
|
||||
|
||||
{width="3.875in"
|
||||
height="5.708333333333333in"}
|
||||
{width="3.875in" height="5.708333333333333in"}
|
||||
|
||||
If a message says that the FW is outdated, DO NOT UPGRADE. Select
|
||||
\[NO\].
|
||||
@@ -460,7 +445,7 @@ on the OpenMV IDE.
|
||||
GitHub,]{.underline}](https://github.com/Mjrovai/Arduino_Nicla_Vision/blob/main/Micropython/nicla_image_classification.py)
|
||||
or modify it as below:
|
||||
|
||||
```{python}
|
||||
```{{python}}
|
||||
# Marcelo Rovai - NICLA Vision - Image Classification
|
||||
# Adapted from Edge Impulse - OpenMV Image Classification Example
|
||||
# @24Aug23
|
||||
@@ -563,7 +548,7 @@ For that, we should [[upload the code from
|
||||
GitHub]{.underline}](https://github.com/Mjrovai/Arduino_Nicla_Vision/blob/main/Micropython/nicla_image_classification_LED.py)
|
||||
or change the last code to include the LEDs:
|
||||
|
||||
```{python}
|
||||
```{{python}}
|
||||
# Marcelo Rovai - NICLA Vision - Image Classification with LEDs
|
||||
# Adapted from Edge Impulse - OpenMV Image Classification Example
|
||||
# @24Aug23
|
||||
|
||||
@@ -377,3 +377,9 @@ The table provides a side-by-side comparison between these two distinct types of
|
||||
|
||||
|
||||
As we gaze into the future, it's clear that the realm of embedded systems stands on the cusp of a transformative era, characterized by groundbreaking innovations, abundant opportunities, and formidable challenges. The horizon is replete with the promise of enhanced connectivity, heightened intelligence, and superior efficiency, carving out a trajectory where embedded systems will serve as the guiding force behind society's technological progress. The path forward is one of discovery and adaptability, where the confluence of technological prowess and creative ingenuity will sculpt a future that is not only rich in technological advancements but also attuned to the intricate and continually shifting needs of a dynamic global landscape. It's a field teeming with possibilities, inviting trailblazers to embark on a journey to define the parameters of a bright and flourishing future.
|
||||
|
||||
## Exercises
|
||||
|
||||
coming soon.
|
||||
|
||||
[Setup Nicla Vision](./embedded_sys_exercise.qmd)
|
||||
|
||||
@@ -1,14 +1,4 @@
|
||||
---
|
||||
title: "2 Embedded Systems - Exercise: The Nicla Vision"
|
||||
format:
|
||||
html:
|
||||
code-fold: false
|
||||
execute:
|
||||
eval: false
|
||||
jupyter: python3
|
||||
---
|
||||
|
||||
## Introduction ##
|
||||
# Introduction
|
||||
|
||||
The [Arduino Nicla
|
||||
Vision](https://docs.arduino.cc/hardware/nicla-vision) (sometimes called
|
||||
@@ -265,7 +255,7 @@ height="3.9722222222222223in"}
|
||||
|
||||
Let\'s go through the [helloworld.py](http://helloworld.py/) script:
|
||||
|
||||
```{python}
|
||||
```{{python}}
|
||||
# Hello World Example 2
|
||||
#
|
||||
# Welcome to the OpenMV IDE! Click on the green run arrow button below to run the script!
|
||||
@@ -400,7 +390,7 @@ The Nicla Pins 3 (Tx) and 4 (Rx) are connected with the Shield Serial
|
||||
connector. The UART communication is used with the LoRaWan device. Here
|
||||
is a simple code to use the UART.:
|
||||
|
||||
```{python}
|
||||
```{{python}}
|
||||
# UART Test - By: marcelo_rovai - Sat Sep 23 2023
|
||||
|
||||
import time
|
||||
@@ -435,7 +425,7 @@ uploaded to the Nicla (the
|
||||
can be found in GitHub).
|
||||
|
||||
|
||||
```{python}
|
||||
```{{python}}
|
||||
# Nicla_OLED_Hello_World - By: marcelo_rovai - Sat Sep 30 2023
|
||||
|
||||
#Save on device: MicroPython SSD1306 OLED driver, I2C and SPI interfaces created by Adafruit
|
||||
@@ -454,7 +444,7 @@ oled.show()
|
||||
|
||||
Finally, here is a simple script to read the ADC value on pin \"PC4\"
|
||||
(Nicla pin A0):
|
||||
```{python}
|
||||
```{{python}}
|
||||
|
||||
# Light Sensor (A0) - By: marcelo_rovai - Wed Oct 4 2023
|
||||
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
# Introduction
|
||||
|
||||
[//]: [testing](test.qmd)
|
||||
|
||||
## Overview
|
||||
|
||||
Welcome to this comprehensive exploration of Tiny Machine Learning (TinyML). This book aims to bridge the gap between intricate machine learning theories and their practical applications on small devices. Whether you're a newcomer, an industry professional, or an academic researcher, this book offers a balanced mix of essential theory and hands-on insights into TinyML.
|
||||
@@ -102,4 +104,4 @@ As we navigate the multifaceted world of embedded AI, we'll cover a broad range
|
||||
|
||||
## Contribute Back
|
||||
|
||||
Learning in the fast-paced world of embedded AI is a collaborative journey. This book aims to nurture a vibrant community of learners, innovators, and contributors. As you explore the concepts and engage with the exercises, we encourage you to share your insights and experiences. Whether it's a novel approach, an interesting application, or a thought-provoking question, your contributions can enrich the learning ecosystem. Engage in discussions, offer and seek guidance, and collaborate on projects to foster a culture of mutual growth and learning. By sharing knowledge, you play a pivotal role in fostering a globally connected, informed, and empowered community.
|
||||
Learning in the fast-paced world of embedded AI is a collaborative journey. This book aims to nurture a vibrant community of learners, innovators, and contributors. As you explore the concepts and engage with the exercises, we encourage you to share your insights and experiences. Whether it's a novel approach, an interesting application, or a thought-provoking question, your contributions can enrich the learning ecosystem. Engage in discussions, offer and seek guidance, and collaborate on projects to foster a culture of mutual growth and learning. By sharing knowledge, you play a pivotal role in fostering a globally connected, informed, and empowered community.
|
||||
|
||||
Reference in New Issue
Block a user