Optimization Level 0: Baseline

Results:
- Perceptron:  (1.86s) 100.0%
- XOR:  (1.92s) 54.5%
- MNIST:  (2.03s) 15.0%
- CIFAR:  (60.00s)
- TinyGPT:  (1.85s)
This commit is contained in:
Vijay Janapa Reddi
2025-09-28 21:42:40 -04:00
parent bac4d0f99a
commit 7ad19905aa
3 changed files with 23 additions and 6 deletions

View File

@@ -14,3 +14,4 @@ Testing Optimization Level 0: Baseline
[2025-09-28 21:31:27] ✅ Complete in 2.00s
[2025-09-28 21:31:27]
Committing results for Baseline...
[2025-09-28 21:31:27] Committed results

View File

@@ -0,0 +1,16 @@
[2025-09-28 21:41:32]
Testing Optimization Level 0: Baseline
[2025-09-28 21:41:32] Description: No optimizations
[2025-09-28 21:41:32] ------------------------------------------------------------
[2025-09-28 21:41:32] Testing Perceptron with Baseline...
[2025-09-28 21:41:34] ✅ Complete in 1.86s
[2025-09-28 21:41:34] Testing XOR with Baseline...
[2025-09-28 21:41:36] ✅ Complete in 1.92s
[2025-09-28 21:41:36] Testing MNIST with Baseline...
[2025-09-28 21:41:38] ✅ Complete in 2.03s
[2025-09-28 21:41:38] Testing CIFAR with Baseline...
[2025-09-28 21:42:38] ⏱️ Timeout after 60s
[2025-09-28 21:42:38] Testing TinyGPT with Baseline...
[2025-09-28 21:42:40] ✅ Complete in 1.85s
[2025-09-28 21:42:40]
Committing results for Baseline...

View File

@@ -1,24 +1,24 @@
{
"Perceptron": {
"success": true,
"time": 1.8516879081726074,
"time": 1.8552052974700928,
"output_preview": "ion\n\n\ud83d\ude80 Next Steps:\n \u2022 Continue to XOR 1969 milestone after Module 06 (Autograd)\n \u2022 YOUR foundation enables solving non-linear problems!\n \u2022 With 100.0% accuracy, YOUR perceptron works perfectly!\n",
"loss": 0.2038,
"accuracy": 100.0
},
"XOR": {
"success": true,
"time": 1.9161672592163086,
"time": 1.9177570343017578,
"output_preview": "ayer networks\n\n\ud83d\ude80 Next Steps:\n \u2022 Continue to MNIST MLP after Module 08 (Training)\n \u2022 YOUR XOR solution scales to real vision problems!\n \u2022 Hidden layers principle powers all modern deep learning!\n",
"loss": 0.2497,
"accuracy": 54.5
},
"MNIST": {
"success": true,
"time": 2.042603015899658,
"time": 2.02604603767395,
"output_preview": " a scalar is deprecated, and will error in future. Ensure you extract a single element from your array before performing this operation. (Deprecated NumPy 1.25.)\n one_hot[i, int(labels_np[i])] = 1.0\n",
"loss": 0.0,
"accuracy": 9.0
"accuracy": 15.0
},
"CIFAR": {
"success": false,
@@ -27,8 +27,8 @@
},
"TinyGPT": {
"success": true,
"time": 2.003012180328369,
"time": 1.851945161819458,
"output_preview": "ining\n \u2022 Complete transformer architecture from first principles\n\n\ud83c\udfed Production Note:\n Real PyTorch uses optimized CUDA kernels for attention,\n but you built and understand the core mathematics!\n",
"loss": 0.2696
"loss": 0.3688
}
}