2025-11-19 10:42:46 +01:00
2025-11-19 10:42:46 +01:00
2025-11-19 10:42:46 +01:00
2025-11-19 10:42:46 +01:00
2025-11-19 10:42:46 +01:00
2025-11-19 10:42:46 +01:00
2025-11-19 10:42:46 +01:00
2025-11-19 10:42:46 +01:00
2025-11-19 10:42:46 +01:00
2025-11-19 10:42:46 +01:00
Description
Designed and implemented a GPU-accelerated MNIST training engine fully in WebGPU. Built complete forward and backward passes using 6 custom WGSL compute shaders: Dense layer forward kernels ReLU activation Softmax + cross-entropy backward Dense layer gradient kernels SGD optimizer Implemented GPU memory layouts, buffer management, shader orchestration, and dispatch scaling. Achieved ~40% training accuracy after 20 epochs on batch 64 in browser. Demonstrates ability to build GPU compute pipelines and AI runtimes from low-level operations.
12 MiB
Languages
JavaScript 91.2%
WGSL 7.4%
HTML 1.4%