D. Ray
2024-10-21 20:06:28 UTC
Reply
Permalinkquantization aware training on the CH32V003, I wondered how far this can be
pushed. How much can we compress a neural network while still achieving
good test accuracy on the MNIST dataset? When it comes to absolutely
low-end microcontrollers, there is hardly a more compelling target than the
Padauk 8-bit microcontrollers. These are microcontrollers optimized for the
simplest and lowest cost applications there are. The smallest device of the
portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
memory and 64 bytes of ram, more than an order of magnitude smaller than
the CH32V003. In addition, it has a proprieteray accumulator based 8-bit
architecture, as opposed to a much more powerful RISC-V instruction set.
Is it possible to implement an MNIST inference engine, which can classify
handwritten numbers, also on a PMS150C?
…
…
<https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>
<https://archive.md/DzqzL>