Discussion:
Neural Networks (MNIST inference) on the “3-cent” Microcontroller
Add Reply
D. Ray
2024-10-21 20:06:28 UTC
Reply
Permalink
Bouyed by the surprisingly good performance of neural networks with
quantization aware training on the CH32V003, I wondered how far this can be
pushed. How much can we compress a neural network while still achieving
good test accuracy on the MNIST dataset? When it comes to absolutely
low-end microcontrollers, there is hardly a more compelling target than the
Padauk 8-bit microcontrollers. These are microcontrollers optimized for the
simplest and lowest cost applications there are. The smallest device of the
portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
memory and 64 bytes of ram, more than an order of magnitude smaller than
the CH32V003. In addition, it has a proprieteray accumulator based 8-bit
architecture, as opposed to a much more powerful RISC-V instruction set.

Is it possible to implement an MNIST inference engine, which can classify
handwritten numbers, also on a PMS150C?





<https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>

<https://archive.md/DzqzL>
olcott
2024-10-27 01:43:01 UTC
Reply
Permalink
Post by D. Ray
Bouyed by the surprisingly good performance of neural networks with
quantization aware training on the CH32V003, I wondered how far this can be
pushed. How much can we compress a neural network while still achieving
good test accuracy on the MNIST dataset? When it comes to absolutely
low-end microcontrollers, there is hardly a more compelling target than the
Padauk 8-bit microcontrollers. These are microcontrollers optimized for the
simplest and lowest cost applications there are. The smallest device of the
portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
memory and 64 bytes of ram, more than an order of magnitude smaller than
the CH32V003. In addition, it has a proprieteray accumulator based 8-bit
architecture, as opposed to a much more powerful RISC-V instruction set.
Is it possible to implement an MNIST inference engine, which can classify
handwritten numbers, also on a PMS150C?


<https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>
<https://archive.md/DzqzL>
test to see if this posts or I should dump this paid provider.
--
Copyright 2024 Olcott

"Talent hits a target no one else can hit;
Genius hits a target no one else can see."
Arthur Schopenhauer
D. Ray
2024-10-28 15:42:41 UTC
Reply
Permalink
Post by olcott
Post by D. Ray
Bouyed by the surprisingly good performance of neural networks with
quantization aware training on the CH32V003, I wondered how far this can be
pushed. How much can we compress a neural network while still achieving
good test accuracy on the MNIST dataset? When it comes to absolutely
low-end microcontrollers, there is hardly a more compelling target than the
Padauk 8-bit microcontrollers. These are microcontrollers optimized for the
simplest and lowest cost applications there are. The smallest device of the
portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable
memory and 64 bytes of ram, more than an order of magnitude smaller than
the CH32V003. In addition, it has a proprieteray accumulator based 8-bit
architecture, as opposed to a much more powerful RISC-V instruction set.
Is it possible to implement an MNIST inference engine, which can classify
handwritten numbers, also on a PMS150C?


<https://cpldcpu.wordpress.com/2024/05/02/machine-learning-mnist-inference-on-the-3-cent-microcontroller/>
<https://archive.md/DzqzL>
test to see if this posts or I should dump this paid provider.
It worked.

Loading...