For running our deep learning model, we need to store and retrieve a 35 MB TF Lite Model and the run the model for inference. Since the on-device RAM is only 1 MB, is it possible to interface external RAM and ROM?
This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
For running our deep learning model, we need to store and retrieve a 35 MB TF Lite Model and the run the model for inference. Since the on-device RAM is only 1 MB, is it possible to interface external RAM and ROM?
Hello,
I'm not familiar with the interfaces that are typically used in external RAM. If there is RAM that connects to one of the standard interfaces available with this device (UART, SPI, etc...) then this may be possible.
Best Regards,
Josh