This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4VM: [TIDL]leak ReLU implementation in AI model?

Part Number: TDA4VM

Dear Champs,

The leak ReLU is supported in TIDL? My customer is little-bit confused if leak ReLU is supported in TIDL by referring below e2e.

Could you please clarify this?

https://e2e.ti.com/support/processors-group/processors/f/processors-forum/1026188/tda4vm-support-for-tanh-activaton/3801522?tisearch=e2e-sitesearch&keymatch=Leaky%20relu#3801522 

TIDL supports Relu, Relu6 and leaky Relu activation functions, and Relu and Relu6 would be more optimal for performance, so you can use these activation functions.

My customer is trying implement leakReLU on TDA4 in their AI model, and they found the accuracy and performance are poor in TDA4 than PC. Could you please provide guide how Leak ReLU can be implemented?

Should this be replaced with ReLU?

I suggested to check our Webinar for YoloV5 implementation to customer, but it would be very helpful if you can provide additional comments on this.

Thanks and Best Regards,

SI.

  • Hi SI,

    TI does support leaky relu operator, but as mentioned above in the E2E you have referred here, Relu and Relu6 activations would be better in terms of performance, so if customer is looking for better performance, I would suggest to not use leaky Relu and use Relu/Relu6 instead.

    Regards,

    Anand