This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4VM: Need some clarity on reshape layer merge in TIDL

Part Number: TDA4VM

I have been using this part of the TIDL user guide as a reference, even though our setup uses the TFLite delegate import scripts.

While standalone reshape layers are not supported in TIDL, it seems as though reshape/flatten layers are supported when they occur before certain other layers, and they get merged together.

For example, this sample flatten + FC TFLite model imports without errors:




and I can see that reshape is listed as a supported TIDL layer type:

Supported TIDL layer type --- 39 Tflite layer type --- 22 layer output name--- TEST/dense/Tensordot/Reshape;TEST/reshape/Reshape 
Supported TIDL layer type --- 6 Tflite layer type --- 9 layer output name---        Identity 

 Number of subgraphs:1 , 2 nodes delegated out of 2 nodes 
 
In TIDL_tfliteRtImportInit subgraph_id=5
Layer 0, subgraph id 5, name=Identity
Layer 1, subgraph id 5, name=input_data
In TIDL_tfliteRtImportNode  TIDL Layer type 39   Tflite builtin code type 22 
In TIDL_tfliteRtImportNode  TIDL Layer type 6   Tflite builtin code type 9 
In TIDL_runtimesOptimizeNet: LayerIndex = 4, dataIndex = 3 

 ************** Frame index 1 : Running float import *************
 
 ...
 
****************************************************
**                ALL MODEL CHECK PASSED          **
****************************************************


But this sample reshape + conv2D TFLite model imports with the error detailed below (despite reshape still being listed as a supported layer):



Supported TIDL layer type --- 39 Tflite layer type --- 22 layer output name--- TEST/reshape/Reshape 
Supported TIDL layer type --- 1 Tflite layer type --- 3 layer output name---        Identity 

 Number of subgraphs:1 , 2 nodes delegated out of 2 nodes 
 
In TIDL_tfliteRtImportInit subgraph_id=5
Layer 0, subgraph id 5, name=Identity
Layer 1, subgraph id 5, name=input_data
In TIDL_tfliteRtImportNode  TIDL Layer type 39   Tflite builtin code type 22 
In TIDL_tfliteRtImportNode  TIDL Layer type 1   Tflite builtin code type 3 
In TIDL_runtimesOptimizeNet: LayerIndex = 4, dataIndex = 3 
convParams.numInChannels Is not multiple of convParams.numGroups -  Exiting

Based on this, I have several questions:

  • Why is this error happening? Are TI able to reproduce this error?
  • What is the full list of layers that support reshape merge and flatten merge. Are they different? Is Conv2D on it?
  • Is any of this behaviour different between native TIDL-RT model import tool and TFLite delegate import? 
  • Is there a way we can distinguish between reshape and flatten layers using the deny_list feature of TFLite delegates, given that they use the same TFLite operator?

Sample models are attached: 
sample_models.zip

  • To extend upon my previous question about the deny_list - it seems like flatten layers specifically cannot be added to the deny list at all, because one of the resulting subgraphs has a 2D tensor as an input (which is not allowed).

    Here is what happens when we add the operator kTfLiteBuiltinReshape = 22 to the deny list for the above models:

    For flatten + FC TFLite model:

    Layer 'TEST/dense/Tensordot/Reshape;TEST/reshape/Reshape' added to unsupported nodes as specified in deny list 
    Supported TIDL layer type --- 6 Tflite layer type --- 9 layer output name---        Identity 
    
     Number of subgraphs:1 , 1 nodes delegated out of 2 nodes 
     
    Node in deny list...delegated to ARM --- tflite layer code - 22, tensor name - TEST/dense/Tensordot/Reshape;TEST/reshape/Reshape  
    In TIDL_tfliteRtImportInit subgraph_id=5
    Layer 0, subgraph id 5, name=Identity
    Layer 1, subgraph id 5, name=TEST/dense/Tensordot/Reshape;TEST/reshape/Reshape
    
     Invalid inWidth parameter setting : set it to >0 Validation of TIDL tflite runtime import config parameters failed!
    In TIDL_tfliteRtImportNode  TIDL Layer type 6   Tflite builtin code type 9 
    In TIDL_runtimesOptimizeNet: LayerIndex = 3, dataIndex = 2 
    ****************************************************
    **   All the Input Tensor Dimensions has to be greater then Zero 
    **   DIM Error - For Tensor 0, Dim 3 is 0
    ****************************************************

    For reshape + Conv2D TFLite model:

    Layer 'TEST/reshape/Reshape' added to unsupported nodes as specified in deny list 
    Supported TIDL layer type --- 1 Tflite layer type --- 3 layer output name---        Identity 
    
     Number of subgraphs:1 , 1 nodes delegated out of 2 nodes 
     
    Node in deny list...delegated to ARM --- tflite layer code - 22, tensor name - TEST/reshape/Reshape  
    In TIDL_tfliteRtImportInit subgraph_id=5
    Layer 0, subgraph id 5, name=Identity
    Layer 1, subgraph id 5, name=TEST/reshape/Reshape
    In TIDL_tfliteRtImportNode  TIDL Layer type 1   Tflite builtin code type 3 
    In TIDL_runtimesOptimizeNet: LayerIndex = 3, dataIndex = 2 
    
     ************** Frame index 1 : Running float import ************* 
     
     ...
     
    ****************************************************
    **                ALL MODEL CHECK PASSED          **
    ****************************************************
    

    This issue, along with the error in the original post, means that we can't come up with a consistent method of importing models that include such reshape/flatten layers, even with the use of the deny_list.

    How can we proceed here?

  • Hi Michael,

    Regarding your question on deny_list for flatten layer, we have a Jira issue (https://jira.itg.ti.com/browse/TIDL-1816) being tracked internally for this and it is slated to be pushed in SDK 8.4 release, so that 2 dimensional input tensor here gets converted to 4 dimensions to be consumed by TIDL.(1x16 to 1x1x1x16).

    For the convolution issue, I am checking the model on my end and will post an update soon.

    Regards,

    Anand

  • Thanks for the update , I hope you're able to reproduce that Conv2D issue Thumbsup