This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4VL-Q1: PSDK 9.2 :: Model Validation Issues

Part Number: TDA4VL-Q1
Other Parts Discussed in Thread: AM69A

Tool/software:

Dear Sir,

We have a custom onnx model , which is working as expected with tidl_j721e_08_02_00_11 (PSDK 8.2)

But when the same model is imported with PSDK 9.2 (J721S2) we are observing garbage values with its output in PC Emulation.

We have also observed that the number of layers detected in the same model with PSDK 8.2 is 95 whereas with 9.2 it is 107.

What could be the possible reason for the same, and how we can debug and resolve the issue?

Thanks and Regards,

Vyom Mishra

  • Hi,

    But when the same model is imported with PSDK 9.2 (J721S2) we are observing garbage values with its output in PC Emulation.

    Share some insights on this issue please, are you using TIDLRT (Config file based flow) for model compilation and inference ? is this 9.2.0.5 SDK that you are using here ?

  • Dear Sir,

    Yes, I am using TIDLRT for compilation and inference. and the SDK is ti-processor-sdk-rtos-j721s2-evm-09_02_00_05.

    Thanks and Regards,

    Vyom Mishra

  • Vyom,

    The details of issue are still missing, but we will come back to it later lets try model compilation on latest tidl tools.

    git clone https://github.com/TexasInstruments/edgeai-tidl-tools.git
    cd edgeai-tidl-tools
    git checkout <TAG Compatible with your SDK version>
    # Supported SOC name strings am62, am62a, am68a, am68pa, am69a, am67a
    export SOC=<Your SOC name>
    source ./setup.sh

     <TAG Compatible with your SDK version> = 09_02_09_00

    Once the setup scripts runs it will download latest tools binaries (import and infer) in tidl_tools repo.

    Use these binaries for model compilation and inference and report the observations.

  • Dear Sir,

    Thanks for the response!

    I am trying to set up the same, please find the issue in the log.

    vyom@ai-lab:~/j721s2/ti-processor-sdk-rtos-j721s2-evm-09_02_00_05/edgeai-tidl-tools$ source ./setup.sh 
    Defaulting to CPU tools
    X64 Architecture
    Installing python packages...
    Defaulting to user installation because normal site-packages is not writeable
    Requirement already satisfied: pybind11[global] in /home/vyom/.local/lib/python3.10/site-packages (2.13.1)
    Requirement already satisfied: pybind11-global==2.13.1 in /home/vyom/.local/lib/python3.10/site-packages (from pybind11[global]) (2.13.1)
    Defaulting to user installation because normal site-packages is not writeable
    Collecting https://github.com/TexasInstruments/edgeai-caffe2onnx/archive/refs/heads/tidl.zip (from -r ./requirements_pc.txt (line 16))
      Using cached https://github.com/TexasInstruments/edgeai-caffe2onnx/archive/refs/heads/tidl.zip
      Preparing metadata (setup.py) ... done
    Requirement already satisfied: numpy==1.23.0 in /home/vyom/.local/lib/python3.10/site-packages (from -r ./requirements_pc.txt (line 1)) (1.23.0)
    Requirement already satisfied: pyyaml in /usr/lib/python3/dist-packages (from -r ./requirements_pc.txt (line 2)) (5.4.1)
    Requirement already satisfied: protobuf in /home/vyom/.local/lib/python3.10/site-packages (from -r ./requirements_pc.txt (line 3)) (3.20.3)
    Requirement already satisfied: onnx==1.13.0 in /home/vyom/.local/lib/python3.10/site-packages (from -r ./requirements_pc.txt (line 4)) (1.13.0)
    Requirement already satisfied: torch in /home/vyom/.local/lib/python3.10/site-packages (from -r ./requirements_pc.txt (line 5)) (2.3.1)
    Requirement already satisfied: timm in /home/vyom/.local/lib/python3.10/site-packages (from -r ./requirements_pc.txt (line 6)) (1.0.7)
    Requirement already satisfied: tflite in /home/vyom/.local/lib/python3.10/site-packages (from -r ./requirements_pc.txt (line 7)) (2.10.0)
    Requirement already satisfied: pillow in /usr/lib/python3/dist-packages (from -r ./requirements_pc.txt (line 8)) (9.0.1)
    Requirement already satisfied: flatbuffers==1.12.0 in /home/vyom/.local/lib/python3.10/site-packages (from -r ./requirements_pc.txt (line 9)) (1.12)
    Requirement already satisfied: requests in /usr/lib/python3/dist-packages (from -r ./requirements_pc.txt (line 10)) (2.25.1)
    Requirement already satisfied: opencv-python in /home/vyom/.local/lib/python3.10/site-packages (from -r ./requirements_pc.txt (line 11)) (4.10.0.84)
    Requirement already satisfied: pytest in /home/vyom/.local/lib/python3.10/site-packages (from -r ./requirements_pc.txt (line 12)) (8.2.2)
    Requirement already satisfied: graphviz in /home/vyom/.local/lib/python3.10/site-packages (from -r ./requirements_pc.txt (line 13)) (0.20.3)
    Requirement already satisfied: dataclasses in /home/vyom/.local/lib/python3.10/site-packages (from -r ./requirements_pc.txt (line 14)) (0.6)
    Requirement already satisfied: gluoncv in /home/vyom/.local/lib/python3.10/site-packages (from -r ./requirements_pc.txt (line 15)) (0.10.5.post0)
    Requirement already satisfied: typing-extensions>=3.6.2.1 in /home/vyom/.local/lib/python3.10/site-packages (from onnx==1.13.0->-r ./requirements_pc.txt (line 4)) (4.12.2)
    Requirement already satisfied: filelock in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (3.15.4)
    Requirement already satisfied: sympy in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (1.13.0)
    Requirement already satisfied: networkx in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (3.3)
    Requirement already satisfied: jinja2 in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (3.1.4)
    Requirement already satisfied: fsspec in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (2024.6.1)
    Requirement already satisfied: nvidia-cuda-nvrtc-cu12==12.1.105 in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (12.1.105)
    Requirement already satisfied: nvidia-cuda-runtime-cu12==12.1.105 in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (12.1.105)
    Requirement already satisfied: nvidia-cuda-cupti-cu12==12.1.105 in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (12.1.105)
    Requirement already satisfied: nvidia-cudnn-cu12==8.9.2.26 in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (8.9.2.26)
    Requirement already satisfied: nvidia-cublas-cu12==12.1.3.1 in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (12.1.3.1)
    Requirement already satisfied: nvidia-cufft-cu12==11.0.2.54 in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (11.0.2.54)
    Requirement already satisfied: nvidia-curand-cu12==10.3.2.106 in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (10.3.2.106)
    Requirement already satisfied: nvidia-cusolver-cu12==11.4.5.107 in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (11.4.5.107)
    Requirement already satisfied: nvidia-cusparse-cu12==12.1.0.106 in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (12.1.0.106)
    Requirement already satisfied: nvidia-nccl-cu12==2.20.5 in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (2.20.5)
    Requirement already satisfied: nvidia-nvtx-cu12==12.1.105 in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (12.1.105)
    Requirement already satisfied: triton==2.3.1 in /home/vyom/.local/lib/python3.10/site-packages (from torch->-r ./requirements_pc.txt (line 5)) (2.3.1)
    Requirement already satisfied: nvidia-nvjitlink-cu12 in /home/vyom/.local/lib/python3.10/site-packages (from nvidia-cusolver-cu12==11.4.5.107->torch->-r ./requirements_pc.txt (line 5)) (12.5.82)
    Requirement already satisfied: torchvision in /home/vyom/.local/lib/python3.10/site-packages (from timm->-r ./requirements_pc.txt (line 6)) (0.18.1)
    Requirement already satisfied: huggingface_hub in /home/vyom/.local/lib/python3.10/site-packages (from timm->-r ./requirements_pc.txt (line 6)) (0.23.4)
    Requirement already satisfied: safetensors in /home/vyom/.local/lib/python3.10/site-packages (from timm->-r ./requirements_pc.txt (line 6)) (0.4.3)
    Requirement already satisfied: iniconfig in /home/vyom/.local/lib/python3.10/site-packages (from pytest->-r ./requirements_pc.txt (line 12)) (2.0.0)
    Requirement already satisfied: packaging in /home/vyom/.local/lib/python3.10/site-packages (from pytest->-r ./requirements_pc.txt (line 12)) (24.1)
    Requirement already satisfied: pluggy<2.0,>=1.5 in /home/vyom/.local/lib/python3.10/site-packages (from pytest->-r ./requirements_pc.txt (line 12)) (1.5.0)
    Requirement already satisfied: exceptiongroup>=1.0.0rc8 in /home/vyom/.local/lib/python3.10/site-packages (from pytest->-r ./requirements_pc.txt (line 12)) (1.2.2)
    Requirement already satisfied: tomli>=1 in /home/vyom/.local/lib/python3.10/site-packages (from pytest->-r ./requirements_pc.txt (line 12)) (2.0.1)
    Requirement already satisfied: tqdm in /home/vyom/.local/lib/python3.10/site-packages (from gluoncv->-r ./requirements_pc.txt (line 15)) (4.66.4)
    Requirement already satisfied: matplotlib in /home/vyom/.local/lib/python3.10/site-packages (from gluoncv->-r ./requirements_pc.txt (line 15)) (3.9.1)
    Requirement already satisfied: portalocker in /home/vyom/.local/lib/python3.10/site-packages (from gluoncv->-r ./requirements_pc.txt (line 15)) (2.10.1)
    Requirement already satisfied: scipy in /home/vyom/.local/lib/python3.10/site-packages (from gluoncv->-r ./requirements_pc.txt (line 15)) (1.13.1)
    Requirement already satisfied: yacs in /home/vyom/.local/lib/python3.10/site-packages (from gluoncv->-r ./requirements_pc.txt (line 15)) (0.1.8)
    Requirement already satisfied: pandas in /home/vyom/.local/lib/python3.10/site-packages (from gluoncv->-r ./requirements_pc.txt (line 15)) (2.2.2)
    Requirement already satisfied: autocfg in /home/vyom/.local/lib/python3.10/site-packages (from gluoncv->-r ./requirements_pc.txt (line 15)) (0.0.8)
    Requirement already satisfied: MarkupSafe>=2.0 in /usr/lib/python3/dist-packages (from jinja2->torch->-r ./requirements_pc.txt (line 5)) (2.0.1)
    Requirement already satisfied: contourpy>=1.0.1 in /home/vyom/.local/lib/python3.10/site-packages (from matplotlib->gluoncv->-r ./requirements_pc.txt (line 15)) (1.2.1)
    Requirement already satisfied: cycler>=0.10 in /home/vyom/.local/lib/python3.10/site-packages (from matplotlib->gluoncv->-r ./requirements_pc.txt (line 15)) (0.12.1)
    Requirement already satisfied: fonttools>=4.22.0 in /home/vyom/.local/lib/python3.10/site-packages (from matplotlib->gluoncv->-r ./requirements_pc.txt (line 15)) (4.53.1)
    Requirement already satisfied: kiwisolver>=1.3.1 in /home/vyom/.local/lib/python3.10/site-packages (from matplotlib->gluoncv->-r ./requirements_pc.txt (line 15)) (1.4.5)
    Requirement already satisfied: pyparsing>=2.3.1 in /usr/lib/python3/dist-packages (from matplotlib->gluoncv->-r ./requirements_pc.txt (line 15)) (2.4.7)
    Requirement already satisfied: python-dateutil>=2.7 in /home/vyom/.local/lib/python3.10/site-packages (from matplotlib->gluoncv->-r ./requirements_pc.txt (line 15)) (2.9.0.post0)
    Requirement already satisfied: pytz>=2020.1 in /usr/lib/python3/dist-packages (from pandas->gluoncv->-r ./requirements_pc.txt (line 15)) (2022.1)
    Requirement already satisfied: tzdata>=2022.7 in /home/vyom/.local/lib/python3.10/site-packages (from pandas->gluoncv->-r ./requirements_pc.txt (line 15)) (2024.1)
    Requirement already satisfied: mpmath<1.4,>=1.1.0 in /home/vyom/.local/lib/python3.10/site-packages (from sympy->torch->-r ./requirements_pc.txt (line 5)) (1.3.0)
    Requirement already satisfied: six>=1.5 in /usr/lib/python3/dist-packages (from python-dateutil>=2.7->matplotlib->gluoncv->-r ./requirements_pc.txt (line 15)) (1.16.0)
    Defaulting to user installation because normal site-packages is not writeable
    Requirement already satisfied: pybind11[global] in /home/vyom/.local/lib/python3.10/site-packages (2.13.1)
    Requirement already satisfied: pybind11-global==2.13.1 in /home/vyom/.local/lib/python3.10/site-packages (from pybind11[global]) (2.13.1)
    Installing python osrt packages...
    Defaulting to user installation because normal site-packages is not writeable
    Requirement already satisfied: pip in /home/vyom/.local/lib/python3.10/site-packages (24.1.2)
    Requirement already satisfied: setuptools in /home/vyom/.local/lib/python3.10/site-packages (70.3.0)
    Installing python packages...
    pip3 install --no-input wheel
    Defaulting to user installation because normal site-packages is not writeable
    Requirement already satisfied: wheel in /usr/lib/python3/dist-packages (0.37.1)
    pip3 install --no-input numpy==1.23.0
    Defaulting to user installation because normal site-packages is not writeable
    Requirement already satisfied: numpy==1.23.0 in /home/vyom/.local/lib/python3.10/site-packages (1.23.0)
    pip3 install --no-input protobuf==3.20.3
    Defaulting to user installation because normal site-packages is not writeable
    Requirement already satisfied: protobuf==3.20.3 in /home/vyom/.local/lib/python3.10/site-packages (3.20.3)
    pip3 install --no-input onnx==1.13.0
    Defaulting to user installation because normal site-packages is not writeable
    Requirement already satisfied: onnx==1.13.0 in /home/vyom/.local/lib/python3.10/site-packages (1.13.0)
    Requirement already satisfied: numpy>=1.16.6 in /home/vyom/.local/lib/python3.10/site-packages (from onnx==1.13.0) (1.23.0)
    Requirement already satisfied: protobuf<4,>=3.20.2 in /home/vyom/.local/lib/python3.10/site-packages (from onnx==1.13.0) (3.20.3)
    Requirement already satisfied: typing-extensions>=3.6.2.1 in /home/vyom/.local/lib/python3.10/site-packages (from onnx==1.13.0) (4.12.2)
    pip3 install --no-input onnxsim==0.4.35
    Defaulting to user installation because normal site-packages is not writeable
    Requirement already satisfied: onnxsim==0.4.35 in /home/vyom/.local/lib/python3.10/site-packages (0.4.35)
    Requirement already satisfied: onnx in /home/vyom/.local/lib/python3.10/site-packages (from onnxsim==0.4.35) (1.13.0)
    Requirement already satisfied: rich in /home/vyom/.local/lib/python3.10/site-packages (from onnxsim==0.4.35) (13.7.1)
    Requirement already satisfied: numpy>=1.16.6 in /home/vyom/.local/lib/python3.10/site-packages (from onnx->onnxsim==0.4.35) (1.23.0)
    Requirement already satisfied: protobuf<4,>=3.20.2 in /home/vyom/.local/lib/python3.10/site-packages (from onnx->onnxsim==0.4.35) (3.20.3)
    Requirement already satisfied: typing-extensions>=3.6.2.1 in /home/vyom/.local/lib/python3.10/site-packages (from onnx->onnxsim==0.4.35) (4.12.2)
    Requirement already satisfied: markdown-it-py>=2.2.0 in /home/vyom/.local/lib/python3.10/site-packages (from rich->onnxsim==0.4.35) (3.0.0)
    Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /home/vyom/.local/lib/python3.10/site-packages (from rich->onnxsim==0.4.35) (2.18.0)
    Requirement already satisfied: mdurl~=0.1 in /home/vyom/.local/lib/python3.10/site-packages (from markdown-it-py>=2.2.0->rich->onnxsim==0.4.35) (0.1.2)
    pip3 install --no-input git+https://github.com/NVIDIA/TensorRT@release/8.5#subdirectory=tools/onnx-graphsurgeon
    Defaulting to user installation because normal site-packages is not writeable
    Collecting git+https://github.com/NVIDIA/TensorRT@release/8.5#subdirectory=tools/onnx-graphsurgeon
      Cloning https://github.com/NVIDIA/TensorRT (to revision release/8.5) to /tmp/pip-req-build-7tqqmvxs
      Running command git clone --filter=blob:none --quiet https://github.com/NVIDIA/TensorRT /tmp/pip-req-build-7tqqmvxs
      Running command git checkout -b release/8.5 --track origin/release/8.5
      Switched to a new branch 'release/8.5'
      Branch 'release/8.5' set up to track remote branch 'release/8.5' from 'origin'.
      Resolved https://github.com/NVIDIA/TensorRT to commit 68b5072fdb9df6b6edab1392b02a705394b2e906
      Running command git submodule update --init --recursive -q
      Preparing metadata (setup.py) ... done
    Requirement already satisfied: numpy in /home/vyom/.local/lib/python3.10/site-packages (from onnx_graphsurgeon==0.3.26) (1.23.0)
    Requirement already satisfied: onnx in /home/vyom/.local/lib/python3.10/site-packages (from onnx_graphsurgeon==0.3.26) (1.13.0)
    Requirement already satisfied: protobuf<4,>=3.20.2 in /home/vyom/.local/lib/python3.10/site-packages (from onnx->onnx_graphsurgeon==0.3.26) (3.20.3)
    Requirement already satisfied: typing-extensions>=3.6.2.1 in /home/vyom/.local/lib/python3.10/site-packages (from onnx->onnx_graphsurgeon==0.3.26) (4.12.2)
    installing the onnx graph optimization toolkit...
    version_file=/home/vyom/j721s2/ti-processor-sdk-rtos-j721s2-evm-09_02_00_05/edgeai-tidl-tools/scripts/osrt_model_tools/onnx_tools/tidl-onnx-model-optimizer/version.py
    running develop
    /home/vyom/.local/lib/python3.10/site-packages/setuptools/command/develop.py:42: EasyInstallDeprecationWarning: easy_install command is deprecated.
    !!
    
            ********************************************************************************
            Please avoid running ``setup.py`` and ``easy_install``.
            Instead, use pypa/build, pypa/installer or other
            standards-based tools.
    
            See https://github.com/pypa/setuptools/issues/917 for details.
            ********************************************************************************
    
    !!
      easy_install.initialize_options(self)
    /home/vyom/.local/lib/python3.10/site-packages/setuptools/_distutils/cmd.py:66: SetuptoolsDeprecationWarning: setup.py install is deprecated.
    !!
    
            ********************************************************************************
            Please avoid running ``setup.py`` directly.
            Instead, use pypa/build, pypa/installer or other
            standards-based tools.
    
            See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details.
            ********************************************************************************
    
    !!
      self.initialize_options()
    error: can't create or remove files in install directory
    
    The following error occurred while trying to add or remove files in the
    installation directory:
    
        [Errno 13] Permission denied: '/usr/local/lib/python3.10/dist-packages/test-easy-install-40755.write-test'
    
    The installation directory you specified (via --install-dir, --prefix, or
    the distutils default setting) was:
    
        /usr/local/lib/python3.10/dist-packages/
    
    Perhaps your account does not have write access to this directory?  If the
    installation directory is a system-owned directory, you may need to sign in
    as the administrator or "root" account.  If you do not have administrative
    access to this machine, you may wish to choose a different installation
    directory, preferably one that is listed in your PYTHONPATH environment
    variable.
    
    For information on other options, you may wish to consult the
    documentation at:
    
      https://setuptools.pypa.io/en/latest/deprecated/easy_install.html
    
    Please make the appropriate changes for your system and try again.
    

    The setup halts.

    Please let me know, how I can resolve the same. I have tried to provide the sudo permission to the python directory but it is not allowed.

    Thanks and REgards,

    Vyom Mishra

  • Hi Vyom,

    This seems to minor setup issue, can you please do diligence of user guide and check the env setup aspect of edgeai tidl tools 

    https://github.com/TexasInstruments/edgeai-tidl-tools/tree/master

  • Dear Sir,

    Thanks for the response!

    We have found the same behaviour with the EdgeAI Import and Inference.

    Still, the number of layers detected in PSDK 9.2(TIDL and EdgeAI) compared to PSDK 8.2 is more and no detections are observed.

    FYI:

    - Import configuration file and post-processing code, model onnx file is same for experiment with PSDK 8.2 and PSDK 9.2

    If you have some suggestions for debugging, do let me know.

    Thanks and Regards,

    Vyom Mishra

  • Hi,

    Hope you have tried 9.2.9.0 tools and coming with this conclusion.

    As next step lets can we do layer level diff of 8.2 model data wrt 9.2 and figure out the first layer which shows deviation ?

    Also on 9.2 sdk have you verified layer level dump comparison of PC and Target flow ? Is it correct ?

    Here are the documentation on that : https://github.com/TexasInstruments/edgeai-tidl-tools/blob/09_02_09_00/docs/tidl_osr_debug.md#steps-to-debug-error-scenarios-for-targetevmdevice-execution

  • Dear Sir,

    Thanks for the response!

    I found that the layer Deconvolution layer has an issue.

    Please find the observation

    - In 8.2 and 9.2 PSDK, the below suggestion is provided by the Import tool

     

    ****************************************************
    **               TIDL Model Checker               **
    ****************************************************
    SUGGESTION: [TIDL_Deconv2DLayer] ConvTranspose_115 Please change to Upsample/Resize if possible. Upsample/Resize will be more efficient.
    SUGGESTION: [TIDL_Deconv2DLayer] ConvTranspose_103 Please change to Upsample/Resize if possible. Upsample/Resize will be more efficient.
    SUGGESTION: [TIDL_Deconv2DLayer] ConvTranspose_118 Please change to Upsample/Resize if possible. Upsample/Resize will be more efficient.
    SUGGESTION: [TIDL_Deconv2DLayer] ConvTranspose_97 Please change to Upsample/Resize if possible. Upsample/Resize will be more efficient.
    SUGGESTION: [TIDL_Deconv2DLayer] ConvTranspose_106 Please change to Upsample/Resize if possible. Upsample/Resize will be more efficient.
    SUGGESTION: [TIDL_Deconv2DLayer] ConvTranspose_121 Please change to Upsample/Resize if possible. Upsample/Resize will be more efficient.

    - In 9.2 Inference, after the layer TIDL_DataConvertLayer the output has garbage values

      64        85.33334         0.00000         2.75625 0
    End of Layer # -   64 with outPtrs[0] = 0x7adfc3e4f500
    Core 0 Alg Process for Layer # -   65, layer type 11 
    Processing Layer # -   65
      65        42.66667        -1.45313         2.45625 1
    End of Layer # -   65 with outPtrs[0] = 0x7adfc371d500
    Core 0 Alg Process for Layer # -   66, layer type 11 
    Processing Layer # -   66
      66         0.00000         0.00000 187701539358850270197811201655701504.00000 1

    Could you please suggest some to do for further debugging or suggestions?

    Thanks and Regards,

    Vyom Mishra

  • Hi,

    - In 9.2 Inference, after the layer TIDL_DataConvertLayer the output has garbage values

    can you share how did you figured out this ?

    Also is it possible to share the layer level dumps of 8.6 and 9.2 sdk ? and also the artifacts, is possible to get the model file ?

    Can you share graph (as mentioned in trouble shooting document) of few layer prior the current layer where issue is coming along with layer where issue is visible for our reference.

    Thanks

  • Dear Sir,

    Thanks for the response!

    can you share how did you figured out this ?

    I have compared the inference logs generated from 8.2 and 9.2 and found in the layer dumps some garbage values are observed with 9.2 which is not in the case of 8.2. For reference, I have already mentioned above in my last reply

    Also is it possible to share the layer level dumps of 8.6 and 9.2 sdk ? and also the artifacts, is possible to get the model file ?

    Apologies, the custom model file can not be shared

    Can you share graph (as mentioned in trouble shooting document) of few layer prior the current layer where issue is coming along with layer where issue is visible for our reference.

    Yes, please find the reference for the same

    Imported TIDL Model

    61|TIDL_ConvolutionLayer         |onnx::Conv_545                                    |     0|     1|     1| 60   x   x   x   x   x   x   x | 61       |       1        1        1      160       10       16 |       1        1        1      960       10       16 |  24729600 |
    62|TIDL_ConvolutionLayer         |onnx::Conv_548                                    |     0|     1|     1| 61   x   x   x   x   x   x   x | 62       |       1        1        1      960       10       16 |       1        1        1      960       10       16 |   1536000 |
    63|TIDL_ConvolutionLayer         |input.404                                         |     0|     1|     1| 62   x   x   x   x   x   x   x | 63       |       1        1        1      960       10       16 |       1        1        1      320       10       16 |  49152000 |
    64|TIDL_ConvolutionLayer         |onnx::ConvTranspose_553_netFormat                 |     0|     1|     1| 63   x   x   x   x   x   x   x | 64       |       1        1        1      320       10       16 |       1        1        1       96       10       16 |   4915200 |
    65|TIDL_DataConvertLayer         |onnx::ConvTranspose_577                           |     0|     1|     1| 23   x   x   x   x   x   x   x | 65       |       1        1        1       24       40       64 |       1        1        1       24       40       64 |    245760 |
    66|TIDL_DataConvertLayer         |onnx::ConvTranspose_561                           |     0|     1|     1| 50   x   x   x   x   x   x   x | 66       |       1        1        1       32       20       32 |       1        1        1       32       20       32 |     81920 |
    67|TIDL_DataConvertLayer         |onnx::ConvTranspose_553                           |     0|     1|     1| 64   x   x   x   x   x   x   x | 67       |       1        1        1       96       10       16 |       1        1        1       96       10       16 |     61440 |
    68|TIDL_Deconv2DLayer            |onnx::Concat_578                                  |     0|     1|     1| 65   x   x   x   x   x   x   x | 68       |       1        1        1       24       40       64 |       1        1        1       24       80      128 |   3932160 |
    69|TIDL_Deconv2DLayer            |onnx::Concat_562                                  |     0|     1|     1| 66   x   x   x   x   x   x   x | 69       |       1        1        1       32       20       32 |       1        1        1       32       40       64 |   1310720 |

    Thanks and Regards,

    Vyom Mishra

  • Vyom,

    It would be great if you can share the TOY Model (not original) so that we can also validate this at our end this will work as test case for us.

    With the current situation am hoping and please confirm as well, you have generated layer level diff model artifacts generated using 9.2.9.0 sdk tools and doing target inference on 9.2 baseline firmware.

    Based on your response,

    - In 9.2 Inference, after the layer TIDL_DataConvertLayer the output has garbage values

    is this TIDL_Deconv2DLayer are you talking about ? Do you see all the layer above this when compared with 8.6 are matching expect the TIDL_Deconv2DLayer is my understanding correct here ?

    68|TIDL_Deconv2DLayer            |onnx::Concat_578                                  |     0|     1|     1| 65   x   x   x   x   x   x   x | 68       |       1        1        1       24       40       64 |       1        1        1       24       80      128 |   3932160 |

    If thats the case, i would recommend to create a TOY model file where the highlighted issue is visible and share it with us so we can reproduce and RCA it.

    Thanks 

  • Dear Sir,

    Thanks for your response!

    We have explored a few open-source models and tried to replicate them same. We have compared the 8.2 and 9.2 import behaviour. 

    We are in the process of generating a TOY model for reference. Please allow us some time for this.

    Thanks and Regards,

    Vyom Mishra

  • Sure, looking forward for TOY model as test case.

    Also please help me understand details asked here : 

    With the current situation am hoping and please confirm as well, you have generated layer level diff model artifacts generated using 9.2.9.0 sdk tools and doing target inference on 9.2 baseline firmware.

    Based on your response,

    - In 9.2 Inference, after the layer TIDL_DataConvertLayer the output has garbage values

    is this TIDL_Deconv2DLayer are you talking about ? Do you see all the layer above this when compared with 8.6 are matching expect the TIDL_Deconv2DLayer is my understanding correct here ?

    68|TIDL_Deconv2DLayer            |onnx::Concat_578                                  |     0|     1|     1| 65   x   x   x   x   x   x   x | 68       |       1        1        1       24       40       64 |       1        1        1       24       80      128 |   3932160 |

    Also requesting you to share, 

    As next step lets can we do layer level diff of 8.2 model data wrt 9.2 and figure out the first layer which shows deviation ?

    Also on 9.2 sdk have you verified layer level dump comparison of PC and Target flow ? Is it correct ?

    Here are the documentation on that : https://github.com/TexasInstruments/edgeai-tidl-tools/blob/09_02_09_00/docs/tidl_osr_debug.md#steps-to-debug-error-scenarios-for-targetevmdevice-execution

    1) float vs fixed trace diff for suspicious layer ? (asking one more time is it TIDL_Deconv2DLayer )

    2) Can you share 8.2 fixed point trace of suspicious layer diff with 9.2.9.0 ? please check above links to understand more on the same.

  • Dear Sir,

    I am not able to upload the toy model here due to its size, could you please share email id, so that I can mail you the same.

    Thanks and Regards,

    Vyom Mishra

  • Dear Sir,

    Please find the toy model for your reference.

    toy.zip

    Thanks and Regards,

    Vyom Mishra

  • Vyom,

    Good to see that you are able to attach the file.

    Let me add few things here,

    Since this thread was in active state from last 1 month we have released the 10 sdk with robustness fixes, since you already have the setup ready with you can you please verify the results and let us know ?

    There can be 1 or 2 things we can expect here, if latest tools and firmware resolves the issue (Well and good ) if not i have the model file, i can continue with RCA on latest sdk and support it.

    Thanks for your kind understanding 

    Pratik

  • Dear Sir,

    Thanks for the response!

    We experimented with the model on PSDK 10, and the same behaviour is observed for the Deconv2D layer.

    Regards,

    Vyom Mishra

  • Vyom,

    Thanks for quick confirmation, will take from here and update you on progress.

  • Dear Sir,

    Any Updates?


    Thanks and Regards,

    Vyom Mishra

  • Dear Sir,

    Any Updates?

    Is this issue observed on your side also?


    Thanks and Regards,

    Vyom Mishra

  • Hi Vyom,

    Due to limited bandwidth i was unable to perform RCA for above shared scenario, i plan to spend some time on it coming week and get back to you once i have an update.

    Thanks 

  • Dear Sir,

    Is there any update?

    Thanks and Regards,

    Vyom Mishra

  • Dear Sir,

    Did you got any time to look into this?

    Thanks and Regards,

    Vyom Mishra

  • Dear Sir,

    Gentle Reminder!

    It's important for us to resolve this issue, could you please spend some time on this scenario?

    Thanks and Regards,

    Vyom Mishra

  • Hi Vyom;

    Are you still waiting for the answer of this issue? if so, please let us know. We will have the team involved to answer it.

    Thanks and regards

    Wen Li 

  • Hi Wen Li,

    I am still waiting for the response.

    As per the request, I shared the toy(dummy) model earlier, please do check it.

    Thanks and Regards,

    Vyom Mishra

  • Hi Vyom;

    This is an old thread that Pratik communicated with you, but Pratik is no longer working on this, I will look into it. I could find the dummy model you mentioned. Could you send it to us again? 

    Thank you!

    Wen Li 

  • Hi Wen Li,

    Thanks for the response.

    Please find the dummy model for your reference /cfs-file/__key/communityserver-discussions-components-files/791/5444.toy.zip

    Thanks and Regards,

    Vyom Mishra

  • Hi Vyom;

    Thanks for the model. We will look into it.

    Due to the holiday season, we may take a longer time to investigate this.

    Thanks and regards

    Wen Li  

  • Hi Vyom,

    I will be working on this thread and try out your model shortly.  

    Chris

  • The model has an array size that requires 100GB of memory.  That is why it is failing.  There may be more.

    Unable to allocate 100. GiB for an array with shape (320, 512, 320, 512) and data type float32

    Chris

  • Hey Chris, 

    Thanks for the response!

    We were unable to share our original model , so we tried out an opensource model to replicate our issue which was shared as toy model.

    We have found below information:

    TIDL-4455

    TransposeConvolution(Deconv) output is not functionally correct on host emulation and target if TIDL-RT based config files are used during compilation

    from the source : https://github.com/TexasInstruments/edgeai-tidl-tools/releases

    This is the issue with our model, could you please help us with the alternative?


    Thanks and Regards,

    Vyom Mishra

  • Hi Vyom,

    This issue was resolved in release 10.00.02.00.

    Chris

  • Dear Chris,

    We have tried with version  10_00_02_00 and  10_00_08_00 but still, the Deconvolution issues exist.

    FYI, we are following this process for import https://github.com/TexasInstruments/edgeai-tidl-tools/blob/master/examples/jupyter_notebooks/colab/tidlrt_tools.ipynb


    Thanks and Regards,

    Vyom Mishra

  • Hi Vyom,

    It is supposed to work.  Can you please send me the output of the compilation so we can see the exact version being run?  Also, If possible, can you please send me the onnx model, and I will try it on my end?

    Chris

  • Dear Chris, 

    We have found a toy model which passes through the PSDK 8.2 but fails in 10_00_08_00 SDK

    Please check and let us know your opinion.

    dlav0_34_241218.zip

    Thanks and Regards,

    Vyom Mishra

  • HI Vyom,

    The model you sent appears to work on 10_00_08.  Here is the compile on host and execution on the host and device.

    Compile:

    Compiling:  dlav0_34_241218.onnx
    ========================= [Model Compilation Started] =========================
    
    Model compilation will perform the following stages:
    1. Parsing
    2. Graph Optimization
    3. Quantization & Calibration
    4. Memory Planning
    
    ============================== [Version Summary] ==============================
    
    -------------------------------------------------------------------------------
    |          TIDL Tools Version          |              10_00_08_00             |
    -------------------------------------------------------------------------------
    |         C7x Firmware Version         |              10_00_02_00             |
    -------------------------------------------------------------------------------
    
    ONNX model (Proto) file      : dlav0_34_241218.onnx  
    TIDL network file            : dlav0_34_241218//tidl_net.bin  
    TIDL IO info file            : dlav0_34_241218//tidl_io_buff  
    Current ONNX OpSet version   : 11  
    ============================ [Optimization started] ============================
    
    ----------------------------- Optimization Summary -----------------------------
    --------------------------------------------------------------------------------
    |         Layer         | Nodes before optimization | Nodes after optimization |
    --------------------------------------------------------------------------------
    | TIDL_EltWiseLayer     |                        12 |                       12 |
    | TIDL_Deconv2DLayer    |                         6 |                        6 |
    | TIDL_ConcatLayer      |                        12 |                       12 |
    | TIDL_ReLULayer        |                        48 |                        0 |
    | TIDL_ConvolutionLayer |                        55 |                       55 |
    | TIDL_PoolingLayer     |                         6 |                        6 |
    --------------------------------------------------------------------------------
    
    =========================== [Optimization completed] ===========================
    
    
    -------- Running Calibration in Float Mode to Collect Tensor Statistics --------
    [=============================================================================] 100 %
    
    ------------------ Fixed-point Calibration Iteration [1 / 1]: ------------------
    [=============================================================================] 100 %
    
    ==================== [Quantization & Calibration Completed] ====================
    
    ========================== [Memory Planning Started] ==========================
    
    
    ------------------------- Network Compiler Traces ------------------------------
    Successful Memory Allocation
    Successful Workload Creation
    
    ========================= [Memory Planning Completed] =========================
    
    [TIDL Import]  WARNING: Change to Upsample/Resize if possible instead of Deconvolution. It will be more efficient
    [TIDL Import]  WARNING: Change to Upsample/Resize if possible instead of Deconvolution. It will be more efficient
    [TIDL Import]  WARNING: Change to Upsample/Resize if possible instead of Deconvolution. It will be more efficient
    [TIDL Import]  WARNING: Change to Upsample/Resize if possible instead of Deconvolution. It will be more efficient
    [TIDL Import]  WARNING: Change to Upsample/Resize if possible instead of Deconvolution. It will be more efficient
    [TIDL Import]  WARNING: Change to Upsample/Resize if possible instead of Deconvolution. It will be more efficient
    ======================== Subgraph Compiled Successfully ========================
    

    Host Execution:

    *************** Running dlav0_34_241218 **************************
    
    Processing config file #0 : /opt/dlav0_34_241218/config 
    Input : dataId=0, name=input.1_original, elementType 0, scale=1.000000, zero point=0, layout=0
    Ouput : dataId=101, name=508, elementType 1, scale=3.222945, zero point=0, layout=0 
     Ouput : dataId=102, name=511, elementType 1, scale=0.770903, zero point=0, layout=0 
     Ouput : dataId=103, name=514, elementType 1, scale=85.225433, zero point=0, layout=0 
         19589468,     18.682 0xffff3db42010
    worstCaseDelay for Pre-emption is 0.8372200 
    Network File Read done
    APP: Init ... !!!
    2918920.712111 s: MEM: Init ... !!!
    2918920.712153 s: MEM: Initialized DMA HEAP (fd=5) !!!
    2918920.712299 s: MEM: Init ... Done !!!
    2918920.712310 s: IPC: Init ... !!!
    2918920.742932 s: IPC: Init ... Done !!!
    REMOTE_SERVICE: Init ... !!!
    REMOTE_SERVICE: Init ... Done !!!
    2918920.751528 s: GTC Frequency = 200 MHz
    APP: Init ... Done !!!
    2918920.751599 s:  VX_ZONE_INIT:Enabled
    2918920.751602 s:  VX_ZONE_ERROR:Enabled
    2918920.751605 s:  VX_ZONE_WARNING:Enabled
    2918920.752188 s:  VX_ZONE_INIT:[tivxPlatformCreateTargetId:124] Added target MPU-0 
    2918920.752298 s:  VX_ZONE_INIT:[tivxPlatformCreateTargetId:124] Added target MPU-1 
    2918920.752370 s:  VX_ZONE_INIT:[tivxPlatformCreateTargetId:124] Added target MPU-2 
    2918920.752434 s:  VX_ZONE_INIT:[tivxPlatformCreateTargetId:124] Added target MPU-3 
    2918920.752438 s:  VX_ZONE_INIT:[tivxInitLocal:136] Initialization Done !!!
    2918920.752774 s:  VX_ZONE_INIT:[tivxHostInitLocal:106] Initialization Done for HOST !!!
    RT-Profile: TIDLRT_init_profiling 
    tidlrt_create            :      159203682 ns,
    tidl_rt_ovx_Init         :       42698695 ns,
    vxCreateContext          :         749154 ns,
    init_tidl_tiovx          :       22017236 ns,
    create_graph_tidl_tiovx  :         554480 ns,
    verify_graph_tidl_tiovx  :       90045893 ns,
    tivxTIDLLoadKernels      :          17721 ns,
    mapConfig                :         776004 ns,
    tivxAddKernelTIDL        :          88717 ns,
    mapNetwork               :       20585883 ns,
    setCreateParams          :         250820 ns,
    setArgs                  :         295761 ns,
    vxCreateUserDataObject   :          28466 ns,
    vxMapUserDataObject      :       13327571 ns,
    memcopy_network_buffer   :        7222626 ns,
    vxUnmapUserDataObject    :           4635 ns,
    
    # NETWORK_INIT_TIME =   159.67 (in ms, c7x @1GHz)
     Freeing memory for user provided Net
    
     Instance created for  /opt/dlav0_34_241218/config
    
    Processing Cnt :    0, InstCnt :    0 /opt/dlav0_34_241218/tidl_net.bin!
    /home/a0194920local/colab-notebooks/dlav0_34_241218//jet.jpeg
    peg file read not supported. Only BMP file read is supported in target 
     Freeing memory for user provided Net
     ----------------------- TIDL Process with TARGET DATA FLOW ------------------------
    Error opening performance debug file
     Layer,   Layer Cycles,kernelOnlyCycles, coreLoopCycles,LayerSetupCycles,dmaPipeupCycles, dmaPipeDownCycles, PrefetchCycles,copyKerCoeffCycles,LayerDeinitCycles,LastBlockCycles, paddingTrigger,    paddingWait,LayerWithoutPad,LayerHandleCopy,   BackupCycles,  RestoreCycles,Multic7xContextCopyCycles,
         1,         245069,         154238,         156640,          11726,          11826,                 0,              0,                 0,              0,              0,           2192,              1,              0,           4599,              0,              0,              0,
         2,         232358,         155561,         160380,           9882,           4794,                 0,              0,                 0,              0,              0,           1765,              1,              0,           3553,              0,              0,              0,
         3,         124577,          51159,          55460,           8647,           4929,                 0,              0,                 0,              0,              0,           1375,              1,              0,           3560,              0,              0,              0,
         4,          77752,           5604,           7160,           6885,           7119,                 0,              0,                 0,              0,              0,           1445,              1,              0,           3911,              0,              0,              0,
         6,          71776,           4982,           6614,           7519,           3138,                 0,              0,                 0,              0,              0,           1402,              1,              0,           3400,              0,              0,              0,
         5,          94952,          23996,          26641,           8675,           5556,                 0,              0,                 0,              0,              0,           1469,              1,              0,           3277,              0,              0,              0,
         7,         101319,          30552,          31917,           7984,           6122,                 0,              0,                 0,              0,              0,           1210,              1,              0,           3111,              0,              0,              0,
         8,          75904,           6082,           8612,           7135,           4189,                 0,              0,                 0,              0,              0,           1322,              1,              0,           3492,              0,              0,              0,
         9,         102124,          30630,          32276,           8079,           6009,                 0,              0,                 0,              0,              0,           1429,              1,              0,           2906,              0,              0,              0,
        10,         100501,          30614,          31784,           7766,           5859,                 0,              0,                 0,              0,              0,           1038,              1,              0,           2735,              0,              0,              0,
        11,          74263,           4613,           7269,           6705,           4028,                 0,              0,                 0,              0,              0,           1684,              1,              0,           3191,              0,              0,              0,
        12,          81864,           7992,          11463,           8395,           4251,                 0,              0,                 0,              0,              0,           1471,              1,              0,           3457,              0,              0,              0,
        13,          82289,           8671,          11794,           9368,           4850,                 0,              0,                 0,              0,              0,           1360,              1,              0,           3750,              0,              0,              0,
        14,          71502,           4466,           5674,           6251,           4086,                 0,              0,                 0,              0,              0,           1110,              1,              0,           3592,              0,              0,              0,
        15,          72591,           3853,           5470,           6837,           3923,                 0,              0,                 0,              0,              0,           1519,              1,              0,           3895,              0,              0,              0,
        17,          71290,           3065,           4434,           8783,           2745,                 0,              0,                 0,              0,              0,           1151,              1,              0,           3530,              0,              0,              0,
        16,         113549,          38733,          41166,           8523,           7716,                 0,              0,                 0,              0,              0,           1371,              1,              0,           3132,              0,              0,              0,
        18,         107922,          32827,          34109,           8041,          10120,                 0,              0,                 0,              0,              0,           1150,              1,              0,           3341,              0,              0,              0,
        19,          71040,           3079,           5321,           6858,           2925,                 0,              0,                 0,              0,              0,           1161,              1,              0,           3594,              0,              0,              0,
        20,         108294,          32876,          34222,           8148,           9234,                 0,              0,                 0,              0,              0,           1214,              1,              0,           3263,              0,              0,              0,
        21,         106446,          32797,          34410,           7468,           9280,                 0,              0,                 0,              0,              0,           1481,              1,              0,           2701,              0,              0,              0,
        22,          70479,           3068,           5123,           7728,           3031,                 0,              0,                 0,              0,              0,           1050,              1,              0,           3829,              0,              0,              0,
        23,          74319,           4328,           6742,           8028,           2982,                 0,              0,                 0,              0,              0,           1406,              1,              0,           3964,              0,              0,              0,
        24,          80409,           7963,           9274,           8801,           6473,                 0,              0,                 0,              0,              0,           1153,              1,              0,           3554,              0,              0,              0,
        25,         108360,          32814,          34344,           7710,           9709,                 0,              0,                 0,              0,              0,           1318,              1,              0,           2813,              0,              0,              0,
        26,         106063,          32690,          34032,           8889,           9238,                 0,              0,                 0,              0,              0,           1135,              1,              0,           3786,              0,              0,              0,
        27,          70785,           3205,           5524,           6798,           2917,                 0,              0,                 0,              0,              0,           1271,              1,              0,           3559,              0,              0,              0,
        28,         107115,          32811,          34414,           8372,           9265,                 0,              0,                 0,              0,              0,           1471,              1,              0,           3455,              0,              0,              0,
        29,         108048,          32667,          34106,           7677,           9767,                 0,              0,                 0,              0,              0,           1307,              1,              0,           3205,              0,              0,              0,
        30,          69082,           2950,           5089,           6385,           2930,                 0,              0,                 0,              0,              0,           1110,              1,              0,           3235,              0,              0,              0,
        31,          87975,           8149,          15946,          11332,           2041,                 0,              0,                 0,              0,              0,           1381,              1,              0,           3650,              0,              0,              0,
        32,          89182,          15262,          18590,           7610,           6854,                 0,              0,                 0,              0,              0,           1458,              1,              0,           2859,              0,              0,              0,
        33,          68836,           2285,           3711,           6581,           2888,                 0,              0,                 0,              0,              0,           1328,              1,              0,           4040,              0,              0,              0,
        34,          67945,           2213,           3430,           6166,           2779,                 0,              0,                 0,              0,              0,           1119,              1,              0,           3645,              0,              0,              0,
        37,          71939,           3705,           5457,           7604,           3174,                 0,              0,                 0,              0,              0,           1610,              1,              0,           3243,              0,              0,              0,
        35,         154876,          73898,          75991,           7868,          15700,                 0,              0,                 0,              0,              0,           1086,              1,              0,           3072,              0,              0,              0,
        38,         141452,          41972,          46349,           8359,          31753,                 0,              0,                 0,              0,              0,           1314,              1,              0,           2955,              0,              0,              0,
        39,          70622,           2456,           5243,           6457,           2274,                 0,              0,                 0,              0,              0,           1762,              1,              0,           2748,              0,              0,              0,
        40,         143461,          42119,          46251,           8760,          32017,                 0,              0,                 0,              0,              0,           1226,              1,              0,           3547,              0,              0,              0,
        41,         143055,          41937,          46075,           9675,          31286,                 0,              0,                 0,              0,              0,           1369,              1,              0,           3602,              0,              0,              0,
        42,          70662,           2478,           5063,           6570,           2280,                 0,              0,                 0,              0,              0,           1614,              1,              0,           2845,              0,              0,              0,
        43,          72187,           3091,           5672,           7971,           2349,                 0,              0,                 0,              0,              0,           1644,              1,              0,           3559,              0,              0,              0,
        44,          84052,           9945,          11264,           8063,           9112,                 0,              0,                 0,              0,              0,           1169,              1,              0,           3581,              0,              0,              0,
        45,         143700,          42118,          46434,           9544,          31577,                 0,              0,                 0,              0,              0,           1424,              1,              0,           3794,              0,              0,              0,
        46,         141056,          42027,          46013,           8245,          31323,                 0,              0,                 0,              0,              0,           1207,              1,              0,           2701,              0,              0,              0,
        47,          70101,           2409,           4837,           6844,           2187,                 0,              0,                 0,              0,              0,           1348,              1,              0,           3361,              0,              0,              0,
        48,         141870,          42157,          46356,           8428,          31458,                 0,              0,                 0,              0,              0,           1373,              1,              0,           2869,              0,              0,              0,
        49,         144422,          42164,          46432,           9411,          32459,                 0,              0,                 0,              0,              0,           1448,              1,              0,           3552,              0,              0,              0,
        50,          70570,           2296,           4418,           7179,           2289,                 0,              0,                 0,              0,              0,           1043,              1,              0,           3884,              0,              0,              0,
        51,          85274,           6170,          13307,          11209,           1721,                 0,              0,                 0,              0,              0,           1352,              1,              0,           3704,              0,              0,              0,
        52,         105378,          19984,          24519,           8790,          15225,                 0,              0,                 0,              0,              0,           1797,              1,              0,           3367,              0,              0,              0,
        53,          69080,           2386,           4194,           5838,           2074,                 0,              0,                 0,              0,              0,           1710,              1,              0,           3546,              0,              0,              0,
        54,          80477,           4740,           6161,           9553,           8710,                 0,              0,                 0,              0,              0,           1271,              1,              0,           3924,              0,              0,              0,
        55,         283415,         137805,         145790,           9938,          71185,                 0,              0,                 0,              0,              0,           1567,              1,              0,           3510,              0,              0,              0,
        57,         301785,          85572,         100002,          10027,         134486,                 0,              0,                 0,              0,              0,           1600,              1,              0,           3912,              0,              0,              0,
        58,          70717,           2329,           4441,           7381,           1726,                 0,              0,                 0,              0,              0,           1073,              1,              0,           3835,              0,              0,              0,
        59,         294438,          85882,          99897,          10191,         126426,                 0,              0,                 0,              0,              0,           1412,              1,              0,           2914,              0,              0,              0,
        60,         286437,          85465,          99669,           9029,         122438,                 0,              0,                 0,              0,              0,           1476,              1,              0,           3561,              0,              0,              0,
        61,          69091,           1852,           4245,           6856,           1625,                 0,              0,                 0,              0,              0,           1329,              1,              0,           3087,              0,              0,              0,
        62,          75083,           3015,           7960,           9911,           1189,                 0,              0,                 0,              0,              0,           1221,              1,              0,           3425,              0,              0,              0,
        63,         120269,          15834,          19929,           8806,          35222,                 0,              0,                 0,              0,              0,           1309,              1,              0,           3212,              0,              0,              0,
        64,          75964,           3697,           5123,           8116,           7608,                 0,              0,                 0,              0,              0,           1276,              1,              0,           3346,              0,              0,              0,
        67,          98207,          21067,          34808,           6234,           1035,                 0,              0,                 0,              0,              0,          13581,              1,              0,           3849,              0,              0,              0,
        70,         197137,          11645,         113686,          22465,           3467,                 0,              0,                 0,              0,              0,           1034,              1,              0,           5156,              0,              0,              0,
        73,          88070,          21629,          23166,           5714,           1855,                 0,              0,                 0,              0,              0,           1406,              1,              0,           3575,              0,              0,              0,
        79,          72265,           3123,           5689,           8045,           2297,                 0,              0,                 0,              0,              0,           1725,              1,              0,           3262,              0,              0,              0,
        80,         315058,         166501,         191550,          11076,          57120,                 0,              0,                 0,              0,              0,           1332,              1,              0,           3346,              0,              0,              0,
        81,          73152,           3285,           4795,           7830,           3727,                 0,              0,                 0,              0,              0,           1368,              1,              0,           2960,              0,              0,              0,
        83,          83778,          11766,          20283,           5952,           1335,                 0,              0,                 0,              0,              0,           8368,              1,              0,           3373,              0,              0,              0,
        85,         140239,           6546,          64409,          17353,           2056,                 0,              0,                 0,              0,              0,           1449,              1,              0,           5637,              0,              0,              0,
        87,          80665,          12927,          15013,           6072,           2740,                 0,              0,                 0,              0,              0,           1805,              1,              0,           3405,              0,              0,              0,
        56,          72818,           3347,           4663,           7912,           3796,                 0,              0,                 0,              0,              0,           1184,              1,              0,           3385,              0,              0,              0,
        66,          82547,          11456,          19577,           6107,           1378,                 0,              0,                 0,              0,              0,           7897,              1,              0,           4027,              0,              0,              0,
        69,         139498,           6143,          63586,          16940,           2269,                 0,              0,                 0,              0,              0,           1347,              1,              0,           5706,              0,              0,              0,
        72,          80439,          13133,          14828,           6351,           2778,                 0,              0,                 0,              0,              0,           1489,              1,              0,           3500,              0,              0,              0,
        76,          74366,           4323,           6569,           8159,           3045,                 0,              0,                 0,              0,              0,           1454,              1,              0,           3253,              0,              0,              0,
        77,         150902,          65773,          69116,           9525,          16039,                 0,              0,                 0,              0,              0,           1389,              1,              0,           3509,              0,              0,              0,
        78,          72337,           3082,           4279,           8453,           3365,                 0,              0,                 0,              0,              0,           1047,              1,              0,           3204,              0,              0,              0,
        82,          78605,           7298,          15290,           6177,           1781,                 0,              0,                 0,              0,              0,           7765,              1,              0,           3663,              0,              0,              0,
        84,         115499,          10354,          38675,          17857,           2054,                 0,              0,                 0,              0,              0,           1374,              1,              0,           5519,              0,              0,              0,
        86,          78067,           9276,          10984,           5802,           4158,                 0,              0,                 0,              0,              0,           1502,              1,              0,           3599,              0,              0,              0,
        90,          74557,           4740,           7146,           8299,           3007,                 0,              0,                 0,              0,              0,           1212,              1,              0,           3526,              0,              0,              0,
        91,         150878,          65408,          68728,           9042,          15963,                 0,              0,                 0,              0,              0,           1390,              1,              0,           2930,              0,              0,              0,
        92,          71837,           2940,           4545,           7759,           3753,                 0,              0,                 0,              0,              0,           1455,              1,              0,           2737,              0,              0,              0,
        93,          78605,           7273,          14948,           6265,           1780,                 0,              0,                 0,              0,              0,           7526,              1,              0,           3879,              0,              0,              0,
        94,         115159,          10077,          38248,          17631,           2058,                 0,              0,                 0,              0,              0,           1077,              1,              0,           5348,              0,              0,              0,
        95,          77361,           8917,          10472,           5326,           4138,                 0,              0,                 0,              0,              0,           1437,              1,              0,           3345,              0,              0,              0,
        36,          72059,           3129,           4398,           8169,           3295,                 0,              0,                 0,              0,              0,           1131,              1,              0,           2896,              0,              0,              0,
        65,          78826,           7417,          14734,           6019,           2036,                 0,              0,                 0,              0,              0,           7168,              1,              0,           3015,              0,              0,              0,
        68,         115703,          10007,          38367,          18134,           1947,                 0,              0,                 0,              0,              0,           1119,              1,              0,           5924,              0,              0,              0,
        71,          76777,           8878,          10436,           5850,           4193,                 0,              0,                 0,              0,              0,           1452,              1,              0,           3243,              0,              0,              0,
        74,          79755,           7182,          11097,           7205,           4131,                 0,              0,                 0,              0,              0,           1280,              1,              0,           2871,              0,              0,              0,
        75,         136890,          62122,          65540,           9462,           5315,                 0,              0,                 0,              0,              0,           1449,              1,              0,           3379,              0,              0,              0,
        88,          77932,           7329,           9215,           7726,           4146,                 0,              0,                 0,              0,              0,           1129,              1,              0,           3527,              0,              0,              0,
        89,         138125,          62030,          65387,           9393,           7049,                 0,              0,                 0,              0,              0,           1467,              1,              0,           3397,              0,              0,              0,
        96,          78935,           7130,          10939,           7104,           4154,                 0,              0,                 0,              0,              0,           1195,              1,              0,           2711,              0,              0,              0,
        97,         136960,          62052,          65130,           8552,           7655,                 0,              0,                 0,              0,              0,           1304,              1,              0,           3090,              0,              0,              0,
        98,         198668,         122042,         124422,           8829,          10283,                 0,              0,                 0,              0,              0,           1348,              1,              0,           3512,              0,              0,              0,
       101,         112626,          30667,          40568,          10353,           5218,                 0,              0,                 0,              0,              0,            949,              1,              0,           3205,              0,              0,              0,
        99,         198756,         122083,         124526,           8184,           9791,                 0,              0,                 0,              0,              0,           1207,              1,              0,           3134,              0,              0,              0,
       102,          95137,          16697,          24509,          10388,           4696,                 0,              0,                 0,              0,              0,            943,              1,              0,           3227,              0,              0,              0,
       100,         199849,         121996,         124349,           8593,          10270,                 0,              0,                 0,              0,              0,           1365,              1,              0,           3272,              0,              0,              0,
       103,          96483,          16709,          24656,          11304,           4765,                 0,              0,                 0,              0,              0,           1202,              1,              0,           3365,              0,              0,              0,
     Sum of Layer Cycles 11432940 
    
    # NETWORK_EXECUTION_TIME =    15.96 (in ms, c7x @1GHz) with DDR_BANDWIDTH (Read + Write) =     0.00,     0.00,     0.00 (in Mega Bytes/frame) .../home/a0194920local/colab-notebooks/dlav0_34_241218//jet.jpeg
    895
    
     A :   895, 0.0000, 0.0000,  3136 .... .....2918920.898354 s:  VX_ZONE_INIT:[tivxHostDeInitLocal:120] De-Initialization Done for HOST !!!
    2918920.902764 s:  VX_ZONE_INIT:[tivxDeInitLocal:206] De-Initialization Done !!!
    APP: Deinit ... !!!
    REMOTE_SERVICE: Deinit ... !!!
    REMOTE_SERVICE: Deinit ... Done !!!
    2918920.903642 s: IPC: Deinit ... !!!
    2918920.904774 s: IPC: DeInit ... Done !!!
    2918920.904788 s: MEM: Deinit ... !!!
    2918920.904795 s: DDR_SHARED_MEM: Alloc's: 9 alloc's of 20781288 bytes 
    2918920.904798 s: DDR_SHARED_MEM: Free's : 9 free's  of 20781288 bytes 
    2918920.904801 s: DDR_SHARED_MEM: Open's : 0 allocs  of 0 bytes 
    2918920.904808 s: MEM: Deinit ... Done !!!
    APP: Deinit ... Done !!!
    *************** Finished dlav0_34_241218 **************************
    

    Device Execution:

    Processing config file #0 : dlav0_34_241218//config 
    Input : dataId=0, name=input.1_original, elementType 0, scale=1.000000, zero point=0, layout=0
    Ouput : dataId=101, name=508, elementType 1, scale=3.222945, zero point=0, layout=0 
     Ouput : dataId=102, name=511, elementType 1, scale=0.770903, zero point=0, layout=0 
     Ouput : dataId=103, name=514, elementType 1, scale=85.225433, zero point=0, layout=0 
         19589468,     18.682 0x75319d600010
    worstCaseDelay for Pre-emption is 0.8372200 
    Network File Read done
    Calling algAlloc
    
    --------------------------------------------
    TIDL Memory size requiement (record wise):
    MemRecNum   , Space               , Attribute   , Alignment   , Size(KBytes), BasePtr     
    0           , DDR Cacheable       , Persistent  ,  128, 19.67   , 0x00000000
    1           , DDR Cacheable       , Persistent  ,  128, 0.65    , 0x00000000
    2           , DDR Cacheable       , Scratch     ,  128, 16.00   , 0x00000000
    3           , DDR Cacheable       , Scratch     ,  128, 448.00  , 0x00000000
    4           , DDR Cacheable       , Scratch     ,  128, 2944.00 , 0x00000000
    5           , DDR Cacheable       , Persistent  ,  128, 435.77  , 0x00000000
    6           , DDR Cacheable       , Scratch     ,  128, 26.00   , 0x00000000
    7           , DDR Cacheable       , Scratch     ,  128, 326.25  , 0x00000000
    8           , DDR Cacheable       , Scratch     ,  128, 4710.12 , 0x00000000
    9           , DDR Cacheable       , Scratch     ,  128, 1573.25 , 0x00000000
    10          , DDR Cacheable       , Persistent  ,  128, 1220.60 , 0x00000000
    11          , DDR Cacheable       , Scratch     ,  128, 512.25  , 0x00000000
    12          , DDR Cacheable       , Persistent  ,  128, 0.12    , 0x00000000
    13          , DDR Cacheable       , Persistent  ,  128, 19130.46, 0x00000000
    14          , DDR Cacheable       , Persistent  ,  128, 0.00    , 0x00000000
    15          , DDR Cacheable       , Persistent  ,  128, 17520.75, 0x00000000
    --------------------------------------------
    Total memory size requirement (space wise):
    Mem Space , Size(KBytes)
    DDR Cacheable, 48883.91
    --------------------------------------------
    NOTE: Memory requirement in host emulation can be different from the same on EVM
          To get the actual TIDL memory requirement make sure to run on EVM with 
          debugTraceLevel = 2
    
    --------------------------------------------
    Num,    Space,     SizeinBytes,   SineInMB
       0,    17,        20144,      0.019 0x5cf79049e080
       1,    17,          664,      0.001 0x5cf79049d080
       2,    17,        16384,      0.016 0x75319ea00080
       3,    17,       458752,      0.438 0x75319ea04080
       4,    17,      3014656,      2.875 0x75319ea74080
       5,    17,       446232,      0.426 0x753232c4e080
       6,    17,        26624,      0.025 0x75319ed54080
       7,    17,       334080,      0.319 0x75319ed5a880
       8,    17,      4823168,      4.600 0x75319edac180
       9,    17,      1611008,      1.536 0x75319f245a00
      10,    17,      1249896,      1.192 0x75319e8ce080
      11,    17,       524544,      0.500 0x75319f3cef00
      12,    17,          128,      0.000 0x5cf7904a3000
      13,    17,     19589596,     18.682 0x75319c200080
      14,    17,            1,      0.000 0x5cf7904a3100
      15,    17,     17941248,     17.110 0x75319b000080
    Total External Memory (DDR) Size =     50057125,     47.738 
    TIDL init call from ivision API 
    
    --------------------------------------------
    TIDL Memory size requiement (record wise):
    MemRecNum   , Space               , Attribute   , Alignment   , Size(KBytes), BasePtr     
    0           , DDR Cacheable       , Persistent  ,  128, 19.67   , 0x9049e080
    1           , DDR Cacheable       , Persistent  ,  128, 0.65    , 0x9049d080
    2           , DDR Cacheable       , Scratch     ,  128, 16.00   , 0x9ea00080
    3           , DDR Cacheable       , Scratch     ,  128, 448.00  , 0x9ea04080
    4           , DDR Cacheable       , Scratch     ,  128, 2944.00 , 0x9ea74080
    5           , DDR Cacheable       , Persistent  ,  128, 435.77  , 0x32c4e080
    6           , DDR Cacheable       , Scratch     ,  128, 26.00   , 0x9ed54080
    7           , DDR Cacheable       , Scratch     ,  128, 326.25  , 0x9ed5a880
    8           , DDR Cacheable       , Scratch     ,  128, 4710.12 , 0x9edac180
    9           , DDR Cacheable       , Scratch     ,  128, 1573.25 , 0x9f245a00
    10          , DDR Cacheable       , Persistent  ,  128, 1220.60 , 0x9e8ce080
    11          , DDR Cacheable       , Scratch     ,  128, 512.25  , 0x9f3cef00
    12          , DDR Cacheable       , Persistent  ,  128, 0.12    , 0x904a3000
    13          , DDR Cacheable       , Persistent  ,  128, 19130.46, 0x9c200080
    14          , DDR Cacheable       , Persistent  ,  128, 0.00    , 0x904a3100
    15          , DDR Cacheable       , Persistent  ,  128, 17520.75, 0x9b000080
    --------------------------------------------
    Total memory size requirement (space wise):
    Mem Space , Size(KBytes)
    DDR Cacheable, 48883.91
    --------------------------------------------
    NOTE: Memory requirement in host emulation can be different from the same on EVM
          To get the actual TIDL memory requirement make sure to run on EVM with 
          debugTraceLevel = 2
    
    --------------------------------------------
    Alg Init for Layer # -    1
    Alg Init for Layer # -    2
    Alg Init for Layer # -    3
    Alg Init for Layer # -    4
    Alg Init for Layer # -    6
    Alg Init for Layer # -    5
    Alg Init for Layer # -    7
    Alg Init for Layer # -    8
    Alg Init for Layer # -    9
    Alg Init for Layer # -   10
    Alg Init for Layer # -   11
    Alg Init for Layer # -   12
    Alg Init for Layer # -   13
    Alg Init for Layer # -   14
    Alg Init for Layer # -   15
    Alg Init for Layer # -   17
    Alg Init for Layer # -   16
    Alg Init for Layer # -   18
    Alg Init for Layer # -   19
    Alg Init for Layer # -   20
    Alg Init for Layer # -   21
    Alg Init for Layer # -   22
    Alg Init for Layer # -   23
    Alg Init for Layer # -   24
    Alg Init for Layer # -   25
    Alg Init for Layer # -   26
    Alg Init for Layer # -   27
    Alg Init for Layer # -   28
    Alg Init for Layer # -   29
    Alg Init for Layer # -   30
    Alg Init for Layer # -   31
    Alg Init for Layer # -   32
    Alg Init for Layer # -   33
    Alg Init for Layer # -   34
    Alg Init for Layer # -   37
    Alg Init for Layer # -   35
    Alg Init for Layer # -   38
    Alg Init for Layer # -   39
    Alg Init for Layer # -   40
    Alg Init for Layer # -   41
    Alg Init for Layer # -   42
    Alg Init for Layer # -   43
    Alg Init for Layer # -   44
    Alg Init for Layer # -   45
    Alg Init for Layer # -   46
    Alg Init for Layer # -   47
    Alg Init for Layer # -   48
    Alg Init for Layer # -   49
    Alg Init for Layer # -   50
    Alg Init for Layer # -   51
    Alg Init for Layer # -   52
    Alg Init for Layer # -   53
    Alg Init for Layer # -   54
    Alg Init for Layer # -   55
    Alg Init for Layer # -   57
    Alg Init for Layer # -   58
    Alg Init for Layer # -   59
    Alg Init for Layer # -   60
    Alg Init for Layer # -   61
    Alg Init for Layer # -   62
    Alg Init for Layer # -   63
    Alg Init for Layer # -   64
    Alg Init for Layer # -   67
    Alg Init for Layer # -   70
    Alg Init for Layer # -   73
    Alg Init for Layer # -   79
    Alg Init for Layer # -   80
    Alg Init for Layer # -   81
    Alg Init for Layer # -   83
    Alg Init for Layer # -   85
    Alg Init for Layer # -   87
    Alg Init for Layer # -   56
    Alg Init for Layer # -   66
    Alg Init for Layer # -   69
    Alg Init for Layer # -   72
    Alg Init for Layer # -   76
    Alg Init for Layer # -   77
    Alg Init for Layer # -   78
    Alg Init for Layer # -   82
    Alg Init for Layer # -   84
    Alg Init for Layer # -   86
    Alg Init for Layer # -   90
    Alg Init for Layer # -   91
    Alg Init for Layer # -   92
    Alg Init for Layer # -   93
    Alg Init for Layer # -   94
    Alg Init for Layer # -   95
    Alg Init for Layer # -   36
    Alg Init for Layer # -   65
    Alg Init for Layer # -   68
    Alg Init for Layer # -   71
    Alg Init for Layer # -   74
    Alg Init for Layer # -   75
    Alg Init for Layer # -   88
    Alg Init for Layer # -   89
    Alg Init for Layer # -   96
    Alg Init for Layer # -   97
    Alg Init for Layer # -   98
    Alg Init for Layer # -  101
    Alg Init for Layer # -   99
    Alg Init for Layer # -  102
    Alg Init for Layer # -  100
    Alg Init for Layer # -  103
    PREEMPTION: Adding a new priority object for targetPriority = 0, handle = 0x5cf79049e080
    PREEMPTION: Now total number of priority objects = 1 at priorityId = 0,    with new memRec of base = 0x5cf7904a3000 and size = 128
    PREEMPTION: Requesting context memory addr for handle 0x5cf79049e080, return Addr = 0x5cf7850499f8
     Freeing memory for user provided Net
    
     Instance created for  dlav0_34_241218//config
    
    Processing Cnt :    0, InstCnt :    0 dlav0_34_241218//tidl_net.bin!
    TIDL_RT: Set default TIDLRT tensor done
    TIDL_RT: Set default TIDLRT tensor done
    TIDL_RT: Set default TIDLRT tensor done
    TIDL_RT: Set default TIDLRT tensor done
    /home/a0194920local/colab-notebooks/dlav0_34_241218//jet.jpeg
          401408,      0.383 0x5cf790644380
          341120,      0.325 0x5cf79054e880
           16640,      0.016 0x5cf790549000
           16640,      0.016 0x5cf790522f00
     ----------------------- TIDL Process with REF_ONLY FLOW------------------------ 
    
    #    0 . ..TIDL_process is started with handle : 0x5cf79049e080 
    TIDL_activate is called with handle : 0x5cf79049e080 - Copying handle of size 20144 from 0x5cf79049e080 to 0x75319ed46080 
    Core 0 Alg Process for Layer # -    1, layer type 1
    Processing Layer # -    1
    Core 0 End of Layer # -    1 with outPtrs[0] = 0x75319ea74080
    Core 0 Alg Process for Layer # -    2, layer type 1
    Processing Layer # -    2
    Core 0 End of Layer # -    2 with outPtrs[0] = 0x75319ea74080
    Core 0 Alg Process for Layer # -    3, layer type 1
    Processing Layer # -    3
    Core 0 End of Layer # -    3 with outPtrs[0] = 0x75319eb38500
    Core 0 Alg Process for Layer # -    4, layer type 2
    Processing Layer # -    4
    Core 0 End of Layer # -    4 with outPtrs[0] = 0x75319ea74080
    Core 0 Alg Process for Layer # -    6, layer type 1
    Processing Layer # -    6
    Core 0 End of Layer # -    6 with outPtrs[0] = 0x75319ea74080
    Core 0 Alg Process for Layer # -    5, layer type 1
    Processing Layer # -    5
    Core 0 End of Layer # -    5 with outPtrs[0] = 0x75319eaa5100
    Core 0 Alg Process for Layer # -    7, layer type 1
    Processing Layer # -    7
    Core 0 End of Layer # -    7 with outPtrs[0] = 0x75319eaa5100
    Core 0 Alg Process for Layer # -    8, layer type 5
    Processing Layer # -    8
    Core 0 End of Layer # -    8 with outPtrs[0] = 0x75319ea74080
    Core 0 Alg Process for Layer # -    9, layer type 1
    Processing Layer # -    9
    Core 0 End of Layer # -    9 with outPtrs[0] = 0x75319eaa5100
    Core 0 Alg Process for Layer # -   10, layer type 1
    Processing Layer # -   10
    Core 0 End of Layer # -   10 with outPtrs[0] = 0x75319eaa5100
    Core 0 Alg Process for Layer # -   11, layer type 5
    Processing Layer # -   11
    Core 0 End of Layer # -   11 with outPtrs[0] = 0x75319ead6180
    Core 0 Alg Process for Layer # -   12, layer type 12
    Processing Layer # -   12
    Core 0 End of Layer # -   12 with outPtrs[0] = 0x75319eaa5100
    Core 0 Alg Process for Layer # -   13, layer type 1
    Processing Layer # -   13
    Core 0 End of Layer # -   13 with outPtrs[0] = 0x75319ea74080
    Core 0 Alg Process for Layer # -   14, layer type 2
    Processing Layer # -   14
    Core 0 End of Layer # -   14 with outPtrs[0] = 0x75319eaa5100
    Core 0 Alg Process for Layer # -   15, layer type 2
    Processing Layer # -   15
    Core 0 End of Layer # -   15 with outPtrs[0] = 0x75319eab2180
    Core 0 Alg Process for Layer # -   17, layer type 1
    Processing Layer # -   17
    Core 0 End of Layer # -   17 with outPtrs[0] = 0x75319eab2180
    Core 0 Alg Process for Layer # -   16, layer type 1
    Processing Layer # -   16
    Core 0 End of Layer # -   16 with outPtrs[0] = 0x75319eacc200
    Core 0 Alg Process for Layer # -   18, layer type 1
    Processing Layer # -   18
    Core 0 End of Layer # -   18 with outPtrs[0] = 0x75319eacc200
    Core 0 Alg Process for Layer # -   19, layer type 5
    Processing Layer # -   19
    Core 0 End of Layer # -   19 with outPtrs[0] = 0x75319eab2180
    Core 0 Alg Process for Layer # -   20, layer type 1
    Processing Layer # -   20
    Core 0 End of Layer # -   20 with outPtrs[0] = 0x75319eacc200
    Core 0 Alg Process for Layer # -   21, layer type 1
    Processing Layer # -   21
    Core 0 End of Layer # -   21 with outPtrs[0] = 0x75319eacc200
    Core 0 Alg Process for Layer # -   22, layer type 5
    Processing Layer # -   22
    Core 0 End of Layer # -   22 with outPtrs[0] = 0x75319eae6280
    Core 0 Alg Process for Layer # -   23, layer type 12
    Processing Layer # -   23
    Core 0 End of Layer # -   23 with outPtrs[0] = 0x75319eacc200
    Core 0 Alg Process for Layer # -   24, layer type 1
    Processing Layer # -   24
    Core 0 End of Layer # -   24 with outPtrs[0] = 0x75319eab2180
    Core 0 Alg Process for Layer # -   25, layer type 1
    Processing Layer # -   25
    Core 0 End of Layer # -   25 with outPtrs[0] = 0x75319eacc200
    Core 0 Alg Process for Layer # -   26, layer type 1
    Processing Layer # -   26
    Core 0 End of Layer # -   26 with outPtrs[0] = 0x75319eae6280
    Core 0 Alg Process for Layer # -   27, layer type 5
    Processing Layer # -   27
    Core 0 End of Layer # -   27 with outPtrs[0] = 0x75319eacc200
    Core 0 Alg Process for Layer # -   28, layer type 1
    Processing Layer # -   28
    Core 0 End of Layer # -   28 with outPtrs[0] = 0x75319eae6280
    Core 0 Alg Process for Layer # -   29, layer type 1
    Processing Layer # -   29
    Core 0 End of Layer # -   29 with outPtrs[0] = 0x75319eae6280
    Core 0 Alg Process for Layer # -   30, layer type 5
    Processing Layer # -   30
    Core 0 End of Layer # -   30 with outPtrs[0] = 0x75319eb00300
    Core 0 Alg Process for Layer # -   31, layer type 12
    Processing Layer # -   31
    Core 0 End of Layer # -   31 with outPtrs[0] = 0x75319eae6280
    Core 0 Alg Process for Layer # -   32, layer type 1
    Processing Layer # -   32
    Core 0 End of Layer # -   32 with outPtrs[0] = 0x75319eaa5100
    Core 0 Alg Process for Layer # -   33, layer type 2
    Processing Layer # -   33
    Core 0 End of Layer # -   33 with outPtrs[0] = 0x75319eabf180
    Core 0 Alg Process for Layer # -   34, layer type 2
    Processing Layer # -   34
    Core 0 End of Layer # -   34 with outPtrs[0] = 0x75319eac9200
    Core 0 Alg Process for Layer # -   37, layer type 1
    Processing Layer # -   37
    Core 0 End of Layer # -   37 with outPtrs[0] = 0x75319eac9200
    Core 0 Alg Process for Layer # -   35, layer type 1
    Processing Layer # -   35
    Core 0 End of Layer # -   35 with outPtrs[0] = 0x75319eadd280
    Core 0 Alg Process for Layer # -   38, layer type 1
    Processing Layer # -   38
    Core 0 End of Layer # -   38 with outPtrs[0] = 0x75319eb85700
    Core 0 Alg Process for Layer # -   39, layer type 5
    Processing Layer # -   39
    Core 0 End of Layer # -   39 with outPtrs[0] = 0x75319eac9200
    Core 0 Alg Process for Layer # -   40, layer type 1
    Processing Layer # -   40
    Core 0 End of Layer # -   40 with outPtrs[0] = 0x75319eadd280
    Core 0 Alg Process for Layer # -   41, layer type 1
    Processing Layer # -   41
    Core 0 End of Layer # -   41 with outPtrs[0] = 0x75319eb85700
    Core 0 Alg Process for Layer # -   42, layer type 5
    Processing Layer # -   42
    Core 0 End of Layer # -   42 with outPtrs[0] = 0x75319eadd280
    Core 0 Alg Process for Layer # -   43, layer type 12
    Processing Layer # -   43
    Core 0 End of Layer # -   43 with outPtrs[0] = 0x75319eadd280
    Core 0 Alg Process for Layer # -   44, layer type 1
    Processing Layer # -   44
    Core 0 End of Layer # -   44 with outPtrs[0] = 0x75319eac9200
    Core 0 Alg Process for Layer # -   45, layer type 1
    Processing Layer # -   45
    Core 0 End of Layer # -   45 with outPtrs[0] = 0x75319eadd280
    Core 0 Alg Process for Layer # -   46, layer type 1
    Processing Layer # -   46
    Core 0 End of Layer # -   46 with outPtrs[0] = 0x75319eb85700
    Core 0 Alg Process for Layer # -   47, layer type 5
    Processing Layer # -   47
    Core 0 End of Layer # -   47 with outPtrs[0] = 0x75319eadd280
    Core 0 Alg Process for Layer # -   48, layer type 1
    Processing Layer # -   48
    Core 0 End of Layer # -   48 with outPtrs[0] = 0x75319eaf1300
    Core 0 Alg Process for Layer # -   49, layer type 1
    Processing Layer # -   49
    Core 0 End of Layer # -   49 with outPtrs[0] = 0x75319eb99780
    Core 0 Alg Process for Layer # -   50, layer type 5
    Processing Layer # -   50
    Core 0 End of Layer # -   50 with outPtrs[0] = 0x75319eaf1300
    Core 0 Alg Process for Layer # -   51, layer type 12
    Processing Layer # -   51
    Core 0 End of Layer # -   51 with outPtrs[0] = 0x75319eaf1300
    Core 0 Alg Process for Layer # -   52, layer type 1
    Processing Layer # -   52
    Core 0 End of Layer # -   52 with outPtrs[0] = 0x75319eabf180
    Core 0 Alg Process for Layer # -   53, layer type 2
    Processing Layer # -   53
    Core 0 End of Layer # -   53 with outPtrs[0] = 0x75319ead3200
    Core 0 Alg Process for Layer # -   54, layer type 1
    Processing Layer # -   54
    Core 0 End of Layer # -   54 with outPtrs[0] = 0x75319ead6380
    Core 0 Alg Process for Layer # -   55, layer type 1
    Processing Layer # -   55
    Core 0 End of Layer # -   55 with outPtrs[0] = 0x75319eadc600
    Core 0 Alg Process for Layer # -   57, layer type 1
    Processing Layer # -   57
    Core 0 End of Layer # -   57 with outPtrs[0] = 0x75319ed2b280
    Core 0 Alg Process for Layer # -   58, layer type 5
    Processing Layer # -   58
    Core 0 End of Layer # -   58 with outPtrs[0] = 0x75319ead6380
    Core 0 Alg Process for Layer # -   59, layer type 1
    Processing Layer # -   59
    Core 0 End of Layer # -   59 with outPtrs[0] = 0x75319eadc600
    Core 0 Alg Process for Layer # -   60, layer type 1
    Processing Layer # -   60
    Core 0 End of Layer # -   60 with outPtrs[0] = 0x75319ed2b080
    Core 0 Alg Process for Layer # -   61, layer type 5
    Processing Layer # -   61
    Core 0 End of Layer # -   61 with outPtrs[0] = 0x75319eadc600
    Core 0 Alg Process for Layer # -   62, layer type 12
    Processing Layer # -   62
    Core 0 End of Layer # -   62 with outPtrs[0] = 0x75319eadc600
    Core 0 Alg Process for Layer # -   63, layer type 1
    Processing Layer # -   63
    Core 0 End of Layer # -   63 with outPtrs[0] = 0x75319ead3200
    Core 0 Alg Process for Layer # -   64, layer type 1
    Processing Layer # -   64
    Core 0 End of Layer # -   64 with outPtrs[0] = 0x75319eadf280
    Core 0 Alg Process for Layer # -   67, layer type 29
    Processing Layer # -   67
    Core 0 End of Layer # -   67 with outPtrs[0] = 0x75319ead326f
    Core 0 Alg Process for Layer # -   70, layer type 11
    Processing Layer # -   70
    Core 0 End of Layer # -   70 with outPtrs[0] = 0x75319eae72de
    Core 0 Alg Process for Layer # -   73, layer type 29
    Processing Layer # -   73
    Core 0 End of Layer # -   73 with outPtrs[0] = 0x75319ead3200
    Core 0 Alg Process for Layer # -   79, layer type 12
    Processing Layer # -   79
    Core 0 End of Layer # -   79 with outPtrs[0] = 0x75319eae7280
    Core 0 Alg Process for Layer # -   80, layer type 1
    Processing Layer # -   80
    Core 0 End of Layer # -   80 with outPtrs[0] = 0x75319ead3200
    Core 0 Alg Process for Layer # -   81, layer type 1
    Processing Layer # -   81
    Core 0 End of Layer # -   81 with outPtrs[0] = 0x75319eadd340
    Core 0 Alg Process for Layer # -   83, layer type 29
    Processing Layer # -   83
    Core 0 End of Layer # -   83 with outPtrs[0] = 0x75319ead3261
    Core 0 Alg Process for Layer # -   85, layer type 11
    Processing Layer # -   85
    Core 0 End of Layer # -   85 with outPtrs[0] = 0x75319eb0a2c2
    Core 0 Alg Process for Layer # -   87, layer type 29
    Processing Layer # -   87
    Core 0 End of Layer # -   87 with outPtrs[0] = 0x75319eaf0200
    Core 0 Alg Process for Layer # -   56, layer type 1
    Processing Layer # -   56
    Core 0 End of Layer # -   56 with outPtrs[0] = 0x75319eac92c0
    Core 0 Alg Process for Layer # -   66, layer type 29
    Processing Layer # -   66
    Core 0 End of Layer # -   66 with outPtrs[0] = 0x75319eabf1e1
    Core 0 Alg Process for Layer # -   69, layer type 11
    Processing Layer # -   69
    Core 0 End of Layer # -   69 with outPtrs[0] = 0x75319eb0a2c2
    Core 0 Alg Process for Layer # -   72, layer type 29
    Processing Layer # -   72
    Core 0 End of Layer # -   72 with outPtrs[0] = 0x75319eabf180
    Core 0 Alg Process for Layer # -   76, layer type 12
    Processing Layer # -   76
    Core 0 End of Layer # -   76 with outPtrs[0] = 0x75319eb24300
    Core 0 Alg Process for Layer # -   77, layer type 1
    Processing Layer # -   77
    Core 0 End of Layer # -   77 with outPtrs[0] = 0x75319eb0a280
    Core 0 Alg Process for Layer # -   78, layer type 1
    Processing Layer # -   78
    Core 0 End of Layer # -   78 with outPtrs[0] = 0x75319eace200
    Core 0 Alg Process for Layer # -   82, layer type 29
    Processing Layer # -   82
    Core 0 End of Layer # -   82 with outPtrs[0] = 0x75319eabf1c5
    Core 0 Alg Process for Layer # -   84, layer type 11
    Processing Layer # -   84
    Core 0 End of Layer # -   84 with outPtrs[0] = 0x75319eb2430a
    Core 0 Alg Process for Layer # -   86, layer type 29
    Processing Layer # -   86
    Core 0 End of Layer # -   86 with outPtrs[0] = 0x75319eabf180
    Core 0 Alg Process for Layer # -   90, layer type 12
    Processing Layer # -   90
    Core 0 End of Layer # -   90 with outPtrs[0] = 0x75319eb0a280
    Core 0 Alg Process for Layer # -   91, layer type 1
    Processing Layer # -   91
    Core 0 End of Layer # -   91 with outPtrs[0] = 0x75319eaf0200
    Core 0 Alg Process for Layer # -   92, layer type 1
    Processing Layer # -   92
    Core 0 End of Layer # -   92 with outPtrs[0] = 0x75319eaff280
    Core 0 Alg Process for Layer # -   93, layer type 29
    Processing Layer # -   93
    Core 0 End of Layer # -   93 with outPtrs[0] = 0x75319eaf0245
    Core 0 Alg Process for Layer # -   94, layer type 11
    Processing Layer # -   94
    Core 0 End of Layer # -   94 with outPtrs[0] = 0x75319eb2128a
    Core 0 Alg Process for Layer # -   95, layer type 29
    Processing Layer # -   95
    Core 0 End of Layer # -   95 with outPtrs[0] = 0x75319eaf0200
    Core 0 Alg Process for Layer # -   36, layer type 1
    Processing Layer # -   36
    Core 0 End of Layer # -   36 with outPtrs[0] = 0x75319eb21280
    Core 0 Alg Process for Layer # -   65, layer type 29
    Processing Layer # -   65
    Core 0 End of Layer # -   65 with outPtrs[0] = 0x75319eaa5145
    Core 0 Alg Process for Layer # -   68, layer type 11
    Processing Layer # -   68
    Core 0 End of Layer # -   68 with outPtrs[0] = 0x75319eb2128a
    Core 0 Alg Process for Layer # -   71, layer type 29
    Processing Layer # -   71
    Core 0 End of Layer # -   71 with outPtrs[0] = 0x75319eb58300
    Core 0 Alg Process for Layer # -   74, layer type 12
    Processing Layer # -   74
    Core 0 End of Layer # -   74 with outPtrs[0] = 0x75319eb89380
    Core 0 Alg Process for Layer # -   75, layer type 1
    Processing Layer # -   75
    Core 0 End of Layer # -   75 with outPtrs[0] = 0x75319ea74080
    Core 0 Alg Process for Layer # -   88, layer type 12
    Processing Layer # -   88
    Core 0 End of Layer # -   88 with outPtrs[0] = 0x75319eb21280
    Core 0 Alg Process for Layer # -   89, layer type 1
    Processing Layer # -   89
    Core 0 End of Layer # -   89 with outPtrs[0] = 0x75319ea74080
    Core 0 Alg Process for Layer # -   96, layer type 12
    Processing Layer # -   96
    Core 0 End of Layer # -   96 with outPtrs[0] = 0x75319ea74080
    Core 0 Alg Process for Layer # -   97, layer type 1
    Processing Layer # -   97
    Core 0 End of Layer # -   97 with outPtrs[0] = 0x75319ea74080
    Core 0 Alg Process for Layer # -   98, layer type 1
    Processing Layer # -   98
    Core 0 End of Layer # -   98 with outPtrs[0] = 0x75319eaa5100
    Core 0 Alg Process for Layer # -  101, layer type 1
    Processing Layer # -  101
    Core 0 End of Layer # -  101 with outPtrs[0] = 0x5cf79054e880
    Core 0 Alg Process for Layer # -   99, layer type 1
    Processing Layer # -   99
    Core 0 End of Layer # -   99 with outPtrs[0] = 0x75319eaa5100
    Core 0 Alg Process for Layer # -  102, layer type 1
    Processing Layer # -  102
    Core 0 End of Layer # -  102 with outPtrs[0] = 0x5cf790549000
    Core 0 Alg Process for Layer # -  100, layer type 1
    Processing Layer # -  100
    Core 0 End of Layer # -  100 with outPtrs[0] = 0x75319ea74080
    Core 0 Alg Process for Layer # -  103, layer type 1
    Processing Layer # -  103
    Core 0 End of Layer # -  103 with outPtrs[0] = 0x5cf790522f00
    TIDL_process is completed with handle : 0x5cf79049e080 
     T    4419.25 Skipping static gen-set function
     .... ..... .../home/a0194920local/colab-notebooks/dlav0_34_241218//jet.jpeg
    895
    
     A :   895, 0.0000, 0.0000, 78791 .... .....TIDL_deactivate is called with handle : 0x5cf79049e080 - Copying handle of size 20144 from 0x75319ed46080 to 0x5cf79049e080 
    

    There were some warnings, " WARNING: Change to Upsample/Resize if possible instead of Deconvolution. It will be more efficient," but they did not appear critical.

    Regards,

    Chris

  • Dear Chris,

    Thanks for the response!

    We were able to convert/quantize and port the model, but due to some garbage values in some of the parameters, we got a completely different model output than in SDK 8.2.

     

    Can you please check the paramDebug.csv file for the toy model (dlav0_34_241218) that we sent you? If you have the same problem as us, you can see that some layers with deconv layer contain garbage values.

     

    If there is no problem, please send us the config settings you used. We'll take a closer look to see if there is an issue on our end.

    Thanks and Regards,

    Vyom Mishra

  • Hi Vyom,

    The values look okay to me, but I am unsure what they should look like for your application.  The outputs look like a bunch of sparse matrixes.  I have included a zip file of the output artifacts.  

    Here is how I ran it:

    /home/a0194920local/10_0/edgeai-tidl-tools/tidl_tools/tidl_model_import.out {modeldir}/config --modelType 2 \
    --inputNetFile {modelname} --outputNetFile {modeldir}/tidl_net.bin \
    --outputParamsFile {modeldir}/tidl_io_buff --inDataNorm 1 \
    --inMean 123.675 116.28 103.53 --inScale 0.017125 0.017507 0.017429 \
    --inData {modeldir}/in_data_list.txt --inFileFormat 2 \
    --tidlStatsTool /home/a0194920local/10_0/edgeai-tidl-tools/tidl_tools//PC_dsp_test_dl_algo.out \
    --perfSimTool /home/a0194920local/10_0/edgeai-tidl-tools/tidl_tools/ti_cnnperfsim.out \
    --graphVizTool /home/a0194920local/10_0/edgeai-tidl-tools/tidl_tools/tidl_graphVisualiser.out \
    --inHeight 224 --inWidth 224 --inNumChannels 3 --numFrames 1

    The config file in modeldir is:

    perfSimConfig = /home/a0194920local/10_0/edgeai-tidl-tools/tidl_tools/device_config.cfg

    The contents of  /home/a0194920local/10_0/edgeai-tidl-tools/tidl_tools/device_config.cfg isd just the standard am69a config file.

    # Size of L2 SRAM Memory in KB which can be used by TIDL, Recommended value is
    # 448KB considering that 64KB of L2 shall be configured as cache. TIDL test bench
    # configures L2 cache as 64 KB, so any value higher than 448 KB would require
    # user to change the L2 cache setting in TIDL test bench
    L2MEMSIZE_KB = 448
    # Size of L3 (MSMC) SRAM Memory in KB which can be used by TIDL
    MSMCSIZE_KB = 2944
    #ID for a Device, TDA4VMID = 0, TIDL_TDA4AEP = 1, TIDL_TDA4AHP = 1, TIDL_TDA4AM = 2, TIDL_TDA4AMPlus = 3
    DEVICE_NAME = 1
    ENABLE_PERSIT_WT_ALLOC = 1
    DDRFREQ_MHZ = 4266

    compiled_dlav0_34_241218.zip

    Chris

  • Dear Chris,

    Thanks for the response!

    We have checked the files you sent us and found that within the ParamDebug.csv file, the debugging values for the Deconv layers (ex. 68th, 69th, 70th, etc.) are significantly different from the other layers. This issue is the same for the models we use; we believe that this is why we are getting completely different model outputs from SDK 8.2. 
    (When we converted/quantized the model in SDK 8.2 and checked the ParamDebug.csv file, we did not see these values bouncing around in the Deconv layer.) 

    I would like to know what is causing the values to be so different for the Deconv layers in that version as opposed to SDK 8.2, and if this is not affecting the model results.

    Thanks and Regards,

    Vyom Mishra

  • Dear Chris,

    Any Update? Are you looking into this?

    Thanks and Regards,

    Vyom Mishra

  • Hi Vyom,

    I have assigned this to a person on our development team.

    Regards,
    Chris

  • Dear Chris,

    Any updates?

    Is the issue diagnosed?

    Thanks and Regards,

    Vyom Mishra

  • Dear Chris,

    Any development on this issue?

    Thanks and Regards,

    Vyom Mishra

  • Vyom,
    We're looking into this thread and get back to you soon

    Regards,
    Varun

  • Vyom,
    We identified an issue with the grouped deconvolution layer in our implementation, which seems to be causing this issue. Is it possible for you to modify your ConvTranspose blocks to be ungrouped (i.e. group = 1). This is being tracked by us (TIDL-6889) and should be fixed in our next upcoming release (SDK 11.0 (Mid May))

    Regards,
    Varun

  • Vyom, can this thread be closed now, we will close at the end of the week if there is no further update, Dave C

  • Dear Dave,

    Yes, you can close the thread, we are waiting for the mid may release.

    Thanks and Regards,

    Vyom Mishra

  • Vyom, thank you, closing as agreed, Dave C