This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TMS320F28P550SG: TI NNC for F28P55

Part Number: TMS320F28P550SG
Other Parts Discussed in Thread: C2000WARE, TI-CGT

Tool/software:

Hello TI E2E Team,

I’m trying to compile and offload a quantized ONNX model to a TI C2000 F28P55 MCU using the on-chip NPU (“ti-npu”) via Apache TVM’s TINIE backend. I’ve spent several days on both Windows (Git Bash) and WSL2 but keep running into blockers. Below is a concise, plain-text summary of my setup, steps taken, errors encountered, and specific questions. Any guidance or example workflow would be greatly appreciated.

  1. Hardware & SDK

    • MCU: C2000 F28P55 with embedded DSP/NPU

    • C2000Ware installed under C:\ti\C2000Ware_5_04_00_00

    • TI-CGT c2000 v22.6.2 installed under C:\ti\ti_cgt_c2000_22.6.2.LTS

  2. ONNX Model

    • model_int8.onnx (opset 11)

    • Input shape [1,1,1,256], output shape [1,2]

    • Validated with onnx.checker; onnxruntime cannot run ConvInteger nodes (expected)

  3. Build Environment

    • Windows 10 Git Bash and Ubuntu 22.04 WSL2

    • Python 3.10 venv with cython, psutil, typing_extensions, numpy 1.23.5 installed

    • Cloned apache/tvm (dev tip), attempted CMake with:
      • -DUSE_LLVM=ON
      • -DUSE_TINIE=ON
      • -DUSE_TVMC=ON
      but saw “USE_TINIE not used” warnings and Cython errors until manually installing cython

  4. Python Binding Problems

    • “pip install -e python/” repeatedly fails to locate libtvm.so / libtvm_runtime.so, even with LD_LIBRARY_PATH (WSL) or TVM_LIBRARY_PATH (Windows) set

    • Installing ti_mcu_nnc-1.3.0 wheel on Windows succeeds but no nnc.exe appears, “which nnc” returns nothing

  5. tvmc compile Attempts & Errors
    a) Missing ONNX frontend:
    “Package ‘onnx’ is not installed. Hint: pip install tlcpack[tvmc]” (tlcpack[tvmc] isn’t published)
    b) CPU fallback:
    tvmc compile --target="c" …
    AssertionError: Target triple should not be empty
    c) TI-NPU partition crash:
    AttributeError: <class 'tvm.ir.type.TupleType'> has no attribute dtype
    (deep in relay/backend/contrib/tinie/prepare.py during ConvInteger padding)

  6. Key Questions

    1. What exact CMake flags and TVM_TARGET syntax enable TINIE + TVMC for “c;ti-npu” under WSL2 and Windows?

    2. How do I correctly link libtvm.so/libtvm_runtime.so so that “pip install -e python/” succeeds?

    3. Is there a standalone NNC workflow (nnc.exe) I can install on Git Bash for Windows without rebuilding TVM from source?

    4. What is the precise tvmc compile command to produce a .a library for F28P55 + TI NPU, including target triple, target-host, cross-compiler options, etc.?

    5. Any known patch or workaround for the TupleType.dtype AttributeError in the TINIE padding hoisting pass?

Thank you for your time and any pointers to official examples or scripts for ONNX→TVM→C2000 F28P55 + TI NPU integration.

Best regards,
Junsuk
MS/PhD candidate

  • Hi Junsuk,

        Please refer to our NNC compiler user's guide: https://software-dl.ti.com/mctools/nnc/mcu/users_guide/

        We have ModelMaker that can help you apply TI-NPU QAT training with some generic model examples as well: https://github.com/TexasInstruments/tinyml-tensorlab  (this probably should be your entry point instead of using NNC directly)

        Some detailed answers to your list:

    2. Please retrain your model as input shape[1,1,256,1] and use 2D conv kernels as [kernel_height, 1]

    3. we do not support building TVM, please use our distribution

    4. run "which tvmc" as specified in user's guide

    5.a pip install onnx

        Just curious, which TI documentation did you follow when you started this exploration?  We definitely need to point users to correct starting places.  Thanks!

    -Yuan