This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AM67: OpenVX Support in custom Yocto image

Part Number: AM67
Other Parts Discussed in Thread: SYSBIOS

Tool/software:

Hello, 

one of our module customers wants to use OpenVX with a custom yocto build environment instead of the AM67 Processor SDK is the any guidance how to implement the OpenVX components to a custom build environment ?

thank you and best regards 

  • Hi Tim,

    I am routing your query to our expert. Please expect a response soon.

    Best Regards,

    Suren

  • Hi Suren, 

    do you have any update for me ? 

  • Hi TimL,

    For our Yocto build, TI OpenVX components come via the ti-vision-apps.bb recipe[0]. This resulting libtivision_apps.so will include the core TIOVX components / interfaces, as well as some basic test applicaftions. The binaries for remote cores running target kernels will be part of the adjacent ti-edgeai-firmware.bb [1] -- these firmwares are prebuilt and will be applicable without modification so long as you have at least 2 GB DDR.

    Thus, your module customer should be able to use these two recipes from meta-edgeai to get the core components for TIOVX.

    ti-tidl [3] recipes will be necessary as well if their OpenVX application is planning to leverage AI acceleration with TIDL on C7xMMA accelerator/NPU.

    There are additional components for user-level applications [2]. These are not explicitly necessary to create an application, but all SDK examples are reliant on these like edgeai-gst-apps, edgeai-tiovx-kernels, edgeai-tiovx-modules. 

    [0] https://git.ti.com/cgit/edgeai/meta-edgeai/tree/recipes-tisdk/ti-psdk-rtos/ti-vision-apps.bb?h=scarthgap-next 

    • Versions will be important here! e.g. branch=refs/tags/REL.PSDK.ANALYTICS.AM62A.11.01.00.04 in SRC_URI. Such versions need to match SDK versions for psdk_fw / ti-edgeai-firmware and ti-tidl [3] recipes. It is best to use an SDK installation as the basis here so that versions are in sync already. The HEAD on some branches does not seem to be 100% consistent for all recipes (I see this for current scarthgap-next)

    [1] https://git.ti.com/cgit/edgeai/meta-edgeai/tree/recipes-tisdk/ti-psdk-rtos/ti-edgeai-firmware.bb?h=scarthgap-next 

    [2]https://git.ti.com/cgit/edgeai/meta-edgeai/tree/recipes-tisdk/edgeai-components?h=scarthgap-next 

    [3] https://git.ti.com/cgit/edgeai/meta-edgeai/tree/recipes-tisdk/ti-tidl?h=scarthgap

    BR,
    Reese

  • Hello Reese, 

    thank you for the reply, and sorry for the delay. I get stuck when compiling the ti-vision-apps, the aarch64-oe-linux-gcc is required, I have noticed that the am67 ti-psdk has a linux-devkit folder which contains the aarch64-oe-linux-gcc binary. Is this sdk something special from the TI Processor SDK or can I use a generic Yocto SDK to compile the ti-vision-apps ? 

    I have attached the build error below:   


    ERROR: ti-vision-apps-11.01.00-r0_edgeai_13 do_compile: oe_runmake failed
    ERROR: ti-vision-apps-11.01.00-r0_edgeai_13 do_compile: ExecutionError('/home/embedded/workspace/scarthgap-next/am67xx_build/tmp/work/armv8a-linux/ti-vision-apps/11.01.00/temp/run.do_compile.2026994', 1, None, None)
    ERROR: Logfile of failure stored in: /home/embedded/workspace/scarthgap-next/am67xx_build/tmp/work/armv8a-linux/ti-vision-apps/11.01.00/temp/log.do_compile.2026994
    Log data follows:
    | DEBUG: Executing shell function do_compile
    | NOTE: make -j 16 yocto_build
    | cp -Rf /home/embedded/workspace/scarthgap-next/am67xx_build/tmp/work/armv8a-linux/ti-vision-apps/11.01.00/repo/psdk_include/* /home/embedded/workspace/scarthgap-next/am67xx_build/tmp/work/armv8a-linux/ti-vision-apps/11.01.00/repo/.
    | PROFILE=release BUILD_EMULATION_MODE=no TARGET_CPU=A53 TARGET_OS=LINUX TIDL_PATH=/home/embedded/workspace/scarthgap-next/am67xx_build/tmp/work/armv8a-linux/ti-vision-apps/11.01.00/repo/tidl_j7 make app_utils
    | make[1]: Entering directory '/home/embedded/workspace/scarthgap-next/am67xx_build/tmp/work/armv8a-linux/ti-vision-apps/11.01.00/repo/sdk_builder'
    | make -C /home/embedded/workspace/scarthgap-next/am67xx_build/tmp/work/armv8a-linux/ti-vision-apps/11.01.00/repo/app_utils
    | make[2]: Entering directory '/home/embedded/workspace/scarthgap-next/am67xx_build/tmp/work/armv8a-linux/ti-vision-apps/11.01.00/repo/app_utils'
    | SHELL=/bin/sh
    | TARGET_MAKEFILES=utils/console_io/src/concerto.mak utils/file_io/src/concerto.mak utils/ipc/src/concerto.mak utils/ipc_test/src/concerto.mak utils/mem/src/concerto.mak utils/misc/src/concerto.mak utils/perf_stats/src/concerto.mak utils/remote_service/src/concerto.mak utils/rtos/src/concerto.mak utils/sciclient/src/concerto.mak utils/timer/src/concerto.mak utils/udma/src/concerto.mak
    | Keep only LINUX OS in TARGET_COMBOS
    | Keep only A53 CPU in TARGET_COMBOS
    | undefined TIARMCGT_ROOT=
    | file TIARMCGT_LLVM_ROOT=$(PSDK_TOOLS_PATH)/ti-cgt-armllvm_3.2.2.LTS
    | undefined GCC_SYSBIOS_ARM_ROOT=
    | file CGT6X_ROOT=$(PSDK_TOOLS_PATH)/ti-cgt-c6000_8.3.7
    | file CGT7X_ROOT=$(PSDK_TOOLS_PATH)/ti-cgt-c7000_5.0.0.LTS
    | undefined GCC_WINDOWS_ROOT=
    | file GCC_LINUX_ROOT=/usr/
    | environment GCC_QNX_ROOT=/home/embedded/qnx800/host/linux/x86_64/usr/bin
    | #######################################################################
    | TARGET_COMBO=J722S:LINUX:A53:1:release:GCC_LINUX_ARM
    | [GCC] Compiling C99 app_log_writer.c
    | [GCC] Compiling C99 app_log_reader.c
    | /bin/sh: 1: aarch64-oe-linux-gcc: not found
    | make[2]: *** [/home/embedded/workspace/scarthgap-next/am67xx_build/tmp/work/armv8a-linux/ti-vision-apps/11.01.00/repo/sdk_builder/concerto/finale.mak:318: /home/embedded/workspace/scarthgap-next/am67xx_build/tmp/work/armv8a-linux/ti-vision-apps/11.01.00/repo/app_utils/out/J722S/A53/LINUX/release/module/utils.console_io.src/app_log_writer.o] Error 127
    | make[2]: *** Waiting for unfinished jobs....
    | /bin/sh: 1: aarch64-oe-linux-gcc: not found
    | make[2]: Leaving directory '/home/embedded/workspace/scarthgap-next/am67xx_build/tmp/work/armv8a-linux/ti-vision-apps/11.01.00/repo/app_utils'
    | make[2]: *** [/home/embedded/workspace/scarthgap-next/am67xx_build/tmp/work/armv8a-linux/ti-vision-apps/11.01.00/repo/sdk_builder/concerto/finale.mak:318: /home/embedded/workspace/scarthgap-next/am67xx_build/tmp/work/armv8a-linux/ti-vision-apps/11.01.00/repo/app_utils/out/J722S/A53/LINUX/release/module/utils.console_io.src/app_log_reader.o] Error 127
    | make[1]: *** [makerules/makefile_tiovx_ptk_imaging.mak:8: app_utils] Error 2
    | make[1]: Leaving directory '/home/embedded/workspace/scarthgap-next/am67xx_build/tmp/work/armv8a-linux/ti-vision-apps/11.01.00/repo/sdk_builder'
    | make: *** [makerules/makefile_linux_arm.mak:347: yocto_build] Error 2
    | ERROR: oe_runmake failed
    | WARNING: exit code 1 from a shell command.
    ERROR: Task (/home/embedded/workspace/scarthgap-next/sources/meta-edgeai/recipes-tisdk/ti-psdk-rtos/ti-vision-apps.bb:do_compile) failed with exit code '1'
    NOTE: Tasks Summary: Attempted 6489 tasks of which 6483 didn't need to be rerun and 1 failed.

  • Hi Tim,

    Right, it is looking for that compiler "aarch64-oe-linux-gcc", and seems like it didn't find the one it was looking for with default paths. 

    Is this sdk something special from the TI Processor SDK or can I use a generic Yocto SDK to compile the ti-vision-apps ? 

    Do you mean is this complier (not sdk) something special from our SDK? I expect the compiler used by default within Yocto is sufficient

    Had there been previous components within this vision_apps that had successfully compiled? I'm curious if there is one specific subcomponent (still within vision-apps recipe) that is looking for a different compiler (perhaps hardcoded / a bug). 

    BR,
    Reese

  • Hi Tim,

    aarch64-oe-linux is the default compiler which Yocto uses to build SDK, The linux-devkit directory tou mentioned above is the toolchain which is packaged in SDK from one of the Yocto recipe. So I think a generic Yocto build will be sufficient to build ti-vision-apps recipe.

    May I know what is the command you are running to build this recipe?

    Ideally the command should be,

    MACHINE=j722s-evm bitbake -k ti-vision-apps

    Also what is the output of the below command in your case,

    MACHINE=j722s-evm bitbake -e | grep "^TARGET_PREFIX"

    Please make sure the MACHINE variable is pointing to j722s-evm and not am67 (since there is no such machine in meta-ti layer)

    Regards,

    Shreyash

  • Hello Shreyash, 

    thank you for following up, for the machine configuration I have used our custom SOM configuration from our meta-tq yocto layer.
    https://github.com/tq-systems/meta-tq/blob/scarthgap/meta-tq/conf/machine/tqma67xx-mba67xx.conf

    Our BSP is based on the poky reference distribution which include meta-ti by default for the TI module based builds. In addition to this I have added the following layers to build the BSP:

    BB_VERSION = "2.8.0"
    BUILD_SYS = "x86_64-linux"
    NATIVELSBSTRING = "universal"
    TARGET_SYS = "arm-tq-eabi"
    MACHINE = "tqma67xx-mba67xx-k3r5"
    DISTRO = "dumpling-wayland-ti"
    DISTRO_VERSION = "5.0.10"
    TUNE_FEATURES = "arm armv7a vfp thumb callconvention-hard"
    TARGET_FPU = "hard"
    meta
    meta-poky = "HEAD:ac257900c33754957b2696529682029d997a8f28"
    meta-oe
    meta-python = "HEAD:491671faee11ea131feab5a3a451d1a01deb2ab1"
    meta-arm
    meta-arm-toolchain = "HEAD:8e0f8af90fefb03f08cd2228cde7a89902a6b37c"
    meta-ti-bsp = "HEAD:8c258e731e62954ff41460febc2c036fb5ca552c"
    meta-rauc = "HEAD:a0f4a8b9986954239850b9d4256c003c91e6b931"
    meta-qt6 = "HEAD:c58fdf7af5d92f5dc0a3446a9865580511ae8691"
    meta-tq
    meta-dumpling = "HEAD:fe1884498d531d0363710fc0c98de8bc7f014885"
    meta-edgeai = "scarthgap:dd0dd867bb84e1872001cd4669fd6139b8257712"
    meta-arago-distro
    meta-arago-extras
    meta-arago-test = "scarthgap:5caa2748e530c78c316b47044da163e51ad775df"
    meta-ti-extras = "HEAD:8c258e731e62954ff41460febc2c036fb5ca552c"
    meta-ti-foundational = "scarthgap:3265fc78d1dddd1642c112a2f3f42507bfa176d9"
    meta-networking
    meta-filesystems
    meta-multimedia = "HEAD:491671faee11ea131feab5a3a451d1a01deb2ab1"
    meta-clang = "scarthgap:aef850f7fa53121c74b244b7ae40d31fb9809ccf"

    Im building the custom Image tq-image-weston-debug from our meta-dumpling layer:
    https://github.com/tq-systems/meta-tq/blob/scarthgap/meta-dumpling/recipes-images/images/tq-image-weston-debug.bb

    To initalize the buildspace im using the custom setup-environment script from ci-meta-tq:
    https://github.com/tq-systems/ci-meta-tq/blob/scarthgap/setup-environment

    I have added the ti-vision-apps and ti-edgeai-firmware package to the image via the IMAGE_INSTALL:append variable in my local.conf 

    MACHINE ??= "tqma67xx-mba67xx"
    DISTRO ?= "dumpling-wayland-ti"
    PACKAGE_CLASSES ?= "package_rpm"
    USER_CLASSES ?= "buildstats"
    PATCHRESOLVE = "noop"
    BB_DISKMON_DIRS ??= "\
    STOPTASKS,${TMPDIR},1G,100K \
    STOPTASKS,${DL_DIR},1G,100K \
    STOPTASKS,${SSTATE_DIR},1G,100K \
    STOPTASKS,/tmp,100M,100K \
    HALT,${TMPDIR},100M,1K \
    HALT,${DL_DIR},100M,1K \
    HALT,${SSTATE_DIR},100M,1K \
    HALT,/tmp,10M,1K"
    PACKAGECONFIG:append:pn-qemu-system-native = " sdl"
    CONF_VERSION = "2"
    ACCEPT_TI_EULA = "1"
    ACCEPT_QT6_EULA = "1"
    QT_EDITION = "opensource"
    #IMAGE_INSTALL:append = " ti-vision-apps ti-edgeai-firmware"

    To build the image I'm running the command bitbake tq-image-weston-debug
    Below you will find the output of the requested bitbake environment command:

     MACHINE=tqma67xx-mba67xx bitbake -e | grep "^TARGET_PREFIX"
    TARGET_PREFIX="aarch64-tq-linux-"  


    thank you and best regards

    Tim


     

  • Hi Tim,

    Can you please try changing the below line under do_compile in ti-vision-apps recipe,

    CROSS_COMPILE_LINARO=aarch64-oe-linux-
    to 
    CROSS_COMPILE_LINARO=aarch64-tq-linux-

    Regards,

    Shreyash

  • Hi Shreyash,

    thanks that helped alot, I had to fix several other packages, but now I have the ti-vision-apps available in /opt/

    /opt/vision_apps# ls -al
    total 296
    drwxrwxr-x 2 root root 4096 Mar 9 2018 .
    drwxr-xr-x 5 root root 4096 Mar 9 2018 ..
    -rw-r--r-- 1 root root 345 Mar 9 2018 app_c7x.cfg
    -rwxr-xr-x 1 root root 117 Mar 9 2018 run_app_arm_fd_exchange.sh
    -rwxr-xr-x 1 root root 57 Mar 9 2018 run_app_c7x.sh
    -rw-r--r-- 1 root root 147 Mar 9 2018 run_app_heterogeneous.sh
    -rwxr-xr-x 1 root root 765 Mar 9 2018 run_app_load_test.sh
    -rwxr-xr-x 1 root root 266 Mar 9 2018 run_app_viss.sh
    -rwxr-xr-x 1 root root 518 Mar 9 2018 vision_apps_init.sh
    -rwxr-xr-x 1 root root 68072 Mar 9 2018 vx_app_arm_fd_exchange_consumer.out
    -rwxr-xr-x 1 root root 68152 Mar 9 2018 vx_app_arm_fd_exchange_producer.out
    -rwxr-xr-x 1 root root 67664 Mar 9 2018 vx_app_arm_ipc.out
    -rwxr-xr-x 1 root root 67648 Mar 9 2018 vx_app_arm_mem.out
    -rwxr-xr-x 1 root root 67640 Mar 9 2018 vx_app_arm_remote_log.out
    -rwxr-xr-x 1 root root 67920 Mar 9 2018 vx_app_c7x_kernel.out
    -rwxr-xr-x 1 root root 71112 Mar 9 2018 vx_app_conformance.out
    -rwxr-xr-x 1 root root 67632 Mar 9 2018 vx_app_conformance_core.out
    -rwxr-xr-x 1 root root 67800 Mar 9 2018 vx_app_conformance_hwa.out
    -rwxr-xr-x 1 root root 67656 Mar 9 2018 vx_app_conformance_tidl.out
    -rwxr-xr-x 1 root root 67712 Mar 9 2018 vx_app_conformance_video_io.out
    -rwxr-xr-x 1 root root 67696 Mar 9 2018 vx_app_heap_stats.out
    -rwxr-xr-x 1 root root 67696 Mar 9 2018 vx_app_load_test.out
    -rwxr-xr-x 1 root root 67952 Mar 9 2018 vx_app_viss.out 

    When I try to execute the vision_apps_init.sh I get some memory mapping erros, can you please also help we with that ? 

    /opt/vision_apps# ./vision_apps_init.sh
    /opt/vision_apps# 0.000000 s: APP_LOG: ERROR: Unable to map memory @ 0xa7000000 of size 262144 bytes !!!

    thank you in advance

    Tim

  • Hi Tim,

    Did you alter the memory map at all, i.e. in firmware-builder / PSDK-RTOS's Vision_apps/platform/j722s/rtos? Or were you using the default configuration?

    I see references to SDK 11.1 in some of the yocto recipes, but this SDK is not out yet. Is your Yocto setup based on a specific TI-SDK or did you grab the recipes and repos directly from latest commit on some branch? I wonder if there is some discrepancy in versioning from grabbing latest (or at least when you started), which I see we had a commit that bumped up to 11.1-relevant release [1] (parent commit before shows  11.1).

    You may be missing a .DTBO that tells linux what dma-carveouts it should use. We've altered how this is applied some in the last couple releases, and I wonder if something is out-of-sync and causing issue. Do you have a file called "k3-j722s-edgeai-apps.dtbo" or "k3-j722s-vision-apps.dtbo" in your /boot/dtb/ti/ directory? These overlays are used to set those DMA regions for LInux, which the remote cores (e.g. C7x for AI acceleration) should have baked into their firmware already. 

    • This overlay would nominally be applied from uEnv.txt in your boot partition (at runtime, see /run/media/BOOT-mmcblk1p1/uEnv.txt)

    [1] git.ti.com/.../