This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Following the instructions in the documentation QNX IPC, I built the firmware for the remote cores and added them to the rootfs partition under /lib/firmware. Then i started the test on QNX by running ./ipc_test -s. I am getting the following output
J7EVM@QNX:/ti_fs/tibin# ./ipc_test -s IPC_echo_test (core : mpu1_0) ..... Creating thread 2 for Core mcu2_0 Creating thread 3 for Core mcu2_1 Creating thread 4 for Core mcu3_0 Creating thread 5 for Core mcu3_1 Creating thread 6 for Core C66X_1 Creating thread 7 for Core C66X_2 Creating thread 8 for Core C7X_1 waiting for thread 2 SendTask7: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask7: mpu1_0 <--> C66X_1, Ping- 10, pong - 10 completed SendTask3: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask9: mpu1_0 <--> C7X_1, Ping- 10, pong - 10 completed SendTask3: mpu1_0 <--> mcu2_0, Ping- 10, pong - 10 completed SendTask8: mpu1_0 <--> C66X_2, Ping- 10, pong - 10 completed SendTask4: RPMessage_recv returned with code 0 SendTask4: mpu1_0 <--> mcu2_1, Ping- 10, pong - 10 completed waiting for thread 3 waiting for thread 4 SendTask5: RPMessage_recv failed with code -4 SendTask6: RPMessage_recv failed with code -4 SendTask5: RPMessage_recv failed with code -4 SendTask6: RPMessage_recv failed with code -4 SendTask5: RPMessage_recv failed with code -4 SendTask6: RPMessage_recv failed with code -4 SendTask5: RPMessage_recv failed with code -4 SendTask6: RPMessage_recv failed with code -4 SendTask5: RPMessage_recv failed with code -4 SendTask6: RPMessage_recv failed with code -4 SendTask5: RPMessage_recv failed with code -4 SendTask6: RPMessage_recv failed with code -4 SendTask5: RPMessage_recv failed with code -4 SendTask6: RPMessage_recv failed with code -4 SendTask5: RPMessage_recv failed with code -4 SendTask6: RPMessage_recv failed with code -4 SendTask5: RPMessage_recv failed with code -4 SendTask6: RPMessage_recv failed with code -4 SendTask5: RPMessage_recv failed with code -4 SendTask6: RPMessage_recv failed with code -4 SendTask5: mpu1_0 <--> mcu3_0, Ping- 10, pong - 0 completed SendTask6: mpu1_0 <--> mcu3_1, Ping- 10, pong - 0 completed SendTask5: mpu1_0 <--> mcu3_0 failure waiting for thread 5 SendTask6: mpu1_0 <--> mcu3_1 failure waiting for thread 6 waiting for thread 7 waiting for thread 8 !!!Exiting the test with failures!!!
It is not clear to me what should i make out of this output. What does returned with code 0 mean. What does Ping- 10, pong - 10 completed mean.
Hello Deepankar,
Are all the remote core firmware images loaded properly? To confirm, can you share the full bootlog include messages from the bootloader?
Thanks.
U-Boot SPL 2023.04-g71b8c840ca (Nov 27 2023 - 08:55:35 +0000) SYSFW ABI: 3.1 (firmware rev 0x0009 '9.1.2--v09.01.02 (Kool Koala)') Trying to boot from MMC2 Warning: Detected image signing certificate on GP device. Skipping certificate d Warning: Detected image signing certificate on GP device. Skipping certificate d Warning: Detected image signing certificate on GP device. Skipping certificate d Warning: Detected image signing certificate on GP device. Skipping certificate d Warning: Detected image signing certificate on GP device. Skipping certificate d Loading Environment from nowhere... OK Starting ATF on ARM64 core... NOTICE: BL31: v2.9(release):v2.9.0-614-gd7a7135d32-dirty NOTICE: BL31: Built : 09:34:15, Aug 24 2023 I/TC: I/TC: OP-TEE version: 4.0.0 (gcc version 11.4.0 (GCC)) #1 Fri Oct 20 18:29:31 U4 I/TC: WARNING: This OP-TEE configuration might be insecure! I/TC: WARNING: Please check https://optee.readthedocs.io/en/latest/architecturel I/TC: Primary CPU initializing I/TC: SYSFW ABI: 3.1 (firmware rev 0x0009 '9.1.2--v09.01.02 (Kool Koala)') I/TC: HUK Initialized I/TC: Activated SA2UL device I/TC: Fixing SA2UL firewall owner for GP device I/TC: Enabled firewalls for SA2UL TRNG device I/TC: SA2UL TRNG initialized I/TC: SA2UL Drivers initialized I/TC: Primary CPU switching to normal world boot U-Boot SPL 2023.04-g71b8c840ca (Nov 27 2023 - 08:55:35 +0000) SYSFW ABI: 3.1 (firmware rev 0x0009 '9.1.2--v09.01.02 (Kool Koala)') Trying to boot from MMC2 Warning: Detected image signing certificate on GP device. Skipping certificate d Warning: Detected image signing certificate on GP device. Skipping certificate d U-Boot 2023.04-g71b8c840ca (Nov 27 2023 - 08:55:35 +0000) SoC: J721E SR1.1 GP Model: Texas Instruments J721E SK A72 Board: J721EX-SK rev A DRAM: 4 GiB Core: 121 devices, 34 uclasses, devicetree: separate Flash: 0 Bytes MMC: mmc@4fb0000: 1 Loading Environment from nowhere... OK In: serial@2800000 Out: serial@2800000 Err: serial@2800000 am65_cpsw_nuss ethernet@46000000: K3 CPSW: nuss_ver: 0x6BA00101 cpsw_ver: 0x6BA0 Net: eth0: ethernet@46000000port@1 Hit any key to stop autoboot: 0 switch to partitions #0, OK mmc1 is current device SD/MMC found on device 1 Failed to load 'boot.scr' 556 bytes read in 22 ms (24.4 KiB/s) Loaded env from uEnv.txt Importing environment from mmc1 ... Running uenvcmd ... k3_r5f_rproc r5f@41000000: Core 1 is already in use. No rproc commands work Failed to load '/lib/firmware/j7-mcu-r5f0_1-fw' 127992 bytes read in 25 ms (4.9 MiB/s) Load Remote Processor 2 with data@addr=0x82000000 127992 bytes: Success! 127992 bytes read in 25 ms (4.9 MiB/s) Load Remote Processor 3 with data@addr=0x82000000 127992 bytes: Success! Failed to load '/lib/firmware/j7-main-r5f1_0-fw' Failed to load '/lib/firmware/j7-main-r5f1_1-fw' 218572 bytes read in 27 ms (7.7 MiB/s) Load Remote Processor 6 with data@addr=0x82000000 218572 bytes: Success! 218572 bytes read in 27 ms (7.7 MiB/s) Load Remote Processor 7 with data@addr=0x82000000 218572 bytes: Success! 10488944 bytes read in 35 ms (285.8 MiB/s) Load Remote Processor 8 with data@addr=0x82000000 10488944 bytes: Success! 47199156 bytes read in 1066 ms (42.2 MiB/s) ## Starting application at 0x80080000 ... MMU: 16-bit ASID 44-bit PA TCR_EL1=b5183519 ARM GIC-500 r1p1, arch v3.0 detected gic_v3_lpi_add_entry for vectors 8192 -> 8447, Ok gic_v3_lpi_add_entry for vectors 8448 -> 65535, Ok No SPI intrinfo. Add default entry for 32 -> 991 vectors, Ok LPI config table #1 @ 000000008000f000, callout vaddr: ffffff8040251000 aarch64_cpuspeed: core speed 2000 cpu0: MPIDR=80000000 cpu0: MIDR=411fd080 Cortex-A72 r1p0 cpu0: CWG=4 ERG=4 Dminline=4 Iminline=4 PIPT cpu0: CLIDR=a200023 LoUU=1 LoC=2 LoUIS=1 cpu0: L1 Icache 48K linesz=64 set/way=256/3 cpu0: L1 Dcache 32K linesz=64 set/way=256/2 cpu0: L2 Unified 1024K linesz=64 set/way=1024/16 Enabling ITS 0 ITS queue at 0000000080020000, num slots: 256 Issue MAPC/SYNC/INVALL commands for ICID 0 update CWRITER to 0x00000060 Waiting for all commands to be processed ... Done in 1 tries Enable LPIs in GICR_CTLR @ 0000000001900000 for CPU0 Total Available L3 cache (MSMC SRAM): 0 bytes Loading IFS...decompressing...done I/TC: Secondary CPU 1 initializing I/TC: Secondary CPU 1 switching to normal world boot aarch64_cpuspeed: core speed 2000 cpu1: MPIDR=80000001 cpu1: MIDR=411fd080 Cortex-A72 r1p0 cpu1: CWG=4 ERG=4 Dminline=4 Iminline=4 PIPT cpu1: CLIDR=a200023 LoUU=1 LoC=2 LoUIS=1 cpu1: L1 Icache 48K linesz=64 set/way=256/3 cpu1: L1 Dcache 32K linesz=64 set/way=256/2 cpu1: L2 Unified 1024K linesz=64 set/way=1024/16 ITS 0 already Enabled ITS queue at 0000000080020000, num slots: 256 Issue MAPC/SYNC/INVALL commands for ICID 1 update CWRITER to 0x000000c0 Waiting for all commands to be processed ... Done in 1 tries Enable LPIs in GICR_CTLR @ 0000000001920000 for CPU1 System page at phys:0000000080023000 user:ffffff8040275000 kern:ffffff8040272000 Starting next program at vffffff8060089280 All ClockCycles offsets within tolerance Welcome to QNX Neutrino 7.1.0 on the TI J721E Starter Kit Board!! Starting random service ... start serial driver Starting MMC/SD memory card driver... SD Starting Network driver... Starting Flash driver... Starting XHCI driver on USB3SS0 and USB3SS1 Setting environment variables... done.. Mounting the sd .. Looking for user script to run: /ti_fs/scripts/user.sh Running user script... user.sh called... Setting additional environment variables... Starting tisci-mgr.. Initializing sciclient in interupt mode Starting shmemallocator.. Starting tiipc-mgr.. Starting tiudma-mgr.. Starting VXE-ENC resource manager... Start screen.. screen started with dss_on_r5 configuration.. Starting sshd done... J7EVM@QNX:/#
I see that there are some cores not loaded. I updated the rootfs folder again with new firmware and here is what i get now. It seems to work now.
What does "RPMessage_recv returned with code 0". Where can i read more about the RPmessage API and source code?
J7EVM@QNX:/ti_fs/tibin# ./ipc_test -s IPC_echo_test (core : mpu1_0) ..... Creating thread 2 for Core mcu2_0 Creating thread 3 for Core mcu2_1 Creating thread 4 for Core mcu3_0 Creating thread 5 for Core mcu3_1 Creating thread 6 for Core C66X_1 Creating thread 7 for Core C66X_2 Creating thread 8 for Core C7X_1 waiting for thread 2 SendTask3: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask5: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask6: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask5: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask6: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask5: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask6: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask5: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask6: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask5: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask6: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask5: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask6: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask5: RPMessage_recv returned with code 0 SendTask6: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask5: RPMessage_recv returned with code 0 SendTask6: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask5: RPMessage_recv returned with code 0 SendTask6: RPMessage_recv returned with code 0 SendTask3: RPMessage_recv returned with code 0 SendTask7: RPMessage_recv returned with code 0 SendTask4: RPMessage_recv returned with code 0 SendTask9: RPMessage_recv returned with code 0 SendTask8: RPMessage_recv returned with code 0 SendTask3: mpu1_0 <--> mcu2_0, Ping- 10, pong - 10 completed SendTask5: RPMessage_recv returned with code 0 SendTask7: mpu1_0 <--> C66X_1, Ping- 10, pong - 10 completed SendTask6: RPMessage_recv returned with code 0 SendTask4: mpu1_0 <--> mcu2_1, Ping- 10, pong - 10 completed SendTask9: mpu1_0 <--> C7X_1, Ping- 10, pong - 10 completed SendTask8: mpu1_0 <--> C66X_2, Ping- 10, pong - 10 completed SendTask5: mpu1_0 <--> mcu3_0, Ping- 10, pong - 10 completed waiting for thread 3 SendTask6: mpu1_0 <--> mcu3_1, Ping- 10, pong - 10 completed waiting for thread 4 waiting for thread 5 waiting for thread 6 waiting for thread 7 waiting for thread 8 Exiting the test successfully!!
Hello Deepankar,
The engineer is out of office until 5/8, please expect a ~1-2 day delay.
-Josue
Where can i see the source code for ti-ipc-mgr and shemallocator binaries
The source code for tiipc-mgr and sharedmemallocator modules can be found in the SDK install path.
<PSDK_INSTALL_PATH>/psdkqa/qnx/resmgr/ipc_qnx_rsmgr/resmgr
<PSDK_INSTALL_PATH>/psdkqa/qnx/sharedmemallocator
Thanks.