This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Syslink error using M3 firmware while streaming live video and dvr



Has any one experience this issue using M3 firmware and know why or have a resolution for it?  Found a similar post but they were running on the DSP instead of M3.

http://e2e.ti.com/support/embedded/bios/f/355/p/250552/878377.aspx

Assertion at Line no: 380 in /home/builder/autobuilds/default/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r26j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed

Assertion at Line no: 380 in /home/builder/autobuilds/default/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r26j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed

Thanks,

Johny.

  • Hi Johny,

    That assertion means that a MessageQ, HeapBufMP has not been deleted, or an event was not
    unregistered.  Are you creating one of these objects on the M3 and not cleaning up?

    Best regards,

        Janet

  • Hi Janet,

    Thanks for you quick reply.  We are not doing anything inside the M3 as far as I known.  We just load the M3 firmware and use it as is.

    Do we have patch the M3 firmware to handle the things you said.  According to the DSP post:

    “Also, in your DSP side code, you need to register the HeapBufMP heap with MessageQ so that you can allocate the
    message. You would need to call HeapBufMP_open() to get the heap handle, and then MessageQ_registerHeap(),
    passing the heap id, 1, and the handle returned from HeapBufMP_open().”

    Do we have to do that inside the M3 firmware code?

    Thanks,

    Johny.

  • Hi Janet,

    Just to verify, we are using the following: vpss, omx and gstreamer to stream live video.

    we load the following firmware on boot: dm814x_hdvicp.xem3 and dm814x_hdvpss.xem3.  

    I hope that may give some more info.

    Thanks,

    Johny.

  • Hi Johny,

    Are you using an sdk?  If so, what version?

    Thanks,

        Janet

  • Hi Janet,

    We are using TI EZSDK 5.05.02 release.

    Thanks,

    Johny.

  • More info with trace turn on: 

    what we use for bootcmd: 

    console=ttyO0,115200n8 rootwait root=/dev/mmcblk0p5 rw mem=364M@0x80000000 mem=320M@0x9FC00000 vmalloc=500M noinitrd notifyk.vpssm3_sva=0xbf900000 vram=16M wdt_enabled_on_boot omap_wdt.timer_margin=360 no_console_suspend

    SysLink version : 2.21.01.05

    SysLink module created on Date:Apr  1 2013 Time:13:26:17
    Trace enabled
    Trace SetFailureReason enabled
    Trace class 3
    MemoryOS_map: pa=0x480ca800, va=0xfa0ca800, sz=0x1000
    FIRMWARE: Memory map bin file not passed
    Usage : firmware_loader <Processor Id> <Location of Firmware> <start|stop> [-mmap <memory_map_file>] [NameServer Module already initialized!
    -i2c <0|1>]
    ===SharedRegion Module already initialized!
    Mandatory argumeGateMP Module already initialized!
    nts=== 
    <Processor Id>         MessageQ Module already initialized!
    0: DSP, 1: VideoHeapBufMP Module already initialized!
    -M3, 2: Vpss-M3 HeapMemMP Module already initialized!
    <Location of FListMP Module already initialized!
    irmware> firmware binary file 
    ClientNotifyMgr Module already initialized!
    <start|stop>    FrameQBufMgr Module already initialized!
           to start/FrameQ Module already initialized!
    stop the firmwar    ProcMgr_getProcInfo: bootMode: [0]
    ===Optional MemoryOS_map: pa=0x48180000, va=0xfa180000, sz=0x2fff
    arguments=== 
    -MemoryOS_map: pa=0x55080000, va=0xf9080000, sz=0xfff
    mmap            MemoryOS_map: pa=0x55020000, va=0xf9020000, sz=0x8
          input memory map bin file DM8168VIDEOM3PROC_attach: Mapping memory regions
    name 
    -i2c     MemoryOS_map: pa=0x55020004, va=0xf9020004, sz=0x4
                  0:MemoryOS_map: entry already exists
        mapInfo->src  [0x48180000]
        mapInfo->dst  [0xfa180000]
        mapInfo->size [0x2fff]
     i2c init not doDM8168VIDEOM3PROC_attach: slave is now in reset
    ne by M3, 1(defaMemoryOS_map: pa=0x55020000, va=0xf9020000, sz=0x4000
    ult): i2c init dMemoryOS_map: pa=0x55024000, va=0xf9024000, sz=0xc000
    one by M3 
    FIRMMemoryOS_map: pa=0x40300000, va=0xd9080000, sz=0x40000
    WARE: isI2cInitR_ProcMgr_map for SlaveVirt:
        dstAddr       [0x300000]
        sgList.paddr  [0x40300000]
        sgList.offset [0x0]
        sgList.size [0x40000]
    equiredOnM3: 0
    DM8168VIDEOM3PROC_map: found static entry: [2] sva=0x300000, mpa=0x40300000 size=0x40000
    FIRMWARE: DefaulMemoryOS_map: pa=0x40400000, va=0xd9100000, sz=0x40000
    t memory configu_ProcMgr_map for SlaveVirt:
        dstAddr       [0x400000]
        sgList.paddr  [0x40400000]
        sgList.offset [0x0]
        sgList.size [0x40000]
    ration is used
    DM8168VIDEOM3PROC_map: found static entry: [3] sva=0x400000, mpa=0x40400000 size=0x40000
    Firmware Loader     ProcMgr_getProcInfo: bootMode: [0]
    debugging not coOsalDrv_mmap(): setting cache disabled for physical address 55020000
    nfigured
    DefaulOsalDrv_mmap(): setting cache disabled for physical address 55024000
    t FL_DEBUG: warnOsalDrv_mmap(): setting cache disabled for physical address 40300000
    ing
    Allowed FL_OsalDrv_mmap(): setting cache disabled for physical address 40400000
    DEBUG levels: error, warning, inDLOAD: fo, debug, log
    ELF: ELF
    MemCfg: DCMM (DyDLOAD: namically ConfigELF file header entry point: 9e3c2481
    urable Memory Map) Version :  2.1.2.1
    target_address=0x00000000
    memsz_in_bytes=0x3c
    objsz_in_bytes=0x3c
    DM8168VIDEOM3PROC_translate: translated [0] srcAddr=0x0 to dstAddr=0x55020000
    translated 0x00000000 (sva) to 0x55020000 (mpa)
    MemoryOS_map: pa=0x55020000, va=0xf9020000, sz=0x3c
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x0]
        sgList.paddr  [0x55020000]
        sgList.offset [0x0]
        sgList.size [0x3c]
    DM8168VIDEOM3PROC_map: found static entry: [0] sva=0x0, mpa=0x55020000 size=0x4000
    target_address=0x00000400
    memsz_in_bytes=0x140
    objsz_in_bytes=0x140
    DM8168VIDEOM3PROC_translate: translated [0] srcAddr=0x400 to dstAddr=0x55020400
    translated 0x00000400 (sva) to 0x55020400 (mpa)
    MemoryOS_map: pa=0x55020400, va=0xf9020400, sz=0x140
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x0]
        sgList.paddr  [0x55020000]
        sgList.offset [0x400]
        sgList.size [0x540]
    DM8168VIDEOM3PROC_map: found static entry: [0] sva=0x0, mpa=0x55020000 size=0x4000
    target_address=0x000007f0
    memsz_in_bytes=0x10
    objsz_in_bytes=0x10
    DM8168VIDEOM3PROC_translate: translated [0] srcAddr=0x7f0 to dstAddr=0x550207f0
    translated 0x000007f0 (sva) to 0x550207f0 (mpa)
    MemoryOS_map: pa=0x550207f0, va=0xf90207f0, sz=0x10
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x0]
        sgList.paddr  [0x55020000]
        sgList.offset [0x7f0]
        sgList.size [0x800]
    DM8168VIDEOM3PROC_map: found static entry: [0] sva=0x0, mpa=0x55020000 size=0x4000
    target_address=0x9dd00000
    memsz_in_bytes=0x2c4f91
    objsz_in_bytes=0x2c4f91
    DM8168VIDEO3PROC_translate: (default) srcAddr=0x9dd00000 to dstAddr=0x9dd00000
    translated 0x9dd00000 (sva) to 0x9dd00000 (mpa)
    MemoryOS_map: pa=0x9dd00000, va=0xd9400000, sz=0x2c4f91
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9dd00000]
        sgList.paddr  [0x9dd00000]
        sgList.offset [0x0]
        sgList.size [0x2c4f91]
    DM8168VIDEOM3PROC_map: adding dynamic entry: [4] sva=0x9dd00000, mpa=0x9dd00000, size=0x2c4f91
    target_address=0x9dfc4f98
    memsz_in_bytes=0x291be4
    objsz_in_bytes=0x291be4
    DM8168VIDEO3PROC_translate: (default) srcAddr=0x9dfc4f98 to dstAddr=0x9dfc4f98
    translated 0x9dfc4f98 (sva) to 0x9dfc4f98 (mpa)
    MemoryOS_map: pa=0x9dfc4f98, va=0xd9800f98, sz=0x291be4
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9dfc4000]
        sgList.paddr  [0x9dfc4000]
        sgList.offset [0xf98]
        sgList.size [0x292b7c]
    DM8168VIDEOM3PROC_map: adding dynamic entry: [5] sva=0x9dfc4000, mpa=0x9dfc4000, size=0x292b7c
    target_address=0x9e256b80
    memsz_in_bytes=0x34
    objsz_in_bytes=0x34
    DM8168VIDEO3PROC_translate: (default) srcAddr=0x9e256b80 to dstAddr=0x9e256b80
    translated 0x9e256b80 (sva) to 0x9e256b80 (mpa)
    MemoryOS_map: pa=0x9e256b80, va=0xd9160b80, sz=0x34
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9e256000]
        sgList.paddr  [0x9e256000]
        sgList.offset [0xb80]
        sgList.size [0xbb4]
    DM8168VIDEOM3PROC_map: adding dynamic entry: [6] sva=0x9e256000, mpa=0x9e256000, size=0xbb4
    target_address=0x9e258000
    memsz_in_bytes=0x75038
    objsz_in_bytes=0x75038
    DM8168VIDEO3PROC_translate: (default) srcAddr=0x9e258000 to dstAddr=0x9e258000
    translated 0x9e258000 (sva) to 0x9e258000 (mpa)
    MemoryOS_map: pa=0x9e258000, va=0xd9180000, sz=0x75038
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9e258000]
        sgList.paddr  [0x9e258000]
        sgList.offset [0x0]
        sgList.size [0x75038]
    DM8168VIDEOM3PROC_map: adding dynamic entry: [7] sva=0x9e258000, mpa=0x9e258000, size=0x75038
    target_address=0x9e330000
    memsz_in_bytes=0x986f6
    objsz_in_bytes=0x986f6
    DM8168VIDEO3PROC_translate: (default) srcAddr=0x9e330000 to dstAddr=0x9e330000
    translated 0x9e330000 (sva) to 0x9e330000 (mpa)
    MemoryOS_map: pa=0x9e330000, va=0xd9200000, sz=0x986f6
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9e330000]
        sgList.paddr  [0x9e330000]
        sgList.offset [0x0]
        sgList.size [0x986f6]
    DM8168VIDEOM3PROC_map: adding dynamic entry: [8] sva=0x9e330000, mpa=0x9e330000, size=0x986f6
    target_address=0x9e510000
    memsz_in_bytes=0xed000
    objsz_in_bytes=0xed000
    DM8168VIDEO3PROC_translate: (default) srcAddr=0x9e510000 to dstAddr=0x9e510000
    translated 0x9e510000 (sva) to 0x9e510000 (mpa)
    MemoryOS_map: pa=0x9e510000, va=0xd9300000, sz=0xed000
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9e510000]
        sgList.paddr  [0x9e510000]
        sgList.offset [0x0]
        sgList.size [0xed000]
    DM8168VIDEOM3PROC_map: adding dynamic entry: [9] sva=0x9e510000, mpa=0x9e510000, size=0xed000
    target_address=0xbffff000
    memsz_in_bytes=0x4
    objsz_in_bytes=0x4
    DM8168VIDEO3PROC_translate: (default) srcAddr=0xbffff000 to dstAddr=0xbffff000
    translated 0xbffff000 (sva) to 0xbffff000 (mpa)
    MemoryOS_map: pa=0xbffff000, va=0xd9170000, sz=0x4
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0xbffff000]
        sgList.paddr  [0xbffff000]
        sgList.offset [0x0]
        sgList.size [0x4]
    DM8168VIDEOM3PROC_map: adding dynamic entry: [10] sva=0xbffff000, mpa=0xbffff000, size=0x4
    DLOAD: write_arguments_to_args_section: c_args=ffffffff
    DLOAD: WARNING - .args section not properly aligned
    DLOAD: ERROR : Couldn't write to .args section
    *** ElfLoader_load: Failed to write args! (ensure .args section is big enough)
            Error [0x0] at Line no: 1967 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/s
    yslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/procMgr/common/loaders/Elf/ElfLoader.c
    ElfLoader_getSymbolAddress: symName [_Ipc_ResetVector]
    DM8168VIDEOM3PROC_translate: translated [6] srcAddr=0x9e256b80 to dstAddr=0x9e256b80
    ProcMgr_translateAddr: srcAddr [0x9e256b80] dstAddr [0xd9160b80]
    DM8168VIDEOM3PROC_translate: translated [6] srcAddr=0x9e256b9c to dstAddr=0x9e256b9c
    ProcMgr_translateAddr: srcAddr [0x9e256b9c] dstAddr [0xd9160b9c]
    handle->slaveSRCfg[0].entryBase 9f700000
    DM8168VIDEO3PROC_translate: (default) srcAddr=0x9f700000 to dstAddr=0x9f700000
    Platform_loadCallback:
        No SharedRegion.entry[0].cacheEnable configuration value found, using default FALSE
    Platform_loadCallback:
        Mapping SharedRegion 0
        addr[ProcMgr_AddrType_MasterPhys] [0x9f700000]
        addr[ProcMgr_AddrType_SlaveVirt]  [0x9f700000]
        size                              [0x200000]
        isCached                          [0]
    MemoryOS_map: pa=0x9f700000, va=0xd9c00000, sz=0x200000
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9f700000]
        sgList.paddr  [0x9f700000]
        sgList.offset [0x0]
        sgList.size [0x200000]
    DM8168VIDEOM3PROC_map: adding dynamic entry: [11] sva=0x9f700000, mpa=0x9f700000, size=0x200000
    DM8168VIDEOM3PROC_translate: translated [6] srcAddr=0x9e256b80 to dstAddr=0x9e256b80
    ProcMgr_translateAddr: srcAddr [0x9e256b80] dstAddr [0xd9160b80]
        DM8168VIDEOM3PROC_start: Configuring boot register
            Reset vector [0x9e3c2481]!
        DM8168VIDEOM3PROC_start: Slave successfully started!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    OsalDrv_mmap(): setting cache disabled for physical address 9f700000
    OsalDrv_mmap(): setting cache disabled for physical address 9a100000
    MemoryOS_map: pa=0x9a100000, va=0xda000000, sz=0x100000
    OsalDrv_mmap(): setting cache disabled for physical address b3d00000
    MemoryOS_map: pa=0xb3d00000, va=0xdb000000, sz=0xbc00000
    _NotifyDrv: Termination packet
    FIRMWARE: 1 start Successful
    Loading HDVPSS Firmware
    FIRMWARE: Memory map bin file noNameServer Module already initialized!
    t passed
    Usage SharedRegion Module already initialized!
    : firmware_loadeOsalDrv_mmap(): setting cache disabled for physical address 9f700000
    r <Processor Id>    NameServer_getLocal name [uiaStarted]
     <Location of Fi    NameServer_getLocal: Entry not found!
    rmware> <start|sOsalDrv_mmap(): setting cache disabled for physical address 9a100000
    top> [-mmap <memOsalDrv_mmap(): setting cache disabled for physical address b3d00000
    ory_map_file>] [-i2c <0|1>]
    ===Mandatory arguments=== 
    <Processor Id>         0: DSP, 1: Video-M3, 2: Vpss-M3 
    <Location of FGateMP Module already initialized!
    irmware> firmware binary file 
    MessageQ Module already initialized!
    <start|stop>    HeapBufMP Module already initialized!
           to start/HeapMemMP Module already initialized!
    stop the firmwarListMP Module already initialized!
    ===Optional arguments=== 
    -ClientNotifyMgr Module already initialized!
    mmap            FrameQBufMgr Module already initialized!
          input memoFrameQ Module already initialized!
    ry map bin file     ProcMgr_getProcInfo: bootMode: [0]
    name 
    -i2c     MemoryOS_map: entry already exists
        mapInfo->src  [0x48180000]
        mapInfo->dst  [0xfa180000]
        mapInfo->size [0x2fff]
                  0:MemoryOS_map: entry already exists
        mapInfo->src  [0x55080000]
        mapInfo->dst  [0xf9080000]
        mapInfo->size [0xfff]
     i2c init not doMemoryOS_map: entry already exists
        mapInfo->src  [0x55020000]
        mapInfo->dst  [0xf9020000]
        mapInfo->size [0x8]
    ne by M3, 1(defaDM8168VPSSM3PROC_attach: Mapping memory regions
    ult): i2c init dMemoryOS_map: entry already exists
        mapInfo->src  [0x55020004]
        mapInfo->dst  [0xf9020004]
        mapInfo->size [0x4]
    one by M3 
    FIRMMemoryOS_map: entry already exists
        mapInfo->src  [0x48180000]
        mapInfo->dst  [0xfa180000]
        mapInfo->size [0x2fff]
    WARE: isI2cInitRDM8168VPSSM3PROC_attach: slave is now in reset
    equiredOnM3: 0
    MemoryOS_map: entry already exists
        mapInfo->src  [0x55020000]
        mapInfo->dst  [0xf9020000]
        mapInfo->size [0x4000]
    FIRMWARE: DefaulMemoryOS_map: entry already exists
        mapInfo->src  [0x55024000]
        mapInfo->dst  [0xf9024000]
        mapInfo->size [0xc000]
    t memory configuMemoryOS_map: entry already exists
        mapInfo->src  [0x40300000]
        mapInfo->dst  [0xd9080000]
        mapInfo->size [0x40000]
    ration is used
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x300000]
        sgList.paddr  [0x40300000]
        sgList.offset [0x0]
        sgList.size [0x40000]
    Firmware Loader DM8168VPSSM3PROC_map: found static entry: [2] sva=0x300000, mpa=0x40300000 size=0x40000
    debugging not coMemoryOS_map: entry already exists
        mapInfo->src  [0x40400000]
        mapInfo->dst  [0xd9100000]
        mapInfo->size [0x40000]
    nfigured
    Defaul_ProcMgr_map for SlaveVirt:
        dstAddr       [0x400000]
        sgList.paddr  [0x40400000]
        sgList.offset [0x0]
        sgList.size [0x40000]
    t FL_DEBUG: warnDM8168VPSSM3PROC_map: found static entry: [3] sva=0x400000, mpa=0x40400000 size=0x40000
    ing
    Allowed FL_    NameServer_getLocal name [uiaStarted]
    DEBUG levels: er    NameServer_getLocal: Entry not found!
    ror, warning, in    ProcMgr_getProcInfo: bootMode: [0]
    fo, debug, log
    OsalDrv_mmap(): setting cache disabled for physical address 55020000
    MemCfg: DCMM (DyOsalDrv_mmap(): setting cache disabled for physical address 55024000
    namically ConfigOsalDrv_mmap(): setting cache disabled for physical address 40300000
    urable Memory MaOsalDrv_mmap(): setting cache disabled for physical address 40400000
    p) Version :  2.1.2.1
    DLOAD: ELF: ELF
    DLOAD: ELF file header entry point: 9f668c3d
    target_address=0x00000000
    memsz_in_bytes=0x3c
    objsz_in_bytes=0x3c
    DM8168VPSSM3PROC_translate: translated [0] srcAddr=0x0 to dstAddr=0x55020000
    translated 0x00000000 (sva) to 0x55020000 (mpa)
    MemoryOS_map: entry already exists
        mapInfo->src  [0x55020000]
        mapInfo->dst  [0xf9020000]
        mapInfo->size [0x3c]
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x0]
        sgList.paddr  [0x55020000]
        sgList.offset [0x0]
        sgList.size [0x3c]
    DM8168VPSSM3PROC_map: found static entry: [0] sva=0x0, mpa=0x55020000 size=0x4000
        NameServer_getLocal name [uiaStarted]
        NameServer_getLocal: Entry not found!
    target_address=0x000007f0
    memsz_in_bytes=0x10
    objsz_in_bytes=0x10
    DM8168VPSSM3PROC_translate: translated [0] srcAddr=0x7f0 to dstAddr=0x550207f0
    translated 0x000007f0 (sva) to 0x550207f0 (mpa)
    MemoryOS_map: entry already exists
        mapInfo->src  [0x550207f0]
        mapInfo->dst  [0xf90207f0]
        mapInfo->size [0x10]
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x0]
        sgList.paddr  [0x55020000]
        sgList.offset [0x7f0]
        sgList.size [0x800]
    DM8168VPSSM3PROC_map: found static entry: [0] sva=0x0, mpa=0x55020000 size=0x4000
    target_address=0x00000800
    memsz_in_bytes=0x140
    objsz_in_bytes=0x140
    DM8168VPSSM3PROC_translate: translated [0] srcAddr=0x800 to dstAddr=0x55020800
    translated 0x00000800 (sva) to 0x55020800 (mpa)
    MemoryOS_map: pa=0x55020800, va=0xf9020800, sz=0x140
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x0]
        sgList.paddr  [0x55020000]
        sgList.offset [0x800]
        sgList.size [0x940]
    DM8168VPSSM3PROC_map: found static entry: [0] sva=0x0, mpa=0x55020000 size=0x4000
    target_address=0x9e700000
    memsz_in_bytes=0x5dcf34
    objsz_in_bytes=0x5dcf34
    DM8168VPSSM3PROC_translate: (default) srcAddr=0x9e700000 to dstAddr=0x9e700000
    translated 0x9e700000 (sva) to 0x9e700000 (mpa)
    MemoryOS_map: pa=0x9e700000, va=0xda800000, sz=0x5dcf34
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9e700000]
        sgList.paddr  [0x9e700000]
        sgList.offset [0x0]
        sgList.size [0x5dcf34]
    DM8168VPSSM3PROC_map: adding dynamic entry: [4] sva=0x9e700000, mpa=0x9e700000, size=0x5dcf34
    target_address=0x9ecdcf38
    memsz_in_bytes=0x30c8
    objsz_in_bytes=0x30c8
    DM8168VPSSM3PROC_translate: (default) srcAddr=0x9ecdcf38 to dstAddr=0x9ecdcf38
    translated 0x9ecdcf38 (sva) to 0x9ecdcf38 (mpa)
    MemoryOS_map: pa=0x9ecdcf38, va=0xd9728f38, sz=0x30c8
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9ecdc000]
        sgList.paddr  [0x9ecdc000]
        sgList.offset [0xf38]
        sgList.size [0x4000]
    DM8168VPSSM3PROC_map: adding dynamic entry: [5] sva=0x9ecdc000, mpa=0x9ecdc000, size=0x4000
        NameServer_getLocal name [uiaStarted]
        NameServer_getLocal: Entry not found!
    target_address=0x9ece0000
    memsz_in_bytes=0x7325f8
    objsz_in_bytes=0x7325f8
    DM8168VPSSM3PROC_translate: (default) srcAddr=0x9ece0000 to dstAddr=0x9ece0000
    translated 0x9ece0000 (sva) to 0x9ece0000 (mpa)
    MemoryOS_map: pa=0x9ece0000, va=0xe7000000, sz=0x7325f8
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9ece0000]
        sgList.paddr  [0x9ece0000]
        sgList.offset [0x0]
        sgList.size [0x7325f8]
    DM8168VPSSM3PROC_map: adding dynamic entry: [6] sva=0x9ece0000, mpa=0x9ece0000, size=0x7325f8
    target_address=0x9f4125f8
    memsz_in_bytes=0x8
    objsz_in_bytes=0x8
    DM8168VPSSM3PROC_translate: (default) srcAddr=0x9f4125f8 to dstAddr=0x9f4125f8
    translated 0x9f4125f8 (sva) to 0x9f4125f8 (mpa)
    MemoryOS_map: pa=0x9f4125f8, va=0xd97345f8, sz=0x8
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9f412000]
        sgList.paddr  [0x9f412000]
        sgList.offset [0x5f8]
        sgList.size [0x600]
    DM8168VPSSM3PROC_map: adding dynamic entry: [7] sva=0x9f412000, mpa=0x9f412000, size=0x600
    target_address=0x9f412600
    memsz_in_bytes=0xed000
    objsz_in_bytes=0xed000
    DM8168VPSSM3PROC_translate: (default) srcAddr=0x9f412600 to dstAddr=0x9f412600
    translated 0x9f412600 (sva) to 0x9f412600 (mpa)
    MemoryOS_map: pa=0x9f412600, va=0xd9b00600, sz=0xed000
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9f412000]
        sgList.paddr  [0x9f412000]
        sgList.offset [0x600]
        sgList.size [0xed600]
    DM8168VPSSM3PROC_map: adding dynamic entry: [8] sva=0x9f412000, mpa=0x9f412000, size=0xed600
    target_address=0x9f4ff600
    memsz_in_bytes=0x13e15
    objsz_in_bytes=0x13e15
    DM8168VPSSM3PROC_translate: (default) srcAddr=0x9f4ff600 to dstAddr=0x9f4ff600
    translated 0x9f4ff600 (sva) to 0x9f4ff600 (mpa)
    MemoryOS_map: pa=0x9f4ff600, va=0xd9740600, sz=0x13e15
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9f4ff000]
        sgList.paddr  [0x9f4ff000]
        sgList.offset [0x600]
        sgList.size [0x14415]
    DM8168VPSSM3PROC_map: adding dynamic entry: [9] sva=0x9f4ff000, mpa=0x9f4ff000, size=0x14415
        NameServer_getLocal name [uiaStarted]
        NameServer_getLocal: Entry not found!
    target_address=0x9f513420
    memsz_in_bytes=0x17000
    objsz_in_bytes=0x17000
    DM8168VPSSM3PROC_translate: (default) srcAddr=0x9f513420 to dstAddr=0x9f513420
    translated 0x9f513420 (sva) to 0x9f513420 (mpa)
    MemoryOS_map: pa=0x9f513420, va=0xd9760420, sz=0x17000
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9f513000]
        sgList.paddr  [0x9f513000]
        sgList.offset [0x420]
        sgList.size [0x17420]
    DM8168VPSSM3PROC_map: adding dynamic entry: [10] sva=0x9f513000, mpa=0x9f513000, size=0x17420
    target_address=0x9f52a480
    memsz_in_bytes=0x34
    objsz_in_bytes=0x34
    DM8168VPSSM3PROC_translate: (default) srcAddr=0x9f52a480 to dstAddr=0x9f52a480
    translated 0x9f52a480 (sva) to 0x9f52a480 (mpa)
    MemoryOS_map: pa=0x9f52a480, va=0xd975a480, sz=0x34
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9f52a000]
        sgList.paddr  [0x9f52a000]
        sgList.offset [0x480]
        sgList.size [0x4b4]
    DM8168VPSSM3PROC_map: adding dynamic entry: [11] sva=0x9f52a000, mpa=0x9f52a000, size=0x4b4
    target_address=0x9f590000
    memsz_in_bytes=0xde91a
    objsz_in_bytes=0xde91a
    DM8168VPSSM3PROC_translate: (default) srcAddr=0x9f590000 to dstAddr=0x9f590000
    translated 0x9f590000 (sva) to 0x9f590000 (mpa)
    MemoryOS_map: pa=0x9f590000, va=0xd9f00000, sz=0xde91a
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9f590000]
        sgList.paddr  [0x9f590000]
        sgList.offset [0x0]
        sgList.size [0xde91a]
    DM8168VPSSM3PROC_map: adding dynamic entry: [12] sva=0x9f590000, mpa=0x9f590000, size=0xde91a
        NameServer_getLocal name [uiaStarted]
        NameServer_getLocal: Entry not found!
    target_address=0xbfd00000
    memsz_in_bytes=0x1ff060
    objsz_in_bytes=0x1ff060
    DM8168VPSSM3PROC_translate: (default) srcAddr=0xbfd00000 to dstAddr=0xbfd00000
    translated 0xbfd00000 (sva) to 0xbfd00000 (mpa)
    MemoryOS_map: pa=0xbfd00000, va=0xda400000, sz=0x1ff060
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0xbfd00000]
        sgList.paddr  [0xbfd00000]
        sgList.offset [0x0]
        sgList.size [0x1ff060]
    DM8168VPSSM3PROC_map: adding dynamic entry: [13] sva=0xbfd00000, mpa=0xbfd00000, size=0x1ff060
    target_address=0xbffff800
    memsz_in_bytes=0x4
    objsz_in_bytes=0x4
    DM8168VPSSM3PROC_translate: (default) srcAddr=0xbffff800 to dstAddr=0xbffff800
    translated 0xbffff800 (sva) to 0xbffff800 (mpa)
    MemoryOS_map: pa=0xbffff800, va=0xd9780800, sz=0x4
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0xbffff000]
        sgList.paddr  [0xbffff000]
        sgList.offset [0x800]
        sgList.size [0x804]
    DM8168VPSSM3PROC_map: adding dynamic entry: [14] sva=0xbffff000, mpa=0xbffff000, size=0x804
    DLOAD: write_arguments_to_args_section: c_args=ffffffff
    DLOAD: WARNING - .args section not properly aligned
    DLOAD: ERROR : Couldn't write to .args section
    *** ElfLoader_load: Failed to write args! (ensure .args section is big enough)
            Error [0x0] at Line no: 1967 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/s
    yslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/procMgr/common/loaders/Elf/ElfLoader.c
    ElfLoader_getSymbolAddress: symName [_Ipc_ResetVector]
    DM8168VPSSM3PROC_translate: translated [11] srcAddr=0x9f52a480 to dstAddr=0x9f52a480
    ProcMgr_translateAddr: srcAddr [0x9f52a480] dstAddr [0xd975a480]
    DM8168VPSSM3PROC_translate: translated [11] srcAddr=0x9f52a49c to dstAddr=0x9f52a49c
    ProcMgr_translateAddr: srcAddr [0x9f52a49c] dstAddr [0xd975a49c]
    handle->slaveSRCfg[0].entryBase 9f700000
    DM8168VPSSM3PROC_translate: (default) srcAddr=0x9f700000 to dstAddr=0x9f700000
    Platform_loadCallback:
        No SharedRegion.entry[0].cacheEnable configuration value found, using default FALSE
    Platform_loadCallback:
        Mapping SharedRegion 0
        addr[ProcMgr_AddrType_MasterPhys] [0x9f700000]
        addr[ProcMgr_AddrType_SlaveVirt]  [0x9f700000]
        size                              [0x200000]
        isCached                          [0]
    MemoryOS_map: entry already exists
        mapInfo->src  [0x9f700000]
        mapInfo->dst  [0xd9c00000]
        mapInfo->size [0x200000]
    _ProcMgr_map for SlaveVirt:
        dstAddr       [0x9f700000]
        sgList.paddr  [0x9f700000]
        sgList.offset [0x0]
        sgList.size [0x200000]
    DM8168VPSSM3PROC_map: adding dynamic entry: [15] sva=0x9f700000, mpa=0x9f700000, size=0x200000
    DM8168VPSSM3PROC_translate: translated [11] srcAddr=0x9f52a480 to dstAddr=0x9f52a480
    ProcMgr_translateAddr: srcAddr [0x9f52a480] dstAddr [0xd975a480]
        DM8168VPSSM3PROC_start: Configuring boot register
            Reset vector [0x9f668c3d]!
        DM8168VPSSM3PROC_start: Slave successfully started!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    Ipc_attach: Ipc_procSyncStart failed!
    MemoryOS_map: entry already exists
        mapInfo->src  [0x9a100000]
        mapInfo->dst  [0xda000000]
        mapInfo->size [0x100000]
    MemoryOS_map: entry already exists
        mapInfo->src  [0xb3d00000]
        mapInfo->dst  [0xdb000000]
        mapInfo->size [0xbc00000]
    _NotifyDrv: Termination packet
    FIRMWARE: 2 start Successful

    ....

    MessageQ_get
    DM8168DSPPROC_translate: translated [2] srcAddr=0x10f000c8 to dstAddr=0x40f000c8
    ProcMgr_translateAddr: srcAddr [0x40f000c8] dstAddr [0xda3600c8]
    DM8168DSPPROC_translate: translated [2] srcAddr=0x10f000c8 to dstAddr=0x40f000c8
    ProcMgr_translateAddr: srcAddr [0x40f000c8] dstAddr [0xda3600c8]
    DM8168DSPPROC_translate: translated [2] srcAddr=0x10f00090 to dstAddr=0x40f00090
    ProcMgr_translateAddr: srcAddr [0x40f00090] dstAddr [0xda360090]
    DM8168DSPPROC_translate: translated [2] srcAddr=0x10f0110c to dstAddr=0x40f0110c
    ProcMgr_translateAddr: srcAddr [0x40f0110c] dstAddr [0xda36110c]
    MessageQ_get
    MessageQ_get    
    MessageQ_get
    Assertion at Line no: 380 in /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/syslink_2_21_01_0
    5/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    Assertion at Line no: 380 in /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/syslink_2_21_01_0
    5/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    DM8168DSPPROC_translate: translated [2] srcAddr=0x10f000c8 to dstAddr=0x40f000c8
    ProcMgr_translateAddr: srcAddr [0x40f000c8] dstAddr [0xda3600c8]
  • we also see this, prior to the ResTrack.c error:

    *** ProcMgr_read: target address is not mapped
    Error [0xfffffff3] at Line no: 1711 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05
    -r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/procMgr/common/ProcMgr.c
    *** ProcMgrDrv_ioctl: Kernel-side ProcMgr_read failed
    Error [0xfffffff3] at Line no: 1338 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05
    -r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/procMgr/hlos/knl/Linux/ProcMgrDrv.c

    Johny.

  • More info .. got a segfault if I stop the app that show the video stream without exiting the video screen first.

    *** MessageQ_registerHeap: Specified heap is already registered.!
    Error [0xfffffffc] at Line no: 1749 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05
    -r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/ipc/hlos/knl/MessageQ.c
    *** MessageQ_registerHeap: Specified heap is already registered.!
    Error [0xfffffffc] at Line no: 1749 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05
    -r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/ipc/hlos/knl/MessageQ.c
    *** MessageQ_registerHeap: Specified heap is already registered.!
    Error [0xfffffffc] at Line no: 1749 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05
    -r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/ipc/hlos/knl/MessageQ.c
    *** NameServer_add: duplicate entry found!
    Error [0xfffffffe] at Line no: 1055 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05
    -r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/utils/hlos/knl/NameServer.c
    *** MessageQ_create: Failed in NameServer_addUInt32
    Error [0xffffffff] at Line no: 769 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-
    r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/ipc/hlos/knl/MessageQ.c
    *** MessageQDrv_ioctl: MessageQ_create failed
    Error [0xffffffff] at Line no: 508 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-
    r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/ipc/hlos/knl/Linux/MessageQDrv.c
    *** NameServer_add: duplicate entry found!
    Error [0xfffffffe] at Line no: 1055 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05
    -r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/utils/hlos/knl/NameServer.c
    *** MessageQ_create: Failed in NameServer_addUInt32
    Error [0xffffffff] at Line no: 769 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-
    r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/ipc/hlos/knl/MessageQ.c
    *** MessageQDrv_ioctl: MessageQ_create failed
    Error [0xffffffff] at Line no: 508 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-
    r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/ipc/hlos/knl/Linux/MessageQDrv.c
    *** NameServer_add: duplicate entry found!
    Error [0xfffffffe] at Line no: 1055 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05
    -r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/utils/hlos/knl/NameServer.c
    *** MessageQ_create: Failed in NameServer_addUInt32
    Error [0xffffffff] at Line no: 769 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-
    r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/ipc/hlos/knl/MessageQ.c
    *** MessageQDrv_ioctl: MessageQ_create failed
    Error [0xffffffff] at Line no: 508 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-
    r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/ipc/hlos/knl/Linux/MessageQDrv.c
    Assertion at Line no: 1239 in /h*** NameServer_add: duplicate entry found!
    Error [0xfffffffe] at Line no: 1055 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05
    -r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/utils/hlos/knl/NameServer.c
    ome/builder/auto*** MessageQ_create: Failed in NameServer_addUInt32
    Error [0xffffffff] at Line no: 769 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-
    r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/ipc/hlos/knl/MessageQ.c
    builds/default/o*** MessageQDrv_ioctl: MessageQ_create failed
    Error [0xffffffff] at Line no: 508 in file /home/johny/CodeHG/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-
    r29j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/ipc/hlos/knl/Linux/MessageQDrv.c
    penembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/syslink_2_21_01_05/packages/ti/syslink/ipc/hlos/usr/MessageQ.c: (h
    andle != NULL) : failed
    Assertion at Line no: 695 in /home/builder/autobuilds/default/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/sys
    link_2_21_01_05/packages/ti/syslink/ipc/hlos/usr/MessageQ.c: (queueId != MessageQ_INVALIDMESSAGEQ) : failed
    ServiceMgr_prime: MessageQ_put failed: status = 0xfffffffe
    Assertion at Line no: 1239 in /home/builder/autobuilds/default/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/sy
    slink_2_21_01_05/packages/ti/syslink/ipc/hlos/usr/MessageQ.c: (handle != NULL) : failed
    Assertion at Line no: 695 in /home/builder/autobuilds/default/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/sys
    link_2_21_01_05/packages/ti/syslink/ipc/hlos/usr/MessageQ.c: (queueId != MessageQ_INVALIDMESSAGEQ) : failed
    ServiceMgr_prime: MessageQ_put failed: status = 0xfffffffe
    Assertion at Line no: 1239 in /home/builder/autobuilds/default/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/sy
    slink_2_21_01_05/packages/ti/syslink/ipc/hlos/usr/MessageQ.c: (handle != NULL) : failed
    Assertion at Line no: 695 in /home/builder/autobuilds/default/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/sys
    link_2_21_01_05/packages/ti/syslink/ipc/hlos/usr/MessageQ.c: (queueId != MessageQ_INVALIDMESSAGEQ) : failed
    ServiceMgr_prime: MessageQ_put failed: status = 0xfffffffe
    Assertion at Line no: 1239 in /home/builder/autobuilds/default/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/sy
    slink_2_21_01_05/packages/ti/syslink/ipc/hlos/usr/MessageQ.c: (handle != NULL) : failed
    Assertion at Line no: 695 in /home/builder/autobuilds/default/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/sys
    link_2_21_01_05/packages/ti/syslink/ipc/hlos/usr/MessageQ.c: (queueId != MessageQ_INVALIDMESSAGEQ) : failed
    ServiceMgr_prime: MessageQ_put failed: status = 0xfffffffe
    Assertion at Line no: 1239 in /home/builder/autobuilds/default/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/sy
    slink_2_21_01_05/packages/ti/syslink/ipc/hlos/usr/MessageQ.c: (handle != NULL) : failed
    Assertion at Line no: 695 in /home/builder/autobuilds/default/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/sys
    link_2_21_01_05/packages/ti/syslink/ipc/hlos/usr/MessageQ.c: (queueId != MessageQ_INVALIDMESSAGEQ) : failed
    ServiceMgr_prime: MessageQ_put failed: status = 0xfffffffe
    Assertion at Line no: 1239 in /home/builder/autobuilds/default/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/sy
    slink_2_21_01_05/packages/ti/syslink/ipc/hlos/usr/MessageQ.c: (handle != NULL) : failed
    ServiceMgr_rxThreadFxn: bind() failed with errno = 98
    Assertion at Line no: 761 in /home/builder/autobuilds/default/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r29j/sys
    link_2_21_01_05/packages/ti/syslink/ipc/hlos/usr/MessageQ.c: (handle != NULL) : failed
    Segmentation fault

  • Hi Johny:

       Per:

    http://e2e.ti.com/support/dsp/davinci_digital_media_processors/f/716/p/254766/891438.aspx#891438

       Try removing notifyk.vpssm3_sva from bootargs.

       If this works, it may be that the Linux kernel notify is not playing nice with the SysLink notify module in terms of resource cleanup.

       Looking at the ResTrack code, the assert occurs on Linux side SysLink driver under a few conditions:

       1) if there is a failure in a SysLink setup routine (but this is unlikely since I didn't see any such failures in the trace, and resources would probably not be allocated *before* module setup); and

        2) if processes are closing the driver without releasing resources.  (Although, looking at the code, closing the driver *should* also release any abandoned resources).  The resources being tracked (by processes ID) in the SysLink driver appear to be objects created for NameServer,  HeapBufMp, MessageQ, and Notify.

      This related thread:

    http://e2e.ti.com/support/dsp/davinci_digital_media_processors/f/716/t/218725.aspx?pi239031349=2

      implies there might be an issue with multiple pipelines (multiple processes) allocating and freeing resources as well.

       So, if the bootargs solution doesn't work, we may need to add trace to the ResTrack module to see which processes are freeing/allocating in what order. 

    Regards,

    - Gil

  • Gil, thanks for your replied.

    I tried remove the notifyk.vpssm3_sva from the bootargs but that cause our vpss.ko not to load.  So I think the I will try to add trace to ResTrack module like you suggested.  Do you have any other thought on this issue?  we need to have multiple pipe line so if there is a way to clean that up nicely that would be great.  

    Thanks,

    Johny.

    VPSS_FVID2: Unable to get firmware version
    VPSS_CORE : Failed to init fvid2 interface,
    vpss: probe of vpss failed with error -22
    VPSS_CORE : failed to register ti81xx-vpss driver
    FATAL: Error inserting vpss (/lib/modules/2.6.37-touchlink-07/kernel/drivers/video/ti81xx/vpss/vpss.ko): No such device
    BUG: Your driver calls ioremap() on system memory. This leads
    to architecturally unpredictable behaviour on ARMv6+, and ioremap()
    will fail in the next kernel release. Please fix your driver.
    ------------[ cut here ]------------
    WARNING: at arch/arm/mm/ioremap.c:211 __arm_ioremap_pfn_caller+0x58/0x190()
    Modules linked in: vpss(+) syslink zwave rt3352 uhf345 secio
    Backtrace:
    [<c0050bf4>] (dump_backtrace+0x0/0x110) from [<c03e3a58>] (dump_stack+0x18/0x1c)
    r7:00000000 r6:c00540cc r5:c04a54a6 r4:000000d3
    [<c03e3a40>] (dump_stack+0x0/0x1c) from [<c0075c80>] (warn_slowpath_common+0x54/0x6c)
    [<c0075c2c>] (warn_slowpath_common+0x0/0x6c) from [<c0075cbc>] (warn_slowpath_null+0x24/0x2c)
    r9:d5c1e000 r8:c053cbe0 r7:bf19a624 r6:000a0200 r5:00200000
    r4:00000000
    [<c0075c98>] (warn_slowpath_null+0x0/0x2c) from [<c00540cc>] (__arm_ioremap_pfn_caller+0x58/0x190)
    [<c0054074>] (__arm_ioremap_pfn_caller+0x0/0x190) from [<c0054288>] (__arm_ioremap_caller+0x64/0x6c)
    [<c0054224>] (__arm_ioremap_caller+0x0/0x6c) from [<c00663bc>] (omap_ioremap+0x60/0x64)
    r6:00000000 r5:00200000 r4:a0200000
    [<c006635c>] (omap_ioremap+0x0/0x64) from [<bf19a624>] (vps_sbuf_init+0x108/0x1c0 [vpss])
    r7:bf193c78 r6:00200000 r5:a0200000 r4:d585bd40
    [<bf19a51c>] (vps_sbuf_init+0x0/0x1c0 [vpss]) from [<bf1840b0>] (vps_probe+0x4c/0x174 [vpss])
    r8:c0553a18 r7:d67f9c80 r6:bf193c10 r5:c053cbe8 r4:c053cbe0
    [<bf184064>] (vps_probe+0x0/0x174 [vpss]) from [<c023ae28>] (platform_drv_probe+0x20/0x24)
    r7:d67f9c80 r6:bf193768 r5:c053cbe8 r4:c053cbe8
    [<c023ae08>] (platform_drv_probe+0x0/0x24) from [<c0239d94>] (driver_probe_device+0xd0/0x190)
    [<c0239cc4>] (driver_probe_device+0x0/0x190) from [<c0239ebc>] (__driver_attach+0x68/0x8c)
    r7:d67f9c80 r6:bf193768 r5:c053cc1c r4:c053cbe8
    [<c0239e54>] (__driver_attach+0x0/0x8c) from [<c0239540>] (bus_for_each_dev+0x50/0x84)
    r7:d67f9c80 r6:bf193768 r5:c0239e54 r4:00000000
    [<c02394f0>] (bus_for_each_dev+0x0/0x84) from [<c0239bb8>] (driver_attach+0x20/0x28)
    r6:bf193768 r5:bf193754 r4:00000000
    [<c0239b98>] (driver_attach+0x0/0x28) from [<c0238e18>] (bus_add_driver+0xb4/0x234)
    [<c0238d64>] (bus_add_driver+0x0/0x234) from [<c023a1f4>] (driver_register+0xb0/0x13c)
    [<c023a144>] (driver_register+0x0/0x13c) from [<c023b2cc>] (platform_driver_register+0x4c/0x60)
    r9:d5c1e000 r8:bf199000 r7:4006c000 r6:00016160 r5:bf193754
    r4:00000000
    [<c023b280>] (platform_driver_register+0x0/0x60) from [<c023b300>] (platform_driver_probe+0x20/0x70)
    [<c023b2e0>] (platform_driver_probe+0x0/0x70) from [<bf199030>] (vps_init+0x30/0x5c [vpss])
    r5:bf193aec r4:00000000
    [<bf199000>] (vps_init+0x0/0x5c [vpss]) from [<c004242c>] (do_one_initcall+0xd0/0x1a4)
    [<c004235c>] (do_one_initcall+0x0/0x1a4) from [<c00a1a98>] (sys_init_module+0x9c/0x1bc)
    [<c00a19fc>] (sys_init_module+0x0/0x1bc) from [<c004ce00>] (ret_fast_syscall+0x0/0x30)
    r7:00000080 r6:00000000 r5:00000000 r4:00000020
    ---[ end trace d6cae975e4e7a6c9 ]---

  • Johny:

       Before going down the route of instrumenting the code, could you please point me to the GST plugIn code that would be doing the SysLink calls?

       Maybe there is something we can spot by code inspection.

    Thanks,

    - Gil

  • Hello Johny:

        Looking more closely at the code, and a colleague verified this by example, that by not "deleting" a resource before "destroying" the module managing that resource, will cause the assert in ResTrack.c:

    Assertion at Line no: 380 in /home/builder/autobuilds/default/openembed/touchlink.1-tmp/work/touchlink-none-linux-gnueabi/ti-syslink-2_21_01_05-r26j/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/syslink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed

        (I was incorrect in assuming that closing the driver would automatically cleanup abandoned resources in this scenario).

        The most likely cause is some process is failing to do either a HeapBufMP_delete(), MessageQ_delete(), Notify_unregisterEvent(), Notify_unregisterEventSingle(), or NameServer_delete() before calling SysLink_destroy() (or any of those respective <Module>_destroy() functions.

        One simple thing we can do to narrow down which resource is not being freed, would be to temporarily add the following bolded line in ./ti/syslink/utils/hlos/knl/ResTrack.c:

        /* make sure resource list is empty */
        do {
            elem = List_dequeue(proc->resList);
            GT_assert(curTrace, (elem == NULL));
            status = ResTrack_E_FAIL;
        } while (elem != NULL);
       Then rebuild syslink.
       Note that currently, even though there is an assertion thrown, the ResTrack_unregister() function still returns ResTrack_E_SUCCESS.  This will cause it to return an error, that will be caught later.
       This error (when it happens) will then trigger an assert in one of the HeapBufMP, MessageQ or other IPC kernel drivers during the processing of the CMD_<MODULE>_DESTROY IOCTL; for example, in ./ti/syslink/ipc/hlos/knl/Linux/MessageQDrv.c:
          case CMD_MESSAGEQ_DESTROY: {
                /* unregister process from resource tracker */
                status = ResTrack_unregister(MessageQDrv_state.resTrack, pid);
                GT_assert(curTrace, (status >= 0));

                /* finalize the module */
                status = MessageQ_destroy();
                GT_assert(curTrace, (status >= 0));
            }
            break;

        Once you get the assert, you can then add GT_trace() statements in the module where the new assert occurred to print the pid (Process ID), to see which process is the culprit.

        Let me know how that works.

    Regards,
    - Gil
  • Hi Gil,

    gstreamer uses  omx (omx-ti81xx-src_05_02_00_48) with openmax and qt-mobility.

    I will try what you are suggested.

    We find out some more info on our end too.  when it failed it created 2 more elem but only remove 2 less, in our example: it create 23 but remove only 21 and we get the assert.  when it working we get 21 push and remove 21.   

    Thanks,

    Johny.

  • Hi Gil,

    I tried what you suggested, here is the results.  I am a little confuse so hopefully you can help explaining it for me.

    Normal - not ResTrack assertion: - register 4 pid and push 21 time - remove 21 time does not get ResTrack assertion

    !!! JOHNY ResTrack_pop
    !!! JOHNY ResTrack_pop
    !!! JOHNY ResTrack_register
    !!! JOHNY Restrac_register with pid 1405
    !!! JOHNY Restrac_register proc==NULL status 0
    !!! JOHNY ResTrack_register
    !!! JOHNY Restrac_register with pid 1405
    !!! JOHNY Restrac_register proc==NULL status 0
    !!! JOHNY ResTrack_register
    !!! JOHNY Restrac_register with pid 1405
    !!! JOHNY Restrac_register proc==NULL status 0
    !!! JOHNY ResTrack_register
    !!! JOHNY Restrac_register with pid 1405
    !!! JOHNY Restrac_register proc==NULL status 0

    Normal remove/unregister, no ResTrack assertion.

    !!! JOHNY ResTrack_unregister
    !!! JOHNY proc->pid 1405
    !!! JOHNY Elem list_dequeue (null)
    !!! JOHNY ResTrack_unregister leave, status -1
    Assertion at Line no: 650 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/ipc/hlos/knl/Linux/HeapBufMPDrv.c: (status >= 0) : failed
    !!! JOHNY ResTrack_pop
    !!! JOHNY ResTrack_unregister
    !!! JOHNY proc->pid 1405
    !!! JOHNY Elem list_dequeue (null)
    !!! JOHNY ResTrack_unregister leave, status -1
    Assertion at Line no: 679 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/ipc/hlos/knl/Linux/MessageQDrv.c: (status >= 0) : failed
    !!! JOHNY ResTrack_unregister
    !!! JOHNY proc->pid 1405
    !!! JOHNY Elem list_dequeue (null)
    !!! JOHNY ResTrack_unregister leave, status -1
    Assertion at Line no: 1216 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sys
    link/ipc/hlos/knl/Linux/NotifyDrv.c: (status >= 0) : failed
    !!! JOHNY ResTrack_pop
    !!! JOHNY ResTrack_unregister
    !!! JOHNY proc->pid 1405
    !!! JOHNY Elem list_dequeue (null)
    !!! JOHNY ResTrack_unregister leave, status -1
    Assertion at Line no: 932 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/NameServerDrv.c: (status
    >= 0) : failed
    !!! JOHNY ResTrack_pop

    RESTRACK ASSERTION - push 23 time

    !!! JOHNY ResTrack_pop
    !!! JOHNY ResTrack_register
    !!! JOHNY Restrac_register with pid 1405
    !!! JOHNY Restrac_register proc==NULL status 0
    !!! JOHNY ResTrack_register
    !!! JOHNY Restrac_register with pid 1405
    !!! JOHNY Restrac_register proc==NULL status 0
    !!! JOHNY ResTrack_register
    !!! JOHNY Restrac_register with pid 1405
    !!! JOHNY Restrac_register proc==NULL status 0
    !!! JOHNY ResTrack_register
    !!! JOHNY Restrac_register with pid 1405
    !!! JOHNY Restrac_register proc==NULL status 0


    get ResTrack assertion on unregister

    !!! JOHNY ResTrack_unregister
    !!! JOHNY proc->pid 1405
    !!! JOHNY Elem list_dequeue (null)
    !!! JOHNY ResTrack_unregister leave, status -1
    Assertion at Line no: 650 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/ipc/hlos/knl/Linux/HeapBufMPDrv.c: (status >= 0) : failed
    !!! JOHNY ResTrack_pop
    !!! JOHNY ResTrack_unregister
    !!! JOHNY proc->pid 1405
    !!! JOHNY Elem list_dequeue e7c2a000
    Assertion at Line no: 404 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    !!! JOHNY Elem list_dequeue e7c21000
    Assertion at Line no: 404 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    !!! JOHNY Elem list_dequeue (null)
    !!! JOHNY ResTrack_unregister leave, status -1
    Assertion at Line no: 679 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/ipc/hlos/knl/Linux/MessageQDrv.c: (status >= 0) : failed
    !!! JOHNY ResTrack_unregister
    !!! JOHNY proc->pid 1405
    !!! JOHNY Elem list_dequeue (null)
    !!! JOHNY ResTrack_unregister leave, status -1
    Assertion at Line no: 1216 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sys
    link/ipc/hlos/knl/Linux/NotifyDrv.c: (status >= 0) : failed
    !!! JOHNY ResTrack_pop
    !!! JOHNY ResTrack_unregister
    !!! JOHNY proc->pid 1405
    !!! JOHNY Elem list_dequeue (null)
    !!! JOHNY ResTrack_unregister leave, status -1
    Assertion at Line no: 932 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/NameServerDrv.c: (status
    >= 0) : failed
    !!! JOHNY ResTrack_pop


  • So looked at that output it kind of make sense to why the assert happen, the question is why it push 23 and only remove 21.  This caused the assertion because the resource is not NULL.  

    Thanks,

    Johny.

  • Hi Gil,

    so I added a few more debug print and found that the resource does not belong to any process since its pid -622927872.

    Can we just clean this up some how and not assert?

    thanks,

    Johny.

    !!! JOHNY ResTrack_remove, found elem da77c000, pid 1588
    !!! JOHNY ResTrack_remove, check elem dadeb000
    !!! JOHNY ResTrack_remove, check elem dade2000
    !!! JOHNY ResTrack_remove, check elem da7b2000
    !!! JOHNY ResTrack_remove, removing elem da7b2000
    !!! JOHNY ResTrack_remove, resource found, elem da7b2000
    !!! JOHNY ResTrack_remove, done with elem da7b2000, status 0
    !!! JOHNY ResTrack_remove leave, status 0, elem da7b2000
    !!! JOHNY ResTrack_unregister
    !!! JOHNY ResTrack_unregister proc->pid 1588
    !!! JOHNY ResTrack_unregister Elem list_dequeue (null)
    !!! JOHNY ResTrack_unregister leave, status 0
    !!! JOHNY ResTrack_pop
    !!! JOHNY MessageQDrv_ioctl unregister pid 1588
    !!! JOHNY ResTrack_unregister
    !!! JOHNY ResTrack_unregister proc->pid 1588
    !!! JOHNY ResTrack_unregister Elem list_dequeue dadeb000
    !!! JOHNY ResTrack_unregister, elem dadeb000 still in list with pid -622927872
    !!! JOHNY ResTrack_unregister, elem dadeb000 still in list with pid -622927872 not match pid 1588

  • Hi Johny:

        Thanks for instrumenting the code.  It is helpful.

        A few things don't make sense here:

    !!! JOHNY ResTrack_unregister leave, status -1

        This should only happen with an accompanying assert from ResTrack_unregister(), if the line returning E_FAIL was placed as I suggested; but in the log, it appears without an assert.

        Also, this bit of trace doesn't make sense:

    !!! JOHNY ResTrack_unregister, elem dadeb000 still in list with pid -622927872

        Can you please upload the complete trace log, and the modified source modules with the new trace statements?

        I was under the impression there were multiple processes in the gst pipeline, but the trace is only showing the same PID (!! JOHNY proc->pid 1405 ).   Is there only one process running under GST?

         However, this trace seems to have caught the resource that is not being freed:  It appears MessageQ_destroy() (or SysLink_destroy()) is being called before two MessageQ_delete() calls are made:

    !!! JOHNY ResTrack_unregister
    !!! JOHNY proc->pid 1405
    !!! JOHNY Elem list_dequeue e7c2a000
    Assertion at Line no: 404 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    !!! JOHNY Elem list_dequeue e7c21000
    Assertion at Line no: 404 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    !!! JOHNY Elem list_dequeue (null)
    !!! JOHNY ResTrack_unregister leave, status -1
    Assertion at Line no: 679 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/ipc/hlos/knl/Linux/MessageQDrv.c: (status >= 0) : failed

       This is happening from a component running in processID 1405, but unfortunately it appears ALL the plugins are running in that same process, so that doesn't help identify which one.

       So, at this point, you can add some GT_Trace in the CMD_MESSAGEQ_CREATE switch statement in MessageQDrv.c to print the name of the MessageQ's that are being created, and deleted, so we can see which ones are NOT getting deleted at the end.   This might point to which GST plugin is misbehaving (then we can locate the source code for those plugIns which create those MessageQs of those names).

       One other question:  when the ResTrack assert occurs does the program crash or hang, or is it just a benign assertion?
    Regards,
    - Gil
  • Hi Gil,

    When the assertion happen we can no longer stream video and had to reboot the system to recover.  

    Ignore !!! JOHNY ResTrack_unregister, elem dadeb000 still in list with pid -622927872 - this is my bad I set up the wrong value here.

    I will post a clean trace with the E_FAIL for you.  

    It does not matter if there is one stream, 2 stream, 3 tream or 4 stream, it always create 4 process with the same pid.  

    I commented out the assert line in ResTrack.c and the system is behaving better.  It does not crash since the assert does not happen.  Do you know if the two line of code below in ResTrack.c (Restrack_unregister function)

    /* destroy the list object */
    List_delete(&(proc->resList));

    /* free the resource process object */
    Memory_free(NULL, proc, sizeof(ResTrack_Proc));

    will clean up all the resources?   I am assuming it does because the system does not crash since I took the assert out.

    I will try the other suggestion you have to find what is being create/delete in MessageQDrv.c

    Thanks,

    Johny.

  • Hi Gil,

    Hope these Message make sense to you - i will add more print.  So assertion happen in CMD_MESSAGEQ_DESTROY: I think.

    is there a way to enable certain TRACE? if I enable TRACEENTER or TRACECLASS the system is flooded with message and not come up so it hard to debug.

    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem dae66000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem dae7b000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem dae51000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem dae48000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem daea5000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem daeba000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem dae90000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem dae87000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem daee4000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem daef9000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem daecf000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem c00c0934
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem dadfd000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem dadf4000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem da7ee000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem (null)
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem da79a000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: untrack resource
    !!! MessageQDrv_ioctl, pid 1498, elem (null)
    !!! HeapBufMPDrv_ioctl, CMD_HEAPBUFMP_DESTROY: ResTrack_unregister pid 1498
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DESTROY: ResTrack_unregister pid 1498
    Assertion at Line no: 386 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysli
    nk/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    Assertion at Line no: 386 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysli
    nk/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    !!! NotifyDrv_ioctl, CMD_NOTIFY_DESTROY: ResTrack_unregister pid 1498
    !!! NameServerDrv_ioctl, CMD_NAMESERVER_DESTROY: ResTrack_unregister pid 1498

  • Hi Gil, 

    Notice the two not NULL elem, this caused the assertion, there are no errors before that, so maybe the question is why these two resource get added because it not belong to this process since it failed to remove from the list according to this function in Restrack_remove:

    /* invoke the supplied compare function */
    if ((*cmpFxn)((Void *)ref, (Void *)elem)) {
    List_remove(proc->resList, elem); /* found it, remove from list */
    break;
    }

    Output when Assert.

    CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem 00000080
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem e6d64000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem e6d79000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem e6d4f000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem 00000100
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem e6ce6000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem e6cfb000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem e6cd1000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem c00c0934
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem e6d25000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem e6d3a000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem e6d10000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem c00c0934
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem 0000ffff
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem e6c7d000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem e6c74000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem e6c4d000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem (null)
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem daff8000
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DELETE: ResTrack_remove
    !!! MessageQDrv_ioctl, pid 1500, elem (null)

    !!! HeapBufMPDrv_ioctl, CMD_HEAPBUFMP_DESTROY: ResTrack_unregister pid 1500
    !!! ResTrack_unregister, proc->resList dafe0000, pid 1500, elem is (null)
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DESTROY: ResTrack_unregister pid 1500
    !!! ResTrack_unregister, proc->resList dafda000, pid 1500, elem is e6c68000
    Assertion at Line no: 386 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysli
    nk/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    !!! ResTrack_unregister, proc->resList dafda000, pid 1500, elem is e6c5f000
    Assertion at Line no: 386 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysli
    nk/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    !!! ResTrack_unregister, proc->resList dafda000, pid 1500, elem is (null)
    !!! MessageQDrv_ioctl, CMD_MESSAGEQ_DESTROY: MessageQ_destroy pid 1500
    !!! NotifyDrv_ioctl, CMD_NOTIFY_DESTROY: ResTrack_unregister pid 1500
    !!! ResTrack_unregister, proc->resList dafd4000, pid 1500, elem is (null)
    !!! NameServerDrv_ioctl, CMD_NAMESERVER_DESTROY: ResTrack_unregister pid 1500
    !!! ResTrack_unregister, proc->resList dafc2000, pid 1500, elem is (null)

  • Johny:

    >  I commented out the assert line in ResTrack.c and the system is behaving better.  It does not crash since the assert does not happen.

        The GT_assert() should not be the cause of the crash: Looking at the code, the GT_assert() is purely informational, in SysLink HLOS side.   From packages/ti/syslink/utils/Trace.h:

    #if defined(SYSLINK_BUILD_DEBUG)
    #if defined(SYSLINK_BUILD_HLOS)
    #define GT_assert(x, y)                                                 \
    do {                                                                    \
        if (!(y)) {                                                         \
            Osal_printf ("Assertion at Line no: %d in %s: %s : failed\n",   \
                         __LINE__, __FILE__, #y);                           \
        }                                                                   \
    } while (0);
    #endif /* defined(SYSLINK_BUILD_HLOS) */


    > It does not matter if there is one stream, 2 stream, 3 tream or 4 stream, it always create 4 process with the same pid.

         Each process should have it's own unique process ID.   To get some visibility into the system, can you run something like "top" or "htop", and include a screen shot of the running processes and threads before the assert fires?

       http://www.howtogeek.com/howto/ubuntu/using-htop-to-monitor-system-processes-on-linux/


    >  Do you know if the two line of code below in ResTrack.c (Restrack_unregister function) [--snip--] will clean up all the resources?

       The ResTrack module does not actually remove the resources.  It is up to the driver to do a ResTrack_pop() to get the remaining resources to free when the driver closes. 

        For MessageQ this would happend in packages/ti/syslink/ipc/hlos/knl/Linux/MessageQDrv.c:MessageQDrv_releaseResources().    But in this case, since the ResTrack_unregister() call already removed the ResTrack_Proc object for this process id in the MessageQDrv.c:CMD_MESSAGEQ_DESTROY switch case, the resource is not later freed during the driver close method, in MessageQDrv.c:MessageQDrv_releaseResources().

        However, looking at the code for MessageQ_destroy(), it appears to clean up any abandoned messages there.  And MessageQ_destroy() is called from SysLink_destroy().  So, if it is MessageQ objects that are not being freed by the application, it appears that MessageQ_destroy() takes care of cleaning them up for you.

    from packages/ti/syslink/ipc/hlos/knl/MessageQ.c:MessageQ_destroy():

                /* Delete any Message Queues that have not been deleted so far. */
    for (i = 0; i< MessageQ_module->numQueues; i++) {
                    GT_assert (curTrace, (MessageQ_module->queues [i] == NULL));
                    if (MessageQ_module->queues [i] != NULL) {
                        tmpStatus = MessageQ_delete (
                                                    &(MessageQ_module->queues [i]));
                        GT_assert (curTrace, (tmpStatus >= 0));
                        if (EXPECT_FALSE ((tmpStatus < 0) && (status >= 0))) {
                            status = tmpStatus;
    #if !defined(SYSLINK_BUILD_OPTIMIZE)
                            GT_setFailureReason (curTrace,
                                                 GT_4CLASS,
                                                 "MessageQ_destroy",
                                                 status,
                                                 "MessageQ_delete failed!");
    #endif /* if !defined(SYSLINK_BUILD_OPTIMIZE) */
                        }
                    }
                }


    I am assuming it does because the system does not crash since I took the assert out.

       So, the code supports this observation.  Looking at MessageQ_destroy(), it appears that MessageQ cleans up after clients that don't call MessageQ_delete().   However, in this case, we should see the above GT_assert bolded line in the log when the application shuts down.  (Do you see it?).

       It might be interesting to put a GT_trace() line in MessageQ_destroy(), and see the abandoned resources getting cleaned up.

       Also, I searched the omx-ti81xx-src_05_02_00_48/ directory in the EZSDK, and it appears that the two missing MessageQ objects may be the result of a missing call to RcmClient_delete().   An RCM Client is created in ./src/ti/omx/domx/OmxRpc.c:OmxRpc_rcmClientCreate().  I know that RcmClient_create() creates two (unnamed) MessageQ objects.   Those normally get deleted when RcmClient_delete() get's called.

       So, I'm thinking that OmxRPC is not calling RcmClient_delete() before MessageQ_destroy() gets called. 

       I notice there is some kind of OMX trace in OmxRpc.   I also found this, for enabling trace in EZSDK OMX components: 

    http://processors.wiki.ti.com/index.php/EZ_SDK_FAQ#How_to_enable_debug.2Ftraces_from_Linux_side_modules_of_OMX_.28OMX_base.2C_D-OMX.2C_etc....29.3F

    http://processors.wiki.ti.com/index.php/Using_slog_in_EZSDK

      This may be another avenue to get some visibility into who is using MessageQ.

    Regards,
    - Gil
  • Hi Gil,

    I was unable to find out more info on the MessageQ_destroy and RcClient_delete but my colleague Craig found this error in the application space, so it may be in the OMX like you said.  We will try to look through the omx code to see if we can find this, if you or someone at TI can help us look for this that would be great. Thanks for all your help. 

    Johny.

    ** (<unknown>:21407): CRITICAL **: g_omx_port_allocate_buffers: assertion `port->buffers[i]' failed

    unrecoverable error: There were insufficient resources to perform the requested operation (0x80001000)

     

    ** (<unknown>:21407): CRITICAL **: g_omx_port_allocate_buffers: assertion `port->buffers[i]' failed

    unrecoverable error: There were insufficient resources to perform the requested operation (0x80001000)

     

    ** (<unknown>:21407): CRITICAL **: g_omx_port_allocate_buffers: assertion `port->buffers[i]' failed

     

    ** (<unknown>:21407): CRITICAL **: g_omx_port_allocate_buffers: assertion `port->buffers[i]' failed

    unrecoverable error: There were insufficient resources to perform the requested operation (0x80001000)

     

    ** (<unknown>:21407): CRITICAL **: g_omx_port_allocate_buffers: assertion `port->buffers[i]' failed

     

    ** (<unknown>:21407): CRITICAL **: g_omx_port_allocate_buffers: assertion `port->buffers[i]' failed

     

    ** (<unknown>:21407): CRITICAL **: g_omx_port_allocate_buffers: assertion `port->buffers[i]' failed

    unrecoverable error: There were insufficient resources to perform the requested operation (0x80001000)

     

    ** (<unknown>:21407): CRITICAL **: g_omx_port_allocate_buffers: assertion `port->buffers[i]' failed

     

     

  • Hi Gil,

    I have a system that run for about 15 hours to stream video and it ran out of memory.  so we are definitely not cleaning up right or big big memory leaks.

    Here is the sys_top information:

    sys_top Ver : 0.2.0.0
    Number Of Running Cores: 3
    DSP    :
    Not Running or Do not integrate sys_top functionality
    MC.HDVICP2:
    Firmware Version: UNKNOWN/INTERNAL VERSION
     0 Heap:Size 1572864    Used 509392     MaxU 509392     Free 1063472    LarF 1012544    
     1 Heap:Size 37748736   Used 31842400   MaxU 31842400   Free 5906336    LarF 5905920    
    Num SR :4         
    SRIn 0 :PhyA 0x9f700000 Virt 0x0        Size 0x200000   
     SRHeap:Size 2095488    Used 18304      MaxU 18304      Free 2077184    LarF 2077184    
    SRIn 1 :PhyA 0x9a100000 Virt 0x0        Size 0x100000   
     SRHeap:Size 1048448    Used 0          MaxU 0          Free 1048448    LarF 1048448    
    SRIn 2 :PhyA 0xb3d00000 Virt 0x0        Size 0xbc00000  
     SRHeap:Size 197132160  Used 69932544   MaxU 69932544   Free 127199616  LarF 127199616  
    MC.HDVPSS:
    Firmware Version: UNKNOWN/INTERNAL VERSION
     0 Heap:Size 2097152    Used 119656     MaxU 119656     Free 1977496    LarF 1977496    
     1 Heap:Size 15728640   Used 0          MaxU 0          Free 15728640   LarF 15728640   
    Num SR :4         
    SRIn 0 :PhyA 0x9f700000 Virt 0x0        Size 0x200000   
     SRHeap:Size 2095488    Used 18304      MaxU 18304      Free 2077184    LarF 2077184    
    SRIn 1 :PhyA 0x9a100000 Virt 0x0        Size 0x100000   
     SRHeap:Size 1048448    Used 0          MaxU 0          Free 1048448    LarF 1048448    
    SRIn 2 :PhyA 0xb3d00000 Virt 0x0        Size 0xbc00000  
     SRHeap:Size 197132160  Used 69932544   MaxU 69932544   Free 127199616  LarF 127199616  
    Legend:
    LarF - Largest Free size   SRIn - Shared Region Index 
    PhyA - Physical Address    Virt - Virtual Address     
    MaxU - Maximum Used Size   SR   - Shared Region       
    MC   - Media Controller   
  • Johnny:

    > ** (<unknown>:21407): CRITICAL **: g_omx_port_allocate_buffers: assertion `port->buffers[i]' failed

      There are two places in ti-ezsdk_dm814x-evm_5_05_02_00/component-sources/gst-openmax_GST_DM81XX_00_07_00_00/omx/gstomx_port.c:g_omx_port_allocate_buffers() where g_return_if_fail (port->buffers[i]) will be called.   

      To find out where, I recommend you turn on GStreamer debug:

    http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/gst-running.html

      The error code is OMX_ErrorInsufficientResources.

      It may be coming from the OMX_AllocateBuffer() call, which asks the OMX component to allocate a buffer.

      The GST debug may tell which component is making the call, and what buffer size is being requested.

    Regards,
    - Gil
  • Hi Gil,

    I am going to create a separate thread for the last two post since it may be not related to this Restrack issue.  

    We put in a work around in the Restrack.c and the problem does not seems to be happen anymore, at least we do not see the assertion.  Can you tell us if this is the correct thing to do (in red)?

    /*
    * ======== ResTrack_unregister ========
    */
    Int ResTrack_unregister(ResTrack_Handle handle, Osal_Pid pid)
    {
    Int status;
    ResTrack_Object *obj;
    ResTrack_Proc *proc;
    List_Elem *elem;
    IGateProvider_Handle gate;
    IArg key;

    /* setup local context */
    status = ResTrack_S_SUCCESS;
    obj = (ResTrack_Object *)handle;
    gate = (IGateProvider_Handle)(obj->gate);

    /* enter the gate */
    key = IGateProvider_enter(gate);

    /* search process list for the given pid */
    elem = NULL;
    while ((elem = List_next(obj->procList, elem)) != NULL) {
    proc = (ResTrack_Proc *)elem;
    if (proc->pid == pid) {
    break; /* found it */
    }
    }

    /* leave if process object was not found */
    if (elem == NULL) {
    status = ResTrack_E_PID;
    goto leave;
    }

    /* remove proc object from process list */
    List_remove(obj->procList, elem);

    /****************
    ** FIX ME: we going to remove the elem that belong to proc->resList anyway
    ** even though it failed the comparison.
    *****************/
    /* make sure resource list is empty */
    do {
    elem = List_dequeue(proc->resList);

    if (elem != NULL) {
    List_remove(proc->resList, elem);
    }

    } while (elem != NULL);


    /* make sure resource list is empty */
    do {
    elem = List_dequeue(proc->resList);

    GT_assert(curTrace, (elem == NULL));

    } while (elem != NULL);

    /* destroy the list object */
    List_delete(&(proc->resList));

    /* free the resource process object */
    Memory_free(NULL, proc, sizeof(ResTrack_Proc));

    leave:
    /* leave the gate */
    IGateProvider_leave(gate, key);

    return(status);
    }

  • Johny:

       The largest free size available for all those heaps looks larger than 1Mb, so if you're running out of memory, it really depends on who is allocating how much from where.  See my previous post on enabling GST debug.

    Regards,
    - Gil
  • Johny:

       The assert in ResTrack_unregister() is highlighting a programming error in the client code above it.   It would be preferable to find the component which is failing to delete it's messageQ's, rather than mask the assertion.

      Also, as mentioned before:  The ResTrack module does not actually remove the resources.  It is up to the driver to do a ResTrack_pop() to get the remaining resources to free when the driver closes.  The List_remove operation does not actually free the resource which the dequeued (or removed) list element holds.

       The logic in syslink_2_20_02_20/packages/ti/syslink/ipc/hlos/knl/Linux/MessageQDrv.c:MessageQDrv_releaseResources(), however, will free abandoned resources, using ResTrack_pop().  But this is only called when the process terminates (usually abnormally) without having freed it's resources (without calling SysLink_destroy).

       Per my earlier post, MessageQ_destroy() should also remove the abandoned MessageQ objects, and it appears from your earlier log that MessageQ_destroy() is being called twice (?).   Again, the GT_assert in MessageQ_destroy() should fire if there are any remaining MessageQs.  Does it?


    Regards,
    - Gil


  • Hi Gil,

    The GT_assert in MessageQ_destroy did not get fire.

    I will look into the other logic as you mentioned.

    Thanks,

    Johny.

  • Johny:

       If the assert didn't fire, that would mean there were no MessageQ objects to be cleaned up during the MessageQ_destroy(), and so none were abandoned.

    > We put in a work around in the Restrack.c and the problem does not seems to be happen anymore

    Is that also the case after running the app overnight? 

    Thanks,
    - Gil
  • Hi Gil,

    We do not see the ResTrack issue anymore with the work around but we do think that we may masking the problem.  System ran overnight just fine, go in and out video is also working fine.  I believed because we calle List_remove on the elem again when we check, this fix the assert but the question still remain why it is different in the first place.

    Failed to remove in Restrack_remove function below:  The comparison failed so the elem did not get remove.  this cause the assert down the road in Restrack_unregister.

    if ((*cmpFxn)((Void *)ref, (Void *)elem)) {
    //Osal_printf ("%s, Elem %p match ref %p \n", __func__,elem, ref);
    List_remove(proc->resList, elem); /* found it, remove from list */
    break;
    }

    Remove it again when Restrack_unregister is call if the elem is not NULL - this fix the assert check because the list is empty now.

    if (elem != NULL) {
    List_remove(proc->resList, elem);
    }

    So we may mask the issue - Is there a way to find out why this compare failed: if ((*cmpFxn)((Void *)ref, (Void *)elem)) - is there a way to print out the actual data when it created and if that data has changed?

    Thanks,

    Johny.

  • Johny:

      In syslink_2_20_02_20/packages/ti/syslink/ipc/hlos/knl/Linux/MessageQDrv.c: you could trace the element fields being passed into ResTrack_push() (at CMD_MESSAGEQ_CREATE) and returned from ResTrack_remove() (at CMD_MESSAGEQ_DELETE), and compare in the trace log.

      It may be possible that a client is passing in a bad handle to the driver CMD_MESSAGEQ_DELETE ioctl, which would cause the ResTrack_remove() call to be skipped.  But if that's the case, we should see some trace (it could have been earlier in the log):

        if (status < 0) {
            /* use MessageQDrv_ioctl in macro */
            GT_setFailureReason(curTrace, GT_4CLASS, "MessageQDrv_ioctl",
                    status, "MessageQ_delete failed");
        }

      Can you post the trace log?

    http://processors.wiki.ti.com/index.php/SysLink_UserGuide#Trace_types

    insmod syslink.ko TRACE=1 TRACEFAILURE=1

    Thanks,
    - Gil
  • Hi Gil,

    Sorry I been looking into other area so did not get this to you.   Below is the log when the resource get create and delete from MessageQ

    WORKS

    MessageQDrv_ioctl, MessageQ_create name ServiceMgr_inMsgs, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da646000
    MessageQDrv_ioctl, MessageQ_create name ServiceMgr_outMsgs, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da65b000
    MessageQDrv_ioctl, MessageQ_create name uiaMaster, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da670000
    MessageQDrv_ioctl, MessageQ_create name uiaStarted, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da685000
    MessageQDrv_ioctl, MessageQ_create name OmxRpcRcmServer_3, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da69a000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da6ac000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da6b5000
    MessageQDrv_ioctl, MessageQ_create name OmxRpcRcmServer_OMX.TI.DUCATI.VIDDEC_Cb_3_0, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da6c4000
    MessageQDrv_ioctl, MessageQ_create name OmxRpcRcmServer_OMX.TI.DUCATI.VIDDEC_Cb_3_1, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da6d9000
    MessageQDrv_ioctl, MessageQ_create name OmxRpcRcmServer_OMX.TI.DUCATI.VIDDEC_Cb_3_2, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da6ee000
    MessageQDrv_ioctl, MessageQ_create name OmxRpcRcmServer_OMX.TI.DUCATI.VIDDEC_Cb_3_3, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da703000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da715000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da71e000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da727000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da730000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da73f000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da748000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da754000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da75d000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da766000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da76f000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da778000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da784000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da78d000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da799000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da7a2000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da7ae000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da7bd000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da7c6000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da7cf000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da7db000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da7e4000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res da7f0000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res daddf000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList da628000 and res dadeb000

    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_unregister, ResTrack_Handle d727e000, pid 1701
    ResTrack_unregister, elem after List_deqeue is da62e000
    ResTrack_unregister, ResTrack_Handle d7296000, pid 1701
    ResTrack_unregister, elem after List_deqeue is da628000
    ResTrack_unregister, ResTrack_Handle d725a000, pid 1701
    ResTrack_unregister, elem after List_deqeue is da622000
    ResTrack_unregister, ResTrack_Handle d726c000, pid 1701
    ResTrack_unregister, elem after List_deqeue is da610000

    ASSERTION

    MessageQDrv_ioctl, MessageQ_create name ServiceMgr_inMsgs, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7e6a000
    MessageQDrv_ioctl, MessageQ_create name ServiceMgr_outMsgs, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7e7f000
    MessageQDrv_ioctl, MessageQ_create name uiaMaster, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7e94000
    MessageQDrv_ioctl, MessageQ_create name uiaStarted, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7ea9000
    MessageQDrv_ioctl, MessageQ_create name OmxRpcRcmServer_3, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7ebe000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7ed0000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7ed9000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7ee5000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7eee000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7efa000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7f03000
    MessageQDrv_ioctl, MessageQ_create name OmxRpcRcmServer_OMX.TI.DUCATI.VIDDEC_Cb_3_24, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7f12000
    MessageQDrv_ioctl, MessageQ_create name OmxRpcRcmServer_OMX.TI.DUCATI.VIDDEC_Cb_3_27, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7f27000
    MessageQDrv_ioctl, MessageQ_create name OmxRpcRcmServer_OMX.TI.DUCATI.VIDDEC_Cb_3_26, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7f3c000
    MessageQDrv_ioctl, MessageQ_create name OmxRpcRcmServer_OMX.TI.DUCATI.VIDDEC_Cb_3_25, params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7f51000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7f63000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7f6c000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7f78000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7f81000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7f8a000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7f93000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7f9c000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7fa5000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7fb7000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7fc0000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7fc9000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7fd2000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7fe1000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7fea000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e7ff3000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e8c02000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e8c0e000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e8c17000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e8c20000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e8c29000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e8c35000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e8c41000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e8c53000
    MessageQDrv_ioctl, MessageQ_create name (null), params
    MessageQDrv_ioctl, Calling ResTrack_push handle d7296000, pid 1701
    ResTrack_push, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_push, Calling List_putHead with proc->resList e7e4c000 and res e8c5f000

     MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    MessageQDrv_cmd_delete, MessageQ_delete handlePtr d3dbde98, status 0
    MessageQDrv_ioctl, Calling ResTrack_remove handle d7296000, pid 1701
    ResTrack_remove, ResTrack_Handle d7296000, pid 000006a5
    ResTrack_unregister, ResTrack_Handle d727e000, pid 1701
    ResTrack_unregister, elem after List_deqeue is e7e52000
    ResTrack_unregister, ResTrack_Handle d7296000, pid 1701
    ResTrack_unregister, elem after List_deqeue is e7e4c000
    Assertion at Line no: 409 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    ResTrack_unregister, elem after List_deqeue is e7e4c000
    Assertion at Line no: 409 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    ResTrack_unregister, elem after List_deqeue is e7e4c000
    Assertion at Line no: 409 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    ResTrack_unregister, elem after List_deqeue is e7e4c000
    Assertion at Line no: 409 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    ResTrack_unregister, elem after List_deqeue is e7e4c000
    ResTrack_unregister, ResTrack_Handle d725a000, pid 1701
    ResTrack_unregister, elem after List_deqeue is e7e46000
    ResTrack_unregister, ResTrack_Handle d726c000, pid 1701
    ResTrack_unregister, elem after List_deqeue is e7e34000

    Thanks,

    Johny.

  • Hi Gil,

    while debugging we turn on GST_DEBUG=*omx*:5 and using gdb to check for thing and get this crash. do you know why?

    Thanks,
    JOHNY. 

    0:00:06.229963500 21031  0x1e62340 INFO                     omx gstomx_base_videodec.c:362:src_setcaps:<omxh264dec0> G_OMX_PORT_SET_DEFINITION

    0:00:06.230672650 21031  0x1e62340 DEBUG        video-connector gstvideoconnector.c:241:gst_video_connector_buffer_alloc:<videoconnector0> using pad videoconnector0:src for alloc

    0:00:06.406611600 21031  0x1e72f08 INFO                     omx gstomx_base_filter.c:150:change_state:<omxh264dec0> begin: changing state PAUSED -> PLAYING

    0:00:06.413456300 21031   0x4b1c68 INFO         video-connector gstvideoconnector.c:309:gst_video_connector_resend_new_segment:<videoconnector0> New segment requested, failed signal enabled: 1

    0:00:06.446810450 21031  0x1e62340 DEBUG        video-connector gstvideoconnector.c:250:gst_video_connector_buffer_alloc:<videoconnector0> buffer alloc finished: ok

    0:00:06.474934401 21031  0x1e62340 INFO                     omx gstomx_base_filter.c:587:pad_chain:<omxh264dec0> omx: play

    0:00:06.523867950 21031  0x1e72f08 INFO                     omx gstomx_base_filter.c:150:change_state:<omxh264dec1> begin: changing state PAUSED -> PLAYING

    0:00:06.586825200 21031  0x1e0b340 DEBUG        video-connector gstvideoconnector.c:267:gst_video_connector_setcaps:<videoconnector0> gst_video_connector_setcaps video/x-raw-yuv-strided, width=(int)1280

    , height=(int)720, format=(fourcc)NV12, rowstride=(int)1408, interlaced=(boolean)false 1

    0:00:06.587369050 21031  0x1e0b340 DEBUG        video-connector gstvideoconnector.c:371:gst_video_connector_chain:<videoconnector0> Pushing new segment event

    0:00:06.587443650 21031  0x1e0b340 DEBUG        video-connector gstvideoconnector.c:372:gst_video_connector_chain:<videoconnector0> ==> srcpad: 0x1d060c8

    0:00:06.858258450 21031  0x1e72f08 INFO                     omx gstomx_base_filter.c:150:change_state:<omxh264dec0> begin: changing state PLAYING -> PLAYING

    0:00:07.006599650 21031  0x1e95ed0 WARN                basesink gstbasesink.c:2875:gst_base_sink_is_too_late:<qvideosurfacegstsink2> warning: A lot of buffers are being dropped.

    0:00:07.006829150 21031  0x1e95ed0 WARN                basesink gstbasesink.c:2875:gst_base_sink_is_too_late:<qvideosurfacegstsink2> warning: There may be a timestamping problem, or this computer is too

    slow.

    0:00:07.040916250 21031  0x1e95ed0 WARN                basesink gstbasesink.c:2875:gst_base_sink_is_too_late:<qvideosurfacegstsink2> warning: A lot of buffers are being dropped.

    0:00:07.041207800 21031  0x1e95ed0 WARN                basesink gstbasesink.c:2875:gst_base_sink_is_too_late:<qvideosurfacegstsink2> warning: There may be a timestamping problem, or this computer is too

    slow.

    0:00:07.269344100 21031  0x1e0b340 WARN                basesink gstbasesink.c:2875:gst_base_sink_is_too_late:<qvideosurfacegstsink0> warning: A lot of buffers are being dropped.

    0:00:07.269544700 21031  0x1e0b340 WARN                basesink gstbasesink.c:2875:gst_base_sink_is_too_late:<qvideosurfacegstsink0> warning: There may be a timestamping problem, or this computer is too

    slow.

    0:00:07.338509700 21031  0x1e0b340 WARN                basesink gstbasesink.c:2875:gst_base_sink_is_too_late:<qvideosurfacegstsink0> warning: A lot of buffers are being dropped.

    0:00:07.338766300 21031  0x1e0b340 WARN                basesink gstbasesink.c:2875:gst_base_sink_is_too_late:<qvideosurfacegstsink0> warning: There may be a timestamping problem, or this computer is too

    slow.

    0:00:11.907251550 21031   0x4b1c68 INFO                     omx gstomx_base_filter.c:150:change_state:<omxh264dec0> begin: changing state PLAYING -> PAUSED

    0:00:11.933665250 21031 0x49204ff8 INFO                     omx gstomx_base_filter.c:514:output_loop:<omxh264dec0> pause task, reason:  wrong-state

    0:00:11.937895050 21031   0x4b1c68 INFO                     omx gstomx_base_filter.c:150:change_state:<omxh264dec0> begin: changing state PAUSED -> READY

    0:00:11.970297200 21031   0x4b1c68 INFO                     omx gstomx_base_filter.c:150:change_state:<omxh264dec0> begin: changing state READY -> NULL

    0:00:12.057307100 21031   0x4b1c68 INFO         video-connector gstvideoconnector.c:309:gst_video_connector_resend_new_segment:<videoconnector0> New segment requested, failed signal enabled: 1

    0:00:12.206400750 21031   0x4b1c68 INFO                     omx gstomx_base_filter.c:150:change_state:<omxh264dec1> begin: changing state PLAYING -> PAUSED

    0:00:12.232152350 21031 0x4923e7d0 INFO                     omx gstomx_base_filter.c:514:output_loop:<omxh264dec1> pause task, reason:  wrong-state

    0:00:12.236249050 21031   0x4b1c68 INFO                     omx gstomx_base_filter.c:150:change_state:<omxh264dec1> begin: changing state PAUSED -> READY

    0:00:12.263124000 21031   0x4b1c68 INFO                     omx gstomx_base_filter.c:150:change_state:<omxh264dec1> begin: changing state READY -> NULL

    ServiceMgr_rxThreadFxn: accept() failed with errno = 4

    ServiceMgr_rxThreadFxn: accept() failed with errUnable to handle kernel paging request at virtual address 00005904

    pgd = d17e0000

    [00005904] *pgd=967a2031, *pte=00000000, *ppte=00000000

    Internal error: Oops: 817 [#1]

    last sysfs file: /sys/devices/virtual/amp/amp0/amp

    Modules linked in: bufferclass_ti omaplfb pvrsrvkm power plm ipv6 gsm gpakLoader gc0308 amp touchlink_bl ti81xxvo ti81xxfb vpss syslink zwave secio rt3352 uhf345 lm75

    CPU: 0    Not tainted  (2.6.37-touchlink-07 #1)

    PC is at List_dequeue+0x8c/0xd4 [syslink]

    LR is at List_dequeue+0x2c/0xd4 [syslink]

    pc : [<bf0517b4>]    lr : [<bf051754>]    psr: 80000093

    sp : d2a51e08  ip : 00000000  fp : d2a51e2c

    r10: e94fc008  r9 : 00000000  r8 : e94fc010

    r7 : e94fc000  r6 : bf122b8c  r5 : d9c05400  r4 : e94fc008

    r3 : 00005900  r2 : e94fc008  r1 : 00050000  r0 : bf0c5cbc

    Flags: Nzcv  IRQs off  FIQs on  Mode SVC_32  ISA ARM  Segment user

    Control: 10c5387d  Table: 917e0019  DAC: 00000015

    Process python3 roubaix (pid: 21031, stack limit = 0xd2a502e8)

    Stack: (0xd2a51e08 to 0xd2a52000)

    1e00:                   d2a51e3c d72d5000 00000000 a0000013 e94fc008 d2a51ecc

    1e20: d2a51e4c d2a51e30 bf051ed8 bf051734 d2a51e5c d2a51e40 00000000 c018f361

    1e40: d2a51e8c d2a51e50 bf065c40 bf051e64 d2a51ecc 00000000 bf0774cc bf051244

    1e60: d9c05400 00000000 c018f361 00000050 00000050 00005227 d2a50000 bee04fb0

    1e80: d2a51ef4 d2a51e90 bf0a7c40 bf065b04 c018f361 bee04fb0 e94fc000 00000000

    1ea0: 54ba4400 000000a0 bee04fc8 000000a0 00000000 48812a70 00060000 00000000

    1ec0: d2a51f04 d2a51ed0 c00c0b80 00000000 00000000 d2a49880 00000050 00000050

    1ee0: bee04fb0 00000000 d2a51f04 d2a51ef8 c00e1ccc bf0a7b14 d2a51f74 d2a51f08

    1f00: c00e23dc c00e1cb0 d17d24d0 00000000 d17d24d0 d17d2420 d2a51f3c d2a51f28

    1f20: c00c2740 c00d19fc d17d24d0 d29aed80 d2a51f84 d2a51f40 c00c375c c00c26e8

    1f40: 674ab000 c009c6fc 00000000 00000000 bee04fb0 c018f361 00000050 d2a49880

    1f60: d2a50000 00000000 d2a51fa4 d2a51f78 c00e2474 c00e1ee8 d2a51fa4 00000001

    1f80: c00c37d4 48880e5c 48880e5c 488404b0 00000036 c004cfa8 00000000 d2a51fa8

    1fa0: c004ce00 c00e2428 48880e5c 48880e5c 00000050 c018f361 bee04fb0 00000050

    1fc0: 48880e5c 48880e5c 488404b0 00000036 bee04fdc 00000390 01c04f58 bee04f94

    1fe0: 01c04f58 bee04f78 487d0788 40510aec 20000010 00000050 00000000 00000000

    Backtrace:

    [<bf051728>] (List_dequeue+0x0/0xd4 [syslink]) from [<bf051ed8>] (List_get+0x80/0xd0 [syslink])

    r6:d2a51ecc r5:e94fc008 r4:a0000013

    [<bf051e58>] (List_get+0x0/0xd0 [syslink]) from [<bf065c40>] (MessageQ_get+0x148/0x290 [syslink])

    r5:c018f361 r4:00000000

    [<bf065af8>] (MessageQ_get+0x0/0x290 [syslink]) from [<bf0a7c40>] (MessageQDrv_ioctl+0x138/0x9b8 [syslink])

    [<bf0a7b08>] (MessageQDrv_ioctl+0x0/0x9b8 [syslink]) from [<c00e1ccc>] (vfs_ioctl+0x28/0x44)

    [<c00e1ca4>] (vfs_ioctl+0x0/0x44) from [<c00e23dc>] (do_vfs_ioctl+0x500/0x540)

    [<c00e1edc>] (do_vfs_ioctl+0x0/0x540) from [<c00e2474>] (sys_ioctl+0x58/0x7c)

    [<c00e241c>] (sys_ioctl+0x0/0x7c) from [<c004ce00>] (ret_fast_syscall+0x0/0x30)

    r8:c004cfa8 r7:00000036 r6:488404b0 r5:48880e5c r4:48880e5c

    Code: 15953000 03a05000 15843000 15953000 (15834004)

    no = 4

    ---[ end trace b970274322347bbd ]---

  • Hi Gil, 

    we have more info and hope that you or Margarita can answer our question.

    we found that MessageQDrv_ioctl get call to create the resource: 

     MessageQDrv_ioctl - ResTrack_push handle d7297000 - pid 2898 - resoure da6ba000 -  res->args.create.handle da6bd000

    but there was no call to remove it like you guys said - 

    MessageQDrv_ioctl - ResTrack_remove handle d7297000 - pid 2898 - resoure d5f0fea8, compare bf0a7b74, elem d5f0fec4, res.args.create.handle da79b000
    MessageQDrv_resCmpFxn - handleA da79b000 - handleB da7cb000
    MessageQDrv_resCmpFxn - handleA da79b000 - handleB da7c2000
    MessageQDrv_resCmpFxn - handleA da79b000 - handleB da79b000
    ResTrack_remove - Found
    MessageQDrv_ioctl - ResTrack_remove handle d7297000 - pid 2898 - resoure d5f0fea8, compare bf0a7b74, elem d5f0fec4, res.args.create.handle da7cb000
    MessageQDrv_resCmpFxn - handleA da7cb000 - handleB da7cb000
    ResTrack_remove - Found
    MessageQDrv_ioctl - ResTrack_remove handle d7297000 - pid 2898 - resoure d5f0fea8, compare bf0a7b74, elem d5f0fec4, res.args.create.handle da7c2000
    MessageQDrv_resCmpFxn - handleA da7c2000 - handleB da7c2000
    ResTrack_remove - Found
    MessageQDrv_ioctl - ResTrack_remove handle d7297000 - pid 2898 - resoure d5f0fea8, compare bf0a7b74, elem d5f0fec4, res.args.create.handle da738000
    MessageQDrv_resCmpFxn - handleA da738000 - handleB da738000
    ResTrack_remove - Found
    MessageQDrv_ioctl - ResTrack_remove handle d7297000 - pid 2898 - resoure d5f0fea8, compare bf0a7b74, elem d5f0fec4, res.args.create.handle da72f000
    MessageQDrv_resCmpFxn - handleA da72f000 - handleB da72f000
    ResTrack_remove - Found
    MessageQDrv_ioctl - ResTrack_remove handle d7297000 - pid 2898 - resoure d5f0fea8, compare bf0a7b74, elem d5f0fec4, res.args.create.handle da6f6000
    MessageQDrv_resCmpFxn - handleA da6f6000 - handleB da6f6000
    ResTrack_remove - Found
    MessageQDrv_ioctl - ResTrack_remove handle d7297000 - pid 2898 - resoure d5f0fea8, compare bf0a7b74, elem d5f0fec4, res.args.create.handle da6d2000
    MessageQDrv_resCmpFxn - handleA da6d2000 - handleB da6d2000
    ResTrack_remove - Found
    MessageQDrv_ioctl - ResTrack_remove handle d7297000 - pid 2898 - resoure d5f0fea8, compare bf0a7b74, elem d5f0fec4, res.args.create.handle da6c9000
    MessageQDrv_resCmpFxn - handleA da6c9000 - handleB da6c9000
    ResTrack_remove - Found
    MessageQDrv_ioctl - ResTrack_remove handle d7297000 - pid 2898 - resoure d5f0fea8, compare bf0a7b74, elem d5f0fec4, res.args.create.handle da6a2000
    MessageQDrv_resCmpFxn - handleA da6a2000 - handleB da6bd000
    MessageQDrv_resCmpFxn - handleA da6a2000 - handleB da6b4000
    MessageQDrv_resCmpFxn - handleA da6a2000 - handleB da6a2000
    ResTrack_remove - Found
    MessageQDrv_ioctl - ResTrack_remove handle d7297000 - pid 2898 - resoure d5f0fea8, compare bf0a7b74, elem d5f0fec4, res.args.create.handle da68d000
    MessageQDrv_resCmpFxn - handleA da68d000 - handleB da6bd000
    MessageQDrv_resCmpFxn - handleA da68d000 - handleB da6b4000
    MessageQDrv_resCmpFxn - handleA da68d000 - handleB da68d000
    ResTrack_remove - Found
    MessageQDrv_ioctl - ResTrack_remove handle d7297000 - pid 2898 - resoure d5f0fea8, compare bf0a7b74, elem d5f0fec4, res.args.create.handle da64e000
    MessageQDrv_resCmpFxn - handleA da64e000 - handleB da6bd000
    MessageQDrv_resCmpFxn - handleA da64e000 - handleB da6b4000
    MessageQDrv_resCmpFxn - handleA da64e000 - handleB da678000
    MessageQDrv_resCmpFxn - handleA da64e000 - handleB da663000
    MessageQDrv_resCmpFxn - handleA da64e000 - handleB da64e000
    ResTrack_remove - Found
    MessageQDrv_ioctl - ResTrack_remove handle d7297000 - pid 2898 - resoure d5f0fea8, compare bf0a7b74, elem d5f0fec4, res.args.create.handle da678000
    MessageQDrv_resCmpFxn - handleA da678000 - handleB da6bd000
    MessageQDrv_resCmpFxn - handleA da678000 - handleB da6b4000
    MessageQDrv_resCmpFxn - handleA da678000 - handleB da678000
    ResTrack_remove - Found
    MessageQDrv_ioctl - ResTrack_remove handle d7297000 - pid 2898 - resoure d5f0fea8, compare bf0a7b74, elem d5f0fec4, res.args.create.handle da663000
    MessageQDrv_resCmpFxn - handleA da663000 - handleB da6bd000
    MessageQDrv_resCmpFxn - handleA da663000 - handleB da6b4000
    MessageQDrv_resCmpFxn - handleA da663000 - handleB da663000
    ResTrack_remove - Found
    HeapBufMPDrv_ioctl - CMD_HEAPBUFMP_DESTROY
    ResTrack_unregister - unregister pid 2898 - handle d727f000 - register count 7
    MessageQDrv_ioctl - CMD_MESSAGEQ_DESTROY
    ResTrack_unregister - unregister pid 2898 - handle d7297000 - register count 6
    ResTrack_unregister - unregister pid 2898 - handle d7297000 - elem da6ba000
    Assertion at Line no: 402 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed
    ResTrack_unregister - unregister pid 2898 - handle d7297000 - elem da6b1000
    Assertion at Line no: 402 in /home/johny/CodeHG/bsp-ti81xx-hg/mods/syslink_2_21_01_05/packages/ti/syslink/utils/hlos/knl/Linux/../../../../../../ti/sysl
    ink/utils/hlos/knl/ResTrack.c: (elem == NULL) : failed

    So do you know who is calling MessageQDrv_ioctl to do the create and clean up?  is it gst-openmax or ti-omx ... we looking through our stuff but cannot find it.

    Thanks,

    Johny.

  • Hi Gil and Gunter,

    attach is the log for syslink and OMX - you can see the match when the handle get create and delete.

    the assert happen since there is no matching delete call on OMX side.

    Thanks,

    Johny.

    log.zip
  • Johny:

       From the syslink.log, it appears the MessageQ handles which are not getting deleted in this run are:

    [2013-05-09 12:35:55]  knl/Linux/MessageQDrv.c MessageQDrv_resCmpFxn - handleA da660000 - handleB da6cf000 
    [2013-05-09 12:35:55]  knl/Linux/MessageQDrv.c MessageQDrv_resCmpFxn - handleA da660000 - handleB da6c6000
    [2013-05-09 12:35:55]  knl/Linux/MessageQDrv.c MessageQDrv_resCmpFxn - handleA da660000 - handleB da6ba000
    [2013-05-09 12:35:55]  knl/Linux/MessageQDrv.c MessageQDrv_resCmpFxn - handleA da660000 - handleB da6b1000

       That corresponds to the 6th to the 9th instance of MessageQ_create() since the start of the logging.

       If the omx.log is coordinated with the syslink.log, the 6th MessageQ_create() corresponds to:

    In OMX_GetHandle, component OMX.TI.DUCATI.VIDDEC, omxhandle 0x1fb5470
    Module<ti.omx> Entering<OmxProxy_commonInit> @line<2491>
    Module<ti.omx> @<OmxProxy_commonInit> @line<2492> msg<OMX.TI.DUCATI.VIDDEC>
    Module<ti.omx> Entering<omxproxy_map_component_name2info> @line<747>
    Module<ti.omx> Leaving<omxproxy_map_component_name2info> @line<764> with error<0:ErrorNone>
    Module<ti.omx> Entering<omxproxy_get_component_custom_config_info> @line<784>
    Module<ti.omx> Leaving<omxproxy_get_component_custom_config_info> @line<801> with error<0:ErrorNone>
    Module<ti.omx> @<OmxProxy_commonInit> @line<2565> msg<Before OmxRpc_Params_init>
    Module<ti.omx> Entering<OmxRpc_Params_init> @line<93>
    Module<ti.omx> Leaving<OmxRpc_Params_init> @line<99> with error<0:ErrorNone>
    Module<ti.omx> @<OmxProxy_commonInit> @line<2569> msg<After OmxRpc_Params_init>
    Module<ti.omx> @<OmxProxy_commonInit> @line<2579> msg<Before OmxRpc_create>
    Module<ti.omx> Entering<OmxRpc_object_create> @line<109>
    Module<ti.omx> Entering<OmxRpc_Instance_init> @line<574>
    Module<ti.omx> Entering<omxrpc_module_init_client> @line<326>
    Entered function:omxrpc_module_init_client (1)
    Module<ti.omx> Entering<OmxRpc_rcmClientCreate> @line<980>
    Entered function:OmxRpc_rcmClientCreate (0x48d9e850, OmxRpcRcmServer_1, 4)
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<987> msg<Before RcmClient_Params_init>
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<989> msg<After RcmClient_Params_init>
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<994> msg<Before RcmClient_create>
    OmxRpc_rcmClientCreate - Before RcmClient_create
    Thu May  9 12:35:43 2013
    [ 1368124543.674754 ] MessageQ_Params_init - calling MessageQDrv_ioctl
    Thu May  9 12:35:43 2013
    ipc/hlos/usr/Linux/MessageQDrv.c MessageQDrv_ioctl - MessageQDrv_refCount 2
    Thu May  9 12:35:43 2013
    [ 1368124543.674954 ] MessageQ_create - calling MessageQDrv_ioctl CMD_MESSAGEQ_CREATE (null),
    Thu May  9 12:35:43 2013

       So, the MessageQ_create() appears to be associated with RcmClient_create() calls.

      I count 12 RcmClient_create() calls.    But I only see 4 RcmClient_delete() calls (it's possible the OMX code is not tracing all the deletes).

      This seems to substantiate my previous guess: that RcmClient_delete() is not being called enough times.  Each RcmClient instance creates 2 message queues of NULL name.

       See component-sources/framework_components_3_22_01_07/packages/ti/sdo/rcm/RcmClient.c:RcmClient_Instance_init():

        /* create the message queue for return messages */
        MessageQ_Params_init(&mqParams);
        obj->msgQue = MessageQ_create(NULL, &mqParams);

        /* create the message queue for error messages */
        MessageQ_Params_init(&mqParams);
        obj->errorMsgQue = MessageQ_create(NULL, &mqParams);

       You can turn on trace in Framework components:  http://processors.wiki.ti.com/index.php/Trace_in_Framework_Components , and rebuild the whole system, to get better tracing on where RcmClient_create() and RcmClient_destroy() are really being called.   

        From the omx.log, I see three (3) calls to:  component-sources/omx_05_02_00_48/src//ti/omx/domx/OmxRpc.c:omxrpc_module_init_client()  (which calls RcmClient_create()).

    2623       Module<ti.omx> Leaving<omxrpc_module_init_client> @line<389> with error<0:ErrorNone>
    2643       Module<ti.omx> Leaving<omxrpc_module_init_client> @line<389> with error<0:ErrorNone>
    2719       Module<ti.omx> Leaving<omxrpc_module_init_client> @line<389> with error<0:ErrorNone>

       But, only one (1) call to:  omxrpc_module_deinit_client():  (which calls RcmClient_delete()).

    52812      omxrpc_module_deinit_client - Calling omxrpc_module_free_client_rsrcModule<ti.omx> Entering<omxrpc_module_free_client_rsrc> @line<191>

       The 2 extra RcmClient_create() calls without a corresponding RcmClient_delete() call, with 2 MessageQ_create() calls per RcmClient_create(), would account for the 4 MessageQ_delete()'s missing from the Log.

       I hope this gives some further direction to search.   It appears to be somewhere in OmxRpc initialization and finalization.

    Regards,

    - Gil

  • Gil,

    Hi this is Johny colleage Craig.  I have been looking out our resources issues in regards to GStreamer, gst-openmax.  After reading your post I decided to take a closer look at the gst-openmax element plugin.  One key thing I noticed is that every call into the omx is going to be on a separate thread for each pipeline.  This means calls to stop(), which call g_omx_core_* (stop, unload,free) are on each of these threads.  Same would be true for the g_omx_core_* (init, deinit).  I had observed segfaults when we would stop a pipeline or change it's state to READY_TO_NULL.  In each case the g_omx_core_* functions are called.  

    Based on the fact each pipeline was in it's own thread and the elements are used in each pipeline, I suspect calls into q_omx_core_* must be mutexed  from the elements?  In other words locked call g_omx_core_* unlock in each thread to serialize the calls back into the g_omx_core_*.  

    I assume the g_omx_core_* functions are NOT THREAD SAFE.   Because they call the OMX_* (FreeHandle, SendCommand, DeInit) API that appear not to be thread safe.

    We only have seen problems when running with multiply pipelines, with a single pipeline we have no problems.

    So I did added GST_OBJECT_LOCK/UNLOCK to the plugin in the area's we call into g_omx_core_* and now I are no longer experiencing the segfaults, but I still see the ResTrack Assert.  So are you saying the order we call clean occurs could explain the ResTrack Asserts?  MessageQ_destroy before a RcmClient_delete?  And if MessageQ_destroy is called we should not see any resource leaks?

    Thanks,

    Craig

  • Craig:

    > So are you saying the order we call clean occurs could explain the ResTrack Asserts?

      All MessageQ objects should be deleted before calling MessageQ_destroy().   As stated here:  http://e2e.ti.com/support/embedded/linux/f/354/p/254951/896888.aspx#896888

        "Looking more closely at the code, and a colleague verified this by example, that by not "deleting" a resource before "destroying" the module managing that resource, will cause the assert in ResTrack.c:

    > MessageQ_destroy before a RcmClient_delete? 

       It appears that RcmClient_delete() is not being called for each RcmClient_create().    By adding the GST_OBJECT_LOCK/UNLOCK, were you able to verify that the number of RcmClient_delete() calls now match the number of RcmClient_create() calls?

    > And if MessageQ_destroy is called we should not see any resource leaks?

      The code in MessageQ_destroy() appears to be cleaning up any abandoned MessageQ objects, but that doesn't prevent the asserts, because those come from ResTrack module, and MessageQ module doesn't use that ResTrack module directly.   There shouldn't be any memory leaks once MessageQ_destroy() is called.

      But still, the thing to find out is, are the number of RcmClient_create() calls equal to the number of RcmClient_delete() calls.  Enabling FC trace (as mentioned in a previous post) can help track that; or, a Linux user space debugger to look at stack trace.

    Regards,
    - Gil


  • Hi Gil,

    Attached is the log that I also email earlier telling Gunter about the print.  I did a grep and It does not seems to be matching up, i am studying the log to see what missing.

    It is odd that some of the Before and After print is not in same order - you can see 3 consecutive Before highlight in red.

    Thanks,

    JOHNY.

    johny@johny-desktop:~$ grep RcmClient_create omx8.log
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1001> msg<Before RcmClient_create>
    <---> JOHNY <--->OmxRpc_rcmClientCreate - RcmClient_create status 0
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1004> msg<After RcmClient_create>
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1001> msg<Before RcmClient_create>
    <---> JOHNY <--->OmxRpc_rcmClientCreate - RcmClient_create status 0
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1004> msg<After RcmClient_create>
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1001> msg<Before RcmClient_create>
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1001> msg<Before RcmClient_create>
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1001> msg<Before RcmClient_create>
    <---> JOHNY <--->OmxRpc_rcmClientCreate - RcmClient_create status 0
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1004> msg<After RcmClient_create>
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1001> msg<Before RcmClient_create>
    <---> JOHNY <--->OmxRpc_rcmClientCreate - RcmClient_create status 0
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1004> msg<After RcmClient_create>
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1001> msg<Before RcmClient_create>
    <---> JOHNY <--->OmxRpc_rcmClientCreate - RcmClient_create status 0
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1004> msg<After RcmClient_create>
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1001> msg<Before RcmClient_create>
    <---> JOHNY <--->OmxRpc_rcmClientCreate - RcmClient_create status 0
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1004> msg<After RcmClient_create>
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1001> msg<Before RcmClient_create>
    <---> JOHNY <--->OmxRpc_rcmClientCreate - RcmClient_create status 0
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1004> msg<After RcmClient_create>
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1001> msg<Before RcmClient_create>
    <---> JOHNY <--->OmxRpc_rcmClientCreate - RcmClient_create status 0
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1004> msg<After RcmClient_create>
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1001> msg<Before RcmClient_create>
    <---> JOHNY <--->OmxRpc_rcmClientCreate - RcmClient_create status 0
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1004> msg<After RcmClient_create>
    <---> JOHNY <--->OmxRpc_rcmClientCreate - RcmClient_create status 0
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1004> msg<After RcmClient_create>
    <---> JOHNY <--->OmxRpc_rcmClientCreate - RcmClient_create status 0
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1004> msg<After RcmClient_create>


    johny@johny-desktop:~$ grep RcmClient_delete omx8.log
    <---> JOHNY <---> omxrpc_gethandle_freersrc - Calling RcmClient_delete with pComponentRcmClientETBHandlePtr 0x1fbb25c
    <---> JOHNY <---> omxrpc_gethandle_freersrc - Calling RcmClient_delete with pComponentRcmClientFTBHandlePtr 0x1fbb2e0
    Module<ti.omx> @<omxrpc_gethandle_freersrc> @line<180> msg<Before RcmClient_delete>
    Module<ti.omx> @<omxrpc_gethandle_freersrc> @line<182> msg<After RcmClient_delete>
    <---> JOHNY <---> omxrpc_gethandle_freersrc - Calling RcmClient_delete with pComponentRcmClientETBHandlePtr 0x210b6b4
    <---> JOHNY <---> omxrpc_gethandle_freersrc - Calling RcmClient_delete with pComponentRcmClientFTBHandlePtr 0x210b738
    Module<ti.omx> @<omxrpc_gethandle_freersrc> @line<180> msg<Before RcmClient_delete>
    Module<ti.omx> @<omxrpc_gethandle_freersrc> @line<182> msg<After RcmClient_delete>
    <---> JOHNY <---> omxrpc_gethandle_freersrc - Calling RcmClient_delete with pComponentRcmClientETBHandlePtr 0x1dbe29c
    <---> JOHNY <---> omxrpc_gethandle_freersrc - Calling RcmClient_delete with pComponentRcmClientFTBHandlePtr 0x1dbe320
    Module<ti.omx> @<omxrpc_gethandle_freersrc> @line<180> msg<Before RcmClient_delete>
    Module<ti.omx> @<omxrpc_gethandle_freersrc> @line<182> msg<After RcmClient_delete>
    Module<ti.omx> @<omxrpc_module_free_client_rsrc> @line<194> msg<Before RcmClient_delete>
    Module<ti.omx> @<omxrpc_module_free_client_rsrc> @line<199> msg<After RcmClient_delete>

    syslink_omx.zip
  • Hi Johny:

    >   It is odd that some of the Before and After print is not in same order

      This may be due to the multi-threaded nature of the application, since there are multiple pipelines.

      This indeed makes it difficult to follow the flow of the trace, since the points where the context switches is not evident.

      However, I looked at this new trace log, and here is what I am able to deduce:

      From syslink8.log, the two unfreed MessageQ objects are:

    [2013-05-11 13:25:27]  knl/Linux/MessageQDrv.c MessageQDrv_resCmpFxn - handleA daeae000 - handleB daf08000 
    [2013-05-11 13:25:27]  knl/Linux/MessageQDrv.c MessageQDrv_resCmpFxn - handleA daeae000 - handleB daeff000

      From omx8.log, annotating the trace log with line numbers, these MessageQ_create() calls for these objects appear to occur within the context (probably the same thread context) of the very first RcmClient_create() call, which seems to be called from the very first :

    1916: Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1001> msg<Before RcmClient_create>
    [ ... ]
    1935:
    [ 1368300313.62735 ] MessageQ_create - calling MessageQDrv_ioctl CMD_MESSAGEQ_CREATE handle->knlObject 0xdaeff000,
    [...]
    1971:
    [ 1368300313.143238 ] MessageQ_create - calling MessageQDrv_ioctl CMD_MESSAGEQ_CREATE handle->knlObject 0xdaf08000,
    [...]
    2065:
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1004> msg<After RcmClient_create>

      And, before that, starting at line 1894, is the first OMX_GetHandle() call to the Video Decoder:

    In OMX_GetHandle, component OMX.TI.DUCATI.VIDDEC, omxhandle 0x1f15ba0
    <---> JOHNY <---> DomxProxy_OMX_TI_VIDDEC_ComponentInit - calling OmxProxy_commonInit
    Module<ti.omx> Entering<OmxProxy_commonInit> @line<2491>
    Module<ti.omx> @<OmxProxy_commonInit> @line<2492> msg<OMX.TI.DUCATI.VIDDEC>
    Module<ti.omx> Entering<omxproxy_map_component_name2info> @line<747>
    Module<ti.omx> Leaving<omxproxy_map_component_name2info> @line<764> with error<0:ErrorNone>
    Module<ti.omx> Entering<omxproxy_get_component_custom_config_info> @line<784>
    Module<ti.omx> Leaving<omxproxy_get_component_custom_config_info> @line<801> with error<0:ErrorNone>
    Module<ti.omx> @<OmxProxy_commonInit> @line<2565> msg<Before OmxRpc_Params_init>
    Module<ti.omx> Entering<OmxRpc_Params_init> @line<93>
    Module<ti.omx> Leaving<OmxRpc_Params_init> @line<99> with error<0:ErrorNone>
    Module<ti.omx> @<OmxProxy_commonInit> @line<2569> msg<After OmxRpc_Params_init>
    Module<ti.omx> @<OmxProxy_commonInit> @line<2579> msg<Before OmxRpc_create>
    Module<ti.omx> Entering<OmxRpc_object_create> @line<109>
    Module<ti.omx> Entering<OmxRpc_Instance_init> @line<582>
    Module<ti.omx> Entering<omxrpc_module_init_client> @line<326>
    Entered function:omxrpc_module_init_client remoteCoreId (1)
    <-- JOHNY --> omxrpc_module_init_client - Calling OmxRpc_rcmClientCreate with rcmHndlPtr 0x4998f860 name OmxRpcRcmServer_1 msgHeapId 4
    Module<ti.omx> Entering<OmxRpc_rcmClientCreate> @line<987>
    Entered function:OmxRpc_rcmClientCreate (0x4998f860, OmxRpcRcmServer_1, 4)
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<994> msg<Before RcmClient_Params_init>
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<996> msg<After RcmClient_Params_init>
    Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<1001> msg<Before RcmClient_create>

    Looking at the trace, I also note that the first two calls to omxrpc_module_init_client() are getting interleaved (contended) by two threads calling OMX_GetHandle().   I see the third call to omxrpc_module_init_client() is completing without contention meaning the other two have completed and one of them set the GLOBAL flag OmxRpc_module->remoteCoreRcmClient[remoteCoreId].initDone to TRUE.

    Note that this flag is checked in omxrpc_module_init_client() to determine if it is necessary to create an OmxRpc instance.  

    It seems, that when there are multiple simultaneous pipelines, parallel threads are calling into this function, and the global flag is not getting set on the first call, before the second call interrupts, and starts to create another (unnecessary) RcmClient instance.   It seems the design expects that only ONE omxrpc_module_init_client() call create one RcmClient instance for the OMXRPC module structure (which is a global variable in the process).

    So, my guess, is that if you can serialize the calls from multiple threads into this particular function, it will operate as expected, and only call OmxRpc_rcmClientCreate() once.   This will then match with the ONE omxrpc_module_deinit_client() which is correctly called when the number of instances drops to zero (on the last OMX_FreeHandle()).

    This also explains why in the previous logs, I saw two pairs of MessageQ objects not being deleted, corresponding to two extra RcmClient_create() calls - this was just a function of the random behaviour of that particular multi-threaded run.

    Regards,
    - Gil
  • Hi Johny:

      Have you managed to serialize the calls to omxrpc_module_init_client() ?

      If so, you should see in the log alternating "Entering..."/ "Leaving ..." trace like this  (with no interleaving "Entering<omxrpc_module_init_client>" trace):

    Module<ti.omx> Entering<omxrpc_module_init_client> @line<326>
    Entered function:omxrpc_module_init_client remoteCoreId (1)
    [....]
    Module<ti.omx> Leaving<omxrpc_module_init_client> @line<342> with error<0:ErrorNone>

      If so, let me know if that did resolve the issue.

    Regards,
    - Gil
  • We may still missing some locking because i still see the interleaving issue with omxrpc_module_init_client when it failed.

    When it works this is what i see:

    <-- JOHNY --> OmxRpc_Instance_init - Calling omxrpc_module_init_client on remoteCoreId 1
    Module<ti.omx> Entering<omxrpc_module_init_client> @line<326> 
    Entered function:omxrpc_module_init_client remoteCoreId (1)
    <-- JOHNY --> omxrpc_module_init_client - Calling OmxRpc_rcmClientCreate with rcmHndlPtr 0x48b94ef0 name OmxRpcRcmServer_1 msgHeapId 4
    omxrpc_module_init_client: Located the remoteCoreRcmServer
    <--- JOHNY ---> omxrpc_module_init_client - seting pc_module->remoteCoreRcmClient[1].initDone = TRUE 
    Module<ti.omx> Leaving<omxrpc_module_init_client> @line<393> with error<0:ErrorNone>
    <-- JOHNY --> OmxRpc_Instance_init - Calling omxrpc_module_init_client on remoteCoreId 1
    Module<ti.omx> Entering<omxrpc_module_init_client> @line<326> 
    Entered function:omxrpc_module_init_client remoteCoreId (1)
    Module<ti.omx> Leaving<omxrpc_module_init_client> @line<342> with error<0:ErrorNone>
    <-- JOHNY --> OmxRpc_Instance_init - Calling omxrpc_module_init_client on remoteCoreId 1
    Module<ti.omx> Entering<omxrpc_module_init_client> @line<326> 
    Entered function:omxrpc_module_init_client remoteCoreId (1)
    Module<ti.omx> Leaving<omxrpc_module_init_client> @line<342> with error<0:ErrorNone>
    When it failed:
    <-- JOHNY --> OmxRpc_Instance_init - Calling omxrpc_module_init_client on remoteCoreId 1
    Module<ti.omx> Entering<omxrpc_module_init_client> @line<326> 
    Entered function:omxrpc_module_init_client remoteCoreId (1)
    <-- JOHNY --> omxrpc_module_init_client - Calling OmxRpc_rcmClientCreate with rcmHndlPtr 0x48b94ef0 name OmxRpcRcmServer_1 msgHeapId 4
    omxrpc_module_init_client: Located the remoteCoreRcmServer
    <-- JOHNY --> OmxRpc_Instance_init - Calling omxrpc_module_init_client on remoteCoreId 1
    Module<ti.omx> Entering<omxrpc_module_init_client> @line<326> 
    Entered function:omxrpc_module_init_client remoteCoreId (1)
    <-- JOHNY --> omxrpc_module_init_client - Calling OmxRpc_rcmClientCreate with rcmHndlPtr 0x48b94ef0 name OmxRpcRcmServer_1 msgHeapId 4
    omxrpc_module_init_client: Located the remoteCoreRcmServer
    <-- JOHNY --> OmxRpc_Instance_init - Calling omxrpc_module_init_client on remoteCoreId 1
    Module<ti.omx> Entering<omxrpc_module_init_client> @line<326> 
    Entered function:omxrpc_module_init_client remoteCoreId (1)
    <-- JOHNY --> omxrpc_module_init_client - Calling OmxRpc_rcmClientCreate with rcmHndlPtr 0x48b94ef0 name OmxRpcRcmServer_1 msgHeapId 4
    omxrpc_module_init_client: Located the remoteCoreRcmServer
    <--- JOHNY ---> omxrpc_module_init_client - seting pc_module->remoteCoreRcmClient[1].initDone = TRUE 
    Module<ti.omx> Leaving<omxrpc_module_init_client> @line<393> with error<0:ErrorNone>
    <--- JOHNY ---> omxrpc_module_init_client - seting pc_module->remoteCoreRcmClient[1].initDone = TRUE 
    Module<ti.omx> Leaving<omxrpc_module_init_client> @line<393> with error<0:ErrorNone>
    <--- JOHNY ---> omxrpc_module_init_client - seting pc_module->remoteCoreRcmClient[1].initDone = TRUE 
    Module<ti.omx> Leaving<omxrpc_module_init_client> @line<393> with error<0:ErrorNone>
    Notice there are 3 initDone flag instead of 1.  and other time it failed there are two initDone flag set (omx35 log)
    So we are still chasing it.
    Johny.
    omx.log.zip
  • Hi Gil,

    My theory on the initDone flag being set is not correct ... attach is a good log that actually has 3 initDone flag set to True and the order on the omxrpc_module_init_client not in order.

    <-- JOHNY --> OmxRpc_Instance_init - Calling omxrpc_module_init_client on remoteCoreId 1
    Module<ti.omx> Entering<omxrpc_module_init_client> @line<326> 
    Entered function:omxrpc_module_init_client remoteCoreId (1)
    <-- JOHNY --> omxrpc_module_init_client - Calling OmxRpc_rcmClientCreate with rcmHndlPtr 0x48b94ef0 name OmxRpcRcmServer_1 msgHeapId 4
    omxrpc_module_init_client: Located the remoteCoreRcmServer
    <-- JOHNY --> OmxRpc_Instance_init - Calling omxrpc_module_init_client on remoteCoreId 1
    Module<ti.omx> Entering<omxrpc_module_init_client> @line<326> 
    Entered function:omxrpc_module_init_client remoteCoreId (1)
    <-- JOHNY --> omxrpc_module_init_client - Calling OmxRpc_rcmClientCreate with rcmHndlPtr 0x48b94ef0 name OmxRpcRcmServer_1 msgHeapId 4
    omxrpc_module_init_client: Located the remoteCoreRcmServer
    <-- JOHNY --> OmxRpc_Instance_init - Calling omxrpc_module_init_client on remoteCoreId 1
    Module<ti.omx> Entering<omxrpc_module_init_client> @line<326> 
    Entered function:omxrpc_module_init_client remoteCoreId (1)
    <-- JOHNY --> omxrpc_module_init_client - Calling OmxRpc_rcmClientCreate with rcmHndlPtr 0x48b94ef0 name OmxRpcRcmServer_1 msgHeapId 4
    omxrpc_module_init_client: Located the remoteCoreRcmServer
    <--- JOHNY ---> omxrpc_module_init_client - seting pc_module->remoteCoreRcmClient[1].initDone = TRUE 
    Module<ti.omx> Leaving<omxrpc_module_init_client> @line<393> with error<0:ErrorNone>
    <--- JOHNY ---> omxrpc_module_init_client - seting pc_module->remoteCoreRcmClient[1].initDone = TRUE 
    Module<ti.omx> Leaving<omxrpc_module_init_client> @line<393> with error<0:ErrorNone>
    <--- JOHNY ---> omxrpc_module_init_client - seting pc_module->remoteCoreRcmClient[1].initDone = TRUE 
    Module<ti.omx> Leaving<omxrpc_module_init_client> @line<393> with error<0:ErrorNone>
    Johny.
  • Johnny:

        This log indicates that:

        1. RcmClient_delete() is never called after the three RcmClient_create() calls (so this must not be a complete log); and

        2. The calls to omxrpc_module_init_client() are still not getting serialized, meaning two extra RcmClient_create() calls are still happening which are not going to get deleted by the current OmxRpc code.

        If by "good log" it is meant that the pipelines all ran and then shut down cleanly without any asserts, without any code modification, this log does not corroborate that.

       Can you please send me the updated code with the locking so I may review?

    Thanks,
    - Gil
  • Hi Gil,

    We have not add the locking in the omx code, Craig been trying the lock on the openmax side only.  we are going to try using the Gate_enterModule in the omx code next.

    "good log"  means that the assert did not happen.  I added print statement inside the RcmClient_create function in the framework but i do not see the print out message.  i am not sure how it all link together. 

    Thanks,

    Johny.

  • Hi Gil,

    I have move the Gates_[enter|leave]Module in omxrpc_module_init_client and omxrpc_module_deinit_client to encompass the entire functions to make the init/deinit thread safe.  So far it is looking promising with my tests and I am no longer seeing the ResTrack ASSERT.  Johny needs to test under his setup tomorrow, but so far my preliminary testing is looking good.

    As you determined from the logs the init's where over lapping.  We should have more tomorrow once Johny tests on his setup.

    Thanks,

    Craig

  • Hi Gil,

    Based on what Gunter told us about the Gate functions (info from you I believe) here is what we did:

    In the openmax driver:

    ===================================================================
    --- a/OmxRpc.c 
    +++ b/OmxRpc.c 
    @@ -328,6 +328,9 @@
    DOMX_UTL_TRACE_FUNCTION_ASSERT ((TRUE ==
    OmxRpc_module->localCoreRcmServer.initDone),
    "serverInitNotDone");
    +
    + key = Gate_enterModule ();
    +
    if (((int) remoteCoreId < DomxTypes_coreFirst) ||
    ((int) remoteCoreId > DomxTypes_coreLast))
    {
    @@ -337,13 +340,13 @@
    {
    /* Return if init is already done */
    DOMX_UTL_TRACE_FUNCTION_EXIT_LEVEL1 (OmxRpc_errorNone);
    + Gate_leaveModule (key);
    return OmxRpc_errorNone; 

    if (OmxRpc_errorNone == retVal) 
    {
    RcmClient_Handle *rcmHndlPtr;

    - key = Gate_enterModule ();
    OmxRpc_module->remoteCoreRcmClient[remoteCoreId].numInstances = 0;
    OmxRpc_genCoreRcmServerName ((char *) OmxRpc_module->
    remoteCoreRcmClient[remoteCoreId].name,
    @@ -371,7 +374,6 @@
    retVal = OmxRpc_rcmClientRemoteFxnLocate (*rcmHndlPtr,
    OmxRpc_remoteFxnInit, pFxnsTbl);
    }
    - Gate_leaveModule (key);
    }
    if (OmxRpc_errorNone != retVal)
    {
    @@ -379,11 +381,10 @@
    }
    else
    {
    - key = Gate_enterModule ();
    OmxRpc_module->remoteCoreRcmClient[remoteCoreId].initDone = TRUE;
    - Gate_leaveModule (key);
    }
    DOMX_UTL_TRACE_FUNCTION_EXIT_LEVEL1 (retVal);
    + Gate_leaveModule (key);
    return retVal;
    }

    @@ -396,18 +397,18 @@
    static OmxRpc_errorType omxrpc_module_deinit_client (DomxTypes_coreType
    remoteCoreId)
    {
    + IArg key;
    OmxRpc_errorType retVal;

    + key = Gate_enterModule ();
    retVal = omxrpc_module_free_client_rsrc (remoteCoreId,
    OMXRPC_RSRC_MODULE_CLIENT_INIT_ALL);
    if (OmxRpc_errorNone == retVal)
    {
    - IArg key;

    - key = Gate_enterModule ();
    OmxRpc_module->remoteCoreRcmClient[remoteCoreId].initDone = FALSE;
    - Gate_leaveModule (key);
    }
    + Gate_leaveModule (key);
    return retVal;
    }

    Does this look ok to you?

    Craig

  • Craig:

       I recommend  putting the line:

    key = Gate_enterModule ();

       right at the beginning of omxrpc_module_init_client(), before any other code statements.

       The rest looks fine.

        The documentation of this function is here:

    http://rtsc.eclipse.org/cdoc-tip/xdc/runtime/Gate.html

       

    Regards,
    - Gil
  • Gil,

     

    ok we will put at the very beginning of omxrpc_module_init_client() and retest.

    We will post traces for BEFORE (without Gate properly located) and AFTER (with Gate_enterModule at very beginning and Gate_leaveModule at very end). Then we should clearly see the difference in the trace as well.

     

    Regards,

    --Gunter