This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

I need help trying to understand and modify the McFW Usecase Examples on the DM8127 IPNC RDK

I have been using and experimenting with the DM8127 IPNC RDK for several months now. I am using the latest release GA Release 2.0. A number of times I have attempted simple modifications to one of the McFW Example Usecases (i.e. - Tri-Stream Full Featured, DSP, or Low Power). In almost every case the outcome is a "system crash". For instance, I often see this in the serial terminal after my modified McFW Usecase Examples / chains fail:

TimeOut occure in boot_proc.
Program exit.
TimeOut occure in boot_proc.
Program exit.
ApproDrvExit: 7
Error: WaitStreamReady Fail.
Error: SemWait: Invalid Semaphore handler

Sometimes they fail in other ways.

I have read the IPNC RDK Multi Channel FrameWork Software User Guide (Document Revision 1.04) many many times. I have grepped the McFW API and Usecase Example source code over and over and I think I've looked at almost every relevant file multiple times. The documentation for the McFW makes it sound like it's an easy process to connect links to form chains and then do some "high level configuration" to set the parameters and control it all. I'm finding it not easy at all. I'm a very experienced embedded C programmer (25 years) and I usually don't have nearly this much trouble when working with a new platform / framework API.

I feel like I must be "missing something". Is there other documentation for the McFW? Are there more fundamental and simple examples that I should be starting with? Should I even be trying to use the McFW at all? Is there an alternative framework I could / should be using? I feel like I might gain a better understanding of the platform if I worked directly with the Link APIs but I don't know of any simple tutorial examples to learn that either.

For the most part, I've been focusing my attention on modifying and understanding these files:

ti_mcfw_ipnc_main.c

multich_tristream_dsp.c

multich_tristream_fullFeature.c

Some of the things I'd like to be able to modify and understand are:

1. How to modify the resolution or framerate of one or more of the camera inputs and then have the modified resolution and framerate go correctly all the way through the chain and be available as an RTP/RTSP stream (i.e. - make the Camera(0) output 640x480 30fps and the Camera(1) output 1080P 30fps in the DSP Usecase Example and also make the chains work all the way to the RTP/RTSP streams).

2. How to insert a Scaler link somewhere in one of the chains (i.e. - between the DUP and IPC_FRAMES_OUT_VPSS links in the DSP Usecase Example).

3. How to turn off or delete one of the streams if I don't need all 3.

I've tried the obvious (to me) ways to do these things, but as mentioned previously, they never seem to work correctly and completely. It really doesn't seem to me that there is truly a single "high level configuration" for the McFW. It seems like several or many different things (some hidden in places I haven't found yet) may need to be configured to get a chain to work.

Can anyone provide me with modified usecase examples that accomplishes any of those goals?

Thanks for any help, advice, examples, tutorials or documentation you can give me!

  • Hi Allen.

    Maybe this example can help you a little : 4073.test.tar.gz

    There are 3 files changed :
    DM8127_GA_Release_2.0.0_2/ipnc_rdk/ipnc_mcfw/mcfw/src_linux/mcfw_api/usecases/multich_tristream_dsp.c
    DM8127_GA_Release_2.0.0_2/ipnc_rdk/ipnc_mcfw/mcfw/src_bios6/links_c6xdsp/swosd/osdLink_alg.c
    /DM8127_GA_Release_2.0.0_2/ipnc_rdk/ipnc_mcfw/mcfw/src_linux/links/ipcBitsIn/ipcBitsInLink_tsk.c

    This is modified multich_tristream_dsp.
    You can use it to test DSP on D1 resolution.
    multich_tristream_dsp.c : default DSP usecase is modified in that way, that DSP-OSD-link is moved from default position between capture-encode,
    to a new branch(blind), and you can use it for video analytics on D1 resolution.   

    osdLink_alg.c : now it is not used for OSD logo rendering, but only for doing a very simple average sum of all luma(Y) bytes,
    then if average value is less then 25 (very dark video frame, full range is 0-255), sends message via Syslink  to ARM-8, which prints that sum in a terminal.

    ipcBitsInLink_tsk.c: receives Syslink message form osdlink and prints that value in  terminal.

    You have to select DSP USECASE  in : Web interface->Live Video->Example->DSP USECASE  


    Regards.

  • Hi Marko,

    Thank you very much for your help and the example. It has given me some clues. And I've had a little bit of success in building my own chains. But I am still getting these types of "failures" with most of the chains I attempt to build and run:

    TimeOut occure in boot_proc.
    Program exit.
    TimeOut occure in boot_proc.
    Program exit.
    ApproDrvExit: 7
    Error: WaitStreamReady Fail.
    Error: SemWait: Invalid Semaphore handler

    For instance, I was able to build and run this chain successfully:

                                     Capture (YUV420)
                                     RSZA        RSZB
                                  1080P_60fps   D1_30fps
                                      |            |
                           ___________|            |
                           |                       |
                          DUP(0)------|            |
                          (1)         --------------
                           |                |
                           |                |
                        SCALER(VPS)         |
                     (to VGA_60fps)         |
                           |                |
                           |                |
                     FRAMESOUT(VPS)         |
                           |                |
                           |              MERGE
                      FRAMESIN(DSP)         |
                           |                |
                           |            IPCM3OUT(VPS)
                       SWOSD(DSP)           |
                           |                |
                           |            IPCM3IN(VID)
                     FRAMESOUT(VPS)         |
                           |                |
                           |             H264ENC(VID)
                      MJPEGENC(VPS)         |
                       (SIMCOP)             |
                           |             BITSOUT(VID)
                        BITSOUT(VPS)        |
                           |                |
                           |             BITSIN(A8)
                        BITSIN(A8)

    But no matter what I tried I could not get this similar chain (with the simple addition of a second DUP and a NULLSINK link) to work:

                                     Capture (YUV420)
                                     RSZA        RSZB
                                  1080P_60fps   D1_30fps
                                      |            |
                           ___________|            |
                           |                       |
                          DUP(0)------|            |
                          (1)         --------------
                           |                |
                           |                |
                        SCALER(VPS)         |
                     (to VGA_60fps)         |
                           |                |
                           |                |
                          DUP(0)-----       |
                          (1)       |       |
                           |    NULLSINK    |
                           |                |
                     FRAMESOUT(VPS)         |
                           |                |
                           |              MERGE
                      FRAMESIN(DSP)         |
                           |                |
                           |            IPCM3OUT(VPS)
                       SWOSD(DSP)           |
                           |                |
                           |            IPCM3IN(VID)
                     FRAMESOUT(VPS)         |
                           |                |
                           |             H264ENC(VID)
                      MJPEGENC(VPS)         |
                       (SIMCOP)             |
                           |                |
                           |             BITSOUT(VID)
                        BITSOUT(VPS)        |
                           |                |
                           |             BITSIN(A8)
                        BITSIN(A8)

    Can anyone explain why this should or shouldn't work?

    I do not understand what link types are allowed to be connected to what other link types. And I also often don't understand what the correct settings should be for the parameters "notifyNextLink", "notifyPrevLink", and "noNotifyMode" for any particular link-to-link connection. I try to make intelligent guesses, but as I have mentioned, usually my chains do not run for reasons that are unknown to me. What is the best way to figure out what these parameters should be?

    Other than looking through the source code for every single type of link, is there any document which gives an overview on using the Link API???

    So someone will possibly be able to help me, I will be a little more specific about the usecase I am trying to construct. I am attempting to derive my usecase from the example "DSP" usecase (included with the DM8127 IPNC RDK). I would like to have one 1080P H.264 stream available for output (just as it is in the example "DSP" usecase). Also I would like to have a VGA (640x480) chain which is passed into the DSP for analytics and a minor amount of graphics overlay (i.e. - a few highlighted pixels here and there). I would like this VGA chain to be available as an H.264 stream (as it looks before DSP analytics and graphics are applied) and also as a frame-rate reduced MJPEG stream (as it looks after the DSP analytics and graphics are added).

    I don't believe I want or need the RSZB (or 2nd output queue) from the CAMERA link. But I don't know how to get rid of it or change its resolution. Can anyone tell me how?

    I believe I should be able to take the RSZA (or 1st output queue) from the CAMERA link. Run it through a SCALER link to get the VGA resolution. Then DUP that with one chain going to the DSP and on to the MJPEG encoding... and the other chain merging back to the H.264 encoder for streaming.

    I had previously attempted to scale to the VGA resolution by using the "scEnable", "scOutWidth", and "scOutHeight" parameters of the CAMERA link "vipInst.outParams". But when I attempted this, the pictures seemed to be cropped but not scaled. So eventually I abandoned trying to understand that method. And I switched to using the SCALER link (which worked in the first link chain example I showed above). Can anyone explain to me how the scaling parameters in the CAMERA link are supposed to work?

    I realise I have a lot of questions but in the absence of more and better documentation on the McFW / Link APIs, I don't know what else to do other than ask them here.

    Thanks very very much, Marko!!! ... I'll continue to analyse your example and I'm sure I will continue to learn things from it. I like the method you show for passing messages from the DSP to the A8 Host. That will be very useful to me (once I can get my chains built and flowing correctly   :-)   ).

    And thanks to anyone who can answer even one of my questions or contribute other understandable examples of simple working McFW / Link API chains! If you can contribute some knowledge, it will of course be available to anyone who searches the E2E forums. And if there's some reference or document I'm missing, please please let me know.

    Thanks!

    Allen

  • Hi Allen,
    7331.src.tar.gz
    I  modified example from my previous post a little, only added scaler link(320x192) before DSP video analytics link.

    Regards.

  • Hi Allen,

    I can get good VGA resolution from RSZB (not cropped) by set "scOutWidth" = 640 "scOutHeight" = 480 in usecase file e.g. "multich_tristream_dsp.c"

    You also have to change RSZB parameters in driver since they are hard coded in:  /8127_GA_Release_2.0.0_2/ti_tools/iss_02_00_00_00/packages/ti/psp/iss/drivers/capture/src/issdrv_captureApi.c

    Look at attached file : 6320.issdrv_captureApi.c.tar.gz

    Regards.

  • Hi Allen,

         I now work with the DM8127 IPNC,and I want to get the Y component,but I have no idea about it ,would you give me some guidance? Thank you very much.




    B&R

    Enson