This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

h264 encoder/decoder performance issue and codec-server configuration issue

Other Parts Discussed in Thread: DM3730, OMAP3530

HI,

I have some questions about the development of dsp-codecs working within a codec-server. The board I have is a custom developed system with a omap37 (DM3730) series processor, runnning with 1 GHz, 512 MB Ram and OpenEmbedded as software platform. All needed software parts can be build with this OpenEmbedded platform, dsplink, dspbios, cmem, edman-lld, gstreamer-ti plugin, codecs, codecserver and so on. All resulting ipk's can be installed and first tests show me the software is working.

1. The default recipes in openembbed are building a codecserver with some streaming codecs in it for accelerating audio/video streams with the DSP. I try a gstreamer pipe like this for testing the h264enc codec streaming to a file.

gst-launch -v videotestsrc  num-buffers=100 ! video/x-raw-yuv, framerate=25/1, width=640, height=480 ! TIVidenc1 codecName=h264enc engineName=codecServer displayBuffer=true ! dmaiperf ! filesink location=sample.264

I'm really surprised about the resulting performance of about 4 framespersecond because I expected that it should be possible to achieve 25 fps. TI has given some performance results here http://processors.wiki.ti.com/index.php/OMAP3530_Performance#Results on a OMAP3530 EVM.

Has anyone made similar experience and can give me some hints to boost the h264 streaming ?



2. I'm interessting to develop and test my own codecs. First I have builded a codec template for developing a simple image inverter with a IUNIVERSAL-Interface only for seeing the workflow with the codec-wizard and building the codec with Code Composer Studio. Everything is working without any problems. Then I have generated a codecserver using the codecserver wizard and added my new codec and some TI-default video streaming codecs (h264enc/dec) to the new codecserver. The memory configuration is also adjusted to my hardware (448 MB for the linux system, 16 MB for Dsplink and 48 MB for shared memory).

The codec server wizard give some selection to influence a special part of Memorymanagement, selection EDMA3 LLD. The EDMA3 LLD is selected by default. Trying the same h264 encoding gstreamer pipe the h264enc will not starting anymore. To enable the verbose mode I added the CE_DEBUG=3 option to the gstreamer pipe and found some interessing error-messages but as a new-comer it is not easy to interpret:

[DSP] @0,129,449tk: [+0 T:0x9cf4c61c S:0x9cf502a4] CE - Engine_open> Enter('local', 0x9cf502f4, 0x9c2048c0)
[DSP] @0,129,974tk: [+0 T:0x9cf4c61c S:0x9cf50284] OM - Memory_alloc> Enter(size=0x34)
[DSP] @0,130,389tk: [+0 T:0x9cf4c61c S:0x9cf50284] OM - Memory_alloc> return (0x9cf51178)
[DSP] @0,130,836tk: [+4 T:0x9cf4c61c S:0x9cf502a4] CE - Engine_open> engine->server = 0x0
[DSP] @0,131,334tk: [+0 T:0x9cf4c61c S:0x9cf502a4] CE - Engine_open> return(-1661660808)
[DSP] @0,131,878tk: [+0 T:0x9cf4c61c S:0x9cf502ec] OM - Memory_alloc> Enter(size=0x34)
[DSP] @0,132,298tk: [+0 T:0x9cf4c61c S:0x9cf502ec] OM - Memory_alloc> return (0x9cf511b0)
[DSP] @0,132,822tk: [+0 T:0x9cf4c61c S:0x9cf502c4] ti.sdo.ce.alg.Algorithm - Algorithm_create> Enter(fxns=0x9cfc5ab0, idma3Fxns=0x9cfc5adc, iresFxns=0x0, params=0x9c2048c0, attrs=0x9cf50410)
[DSP] @0,133,697tk: [+0 T:0x9cf4c61c S:0x9cf502a4] OM - Memory_alloc> Enter(size=0x10)
[DSP] @0,134,205tk: [+0 T:0x9cf4c61c S:0x9cf502a4] OM - Memory_alloc> return (0x9cf511e8)
[DSP] @0,158,326tk: [+0 T:0x9cf4c61c S:0x9cf5022c] ti.sdo.ce.osal.Sem - Sem_create> count 1
[DSP] @0,158,811tk: [+0 T:0x9cf4c61c S:0x9cf5022c] ti.sdo.ce.osal.Sem - Sem_create> sem: 0x9cf516e4
[DSP] @0,159,859tk: [+0 T:0x9cf4c61c S:0x9cf501a4] ti.sdo.ce.osal.Sem - Sem_create> count 1
[DSP] @0,160,322tk: [+0 T:0x9cf4c61c S:0x9cf501a4] ti.sdo.ce.osal.Sem - Sem_create> sem: 0x9cf51714
[DSP] @0,162,139tk: [+0 T:0x9cf4c61c S:0x9cf5016c] ti.sdo.ce.osal.Sem - Sem_delete> sem: 0x9cf516e4
[DSP] @0,162,771tk: [+0 T:0x9cf4c61c S:0x9cf501b4] ti.sdo.ce.osal.Sem - Sem_delete> sem: 0x9cf51714
[DSP] @0,163,366tk: [+7 T:0x9cf4c61c S:0x9cf502c4] ti.sdo.ce.alg.Algorithm - Algorithm_create> Granting DMA channels to algorithm through DMAN3 FAILED (0xfffffffe)
[DSP] @0,164,204tk: [+0 T:0x9cf4c61c S:0x9cf502a4] ti.sdo.ce.alg.Algorithm - Algorithm_delete> Enter(alg=0x9cf511e8)
[DSP] @0,165,168tk: [+0 T:0x9cf4c61c S:0x9cf50274] OM - Memory_free> Enter(addr=0x9cf511e8, size=16)
[DSP] @0,165,733tk: [+0 T:0x9cf4c61c S:0x9cf50274] OM - Memory_free> return (0x1)
[DSP] @0,166,125tk: [+0 T:0x9cf4c61c S:0x9cf502a4] ti.sdo.ce.alg.Algorithm - Algorithm_delete> Exit
[DSP] @0,166,624tk: [+0 T:0x9cf4c61c S:0x9cf502c4] ti.sdo.ce.alg.Algorithm - Algorithm_create> return (0x0)
[DSP] @0,167,196tk: [+6 T:0x9cf4c61c S:0x9cf5030c] CV - VISA_create2> FAILED to create local codec.
[DSP] @0,167,691tk: [+0 T:0x9cf4c61c S:0x9cf502dc] CV - VISA_delete(0x9cf511b0)
[DSP] @0,168,081tk: [+5 T:0x9cf4c61c S:0x9cf502dc] CV - VISA_delete> deleting codec (localQueue=0xffff, remoteQueue=0xffff)

I'm not confirm with this EDMA3_LLD memory managemant but if unchecking the EDMA3_LLD selection as described here http://processors.wiki.ti.com/index.php/Codec_Engine_GenServer_Wizard_FAQ#Algorithm_and_Group_Configuration I can work with my test codec server.  What is going wrong here ? I have installed the ipk ti-edma3lld on my test system  without any changes on the behaviour.

 

Thanks Uli

  • Perhaps too late for you, but others scratching their heads over this post may benefit:

    My BSP sets the DSP frequency to 260MHz.  I don't believe this is documented anywhere, but I ran into similar problems, and obviously saw performance improvements after changing the clock to 800MHz. 

    http://e2e.ti.com/support/embedded/linux/f/354/p/286591/999622.aspx#999622