This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Using DVSDK 4, DM368, H264
I've got my H264 streaming encoder working, but having trouble with the decoder. As a test I have my decoder capture the incoming RTP packets, strip the RTP header, reconstruct the NALU from RTP format back to the encoded stream format, then write to a file. The file plays with the dvsdk decode sample app and VideoLan client.
When I try to pass the same buffer to Vdec2_process is when the problems start. I'm using code that came from the DVSDK sample app. The first problem I have is that I run out of display buffers. Each time I call Vdec2_process I give it a new picture buffer from a fifo that the display thread uses to send buffers back to the decode thread. That fifo is loaded with display buffers on init.
When I make the call to handleCodecBufs it appears that no buffers are being sent to the display thread. I'm assuming that the decode may not be able to decode the data and therefore no picture.
My first question is....
if the Vdec2_process can't decode the data to create a display buffer, then how do I get the buffer back? My decode thread needs to be aware that the display buffer I took from the fifo can be reused. There seems to be no provision in the code for this situation.
Next question is....
Why do we give the decoder the buftab with Vdec2_setBufTab? If I'm giving it display buffers then why does it need the BufTab?
As a test I return the display buffer to the fifo after calling Vdec2_process. I know that's going to confuse things if a buffer does get sent to the display thread, but I was able to get continuous operation without stalling for a display buffer. I was able to get some decoded picture output although it was horribly trashed. You could almost get a clear frame on occasion. I also got an overwheming number of the following error messages...
VICP Error: VICP_wait() ioctl failed: -1
I'm not sure why my application is having so much trouble getting a decoded picture when the same data is played ok by the sample decode app and VLC. But if I can get a better understanding of the buffer handling it will help.
John A
Hi John,
We offten faced similar problems. I also made decoder based on DVSDK DMAI decode demo, and have that problem.
Problem you describe might arise since decoder board display rate (composite PAL or NTSC ? ) is slower then
encoder frame rate.
So decoder blocks at Display_get(...) DMAI function, while loader RTP thread and video decode thread delivers frames faster then
display can render. Even if encoder frame rate is same than decoder display rate, there may be some RTP packets congestions/bursts occasionally, which can cause similar problems.
I suggest you first to decrease frame rate on encoder, to the value that is lower than display rate, and see if that solves problem.
Than you can redesign application, so display thread would not block other threads. Loader and video decode thread should run without waiting for display buffer.
Regards,
Marko.
I'm pretty sure that the problem is not related to the rate of frames. I have an app I wrote that records the RTP packets and can send them out at lower rates. My concern is that if the decoder for whatever reason doesn't fill a decoded picture buffer (DPB) then the DPB I sent will be lost. My app creates 4 DPBs at init. Unless the decoder sends the DPB to the display thread, it won't get put back into the FIFO for the decoder thread to reuse.
My RTP reader thread throws out all coded picture buffer (CPB) data until it finds the SPS/PPS in the RTP stream. So I guarantee that the first buffer the decode gets is a SPS/PPS followed by an IDR frame. The very first call to Vdec2_process should return a filled DPB. I'm passing complete CPBs to the Vdec2_process.
Also there is no reason I can think of that the decoder should not be able to handle 30fps at SD resolution.
John A
Hi John.
I don't know if you saw this DMAI/decode demo lost buffers workaround: http://e2e.ti.com/support/embedded/f/356/t/56421.aspx from Yashwant.
I made pretty similar application, based on decode demo, with modified loader which receives RTP packets. My loader also waits until the first SPS/PPS, and then starts to feed decode video thread with received buffers. I always put one complete frame from loader to video thread. In loader i'm checking "RTP sequence numbers", and in case they are not in strict order (missing packets), i drop that frame.
I also drop all frames from the broken frame until the next IDR frame. But with mentioned workaround (or with older ver1 h264 decoder), it should work with only dropping current broken frame. Then i do one new fifo_put for each received frame, to the video thread, and there fifo_get ...
It is not problem that decoder is not able to handle 30fps, but e.g PAL display has only 25 fps.
Check RTP sequence numbers.
Marko.
Marko, Thanks for that link. I just downloaded the zip file and I will take a look at it.
I get a lot of these errors...
VICP Error: VICP_wait() ioctl failed: -1
This makes me think that there is something wrong with my data. But I tested the data by writing it to a file instead of sending to the decoder. Then played it back with the decoder sample app and it played fine. Played fine in VLC also. So apparently my parsing of the RTP stream is ok.
Hopefully the zip file with help with my buffer problem. I appreciate your help.
John A
I'm not sure if this is a clue but I get this error first when I send the first SPS/PPS and IDR frame...
VICP Error: VICP_register() ioctl failed: -1
After that error I get a string of the ioctl failed errors.
Perhaps this is an indication of something wrong with my parameters being send to the decoder.
I send first frame like : 00 00 00 01 SPSbytes... 00 00 00 01 PPSbytes... 00 00 00 01 frame bytes...
SPS/PPS just preceded the frame payload bytes, they are all together in the same Vdec2_process(...) call.
My parameters:
....
IH264VDEC_Params extnParams;
params = envp->params ? envp->params : &defaultParams;
dynParams = envp->dynParams ? envp->dynParams : &defaultDynParams;
if (envp->videoStd == VideoStd_720P_60) {
params->maxFrameRate = 30000;
params->maxWidth = VideoStd_720P_WIDTH;
params->maxHeight = VideoStd_720P_HEIGHT;
} else if (envp->videoStd == VideoStd_D1_PAL) {
params->maxFrameRate = 25000;
params->maxWidth = VideoStd_D1_WIDTH;
params->maxHeight = VideoStd_D1_PAL_HEIGHT;
} else {
params->maxFrameRate = 30000;
params->maxWidth = VideoStd_D1_WIDTH;
params->maxHeight = VideoStd_D1_NTSC_HEIGHT;
}
if (colorSpace == ColorSpace_YUV420PSEMI) {
params->forceChromaFormat = XDM_YUV_420SP;
} else {
params->forceChromaFormat = XDM_YUV_422ILE;
}
#if cDecH264Ver == cDecH264Ver_01_00_00_08
extnParams.displayDelay = 0;//16;
extnParams.hdvicpHandle = (void*) NULL;
extnParams.resetHDVICPeveryFrame = 0;
extnParams.disableHDVICPeveryFrame = 0;
if(strcmp(envp->videoDecoder,"h264dec") == 0)
params->size = sizeof(IH264VDEC_Params);
extnParams.viddecParams = *params;
hVd2 = Vdec2_create(hEngine, envp->videoDecoder, (VIDDEC2_Params*)&extnParams, dynParams);
#elif cDecH264Ver == cDecH264Ver_02_00_00_10
extnParams.displayDelay = 8;
extnParams.levelLimit = 0;
extnParams.disableHDVICPeveryFrame = 0;
extnParams.inputDataMode = 1;
extnParams.sliceFormat = 1;
extnParams.frame_closedloop_flag = 1;
if(strcmp(envp->videoDecoder,"h264dec") == 0)
{
params->size = sizeof(IH264VDEC_Params);
}
extnParams.viddecParams = *params;
hVd2 = Vdec2_create(hEngine, envp->videoDecoder, (VIDDEC2_Params*)&extnParams, dynParams);
#endif
Marko, your last post was a huge help!!!!
I wasn't using the extended parameters, as I was just using the sample decode app for reference. Putting in the same extended parameters as you posted made a big difference. I am now getting clear decoded images of my stream. However I am still getting the VCIP errors. The decoding also stops when get back from Vdec2_process and the number of bytes consumed from my input buffer is zero. I could try using that as a signal to reuse the display buffer, as that appears to be what causes the decode to stall.
I need to look over those extended parameters and see if anything in there could be the cause of my VCIP errors.
I don't want to mark the thread as answered until I finally get through the errors, and examine what's int he zip file. But when I do mark it as answered I will be sure your last post gets marked.
John A
You can also find extended parameters use for Platinum codec in Yashwant decode sample.
Those errors from Vdec2_process disappeard for me, when i skiped decoding until next IDR frame, after first Dmai_EBITERROR.
This error to me always indicated that frame was not properly combined from RTP packets. But that occured since display was to slow, and thus
display thread also blocked loader thread.
Marko.
Turns out that setting frame_closedloop_flag to 1 from the default 0 was causing the majority of my problems. This doesn't make much sense in the context of the documentation. I'm going to post a question about this in the multimedia software codec forum.
The VCIP errors that I am getting aren't related to the Dmai_EBITERROR. I get them on the very first decoded IDR frame and no error on the return.
And yes the RTP packet receive is very fragile. My encoder application takes a frame and sends it out as RTP packets that fit into a single ethernet MTU. It's pretty easy to lose a packet and therefore the entire frame. I do have an issue with my ethernet interface running in half duplex at the moment, but that's another story.
Thanks for your generous help Marko. As soon as I can figure out the VCIP error issue I will be golden.
John A
Hi,
frame_closedloop_flag sets decoding mode, platinum decoder has two modes universal(frame_closedloop_flag = 0) and close loop mode(frame_closedloop_flag =1).
Universal decoder can decode all .264 streams but closed has some limitations to achieve speed performance.
Hi John,
I did not look at this post earlier. I think the issue you are seeing is because of the display buffers that codec holds based on the encoding level of the stream. For the closed loop decoder, we do not block the display buffers. By default in universal decode mode, codec blocks upto 17 buffers before providing for display.
I would request you to refer to decoder user guide and the buffer usage mechanism.
Hope that helps.
Regards,
Anshuman
Yes, it appears that I can turn closedloop mode off if I set the displayDelay below 4, which happens to be the number of display buffers I allocation. Any idea why someone would decide to hold back buffers? Seems to make absolutely no sense, especially considering how much memory it eats up.
Also, do you have any idea about the errors. This is killing me as the api so abstracted it's difficult to even find where the functions called from. I get a string of errors that look like this for every decoded frame.
VICP Error: VICP_register() ioctl failed: -1
VICP Error: VICP_wait() ioctl failed: -1
VICP Error: VICP_wait() ioctl failed: -1
VICP Error: VICP_wait() ioctl failed: -1
VICP Error: VICP_wait() ioctl failed: -1
VICP Error: VICP_wait() ioctl failed: -1
VICP Error: VICP_wait() ioctl failed: -1
VICP Error: VICP_wait() ioctl failed: -1
VICP Error: VICP_wait() ioctl failed: -1
VICP Error: VICP_wait() ioctl failed: -1
VICP Error: VICP_wait() ioctl failed: -1
John A