This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Is it correct to say that to DECODE an H.264 INTERLACED stream, previously encoded with the H.264 encoder as discussed here, I have to call twice the decoding function with the same output buffer, in this way:
ret = Vdec2_process(hVd2, hInBuf1, hDstBuf);
(here ret should be Dmai_EFIRSTFIELD)
ret = Vdec2_process(hVd2, hInBuf2, hDstBuf);
(here ret should be Dmai_EOK)
Where hInBuf1and hInBuf2 are the two slices obtained from the two calls of the encoder function, one for each field.
In particular I wonder if the hDstBuf must be the same in the two calls.
Thank you.
Thank you for the confirm Niclas.
I use this strategy to decode my signal. However after few seconds (sometimes 1-2s sometimes 10s or even 20s) my decoding routine crash due to the condition
ret == Dmai_EBITERROR && Buffer_getNumBytesUsed(hInBuf) == 0
However before the decoder call, the hInBuf is not empty. Is it possible that the Vdec2_process change the properties of the hInBuf? Why?
This is part of the code:
while (true)
{
.....
if (Buffer_getNumBytesUsed(hInBuf) == 0)
printf("No encoded data available.\n");
/* Decode the video buffer */
ret = Vdec2_process(hVd2, hInBuf, hDstBuf);
if (ret < 0) {
ERR("Failed to decode video buffer\n");
cleanup(THREAD_FAILURE);
}
/* If no encoded data was used we cannot find the next frame */
if (ret == Dmai_EBITERROR && Buffer_getNumBytesUsed(hInBuf) == 0) {
ERR("Fatal bit error. \n");
cleanup(THREAD_FAILURE);
}
...
}
Hello,
Since you are stepping through an elementary bit stream, the application needs to know how many bytes were used in the previous frame to know where the next frame is.
The Vdec2_process() call sets the numBytesUsed with this information from the codec so that the application (typically through the Loader module) can step to the next frame in the bitstream and pass it to the codec in the next iteration of the loop.
In this case, there was a bit error (not fatal error) in the stream, at which point the codec should still return the size of the (corrupted frame). Otherwise the next iteration of the loop would process the same corrupt data over and over. When this situation is detected, the application bails as it cannot recover.
Essentially, this is a codec issue. It should always set the number of bytes consumed unless there's a fatal error.
Regards, Niclas
I understand Niclas.
I just tested that since I am not using the loader but i receive my data from network, and I am able to skip stream bytes until the next NAL containing SPS and PPS, I can recover the bit error even when the decoder puts the size if the input buffer to 0.
Thank you.
Peregrinus, I just ignore the the error and continue decoding. Like you I do';t need to know how many bytes are consumed because I am sending it a complete field for each Vdec1_process call. I don't even bother skipping to the next SPS/PPS.
Now I should say that is how it works when I using non-interlaced video and sending it a whole frame at a time. I recently switched to decoding interlaced video and I'm having a few problems. Since this is network streaming I do lose a packet every now and then. And the decoder seems to handle it ok. But it's possible for me to get the field top and bottom order mixed if I lose a packet. So I'm working on that issue now.
The biggest problem I'm having and would like to know if you've seen this...The decoder is having a temporal order issue with the fields. It appears to be displaying them backwards and I get jitter on horizontal motion. The best way to see this is to use a CRT type monitor that has true interlaced output. But it can be seen on an LCD monitor as well. I swapped the top and bottom fields during encoding as a test and found the temporal problem is fixed, but it has an obvious spacial swap where the scan lines are visibly upside down in order.
So right now I'm focusing on the decoder because the video looks ok on VLC.
John A
Hi John,
I experienced a similar problem at the beginning of the development but if I remember correctly, at that time the image was not encoded correctly. Now, I see the output on a CRT and it looks right. The only problem I see is that sometimes the image looks very "solarized", it looks like as very few bit per pixel are used but this does not happen frequently and this artifact comes from the decoder since VLC does not show it.
Another thing: I start to decode providing the SPS in the first buffer to encode. Do you call the decoder twice? You should see the output if the encoder, when it says that it has decoded the first field you have to provide the second field with the same output buffer.
I call Vdec2_process with a buffer of one field and when I get the Dmai_FIRSTFIELD return code from Vdec2_process I give it the same output buffer on the next call. The first field of each IDR frame (1 per second @ 30fps) has the SPS/PPS at the beginning. I also start decoding with the SPS/PPS and the IDR frame as the first buffer given tot the decoder.
I'm coming to the conclusion that the field order problem is in the encoder.
John A
Here is a sample h.264 encode that demonstrates the field order problem. When you run it with the dvsdk sample decoder you can see the jitter trails on the moving cars. If I encode the bottom field first the jitter goes away but the field order is visually reversed.
http://dl.dropbox.com/u/3345452/test.264
John A
Sorry John,
at the moment I have not the board available to try to decode what you encoded. In VLC it looks fine. But the dvsdk sample decoder does not manage interlaced video.
Yes, it looks ok in VLC but I believe the issue is being concealed by the way VLC renders the video. The dvsdk decoder that came with DVSDK 4 for the DM365EVM does play the interlaced video and shows the problem.
Also, when I play the video in VLC it plays at half speed (not exactly half though). It's almost like it thinks each field is a complete frame. When I play the same video streaming RTP to VLC it plays at the correct speed.
John A
John, I am still working with the DVSDK 2 but I upgraded the H264 codec the platinum (if I remember).
I noticed the same behaviour regarding the speed, when you play the file the speed is half of the right speed while via RTP it si correct. However also when I send the H.264 to the TI decoder directly from file or with UDP stream or RTP the speed is right so I suppose the half speed is a problem of VLC.
I slightly modified the sample code in this way to decode interlaced video, hope this can help you.
while (!gblGetQuit()) {
if (ret != Dmai_EFIRSTFIELD) {
/* Get a displayed frame from the display thread */
fifoRet = Fifo_get(envp->hDisplayOutFifo, &hDispBuf);
if (fifoRet != Dmai_EOK) {
ERR("Failed to get free display out buffer\n");
cleanup(THREAD_FAILURE);
}
/* The display thread is no longer using the buffer */
hDstBuf = BufTab_getBuf(hBufTab, Buffer_getId(hDispBuf));
Buffer_freeUseMask(hDstBuf, DISPLAY_FREE);
/* Keep track of the number of buffers sent to the display thread */
numDisplayBufs--;
/* Get a free buffer from the BufTab to give to the codec */
hDstBuf = BufTab_getFreeBuf(hBufTab);
if (hDstBuf == NULL) {
ERR("Failed to get free buffer from BufTab\n");
BufTab_print(hBufTab);
cleanup(THREAD_FAILURE);
}
}
/* Make sure the whole buffer is used for output */
BufferGfx_resetDimensions(hDstBuf);
if (Buffer_getNumBytesUsed(hInBuf) == 0)
printf("No encoded data available.\n");
/* Decode the video buffer */
ret = Vdec2_process(hVd2, hInBuf, hDstBuf);
if (ret < 0) {
ERR("Failed to decode video buffer\n");
cleanup(THREAD_FAILURE);
}
/* If no encoded data was used we cannot find the next frame */
if (ret == Dmai_EBITERROR){
VDCTF_MESSAGE(DBG_DEBUG,"* bit error *.\n");
}
/* Increment statistics for the user interface */
//gblIncVideoBytesProcessed(Buffer_getNumBytesUsed(hInBuf));
/* Send frames to display thread */
if (Dmai_EFIRSTFIELD != ret)
{
bufsSent = handleCodecBufs(hVd2, envp->hDisplayInFifo);
if (bufsSent < 0) {
cleanup(THREAD_FAILURE);
}
}
else
{
bufsSent = 0;
}
/* Keep track of the number of buffers sent to the display thread */
numDisplayBufs += bufsSent;
if (decodeFromFile)
{
/* Load a new encoded frame from the file system */
if (Loader_getFrame(hLoader, hInBuf) < 0) {
ERR("Failed to get frame of encoded data\n");
cleanup(THREAD_FAILURE);
}
}
else
{
Fifo_put(envp->hNetworkInFifo, hInBuf);
/* Get a encoded frame from the Network Rx thread */
fifoRet = Fifo_get(envp->hNetworkOutFifo, &hInBuf);
if (fifoRet != Dmai_EOK) {
cleanup(THREAD_FAILURE);
}
if (0 == Buffer_getNumBytesUsed(hInBuf))
{
//buffer vuoto;
printf("the encoder get an empty buffer\n");
RXBytes = 0;
while (RXBytes <= 0)
{
// send the buffer back
Fifo_put(envp->hNetworkInFifo, hInBuf);
/* Get another buffer */
fifoRet = Fifo_get(envp->hNetworkOutFifo, &hInBuf);
if (fifoRet != Dmai_EOK) {
cleanup(THREAD_FAILURE);
}
RXBytes = Buffer_getNumBytesUsed(hInBuf);
if (RXBytes > 0)
{
if (!CheckSPS(Buffer_getUserPtr(hInBuf)))
RXBytes = 0;
}
}
}
}
frameNbr++;
}
HI Perigrinus,
In my other thread regarding the decoder being broke I got a confirmation that the problem is in the encoded stream. So the decoder is off the hook. It's strange because I'm only setting oper_mode=1 and then using DMAI to capture and encode. We have a custom made board but I can't see anything in the schematic that's different than the EVM wrt the TVP5146. Maybe I'll try running my app on the EVM and see if the problem exists there.
John A
Hi Perigrinus,
thanks for the code.
How ever,when it return DMAI_BITERROR, not buffers will be sent to displsy thread, and at the next time , I found BufTab_getFreeBuf() will not return the previous buffer. it seems the codec donot release the hDstBuf, so when this error happens again for about 31 times, BufTab_getFreeBuf() will return NULL. At this moment , not buffer will be available, the application is dead.
So ,do you meet this problem? how can I let the codec release the buffer?
thank you.