This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DM648 Vport timing and configuration problem

Hello,

 

I am working with analogue video capture as in the MiniDemo among the DVSDK examples for DM648. I got everything working, the video comes in and output in the display.

 

I started timing the captured frames to see wether ther is any frame drop. To my astonishment the driver claims to return with a new frame every 10ms. Since I am using Pal video format it should be a confortane 40ms.

Digging deeper I realised that the time, based on the timestamp in FVID_Frame, is heavily dependent on the number of printf calls I use to print them. EG. If I just put one print f in the loop I get about 20ms between frames. If I have two ore more I get 10ms or in some cases 5ms.

I also played around with the parameters in VPORTCAP_Params to see the difference between full frame and  field times. However this does not seems to change the timing of the frames either. By setting VPORTCAP_Params.mergeFlds=VPORT_FLDS_SEPARATED I expected to have to separate field in one frame. I also expected that the frames should follow with 40ms intervals. Instead the timing did not change a bit. I got only one field of the the frame though.

I also tried to set VPORTCAP_Params.fldOp = VPORT_FLDOP_PROGRESSIVE to force it to give me a frame every 40ms, but had no effect either.

 

I cannot explain the timestamps on the video frames and the way they change with using printf. The fact that they do not match any sensible value brings up the question of whether the low-res timer is working properly. The time stamps on the image frames matched the low-res timer therefore I set up a loop of 10,000 timer ticks to see how long that takes. It took 10sec, therefore it seems that my timer is working properly.

 

if my timer works and my display where the capured frames are copied to is ok, then how can the timing be wrong?

Could the timer be stoped while either printf is being executed or for the duration of video capture?

Shouldn't   FVID_dequeue block if there is no video frame to output? It cannot have a new frame every 10ms or sometimes faster.

Please find attached the test code below.

 

Best regards,

Peter

 

 

 while ( (!VIDEO_bFlagCaptDone[chanNum]) && (status == 0) )
    {

      printf("aaaaaaaaaaaaaqqqqqqqqqqq  in %d\n", CLK_getltime());
      // Get an old, full buffer from driver
      if ((status = FVID_dequeue(VIDEO_hCaptChan[chanNum],&pCapFrameOld)) != IOM_COMPLETED)
      {
        LOG_printf(&trace,"FVID_exchange from VIDEO_hCaptChan[%d] failed!",chanNum);
        LOG_printf(&trace,"FVID_exchange returned code %d",status);
        exit(0);
      }
      printf("aaaaaaaaaaaaaqqqqqqqqqqq out %d\n", CLK_getltime());
      ATM_deci(&VIDEO_iCaptFramesInDriver[chanNum]);

      // Allocate a new FVID_Frame
      if( (status = LOCAL_allocBuffer(&pCapFrameNew,NULL)) != IOM_COMPLETED)
      {
        LOG_printf(&trace,"LOCAL_allocBuffer to VIDEO_hCaptChan[%d] failed!",chanNum);
        LOG_printf(&trace,"LOCAL_allocBuffer returned code %d",status);
        exit(0);
      }

      // Give new empty frame to driver
      if ((status = FVID_queue(VIDEO_hCaptChan[chanNum],&pCapFrameNew)) != IOM_COMPLETED)
      {
        LOG_printf(&trace,"FVID_exchange from VIDEO_hCaptChan[%d] failed!",chanNum);
        LOG_printf(&trace,"FVID_exchange returned code %d",status);
        exit(0);
      }
      ATM_inci(&VIDEO_iCaptFramesInDriver[chanNum]);

      // Set variable that points to actual frame data
      pFrameBufferData = pCapFrameOld->frame.iFrm.y1;

      // Invalidate the cache lines since data came from EDMA3 in driver
      BCACHE_inv(pFrameBufferData,VIDEO_BUFSIZE, TRUE);
     
      // Free the FVID_frame that we dequeued from the driver
      pCapFrameOld->frame.iFrm.y1 = NULL;

     
      printf("qqqqqqqqqqq %d %d\n", CLK_getltime(), pCapFrameOld->timeStamp);


      LOCAL_freeBuffer(&pCapFrameOld);


      // Post frame data to copy thread, where it will be used and freed
      MBX_post( VIDEO_hMbxCapt[chanNum],
                (Ptr)&pFrameBufferData,
                SYS_FOREVER );
      // Yield to another equal priority task now that we've posted our data
      TSK_yield();
    }

 

 

  • OK, It seems that if I drop all the printf lines and save the timestamps in an array, then It is all fine. i get a frame every 40 ms.

     

    I guess that the print f stops the system as the message is printed on the host  computer.  Of course  the external video source does not know about it and keeps going. This therefore looks like the video frames coming faster then they should.

     

    The problem of separating the two fields remains. Ther is a frame every 40 ms now, but it is either an interlaced one or in separated mode a singel filed is returned with the other field blanked out. Since I only get the images with frame rate I assume that the missing field of the frame is lost.

    So how can I get every field of a video separated?

     

    Any ideas,

    Peter

  • You are right, the printf will halt the CPU and hence the timer ISR or the internal code will not get a chance to update the timestamp value. Hence you will see wrong timestamps compared to the actual time.

    Regarding the field separated mode:

    When you enable field separated mode, what you get in a dequeue operation is still two fields. But the only diference is that the odd and even fields are in two buffers and not as a merged frame. Hence for a PAL capture you still do 30 dequeue operation per second and not 60 as you are expecting. You could access the other field using the index 2 of the FVID_Frame address array.

    Hope this clarifies.