This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Where to find the TI's demo associated with the article SPRP654 for Davinci DSPs

Does anyone know where and how to download the demo code associated with this TI document: Using the Video Port Sub System (VPSS) for Inter-Processor Communication (SPRP654). It seems that the demo includes the following 3 examples:

-- BT.656 (VPBE -> VPFE) transfer demo of a video stream;
-- RAW (VPBE -> VPFE) transfer demo of a video stream;
-- RAW (VPBE -> VPFE) transfer demo of a arbitrary data;

Thanks in advance.

John XU

 

  • Searching that title, the only thing I was able to find is : http://processors.wiki.ti.com/index.php/Using_the_Video_Port_Sub_System_(VPSS)_for_Inter-Processor_Communication

    I can't find SPRP654 at all, are you sure it is the right doc?

  • Hi Paul,

    Thanks for the reply. I think you got the doc that I am looking for the demo code associated with it. 'SPRP654' is what appears within this document. I am not sure if it is still available on TI's website.

    Regards,
    John

     

  • Paul is right, that wiki link is all that exists in regards to the VPSS communication topic, there was never any demo code productized and published, and the full packetization concept was never implemented. The wiki gives the foundation of what you would need to do however, primarily some register modifications on the master, to get it to output data without any filtering, which are what took the bulk of the time in the project aside from the hardware generation. The code that was used in the demo was taken directly out of the old DVSDKs as noted on the wiki, the only change was some code to write something into the frame buffers of the transmitter, and something to seek out that data on the receiver. Unfortunately the code that was used is no longer available (at least the ARM side code), but I would say that even if it was it would not be of much help, as what was added to the demos was somewhat of a hack, and for any real world system it would be made custom to whatever the system's needs are anyway (not to mention the basis on long obsolete DVSDKs).

    If you are planning on implementing this, I would be glad to discuss what you plan on doing here on the forums to help point you in the right direction, the concept does work, but it is not the most practical interface for most situations.

    SPRP654 is just the slide set in PDF form that was used with the TIDC presentation on this topic.

  • Hi Bernie,

    Thank you for the very informative answer. Basically what we want to do is using 2 DM6437 DSPs in our system: one sits within a camera that acquires the video at 60 fps, processes the video data and then sends the processed video at 30 fps over 5 meters of cable to another DM6437 DSP for additional processing and operator display. We planed to use digital/video (LVDS) (or camera link) as the interface between the camera and the additional processing board mentioned above. We would like to transfer the image data between 2 DSPs in raw sensor data format rather than in analog video steam that has to go through encoder and decoder (-- plus as mentioned in your article, raw could be used to transfer arbitrary/general data).

    I am not very familiar with TI's DaVinci DSP. If I understand correctly the DSP within the camera needs to configure the CCDC driver of VPFE to capture raw sensor data. My question is: when transferring the raw CMOS sensor data from the DSP within camera to the 2nd processing DSP, should I bypass the encoder driver of VPBE for the DSP within the camera and how should I configure the CCDC driver in the 2nd DSP?

    Again, thanks for all your time and help.


    Regards,
    John

  • I suppose what you want to do is possible, though I am not intimately familiar with LVDS transmitters, if you can find one that can accept a raw camera or video like data stream than this should be possible. Sending arbitrary data over is possible if you use a configuration like described in the wiki, with external syncs and no scaling or post processing, however since a lot of this hinges on your LVDS chipset I could not say for sure how possible this is without looking at a particular LVDS chipset.

    Do note that the LVDS interface is fairly necessary, it is unlikely that you could get the raw parallel video data going stable over a 5 meter cable (at least not at 27MHz), I was having trouble keeping it stable on a 6 inch ribbon cable, though granted that could be improved with shielding and buffering.

    John Xu said:
    If I understand correctly the DSP within the camera needs to configure the CCDC driver of VPFE to capture raw sensor data.

    This depends on your particular sensor and what work you want to do on the first DM6437, but chances are if it is a raw sensor that you would want to configure the CCDC to do any pre processing necessary for your work (i.e. color space conversion) on the first DM6437 in the camera.

    John Xu said:
    My question is: when transferring the raw CMOS sensor data from the DSP within camera to the 2nd processing DSP, should I bypass the encoder driver of VPBE for the DSP within the camera and how should I configure the CCDC driver in the 2nd DSP?

    I would probably do the intial processing with the encoder driver on the first DSP so you can work with the data there as well. In the second DSP, if you want to be able to receive arbitrary data you would need it configured in a raw data mode for capture such that it does not try to do any processing during the capture which could corrupt your arbitrary data, even if it is capturing processed YCbCr data, the VPFE on the second DSP does not have to know any better.

  • Bernie,

    Thank you so much for taking time to answer my questions. I will look into the direction you pointed to.

    Best regards,
    John