This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

How to use DLP LIGHTCRAFTER display image buffer?

Other Parts Discussed in Thread: DLPC300

Hi everybody,

I want to use DLP lightcrafter to project the images captured by a camera that is not connected with it. I have the raw data of the image, but I don't which mode I should use. I cannot find relative sample codes. It's difficult for me to understand " DLPC300 Programmers' Guide" on page 41 and 42.

Does anybody know how to solve the problem or give me the sample codes?

Thank you. 

  • Hello Zhenyue,

    Welcome to TI DLP E2E community!

    I would definitely be able to help you with this. Could you give me some more details like:

    • Whether the camera is connected to the DM365 camera port (directly with the LightCrafter EVM) or to the computer and you already have some images captured by that camera?
    • Is there a particular frame rate you are trying to achieve while displaying the captured images?
    • What is the bit depth of the images (Binary, grayscale, 24-bit color)?

    DLPLightCrafter allows two kinds of pattern sequences:

    1. External Test Pattern  - In this mode you can stream the patterns through the 24-bit RGB interface and those patterns are fed to the DLPC300 fro display at the desired frame rate (15,30,45 or 60)
    2. Internal Test Pattern - In this mode, certain number of patterns/images can be preloaded into the mDDR memory of the controller which could then be displayed at much higher frame rates since (120Hz maximum for grayscale patterns) since the patterns are already preloaded.

    Refer to Section 4 of the User's guide to understand more about these two modes.

    I would recommend using the GUI that comes with the EVM to play with different modes and different frame rates. After you know exactly what mode you want to use and the frame rate, you can then refer to the DLPC300 programmer's guide and the Command handler sample application to configure LightCrafter accordingly.

    Hope this helps. Feel free to write back if you have more questions.

    Regards

    Manasi

     

     

  • Hi Manasi,

    Thank you so much for your response. 

    The camera I use now is not connected to the DM365 camera port. Since the camera is built by myself, I don't think the lightcrafter can support it. When the camera is connected to the laptop, it can capture 30 gray images (8 bit) per second (i.e FPS=30). I can get the raw data buffer of each image. So my question is how to project them out with the lightcrafter?

     The images should be displayed in live and the FPS is not very high (smaller than 30). So it looks like the "Internal Test Pattern" is not suitable, right?

    For the "External Test Pattern", how can I stream the image buffers to DLPC300? Where can I find the sample code to achieve it? Or where can I find the SDK to do it? I even don't know which function to use. 

    I'll be really appreciated if you can answer my questions promptly.

    Regards,

    Zhenyue Chen 

  • Hi Manasi,

    To my understanding, I should use "External Test Pattern" for display and set the number of pattern to 1 and choose an exposure time smaller than the camera exposure time. Then set the buffer as the pattern to be displayed. But how can I define the pattern?

    LCR_CMD_SetDisplayMode((LCR_DisplayMode_t)(0x04));

    printf("Selecting BIT_DEPTH = 8, NUM_PAT = 1, TRIGGER_TYPE = AUTO, AUTO_TRIG_PEIORD = 8700uSec, EXPOSURE = 8333, LED = BLUE\n");
    patSeqSet.BitDepth = 8;
    patSeqSet.NumPatterns = 1;
    patSeqSet.PatternType = PTN_TYPE_NORMAL;
    patSeqSet.InputTriggerType = TRIGGER_TYPE_AUTO;
    patSeqSet.InputTriggerDelay = 0;
    patSeqSet.AutoTriggerPeriod = 8700;
    patSeqSet.ExposureTime = 8333;
    patSeqSet.LEDSelect = LED_GREEN;
    patSeqSet.Repeat = 0;
    LCR_CMD_SetPatternSeqSetting(&patSeqSet);

    LCR_CMD_DefinePatternBMP(xxxxxxxxxxxxxxxxxxxxxxxxxxxx); // How to do this step with the image buffer???

    printf("Starting Pattern Sequence...\n");
    LCR_CMD_StartPatternSeq(1);
    mSleep(10000);
    LCR_CMD_StartPatternSeq(0); //Stop pattern sequence

    Best regards,

    Zhenyue Chen

  • Hello Zhenyue,

    Okay so here's what you could do:

    1. Internal Test Pattern Mode: If the number of patterns you want to display is less than 12 then, you could still use the internal stored pattern mode, preload them and then stream them on command or auto trigger. In this case you will use following commands:
      1. LCR_CMD_SetDisplayMode((LCR_DisplayMode_t)(0x04));  // 0x04 is internal pattern sequence mode

        printf("Selecting BIT_DEPTH = 8, NUM_PAT = 1, TRIGGER_TYPE = AUTO, AUTO_TRIG_PEIORD = 8700uSec, EXPOSURE = 8333, LED = BLUE\n");
        patSeqSet.BitDepth = 8;
        patSeqSet.NumPatterns = 1;                                                  // or upto 12
        patSeqSet.PatternType = PTN_TYPE_NORMAL;
        patSeqSet.InputTriggerType = TRIGGER_TYPE_AUTO;
        patSeqSet.InputTriggerDelay = 0;
        patSeqSet.AutoTriggerPeriod = 8700;
        patSeqSet.ExposureTime = 8333;
        patSeqSet.LEDSelect = LED_GREEN;
        patSeqSet.Repeat = 0;
        LCR_CMD_SetPatternSeqSetting(&patSeqSet);

        LCR_CMD_DefinePatternBMP(xxxxxxxxxxxxxxxxxxxxxxxxxxxx); // How to do this step with the image buffer???                                                    // Yes you would need to convert the raw images to a bmp format may be using opencv routines

        printf("Starting Pattern Sequence...\n");
        LCR_CMD_StartPatternSeq(1);
        mSleep(10000);
        LCR_CMD_StartPatternSeq(0); //Stop pattern sequence

    2. External Pattern streaming mode - In this case you would stream those patterns through HDMI at upto 60Hz. You will have to set display mode to Video (LCR_CMD_SetDisplayMode((LCR_DisplayMode_t)(0x02));
    3. Then select the LED, bit depth and frame rate (0x02 0x01). Refer to section 3.9 from the DM365 Command interface guide

    Hope this helps. Ping me back if you have more questions.

    Regards

    Manasi

  • Hi Manasi,

    When I use the "internal pattern sequence mode" to display the live images captured by a camera, there are two problems. First, I don't know how to stream the image buffer to the lightcrafter directly. I don't want to convert the raw image data to a bmp file and store it in the computer disk. That will slow down the projection speed.  Second, using the codes in your last response, the lightcrafter will blink. The projection is not continuous. It gets dark and then bright then dark then bright. How can I display real-time images?

    On the other hand, I referred the DM365 command interface guide, it doesn't help me to know how to use HDMI mode to display. I wonder why there is no sample codes about this commercial product?  I've wasted so much time trying to figure out how to use this lightcrafter display real-time images. Please help me out.

    Regards,

    Zhenyue Chen

     

  • Hello Zhenyue Chen,

    Driving images real-time - 

    1. When you connect HDMI output of your PC to LightCrafer Kit then it would detected as monitor device of 608x684 resolution @ 60Hz. 

    2. Now your PC graphics card can be configured into Extended mode, so Primary is your PC Monitor and Secondary is the LightCrafter Kit.

    3.

    Option A:

    Now, in your software, the images or camera captures your processed can be directly opened into in the form Video Window (like live feed), after this, the Video Window feed can be run in full-screen on the Secondary Monitor which is LightCrafter Kit

    Option B:

    Your code can simply update Desktop background image (on windows) with your intended display content. This will automatically gets shown on the LightCrafter. Although this option would be quite slow compared to Option A.

    I hope if this helps.

    Regards,
    Sanjeev

     

  • Hello Zhenyue,

    I'd like to add a note about option B that I've experienced. When using the various Window's desktop themes, such as Aero mode, the OS sometimes adds artifacts to the video stream sent to the LightCrafter. If you are projecting binary patterns these artifacts can become very apparent.

    If it's is possible, I recommend option A for the best performance.

    Best regards,

    Blair

  • Hello Sanjeev and Blair,

    Thank you both for your response. I'll take option A when compared to option B.

    By the way, if I don't want to use extended screen to display the real-time images, do I have some other way to achieve the similar effect? For instance, I get the raw data image buffer first and then stream it to the DLP Lightcrafter frame by frame. Is it possible?

    Regards,

    Zhenyue Chen

  • Hello Zhenyue,

    Option B - works on the principle that LightCrafter Kit is recognized as like any other monitor device. 

    For instance, I get the raw data image buffer first and then stream it to the DLP Lightcrafter frame by frame. Is it possible?


    Is your question to do with writing directly into the video buffer on a PC running Windows like operating systems. I am not familiar with windows driver code to write into the video buffer. So you have to look into the aspects of accessing directly into the video buffer on your operating system. A simple way would be to draw a GUI window update the image frames on it, and display the GUI window on the LightCrafter detected monitor port.

    Regards,
    Sanjeev

  • Hi sanjeev,

    I take your advice to draw a GUI window update the image frames and then display the GUI window at the LightCrafter screen. It works. Thank you very much.

    Best regards,

    Zhenyue Chen