This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DLP3000-C300REF: LightCrafter 3000 Source Code

Part Number: DLP3000-C300REF

Dear All,

I am currently working with the TI Lightcrafter 3000 EVM in order to develop a pattern projector for a commercial product.To this end, we intend to use the Lightcrafter as a development platform so as to design our own custom drive electronics (a custom solution is mandatory for us in terms of mechanical integration and life cycle management).

We have a few questions regarding this platform:

1. 8-bit pattern sequences with more than 12 patterns :

More specifically, we are interested in projecting sequences of ~20 8-bit monochrome patterns and synchronously acquiring images. We have had success using the GUI for less than 12 patterns, but for more patterns (e.g. 20), the Start Pattern Sequence command fails (though the upload and configuration commands succeed). To further investigate, we connected to the DM365 using telnet and restarted the "cmdh" application so as to have more debugging info. The failing command takes more than 1 second and fails with the following message :

API.c:1100 >> Error = FAIL

2. Availability of FPGA and DM365 source code :

Before spinning our own board, we need to fully understand and customize the way the Lightcrafter works. To this end, we need the following bits of code :

- The Lightcrafter FPGA source code

- The DM365 API/library (liblcr.so) source code

Unfortunately, we have not been able to find these files nor on TI's website, nor in the forum. Can they be made available for download ?

Best regards,

Guillaume

  • Hello Guillaume and welcome to the E2E forums,

    1. Unfortunately the maximum number of patterns that can be stored for streaming is 96 / bit_depth. Therefore, if you are using 8 bits the maximum you can use is 12. I would recommend looking at the DLPCR6500EVM which can hold up to 30 8-bit binary patterns. You will also likely have more flexibility with this chipset with less external components required. Alternatively, you could stream patterns instead of storing them in flash.

    2. TI has not released the source code for the FPGA or the DM365 API. We plan to keep it as a binary only release.


    Thanks,
    Kyle
  • Dear Kyle,

    Thank you for your answer.

    Regarding point 1, we intend to keep using the DLP3000 chipset due to its low cost, small footprint and wide availability of optical engines.

    In our application, we are using multiple cameras synchronized to the Lightcrafter's trigger output ; we do not need a high repetition rate (in triggered mode, the cameras we are using are limited to about 32 FPS) but we need to control the exposure time and most importantly, the acquired images must be synchronized with the projected patterns. This can be done easily using the Stored Pattern Sequence in "Normal" trigger mode, which produces a "burst" of N patterns, resulting in also N camera images, each acquired image corresponding to the right pattern in the sequence. This is much harder to obtain using a video stream, e.g. using the HDMI input, since there is no side-channel for triggering.

    We were thinking of modifying the Lightcrafter software such that more than 12 8-bit patterns could be used : since there are 4 24-bit buffers in the DLPC300 arranged in a circular fashion, it should be possible to upload new patterns once the first patterns of the sequence have been displayed. This should be possible to implement quite easily in our case, as we do not need a high repetition rate, giving us enough time to refill the display buffers once the sequence is started.

    This is why we requested the source code in the first place; while we are able to infer the functionality and implementation of the DM365 API (using a disassembler) and the FPGA (using the available documentation), reimplementing a complete solution will take us quite a lot of time. In our opinion, not providing all the source code defeats the purpose of the Lightcrafter as a development platform.

    Best regards,

    Guillaume

  • Hello Guillaume,

    Are you able to use the streaming mode with external triggers? Or am I missing something that doesn't allow you to do that?

    Thanks,
    Kyle
  • Hello Guillaume,

    Let me know if you still need help. I think it may be possible for you to use external streaming mode with triggers unless I'm missing something.

    Thanks,
    Kyle
  • Hello Kyle,

    We are currently operating the LightCrafter in "External Streaming Pattern Sequence" mode with External Trigger output, in order to use all our patterns (currently 24 phase images, plus black and white screens). However, we have to comply with the following limitations :

    - The trigger rate is too high for our cameras (we can reliably operate them up to 32Hz trigger frequency, depending on the exposure settings), hence some frames are dropped.

    - We are using two cameras at the moment, and due to the above issue they may not be synchronised (e.g. one camera may trigger on "odd" frames while the other triggers on "even" frames due to the missed triggers).

    - There is no provision for changing the pattern exposure time.

    - There is no reliable means to synchronize pattern sequences with acquired images, since the video interface does not provide a side channel for gating camera triggers. We would like to acquire only one image for each pattern; at the moment the operation sequence is:

    1. Display next pattern on video output,
    2. Enable hardware trigger on camera and start camera,
    3. Wait for at least one frame,
    4. Save frame and disable camera,
    5. Go to 1 until all patterns have been displayed.

    In the "Stored Pattern Sequence" mode, we are able to grab all the desired images at once in a single burst, resulting in much shorter acquisition times. However, in this mode we are limited to 12 patterns (due to the 4*24-bit buffer size).

    Our aim is to modify the DM365 and FPGA code so as to refill the buffer once the first patterns have been projected, in order to overcome the 12-patterns limit.

    Best regards,

    Guillaume

  • Hello Guillaume,

    Even when changing the external frame rate to 15 Hz and using the output trigger to trigger the cameras do you still miss camera triggers? Unfortunately modifying the DM365 and FPGA is not supported and will likely be a prohibitively difficult endeavor.

    Thanks,
    Kyle
  • Hi Kyle,

    We have tried your suggestion in the past few days. Unfortunately this solution is not entierely satisfying for us, for the following reasons :

    - Even though it is possible to change the source frame rate (we modified the EDID to do that), the graphics card in the computer we used (Intel Graphics HD 530) refused to output video signals at less than 30Hz. At such a low rate, the Windows desktop is also severely slowed down (it has to keep up with two different frame rates).

    - The exposure time is lengthened wrt the original 60 Hz settings

    - Even though the source frame rate is lower, the pattern rate is still multiplied by 2 (for 8-bit patterns) or 3 (for 7-bit patterns).

    - Last but not least, the external video mode does not solve our matching issue (acquiring one, and only one image for each projected pattern)

    In the meantime, we are considering the following workarounds for triggering in external video mode :

    - Reducing the trigger rate with an external divide-by-N and gate circuit

    - Modifying the DLPC300 display sequence to skip one every two frames

    Regards,

    Guillaume

  • Hello Guillaume,

    Thanks for the update. If you need furthers assistance with this new approach let us know. Also for your information, some of our customers will use external hardware (such as an FPGA) that is specifically designed to feed in very low framerates. I do realize if you are limited by a Windows PC this approach may be impractical. Perhaps you could just ignore every n frames by using some external hardware to tell the camera to ignore certain triggers (as you alluded to doing).

    Thanks,

    Kyle