This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

OPT8241-CDK-EVM: OPT8241-CDK-EVM - General Application Questions

Part Number: OPT8241-CDK-EVM

Hi,

I have a few questions regarding methods in which this TOF camera will fail.

  1. Say one of the applications is a real-time analysis of a dark object on a conveyor belt, how fast can the conveyor belt be moving before errors become large?

2. How dark of a material can be used before not enough infrared light is reflected back to the camera? Does texture play a role in this? (assume a working distance of ~2m)

3. How shiny of a material can be used before too much light is reflected back? Does texture play a role in this? (assume a working distance of ~2m)

4. Is the TOF Camera/software able to distinguish between different objects or materials? If so, how does it do this? Ex. a bin full of sand; can the bin and the sand be distinguished from one another? What if the bin and the sand are the same color?

5. Is the TOF Camera/software able to create a depth map of a pile of big rocks? Or would occlusions become too big of a problem?

6. How bright and clear of an image is possible with different combinations of working distance and Field of View?

Thanks,

Sabrina

  • 1. Say one of the applications is a real-time analysis of a dark object on a conveyor belt, how fast can the conveyor belt be moving before errors become large?

    >> Speed of measurement is largely related to frame rate.  OPT8241 can scan up to 120 fps at 320x240, and OPT8320 can scan up to 500 fps.  3D-TOF sensor works by reflecting light off object and measure the phase shift.  If the object is too dark (to NIR light), then measurement error will become larger.  I recommend you test your specific scenario using our System Estimator tool available here:

    http://www.ti.com/lit/zip/sbac124

    2. How dark of a material can be used before not enough infrared light is reflected back to the camera? Does texture play a role in this? (assume a working distance of ~2m)

    >>  Please use the system estimator tool to answer more specifically.  Texture does not play a role, unless it deflects light so that light does not reflect back.  But texture will likely show up in the amplitude image (TOF sensor returns two types of images, phase and amplitude.)

    3. How shiny of a material can be used before too much light is reflected back? Does texture play a role in this? (assume a working distance of ~2m)

    >> See my previous answer.  If reflected light is too strong, saturation could occur, but TI 3D-TOF sensors can detect saturation (and amplitude) per each pixel, and that information can be used to dial back integration time ("exposure") or illumination.

    4. Is the TOF Camera/software able to distinguish between different objects or materials? If so, how does it do this? Ex. a bin full of sand; can the bin and the sand be distinguished from one another? What if the bin and the sand are the same color?

    >> TOF camera provides two types of images, phase and amplitude.  Phase is generally not a function of surface material, but amplitude is.  One may potentially use amplitude of reflection to distinguish different types of materials.  If TOF is paired with a RGB camera, color can be used.  We have several partners that have made TOF with RGB that one can buy.

    5. Is the TOF Camera/software able to create a depth map of a pile of big rocks? Or would occlusions become too big of a problem?

    >> Yes, but occluded portion would not be seen (shadowed by the object in front of it). One can use multiple TOF cameras to acquire depth image at different angle, and merge the point clouds into "solids".  This technique has been used for scanning objects for 3D printing.

    6. How bright and clear of an image is possible with different combinations of working distance and Field of View?

    >> Quality of image is a complex function of illumination power, FOV of lens, number of pixels, exposure time, etc.  The system estimator tool is designed to help you experiment with different scenarios.

    Perhaps these videos will help:

    https://training.ti.com/3d-time-flight-sensors