This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

camera link to DM8618

Guru 13485 points

Our Camera Link cameras are monochromatic and has a 12-14 bit depth pixels.

I need to understand, to the details, how are pixels which are larger than 8-bit per pixels handled.

Actually, I must have all the pixel bits into the processor (so actually 16-bit per pixels should be received by the processor)

Could you please verify that ?

  • Hi eli,

     

    Are you saying 12/14 bit per pixel or 12 bits per color component?

     

    If it is 12/14 bits per pixel, you can treat them as 16 bits per pixel and get the input in embedded or discrete sync mode. You can store the pixels in the YUV422 interleaved format, so that all the bits will be available in the memory.  You can the 12/14bits at LSB or MSB depending on how input is connected to VIP.

     

    If it is 12 bits per color component, you can treat 24 bit YC data as 24bit RGB data and get the input in discrete sync mode. Again, you can store them in RGB format and get all 24 bits in packed mode.

     

    Thanks,

    Brijesh Jadav