This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

LMH0395 Carrier Detect Signal (Specifics?)

Other Parts Discussed in Thread: LMH0395

Hello, I currently have a design utilizing the LMH0395 (SDI Equalizer) and I was hoping to get some clarification on the "Carrier Detect" functionality.  Specifically I'm trying to understand what signal/data is being "detected".  I currently use the Carrier Detect to trigger an LED indicator circuit, and this appears to work.  When sending a good HD-SDI signal to the LMH0395 the Carrier Detect is LOW (indicating good SDI signal received).  Likewise, if the HD-SDI signal is disconnected from the LMH0395 circuit, the Carrier Detect goes High (indicating a bad SDI signal received).

Unfortunately, the Carrier Detect circuitry is not operating as expected for degraded signals.  There have been many times where we have pushed the cable lengths to the limit of what is recoverable, so much so that we loose video feed completely.  In these scenarios, despite the lost video feed, the Carrier Detect circuitry remains Low, indicating that the LMH0395 should be able to recover it (to my understanding).  Just prior to reaching these cable limits, the Jitter characteristics observed are within SMPTE standards, so I do not believe the lost video to be caused by excessive Jitter. 

In trying to understand why the Carrier Detect circuit would act this way I began to wonder what was really being "detected".  If it effectively detects the sampling frequency, I could understand what is going on.  Or perhaps if it looks for bit sequences such as SAVs & EAVs, this might be recoverable while the rest of the signal was lost to CRC errors. 

Any insight in how to better understand and utilize the "Carrier Detect" functionality here would be very much appreciated.

Thank you, in advance,  for your time and assistance.

- Jay Hardesty     

  • Hi Jay,

    As part of the data sheet release, we have checked LMH0395/4 using longest cable length, as mentioned in the data sheet, and different video patterns to make sure it is operational over cable reached mentioned in the data sheet.

    LMH0395/4 CDbar works by measuring energy within certain frequency band. Given your observation, CDbar low despite the loss of video, it indicates there is still low frequency energy content that causes CDbar to stay active. Since this measures low frequency energy content, there is less attenuation and thus video could get distorted but CDbar stays active.

    Could you please specify the length of the cable, video rate, and video pattern at which the video is distorted but CDbar is active?

    Regards,,nasser
  • This explains a lot!

    (Background information) Due to our industry's environmental factors, we are actually using Ethernet cables for HD-SDI transmissions. These cables are far more rugged, dependable, and cost effective than traditional terrestrial (land) digital video transmission mediums (coax/fiber).

    The Ethernet cables would certainly attenuate the lower frequencies far less than the higher signal frequencies of the 3G/HD-SDI signals, hence the video loss while maintaining the presence of low frequency energy. For reference the cable length was approximately 20meters, and the signal was pathological HD-SDI.

    Given the non-standard implementation, I'm guessing we wont be able to use this feature. I imagine the Mute function wont be of much use either as the transmission capabilities of the various Subsea Ethernet Cables can vary quite a bit.

    Nonetheless, I remain open to any suggestions you may have for indicating video status, given our non-standard application.

    Thank you for your help and insight Nasser,

    - Jay