This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DRV5056: Accurate distance measurement considering the temperature drift

Part Number: DRV5056

Hi,

I would like to use a Hall sensor to measure the gap between a cylindrical magnet and a sensor (the range of the gap is 7 to 20 mm). For this, I plan to use the DRV5056. For my first tests I use a N35 neodynium magnet and considering the size of the magnet and the gap range it corresponds to a magnetic field between about 10 and 75 mT. According to this, I selected the DRV5056A3.

The problem is that the sensor will be used outdoors in temperatures between 15 and 35 degrees. I would like to better understand the issues related to temperature drift. There is a variation due to the influence of temperature on the magnet, which can be compensated with the A versions of the DRV5056. But my question is about the output voltage temperature drift (effect of temperature on the Hall effect).

There is the value of 40mV which is given as the quiescent voltage temperature drift, but it's the absolute maximum value for the complete range of temperature. Is it possible to make a theoretical estimate of the output voltage drift as a function of temperature when the sensor is subjected to a fixed magnetic field?

Thanks in advance, Regards

Alex

  • Hello Alexandre,

    Thanks for considering to use Texas Instruments.  Output voltage depends on the sensitivity (S) and sensitivity temperature compensation for magnets (STC) specifications in the magnetic characteristics table.  For your device you can expect your output to be equivalent to Vq+(S*(1+(Temperature-25°C)*STC/100%))*Magnetic Field.  As for the Magnetic field calculation, we do have some calculator tools on the product page here that you can use.