This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DRV5055: About sensor linearity

Part Number: DRV5055

I have two questions regarding the statement that the sensor linearity error of the DRV5055 is approximately ±1%.
1.This states that linearity error is defined as the difference in sensitivity between any two positive B values and any two negative B values. Will the error change if used only in the positive range? Or does the error change when used only in the negative range?

2. To consider the worst case of error, suppose two arbitrary B values are 100mT and 101mT. Assume that the error is 0% at 100mT and -1% at 101mT. The sensitivity of the sensor is 7.5mV/mT.
In that case, is the calculation correct that the output voltage for 100mT is 750mV with 0% error, and 749.925mV for 101mT with -1% error? Is it possible that the output voltage decreases even if the magnetic flux increases?
Or is it correct to think that the output voltage of 101mT is 757.425mV, which is the difference between the two B values, 1mT*7.5*0.99, added to 750mV (100mT)?

  • Hello Ayaki,

    1-Its possible that the linearity for the positive and negative ranges are not identical.  However, typically the linearity should be at or around +-1%. 

    2- One of the ways I have seen linearity calculated would be the following for this situation

    (sensitivity between 100 mT and 101mT)/(Full range {negative or positive} sensitivity)-1*100.

    so to have a -1% linearity from 100mT to 101mT assuming full range sensitivity = 7.5mV/mT, we would have a sensitivity of ( -1/100+1)/7.5 = 7.425mV/mT over that range.

    If operating at 3.3V and the device exhibits no offset (Vq = 3.3V/2 exactly), then you would expect to see 1.65V+750mV=2.4V for 100mT, and then I would expect 2.4+1mT*7.425mV/mT = 2.407425V (neglecting noise or assuming sufficient averaging).