This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TMAG5173-Q1: Push Button Application

Part Number: TMAG5173-Q1

Tool/software:

Hello,

We want to make hall effect push buttons with safety compliant. I have tried this sensor and my measurements are:

  • -09 mT when not pressed
  • -27 mT when pressed

Is it suitable for push button application?

We want this product work in –40°C to 85°C temperatures. How do I determine LOW and HIGH states? What will be the tolerance range?

How do I calibrate states while production in 25°C?

Magnet is axial cylinder, SmCo, 3mm diameter, 3mm height, stroke 2.6mm

The sensor has different ranges. Which one do I prefer? Which range should I use?

Thanks

  • Mesut,

    If your assembly procedures and magnet qualities have good repeatability, then you should be able to expect fairly consistent behavior from sensor to sensor.  However, it would typically be the most reliable calibration to test both the idle and pressed conditions and store thresholds in the controller to ensure enough margin across temperature.

    Across temperature you will find that the magnet strength gets weaker as it heats, but you are also able to enable temperature compensation in the device to counter this behavior.  For example, Neodymium magnets typically derate at about -0.12%/C, and if enabled in the device, the sensitivity gain can be set to increase at +0.12%/C.  This should help keep the output code measured by the sensor relatively constant.

    To set ranges, it is helpful to have a large change between states. From your details I would expect that you are seeing a large enough change without needing to make adjustments to the strength of the magnet.  Typically it would be a good practice to define a press threshold and a release threshold with some hysteresis region between them.  Designing in hysteresis for your thresholds will help to prevent the possibility of thrashing on the outputs where the sensor switches back and forth. It is helpful to remember that the strength of the magnetic field is inversely proportional to the square of the distance.  You will see small changes in the field at first, but the rate of change will increase as the magnet gets closer.

    To select a range, I would recommend using the lowest range possible without causing the sensor to saturate.  This will help provide better resolution if measurements are needed between each threshold position. So in your case the +/-40mT range looks appropriate.

    Thanks,

    Scott

  • Hi Scott,

    Thank you for your answer. 
    I will be using SmCo setting for temperature on +/-40mT range variant. 

    I have an additional question. I have made calculations for min tolerance according to  6.7 Magnetic Characteristics For A1, B1, C1, D1 on datasheet.

    Boff_TC_A1 ±7.85 µT/°C for 25°C, ΔT=25-(-40) = 65°C, 65*7.85 = 510,25 µT
    Boff_A1 700 µT
    Boff_DR_A1 100 µT
    SENSLDR_A1 ±3.74% for -9mT 336 µT, for 27mT  1mT

    Sum of al that min tolerance 

    • -09mT -> 1646  µT -> 1,64 mT
    • -27mT -> 2310  µT -> 2.31 mT

    Is my calculations correct?

  • Mesut,

    Looks like you're on the right track. While it is not likely you'll ever hit all of these maximum values simultaneously it appears that if you were to set thresholds with at least 3mT of margin on either side you would be able to differentiate the result across lifetime and temperature.  Given that your input range varies by 18mT, you could probably still comfortably set wider margins still.

    SmCo magnets have a very low tempco normally, and the device does have a setting for these as well.

    Thanks,

    Scott