Other Parts Discussed in Thread: TMAG5170
Hi all,
i have done measurements on TMAG5170_EVM to study a high precision measurement system.
For 50 successives samples on X axis, +/-100mT range, 32x average conversion, only X axis acquisition, Ta=20°C, without any magnet on proximity, i have imported the results into Excel.
The difference between the MAX and the MIN sample is 85uT. Excel calculates a standard deviation of 20uT.
1/ Does it means that the 1 sigma "noise" is 20uT in my case ?
2/ How to compare this standard deviation with RMS (1 Sigma) mognetic noise of +/-0.025mT mentionned in the datasheet ? To statute a worst case analysis, do i to multiply 0.025 by 6.6 (state of the art claims to multiply RMS value by 6.6 to calculate peak value) ?
3/ it seems that some people consider 3 or 5 sigma (RMS or Peak ??) to claim the overall precision of a system. Your opinion ?
4/ To understand well how the TMAG works, is it correct that the internal ADC is 16b and the Conv_Avg average samples ?
5/ I notices that noise (difference between MAX and MIN samples) is twice more in +/-25mT range. Should it be a random hasard ? Indeed, i thought that high range should implie much noise.
6/ I notices that noise seems to be same with 1 axis or 3 axis measurements. According to TMAG operation, does it seem to be correct ?
7/ Is the RMS (1 Sigma) mognetic noise mentionned in the datasheet be really not relative to the range ? If yes, higher range have a better ENOB. Furthermore TMAG5170-A2 which have a higher range and quite the same noise should much much better also. Can you confirm ?
Sorry for this long list of questions. Hope you should answer about this hopeful TMAG.
Best regards.