Tool/software:
Hi all,
I have to design a current sense circuit, whose output is read by an ADC. I chose the delta-sigma architecture for that and I had the AMC1035 in mind but I realized, it is recommended for voltage and temperature sense application. The big difference between the delta-sigma ADC for voltage and current sense application is the input impedance. As much as I understand:
Low input impedance (4.9kOhm for AMC1306):
- Good for low signal source impedance, which is mostly the case when using a shunt resistor because of the power loss in that shunt.
- Enable fast settling time (low time constant)
High input impedance (1.6GOhm for AMC1035):
- Good for high signal source impedance, which is mostly the case by voltage sense when using a simple voltage divider from HV.
- Low settling time.
The current I measured is downscaled to the range -200mA...200mA so the power losses in the shunt are not a problem and I would like to have a voltage input range as big as possible to optimize accuracy. That's the reason I would like to use the AMC1035 with +-1V instead of +-50V for example. Now I would not know how to find out whether the game between settling time of the analog circuitry and the modulation frequency is fine, or whether I will get distortion of my original signal.
Current range: -160mA ... 160mA (from amplifier)
Sample data: max 50kSps
Modulation frequency: TBD
Filter/OSR: Sinc3/TBD
Could anyone help me understanding these relationsships on the analog side ?
Thank you in advance !
Antoine