Hi,
I would like my math to be reviewed as a mistake is easily made.
My goal is to have a voltage DAC control the LMR33640 between +8V and +24V with an input voltage of 24.3V. The DAC can output between 0 and 2.048V and can sink/source 25mA max.
I'll set my default voltage halfway at 15.7V and then inject or source current with the voltage DAC. This default voltage output is achieved with a 100k resistor (RFBT) and a 6k8 resistor (RFBB).
By adding a 10k resistor to GND we get an output voltage of 25.7V, this can be done with a voltage DAC set to 0V and a 10k resistor. Not sure how to add negative resistors together, can I just assume it's linear?
Thus applying 2V through a 10k resistor results in a voltage of 5.7V?
(Vref = 1.0V)
15.7V = ((100000Ω / 6800Ω) +1)*1.0V
25.7V = ((100000Ω / (6800Ω^-1+ 10000Ω^-1)^-1 +1)*1.0V
5.7V = inverse of above?
A 12bit DAC yields me about 3250 steps between 8.0V and 24V.