Hi,
we are designing a step down supply from 5V to 3.3V (converter 1) and to 1.2V (converter 2) using the TPS62420-Q1. The schematic is as follows :
The question we have is : we noticed the input current of pin DEF_1 is spec'ed to maximum 1µA. Combining this maximum input current to the resistors shown above (which are within the specified range suggested by the datasheet) gives us a tolerance on the output voltage of 10.2% for the 3.3V and 16.9% for the 1.2V. This is completely inacceptable for our design.
Now, we could lower the values of resistors R327, R329, R331 and R333 by say a factor of 100 and this would give us an acceptable output voltage accuracy, but this wouldn't make sense with typical values shown in the datasheet (and I'm worrying if it could affect stability and/or transient performance, as none of this is documented in the datasheet).
In comparison, using the LM43602-Q1 (which has a maximum FB leakage current of 65nA and typical resistor values of 10k to 100k), we can easily end up with tolerances of 0.1% to 0.2%...
Does this make any sense? Is the maximum input current of DEF_1 really 1µA or is it just a "convenient" value that was placed in the datasheet? Is there any way to be sure that the regulator will have adequate stability and accuracy?
Regards,
Marion Nourry