Hello,
I'm testing a new design based on the FDC2214 and I am tuning the drive current setting.
We use a 39pF capacitor and 22uh inductor. (we want to measure ~0-100pF)
We only use 1 channel and use a 32MHz external clock.
When I have no load atatched I can tune the amplitude pretty well and within 1.2V - 1.8V. (best setting is about 13 and 108uA).
When I connect some cappacitance to the output the amplitude increases and even with setting 1 so 18uA the ampltitude is about 2.2V, which is higher than the datasheet recommends.
I do use a driven shield which uses a BUF634AIDR as buffer and this signal is buffered and connected to the shield of the coaxial cable that carries the signal.
The driven shield signal is a good representation of the input signal with minimal phase shift.
What could we adjust to get the output voltage in the correct range?
Thanks in advance.