I am intending to use the TDC7200 as a time to digital converter in a time tagging frequency counter. The device will be used in measurement mode 1, with a clock frequency of 12.375MHz, and a range of times between start and stop events from 80ns to 160ns. After lengthy study of the data sheet, I eventually concluded that a calibration cycle is performed automatically after every measurement cycle. The maximum measurement rate that I want to use is 100k measurements per second. I will be using an SPI clock rate of 20MHz. My questions are:
1/- Is it necessary to read and calculate new calibration constants every measurement? What degradation in accuracy will result if the calibration constants are recalculated at a slower rate, say, every 10 or 100 measurement cycles? Is the drift in the uncalibrated measurement mainly temperature related?
2/- Considering that my maximum time measurement is 160ns, is there any advantage in setting the second calibration measurement to more than 2 clock cycles?
3/- Could you please confirm that I can achieve a throughput of 100k measurements per second with the above parameters?
regards
Cosmo Little