Hi TI,
I found that the time offset of TDC7200 between the input time interval and the measured results is not a constant one in MODE 1, as shown in the following.
Input time interval (ns) | 12 | 50 | 100 | 200 | 500 | 1000 | 1500 |
Measured result (ns) | 11.81 | 49.76 | 99.66 | 199.52 | 499.3 | 999.2 | 1499.2 |
offset (ps) | 190 | 240 | 340 | 480 | 700 | 800 | 800 |
offset deviation (ps) | 610 |
The deviation of the offset is as large as 610ps from 12ns to 1.5us, this could induce large measurement error regardless of the precision is ~± 50ps.
Why this happened? I have measured three chips of TDC2700, the offsets are all with several hundred ps.
I'm sure the chips are properly configured and the input time intervals are accurate (the input time interval have been captured by a high-speed ossciloscope to observe).
Looking forward to your reply.