This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
We have been using the TLC2543 for many years with the same circuit and software, which respects all setup, hold and conversion times with a margin. We now find that a number of recently manufactured units have the same fault - the TLC2543 outputs only 8 bits of precision followed by 8 zeros, not 12 bits of precision followed by 4 zeros, despite 16 bit interface being selected in the command byte. We can see that the clock line has no glitches. We have looked at every angle on the CRO and we can find no fault, and most units correctly output 12 bits of precision. What can cause this? We thought that we might be reading a new channel too soon after the last, but we padded our software to wait 12us from EOC high to /CS low and thisachieved nothing. Everything looks normal except the output bits and previous batches worked. We have also tried replacing the A/D with one from an older batch to no avail.
Hello,
Please try the following. Take two boards, one that is working correctly and one that is not working correctly. Take the part from the good board and put it on the bad board. Take the part from the bad board and put it on the good board. Does the problem follow the part or the board?
Do you have scope wave forms of a bad part showing CS, clock, and data out that you can post? What is the value of the Input Data Byte that you are writing?
MIke
Thanks Mike. We swapped A/D chips and the problem stays with the board. The circuit is as per the TLC2543 data sheet with all the bypass caps and is proven in previous products. On the scope the signals appear correct as per the data sheet page 11 figure 13: Timing for 16-Clock Transfer Using /CS with MSB First, with plenty of time for setup, hold and conversion. SPI clock is 3.7MHz and there is no sign of glitches. The rising edge of the SPI clock is in the middle of the A/D data in. On the faulty units we can clearly see that the data out has only 8 bits of precision in 16 bits out as we vary any analog input. This isn't intermittent and changing A/D chips from another batch doesn't fix it. No inputs are floating. At half scale in the output is 0x8200 on every board with this fault. This is a weird one.
We send it command 0x0C00 to read input AIN0. We discard the data in and send 0x1C00 to read AIN1, with data in now for AIN0. It reads as only 8 bits of precision, as we see on the scope. We aren't losing the other 4 bits in buggy software, the A/D really is only giving 8 bits of precision. I even tried the command 0x0C0C to see if somehow it was going into 8 bit interface mode but the result wasn't 0x8282, it was still 0x8200.
We can't see any difference on the scope between the signals on the units that output 16 bits with 12 data and the units that outputs 16 bits with only 8 data.
VCC is 4.960V. VREF+ is 4.995V. VREF- is 1.000V. Analog inputs have 250R to ground for current loop, then 1k protection to the pin.
Tomorrow I shall try lowering the SPI clock frequency but the high & low times are well within spec so I'm not hopeful. Regards, Daniel.