We have been using the TLC2543 for many years with the same circuit and software, which respects all setup, hold and conversion times with a margin. We now find that a number of recently manufactured units have the same fault - the TLC2543 outputs only 8 bits of precision followed by 8 zeros, not 12 bits of precision followed by 4 zeros, despite 16 bit interface being selected in the command byte. We can see that the clock line has no glitches. We have looked at every angle on the CRO and we can find no fault, and most units correctly output 12 bits of precision. What can cause this? We thought that we might be reading a new channel too soon after the last, but we padded our software to wait 12us from EOC high to /CS low and thisachieved nothing. Everything looks normal except the output bits and previous batches worked. We have also tried replacing the A/D with one from an older batch to no avail.