Hello, I've built a few D/A converters around the PCM1792/1794 in the past and I am perhaps have some trouble realising the performance that these are supposed to be able to achieve. Also, as my knowledge is a little lacking when it comes to the internal workings of D/A converters, I don't know if what I am experiencing is typical of how they perform. So instead of making yet another PCB prototype I figured I'd ask/post here in the hope that someone can help shed some new light on this.
First of all the implementation.
I have the master clock, bit clock, LR clock and the data line being transmitted over a short CAT5 network cable via an LVDS implementation using your own SN65LVDS receiver and transmitters. The received clocks then pass through a ISO7240M chip, to isolate the DAC side of things from the place the clocks are sent from. The clocks then feed into the PCM1792. These are all mounted on the same PCB to minimise EMI.
The PCB is double sided, with the copper bottom acting as a ground plane, this is completely uninterrupted, save for a couple of areas that have been removed to reduce stray capacitance from some areas of the design. All components relevant to the PCM1792 are mounted very close to the pins of the chip and low ESR capacitors have been used in every area, the 0.1uf decouplers are ceramic. There are no traces on the copper bottom.
The copper top is obviously used for component placement, but the area of the copper top directly beneath the PCM1792 is uninterrupted copper., that is there is copper directly beneath the PCM1792. The ground connections from the DAC chip are made directly to this copper on the top. To connect this copper area to the ground plane on the copper bottom I basically have something similar to using a number of vias, only instead of the usual method involved in automated PCB fabrication, I have drilled two 1.5mm holes directly beneath the DAC chip. These two holes have a piece of 1.5mm solid copper rod that connects the copper bottom to the copper top beneath the DAC chip.
The I/V + difference amplifier are essentially identical to the data sheet implementation, except for two things. The first, is the capacitor in parallel with the feedback resistor in the I/V converters. The capacitor is in series with a small resistor to decouple the output of the opamp from the capacitance. The second is that the size of the capacitor in the difference amplifiers filter is larger to lower the frequency the filter operates at.
The PCM1792 is operating with a master clock of 256fs, and is controlled by a PIC24 via I2C. The system is set up so that I can alter the Delta-Sigma oversampling rate by remote control and I can also switch between the fast and slow rate digital low pass filter.
First of all the clocks are low jitter
as can be seen from the attached image, also as can be seen the noise floor is quite low too. So jitter is unlikely to be a cause of any performance issues.
Now some words on the distortion performance.
RE the DAC output set to around -1dB.
Going from a 48khz sample frequency, to 96 khz results in a doubling of THD. Likewise going from a 96khz signal to a 192khz signal also results in a doubling of THD. The over sampling filter is kept the same.
If the sampling frequency is fixed to 48khz, going from a 32x over sampling filter, to a 64x over sampling filter also results in a doubling of THD and going to 128x once again doubles THD over the 64x rate. The same happens @ 96khz and 192 khz too. Each time the sampling frequency is doubled, or the over sampling rate doubles the THD doubles too. (Sometimes it's not quite double, but it's close enough to see a trend emerging.)
Now @ -1dB and using an over sampling of 32x I get roughly 0.0005%, 0.001% and 0.002% for 48/96/192khz, both the left and the right channel show very similar performance. This isn't limited to low order harmonics either as the next picture demonstrates.
The distortion to me seems a tad too high in level and if it were just for this I wouldn't be posting here as things start to get worse or confusing depending on various things.
For example the above performance is achieved when using the AD8610 opamp in the position of the I/V converter (an OPA627 is used as the difference amplifier). However if I change the I/V opamp to the THS4031 (what I was originally using) the performance degrades significantly, however it also shows exactly the same trends with respect to the over sampling rate and the sampling frequency used (192khz reaching 0.005% THD with 32x OS and going up to 0.01% with 64x).
A while ago I also built a DAC using the PCM1794, this implementation used a SSOP to DIL adaptor, so theoretically loses out straight away due to the adaptor. However in that design I used the THS4031 in exactly the same configuration and it performed fine. The peculiar thing is that the design using the SSOP>DIL adaptor performed far better then any of the designs I've done where the PCM1794/2 is mounted directly to the PCB. Even stranger is that the analogue stage of each design, (design 1 being with the adaptor, design 2 being without) was identical, so quite why it performed well in the adaptor version and poorly in the version without the adaptor I do not know. (By perform well I am talking 0.0003% distortion at 48khz @ -1dB output level). So I am assuming here that the I/V stage using the THS4031 isn't or shouldn't be a problem.
Here's where things perhaps become even stranger.
According to the datasheet, the distortion performance reaches a minimum at around -20dB, where presumably, up until this point all the distortion was in the noise floor. Then as the output level increases, the distortion also increases, but both do so at an identical rate so the % distortion remains the same. This all makes sense, but isn't what happens with me.
If I lower the output level down from -1dB to -10dB the distortion remains roughly the same, it decreases, but only by a tiny bit. Say from 0.0018% to 0.0015% for a sampling frequency of 192khz and an OS rate of 32x.
If I then lower the output level from -10dB to -14dB, it decreases down to 0.001%.
If I then lower the output from -14dB down to -18dB something interesting happens. The distortion @ all sampling frequencies and pretty much all OS rates just takes a vacation and looks something likes this.
Interested by this I ran a sweep from 100-20khz and came up with this.
Obviously this isn't just frequency related, so probably isn't a result of any inductive or capacitive coupling. Nor would I imagine contamination of an analogue signal line by an a ground current.
In trying to figure this out some what, I have tried altering the component values of the I/V and difference amplifiers so that the op amps will see easier loads. This didn't do anything, I wasn't surprised.
What is also interesting is that when using the THS4031s in the I/V stage exactly the same thing happens. The distortion goes from being quite high (0.005% @ 192khz @32x OS) to almost invisible. It's almost as if there's some threshold point inside the DAC chip, and regardless of what I/V stage it is driving, when the driven level drops below it the distortion disappears.
I thought this might be due the ADC of the sound card I am using to do the measurements, so I attenuated the output of the difference amplifiers by 20dB using a resistor divider and with the DAC set at -1dB output the high distortion was still there, so it's a product of the output level of the DAC and not the measurement system.
I am thinking of trying a different I/V configuration using an OPA1632 fully differential opamp as an I/V converter and then feeding that into the OPA627, but something tells me that that wont solve the problem, does this sound like an issue that anyone has dealt with before? Or rather understands how DACs work internally and can perhaps understand what could be causing this? I have tried around 7 different PCBs now, with different grounding, slightly different signal routing etc and nothing appears to work, it seems like I am missing something fundamental.
Many thanks in advance if anyone can offer some assistance.
Okay, as a test I decided to increase the value of the resistor that connects between IREF and ground, this decreases the current that the DAC outputs. This may not be optimal, but it is instructive to a certain degree I believe.
What is immediately apparent is that reducing the output current does nothing to affect the DACs behaviour. Except for the absolute magnitude of all of the signals and the related distortion products, the details described above remain exactly the same.
With regards to the I/V and difference amplifier, I think this shows that the distortion isn't an analogue drive level issue, as the distortion comes and goes at exactly the same point in reference to the digital drive level.
I would also expect this to alter the dynamic of the returning currents around the DAC and the analogue circuitry, if this were a problem I'd expect it to have manifested itself in some way, but it didn't alter anything at all. (I am only guessing with this mind you).
I thought I would attach some graphs of distortion vs drive level @ different sampling frequencies, to help illustrate what it is I am trying to say is occurring. (The bottom left of each graph contains sampling frequency details, all done at 32x OS apart from the last one.)
It is worth mentioning again I believe, that if I use the THS4031 as an I/V converter the graphs look pretty much exactly the same, except that the rise in distortion at the end is far worse.
I am of course trying to figure out why I get this increase in distortion, both the left and the right channel perform in the same way too and the graphs were done with the correct value of resistor tying IREF to ground.
In reply to Matt Storey:
Some more information.
When running idle, that is the reset pin high, clocks present, but no data, the power consumption of the analogue line remains constant, this is as you'd expect. If I then apply data to the DAC but at -100dB the output produces the test signal and the power consumption remains the same as it was at idle. If I then increase the input level the power consumption remains the same right up until the point where the distortion abruptly increases. In this case the input current decreases slightly and keeps decreasing up until 0dB.
This reminds me somewhat of class A biasing in power amplifiers (which is why I thought to try this), where if the output remains in class A bias it will mask various distortion mechanisms, but if you increase the drive level beyond the class A region, those distortion mechanisms will become apparent and grossly affect the performance of the amplifier. It would appear that something similar could be happening here. If I keep the drive level below the quiescent state that the DAC is biased under, the distortion remains very low, however if I go beyond this the performance suffers as a direct result.
The question is what mechanism is it that's causing this?
We are still grinding this one out. Please hang in there.
Gate Driver Applications Engineering Manager
Dallas, TX USA
In reply to Don Dapkus:
Wow, thanks D2 :)
I will add that I've tried a little more experimentation.
1) Setting the volume control internal to the PCM1792 has exactly the same effect as turning down the volume digitally before the DAC. I thought this would be the case, but I wasn't sure if the volume control was handled on the digital side of the 1792, or whether it affected some parameters on the analogue side. Seems like it's digital.
2) Both the left and the right channel of the DAC perform entirely independently from one another. That is, if the left channel is going full out, the right channel still falls to the distortion minimum at the same drive level. The absolute level of distortion on the right channel is slightly increased due to cross talk, but nothing more then that. As the DAC shares a common power pin for both analogue output channels I'd assume that this isn't an issue with the associated pin and any of it's possible return currents, otherwise turning one channel to max would possibly influence the other in a detrimental way.
3) I have tried measuring the performance without using the difference amplifier and cap coupling the output of the I/V opamp directly to the measuring system. The point of this was to remove any ground currents flowing from the R+C network connected to the non-inverting input of the difference opamp, maybe they were to blame. They weren't. Same issue and both the positive and negative current outputs perform similarly.
4) The previously mentioned copper rods that connected the top side ground plane beneath the DAC to the bottom side main ground plane - there were two of these mounted directly beneath the DAC chip, I wondered if using two of them could create a small loop so I removed one of them, this didn't affect the performance.
5) The PCB designs prior to this one had the LVDS receiver and opto isolator mounted on a separate PCB, with wires connecting the I2S signals around. I wasn't happy with this as I felt EMI was causing unpredictable results. Either way, with one of the previous PCBs, both the +/- current outputs went beneath a zero ohm 1206 link resistor. This resistor had the +5 volt rail flowing through it. The odd thing here was that if the DAC was going full out and distorting, I could press the tip of my finger generously on top the 1206 bridge so that the skin pressed into the +/- current lines too, doing so resulted in almost all the distortion vanishing completely. 48 & 96khz = 0.0003% @ -1dB, only 192 at 64x OS would show higher, but 192 @ 32x would also fall to 0.0003%. I ran this through a sweep from 20hz-20khz and it was also ruler flat. My finger didn't have to be on only the current outputs either, I could place my finger over other areas of the I/V + difference amplifier stages and it would have the same effect, oddly putting my finger on the left channel would also show an improvement in the right channel too. The more skin contact there was, the better. I figured I was either acting as an antenna and channeling the EMI into the circuit and the interference was helping in some way, or I was creating some capacitive coupling somewhere that managed to help in some way. Touching the scope probe to the current output also reduced the distortion by a small amount too and I had seen a similar effect on another circuit where it functioned with the scope probe connected, but didn't with it off. In that instance I attached a small cap to the circuit and increased the line capacitance, this was an I2C line and that fixed the problem. I tried adding some capacitance to ground on the current outputs but this did nothing, I figured as much considering the current outputs are supposed to see a zero ohm input impedance from the I/V stage anyway, not like anything is going to decide to flow through the cap instead. The trouble with this is that in the current implementation I altered the signal routing of the I/V stage slightly (the 1206 link with the current outputs beneath is still there though) while also putting the LVDS/opto on the same PCB and now I cannot get any improvement in performance using the sophisticated 'finger' method. I do however get a small increase in performance (we are talking 0.0018% to 0.0016%, but it is repeatable) by turning on my soldering iron. This is a Metcal unit that uses a 13Mghz signal to heat a coil of wire embedded in the tip of the iron. I just tried this again and yes, one channel shows a small improvement, but the other gets worse by the same amount, I don't figure this is really relevant, but I thought I'd mention it anyway. Touching the scope probe to the current implementation - touching the scope probe to the left channels negative current output reduces the performance from 0.0018 to 0.002%, touching it to the left channels positive current output decreases it by a similar amount. Touching the scope probe to the right channels negative current output decreases the distortion from 0.0018 to 0.0016% and touching the probe to the right channels positive current output increases the distortion by a similar amount. The effect the scope probe has changes a little depending on if it's used in x1 or x10 mode, having less of an effect in x10 mode, where I would imagine that the probe capacitance would be less.
6) This might be important, although I am not sure. The PCB with the DAC on board doesn't contain any of the power regulators, these are high quality independent units on separate PCBs (They are Super regulators similar to the type designed by Walt Jung, but with a pre regulator.). The entire set up is wired up following star grounding principles, with 20cm wires carrying the +5 +3.3 +/-9 rails to the PCB, another separate wire solders to the ground plane and returns to the star earth point. I have tried braiding the +5 +3.3 and ground wires together this did nothing. Twisting the +/- 9 volt rails didn't do anything either. And I have also tried soldering the ground return wire to various different parts of the PCB and this has absolutely zero effect on the performance. The regulators aren't oscillating either.
I should have some NE5534s arriving tomorrow, these are cheap, I figured I'd give them a go in the I/V converter and I'll see what I find.
I tried the NE5534s and they deliver roughly the same performance that the AD8610s provide, with the NE5543s being slightly worse.
Some suggestions from our side for you to consider:
Put decoupling capacitor for pin28-27 close to the part
Decoupling capacitors for pin21 and pin22 should connect pin23 directly with
10k ohms Resistor for pin20 should connect pin19 directly without using GND
I've already got the 0.1uf decouplers as close as they can get to the chip, we're talking 0.5mm of trace. The 47u lytics are somewhat further away due to their size, but I could reduce this somewhat if it is extremely important.
What you said about the 10k resistor however is interesting, are you saying that the resistor should connect between pin 20 and pin 19 without connecting pin 19 to the ground plane? Or rather, instead of having separate connections from pin 19 to the ground plane and from the resistor the ground plane, that the ground side connection of the resistor should be routed through pin 19 instead? If the latter is the case I currently have the resistor connected like that.
You've got it.
Will the detailed requirements for implementation as above be written into an updated Datasheet.
If the datasheet does not detail how the IC works internally - then i assume that the only guidance we can request is from TI ?.
Thanks and regards,
In reply to shadders:
If I end up figuring out how to solve this and if the issue is simple enough and widely applicable, it would be nice for some design hints to be included in a data sheet or evaluation module. As it stands I am waiting for Don to get back to me on a couple of questions :)
All content and materials on this site are provided "as is". TI and its respective suppliers and providers of content make no representations about the suitability of these materials for any purpose and disclaim all warranties and conditions with regard to these materials, including but not limited to all implied warranties and conditions of merchantability, fitness for a particular purpose, title and non-infringement of any third party intellectual property right. TI and its respective suppliers and providers of content make no representations about the suitability of these materials for any purpose and disclaim all warranties and conditions with respect to these materials. No license, either express or implied, by estoppel or otherwise, is granted by TI. Use of the information on this site may require a license from a third party, or a license from TI.
TI is a global semiconductor design and manufacturing company. Innovate with 100,000+ analog ICs andembedded processors, along with software, tools and the industry’s largest sales/support staff.