I have a DAC1220 EVM wired up and was hoping to command some nice voltages, but unfortunately, such is not the case for me. As soon as I power up the board I get a 2.5-2.6 volt output which ripples some when I connect my powered-up microprocessor. My settings are:
S1 (sys ref): internal (to the right of the analog connector)
S2 (sys clock source): onboard 2.4 MHz crystal (all the way up from the power connector)
J3 both jumpers in place: do not measure AVDD or DVDD currents
J5 in place (analog and digital grounds connected)
Power Connector: pins 3 (AVDD) and 10 (DVDD) connected to +5v, pins 5 (AGND) and 6 (DGND) connected to ground
Analog Connector: pins 1,3,9,11,13,17 and 19 connected together to ground, pin 2 (AN0+) and pin 4 (AN1+) hooked to oscilloscope, pin 20 (external SYSREF) not used
Serial Connector: pins 4,10,18 connected to ground, pin 16 and 20 not used, pin 7 ! CS connected to ground, SDIEN: connected to ground (only write to device), SCLK connected to PC5 of microprocessor output, SDIO connected to PC7 of microprocessor
I planned to start at the most rudimentary level and work my way forward. Thus, I am bit-banging the device. I wait after power up, then pull the clock line high, set my data and then pull the clock line low. I send out the following data pattern: 0 1 0 0 0 0 0 0 (the command byte to write to the three Data Input Registers). I then hold the clock low for 16 clock cycles after which I transmit my DIR values (i.e. 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 for max voltage). I try to change this , but see no cause and effect. Attached is my code snipet for review (sorry if it is not so readable due to the loss of indentation). If you could offer any advice on what I might be doing wrong I would be quite grateful. Additionally, any further pointers would be appreciated. Then maybe someday I can start asking more challenging questions like running multiple DACs and optimizing op amps. Thanks.
// set up a timer to determine the data rate for the D to A converter
// the clock is toggled every third time this is true
if(ulCount2++ >= 100)
// restart the counter for next time
ulCount2 = 0;
// hold the voltage is set after all the bits have been sent to the D/A converter
if(ulHoldTheVoltage == 0)
// the transmission sequence consists of first bringing the clock high
// then loading the data
// then bringing the clock low (data is latched on the falling edge)
if(ulClockHighDataInThenClockLow == 0)
ulClockHighDataInThenClockLow = 1;
// bring the clock high except during pause between command and data
if ( (ulBitCount < 8)
|| (ulBitCount >= 24)
//RIT128x96x4StringDraw("This is not the time" , 6, 40, 15);
else if (ulClockHighDataInThenClockLow == 1)
ulClockHighDataInThenClockLow = 2;
// the default output value is 0
ulPinValue = 0;
// only command bit 1 is high (bits 0 and 2-7 are low)
if( (ulBitCount == 1)
ulPinValue = GPIO_PIN_7;
// output a little less than 5 v
if( ulBitCount == 26)
if(ulOutput2p5Volts == 1)
ulOutput2p5Volts = 0;
ulOutput2p5Volts = 1;
// set the port value for the output data
//RIT128x96x4StringDraw(" " , 6, 40, 15);
ulClockHighDataInThenClockLow = 0;
// bring the clock low (latch data)
// move up to the next bit position, repeat after 48 bits
if(ulBitCount++ >= 48)
ulBitCount = 0;
ulHoldTheVoltage = 1;
// end of hold the voltage == 1 case
if(ulHoldTheVoltage++ > 50)
ulHoldTheVoltage = 0;
// end of count2 if statement
I am sorry to hear that you are having difficulties with the DAC1220 - by the description of you setup above, you seem to be on the right path. Is it possible for you to capture your bit-banged interface with an o-scope and send that detail along?
Here are a few screen shots of data and clock lines. I think the phenomenon of the data line dropping out is new to today. Does this indicated a damaged circuit or a miswiring? Thanks for you input.
data and clock lines floating off microprocessor, 8 bit command, pause, 24 bit register
data and clock lines when micro port is connected to the unpowered DAC (clock and data signals reduced)
data and clock lines after DAC is powered up (clock value back to normal, data amplitude negligible).
In my first post I bolded the following: SDIEN: connected to ground (only write to device) in hopes to call attention to it. Is this statement correct? When I see my output as it is and understanding that this is a tristated line to accomodate both data input and data output I wonder if my present problem is related to this? What do you think/
to my previous point, attached is a trace of a strange waveform which I think must be generated as an output from the DAC. Does this look at all familiar? This event (or similar) occurs sporatically when I let the clock run free.
As a final scope capture, the following was taken using the SDI (with SDIEN low) as the input to the DAC.
The bit rate was slowed considerably, and the output changed to request 5v. The DAC output, however, remains at 2.5 volts.
My bad :( .... On reviewing the circuit I see the SDIEN as an asserted low signal to the tristate device. This is used to control the passage of the SDI signal. Using SDI tristate instead of SDIO boosts the signal to the 3.3 volts it should be. Why is SDIO attenuated by the DAC circuitry? I dunno ...
Sorry for the delay in response. A couple of things. In regards to connections, you were detailed on how you made your connections to the EVM connector except for the SCLK and SDIO settings. SCLK should connect to the serial connector pin 3. SDIO should connect to serial connector pin 11. You will notice that the default setting for the tri-state buffer in enabled, which is why the input to the micro would be pin 13 and the output pin from the micro would be 11 thus separating the outputs from being connected together assuming the DAC1220 will be in output mode.
Second issue is you are giving the wrong command for writing. The DAC1220 is expecting 3 bytes of data and you are only giving it 2. So you either need to send the correct command (0010 0000) for 16 bit mode, or you need to send another dummy byte of data as the third byte. What is happening is the command times out waiting for the last byte to be sent, which cancels the command.
Third, you need to remember that you are in binary two's complement mode as power up default. This means that the device will start at mid-scale of 2.5V with 0x0000 and 0xffff being very close to the same value. In this mode 0x7fff is full scale and 0x8000 is minimum.
Thanks for your quick response. I realize you were out of the office last week and I was going to cut you some slack and not start beating on your door until tomorrow.
In regards to the SDI/SDIO and SDIEN conundrum, I see for transmitting that using SDI works and SDIO doesn't. I don't know why completely. Page 3-4 recommends pulling SDIEN high and using SDIO for both transmit and receive when bit banging. Anyway, I recognize that I should not expect to receive any data into the micro from SDI. I don't know if I left SDIEN floating if it would default to on like you say, but I have it tied low to be safe.
Towards the second issue, my apologies about a typo. I am transmitting 24 bits (3 bytes) of data. I'm afraid I didn't count my zeroes careful enough, but if you are courageous enough to look at my code or you look at the scope traces you will see that there is an 8 bit command followed by the 24 bits of data. I also made another mistake: S2 is moved to the "all the way down" position, not the "all the way up".
Getting back to the data bits... I have sent the DAC several variations of output voltage requests: 8 0 0, 7 0 0, 7 F F, etc., but have not been able to change the value from 2.5v (good catch, though since my posted scope traces show outputs of 0 0 0 and F F F only). I have also tried to read several of the registers, but have not recieived any data back. There is a required pause between command and data and I have varied this considerably. I have also varied the clock rate quite a bit. I will go back and try things again in light of what you have said. I will update the post with the results.
Sorry, I should have counted the clocks on the scope shots. Are you using the powerup default settings for the CMR register? That's my assumption. If that is not correct, let me know. The default is 16 bit mode.
The SDIO pin is rather confusing. The buffer is enabled by default with R3 that pulls the enable pin low. The data output from the micro should go to pin 11 (SDI) and the input to the micro should be pin 13 (SDIO). Pin 9 (SDIEN) is the enable pin for the buffer. If you want to read from the DAC1220, you must pull this pin (SDIEN) high first or you will have two outputs driving the SDIO pin.
One other thing, have you verified that the crystal oscillator has truly started oscillation? You will not be able to communicate with the DAC1220 if the clock isn't running.
A couple of things I should have mentioned. If you are just powering up the device and writing to the DIR, you will not see a change in the output as the power up/reset mode is SLEEP, which will tri-state the outputs. You might be seeing this condition where the output my be floating to mid-supply. When I have used the DAC1220 in projects, I send a reset pattern, and then run the calibration routine. I have always had good success with communication following this start up procedure. Following the calibration the output automatically goes to NORMAL mode.
You are a genius. You pointed me straight to my problem (well, maybe not ruler straight, but you got me there). I was lazy and anything that said optional I figured I'd skipped till after I had the thing working. Implementing the reset did the trick, but indirectly. It forced me to bump up my clock rate because the reset specifications do contain maximum values (other specifications list mins only). Prior to this, I thought as long as I provided activity on the clock line at least every 100 ms (p. 11 of spec) I would be cool. So I was ultra conservative and clocked every 1500 us (666 baud, nice number, huh). In some of my testing I bumped this up quite a bit, but my recent investigations seem to indicate that the device will not respond unless the entire command is completed within 100 ms. I increased the clock rate by a factor of 10 and the thing jumped up and started to work :). I will keep the reset in the code (although, the device seems to work ok without it) and now move on the calibration part. Thanks for your help.
All content and materials on this site are provided "as is". TI and its respective suppliers and providers of content make no representations about the suitability of these materials for any purpose and disclaim all warranties and conditions with regard to these materials, including but not limited to all implied warranties and conditions of merchantability, fitness for a particular purpose, title and non-infringement of any third party intellectual property right. TI and its respective suppliers and providers of content make no representations about the suitability of these materials for any purpose and disclaim all warranties and conditions with respect to these materials. No license, either express or implied, by estoppel or otherwise, is granted by TI. Use of the information on this site may require a license from a third party, or a license from TI.
TI is a global semiconductor design and manufacturing company. Innovate with 100,000+ analog ICs andembedded processors, along with software, tools and the industry’s largest sales/support staff.