This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Hi,
I want to interfacing my TMS320F28335 with my DAC8568. It has a SPI serial communication with 3 wires, without CS.
My program has a infinite loop that send data packets from SPI pin. I have run it: With the scope I see a good square wave: SPICLK, and the transit of communication bit of SPISIMOA.
I have 2 questions:
1- Why if I increase the delay, changing the argument of delay_loop from 100 to 1000, the signals are completely distorted (so it don't works fine)?
2- My DAC accept 32 bit packets, is it correct send consequentially 2 packets of 16 bit as I do?
Thank you for helping me!
Here the source code:
#include "DSP2833x_Device.h" // DSP2833x Headerfile Include File
void InitSPI(void); // This function initializes the SPI
void InitSpiaGpio(void); // This function initializes the SPI Gpio
void delay_loop(long);
// external function prototypes
extern void InitSysCtrl(void);
extern void InitPieCtrl(void);
extern void InitPieVectTable(void);
void main(void)
{
InitSysCtrl(); // Basic Core Init from DSP2833x_SysCtrl.c
EALLOW;
SysCtrlRegs.WDCR= 0x00AF; // Re-enable the watchdog
EDIS; // 0x00AF to NOT disable the Watchdog, Prescaler = 64
DINT; // Disable all interrupts
InitPieCtrl(); // basic setup of PIE table; from DSP2833x_PieCtrl.c
InitPieVectTable(); // default ISR's in PIE
InitSpiaGpio();
InitSPI();
EINT; // Enable Interrupt (all)
ERTM; // enable realtime debug mask (DBGM)
while(1)
{
SpiaRegs.SPITXBUF = 0x3F; // command
SpiaRegs.SPITXBUF = 0x3E80; // data
delay_loop(100);
}
}
void InitSPI(void)
{
SpiaRegs.SPICCR.all =0x000F; // Reset on, rising edge, 16-bit char bits
SpiaRegs.SPICTL.all =0x000E; // Enable master mode, shift phase,
// enable talk, and SPI int disabled.
SysCtrlRegs.LOSPCP.bit.LSPCLK = 5; // SYSCLKOUT / 10 = 15 MHz
SpiaRegs.SPIBRR = 99; // SPICLK = LSPCLK / (1 + 99) = 150 kHz
SpiaRegs.SPICCR.all =0x0087; // Relinquish SPI from Reset
SpiaRegs.SPIPRI.bit.FREE = 1; // Set so breakpoints don't disturb xmission
SpiaRegs.SPICCR.bit.SPISWRESET = 1; // Release the SPI from reset
}
void InitSpiaGpio(void)
{
EALLOW;
/* Configure SPI-A pins using GPIO regs*/
// This specifies which of the possible GPIO pins will be SPI functional pins.
GpioCtrlRegs.GPAMUX2.bit.GPIO16 = 1; // Configure GPIO16 as SPISIMOA
GpioCtrlRegs.GPAMUX2.bit.GPIO17 = 1; // Configure GPIO17 as SPISOMIA
GpioCtrlRegs.GPAMUX2.bit.GPIO18 = 1; // Configure GPIO18 as SPICLKA
GpioCtrlRegs.GPAMUX2.bit.GPIO19 = 1; // Configure GPIO19 as SPISTEA
EDIS;
}
void delay_loop(long end)
{
long i;
for (i = 0; i < end; i++)
{
asm(" NOP");
EALLOW;
SysCtrlRegs.WDKEY = 0x55;
SysCtrlRegs.WDKEY = 0xAA;
EDIS;
}
}
Hi Alessandro,
Can you give a little more detail on what you mean when you say that you 'see good square waves'? Are you talking about the SCLK and SDO from the TMS320F28335 or the output of the DAC8568? I suspect you are referring to the SPI lines since it looks like you are sending 0x3F as the command word to the DAC8568 - that would set bits DB27-DB24 which is reserved. Can you post a screen shot of the timing for us?
Thank you for your answer!
At the moment I cannot post a screenshot, but as soon as possible I will do it! In the meanwhile I try to explain better what I have done:
I have not linked yet the F28335 to my DAC EVM. Before doing that I have thinked that it is better proof the SPI communication.
I have attached the probes of my scope to the pins of the SCLK and the the SPISIMOA and I have run my program. I see a stable square wave (I post the screen shot as soon as I will come back to the office) for the clock and a changing square wave for SPISIMOA, so I think that the program appear running properly.
I have done a proof, changing in real time (with the watch windows) the value of the argument of the delay function: instead of 100 I have changed it in 1000, and the scope show me that the waveform is became completely distorded. 1) I don't understand why changing the delay the SPI doesen't work.
2)How can I don't set the bits that are reserved?
Thank you very much, and sorry for my English!
Hi Alessandro,
The TMS320F23885 DSP is going to tri-state its SDO after the transmission completes, so you may see the value of the LSB trail off like some sort of exponential delay, perhaps that's what you are capturing. Take your time with the screen shots and don't worry about the English - we understand what you are trying to work through. We're here to help...
This is the screen using
while(1)
{
SpiaRegs.SPITXBUF = 0xCCCC; // binary --> 1100110011001100
delay_loop(1000);
}
Zoomed::
And this one is the screen using
while(1)
{
SpiaRegs.SPITXBUF = 0xCCCC; // binary --> 1100110011001100
delay_loop(100);
}
Zoomed:
UPDATE -------> I have tried the code linking the DAC (after modified an error of the configuration) and it almost works correctly!!
SpiaRegs.SPITXBUF = x; // binary --> MSB_0'111'0000'0001'0000 WRITE CHANNEL A
SpiaRegs.SPITXBUF = 0xFFF0; // binary --> 111111111111'0000_LSB
I have measured the output at the channel A, changing the 16 bit of data, and now it appear working! I want to do some proof before be sure!
Hi Alessandro,
ALESSANDRO GUERRESCHI said:UPDATE -------> I have tried the code linking the DAC (after modified an error of the configuration) and it almost works correctly!!
It sounds like you are making progress here!
Eheh, yes, but the work isn't finish: after proof definitely the DAC I have to use also an external ADC (ADS8556) with SPI. In this DAC the pin SYNC has the same function of the Chip Select?
For use other SPI device could be a good method mirror the SPISTE to two GPIO of the DSP, to link to the SYNC and to the CS of the ADC?
Hi Allesandro,
Yes, I believe that would work. Are you also controlling the LDAC input for the DAC8568?
No, I am not.
I have seen that this program is working fine!
Now I will create a new post, because I have some questions referred to the ADS8556.