This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Help! SPI Bit error

Hello all,

    I am trying to use the SPI port of the DM365, which work as master, and write data to a slave device. The program says "SPI Bit error". There is only one slave device, and I configure the master as "SPI_OPMODE_3PIN" mode.

    I look up the DM365 GPIO datasheet. It give some translation for the BITERRFLG. which says "Possible reeasons for a bit error are a high bit rate/capacitive load or another master/slave trying to transmit at the same time.

    I donnot know "high bit rate" and "capacitive load" here. And there is just a pair  of master and slave here.

    Anyone can give me some good advice?

    Thank you for your attention here, and any reply will be appreciated.

  • I'd try decreasing the SPI clock to some very low value. If it helps, your hardware has to be looked at most likely.

    It may be helpful to look at SPI settings from Spectrum Digital's EVMDM365 code, specificly, in /tests/spirom. Those settings worked for me.

  • Thanks very much for your reply.  How to decrease the SPI clock? The clock justbe  triggerd when the data is transfered. What's your SPI clock frequency?

    I look up the test programs in my install file directory. Is it just contained in the EVM code package?

  • The spi transfer the last bit data, do the operation spi_SPI_DATA1 = (spidata1 & ox0ffcffff) | buf[i],  it clears the CSHOLD, and set CSNR to 0, which means both chip select SPI_EN0 and SPI_EN1 are selcected. So why does it  select the SPI_EN0 and SPI_EN1 after the transfer finished, but not at the beginning?

  • Looks like you've found the SD code.

    The line of code you are referring to actually deselects both chip selects during the last cycle. The necessary chip select is activated using a global variable spidat1which is set to select the proper chip select in spirom_init().

    I had to fix the SD code in couple places, most noticeably, in // Wait for transmit ready  loop: the bitmask should be 0x20000000, not 0x10000000, but overall it was a huge help compared to having to write it from scratch.

    If you still have problems the best thing will be monitoring the SDI signals with an oscilloscope - that'll give you a better info on what's going on.

    Good luck!