This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ADC/DMA/USB MSP430F5522

Other Parts Discussed in Thread: MSP430F5522

I am using the MSP430F5522 for a simple digital oscilloscope for my senior design project. I've created a program in LabView that will display all the data coming in, so that part of the project is finished. The trouble I am having is being able to take samples fast enough, and more importantly, at regular intervals. I know the ADC can be configured to take a sample automatically without any timer interrupt, but my project is having problems.

The general program flow is to take continuous ADC samples, and load them into a 2,046 byte buffer. Once this buffer is filled, they are then transfered in one packet through USB to the computer. However, when the USB transfer is initiated, all ADC sampling seems to stop - at least the interrupt does where I am toggling a pin to see when it is called.

Ideally, I would have some kind of mechanism that transfers the ADC samples to the buffer automatically, while the ADC is still running and sampling while the buffer is sent over USB. In my program, I use double buffering so that the ADC can add new samples, while the USB transfers the completed buffer from previous samples.

 

My question is: Can I use the DMA to initiate ADC transfers to a large buffer? I've found that the USB code actually disables interrupts while it does the transfer, so a "background" DMA approach seems like the best option. If so, how do I do this?

Thanks!

  • Philip McCorkle said:
    Can I use the DMA to initiate ADC transfers to a large buffer?

    Not to initiate, but to perform. Initiated is the transfer by the ADC signal for a new conversion result being ready.
    The DMA can be configures to do 1024 transfers into your 2048byte buffer. Then the DMA controlle rwill trigger an interrupt in which you'll have to reprogram the DMA for the second buffer (before the next sample is ready),and then flag the main code to start the transfer of the filled buffer.

    The ADC itself can be triggered for equidistant sampling by a timer CCR module, so the CCR output opens the S&H gate and then closes it and starts the conversion - without any software.

    There are several threads discussing this.

     

  • Is there an example of this? I'm using an adapted form from the F552x sample pack, but to no avail. I've looked and looked through the forum for something addressing this topic, and I haven't found anything relevant. That's not to say the topics are not there, but the forum software search capability is more than lacking...

    I'm having trouble with adapting the sample code to my application because there is USB involved. I would think the solution would be straight forward, but it doesn't seem to be. Any help is greatly appreciated.

  • Separate your project in two, one sampling into a buffer using DMA, and one doing the USB transfers, then merge the two. Don't do everything at once.

    You won't find code that exactly does what you want. After all, it's an engineers job to find what's there and fuse it to a solution that fits the needs. If there were a finished solution for everything, there would be no need for engineers.

  • Yes, I've tried modularizing the code into separate tasks. I re-rewrote a USB example to repeatedly transfer a buffer where each element is just an incremented counter value. On my computer, I get what is expected - a sawtooth waveform - as each USB packet comes in. That part seems to work ok, but a little slower than I'd like. I'm ok with that for now.

    I think I have the DMA working correctly, however, the combination of the DMA and USB seems to cause problems. I have a pin toggling every time the DMA interrupt is called (when it finishes transferring 1023 ADC samples). I am able to change the timing of the ADC, and I have verified this by checking the toggling frequency of the DMA interrupt pin toggle.

    However, when I look at the USB data coming into my LabView program, it is junk and extremely slow (1 packet every ~5 seconds) when I use the DMA code. If I just use a timer and move ADC samples manually, the data comes in quickly. Any idea why this would be slowing the USB transfer so horribly? I thought the DMA was supposed to operate in the background, while the CPU does it's thing?

  • > I've found that the USB code actually disables interrupts while it does the transfer, so a "background" DMA approach seems like the best option.

    Surely, this USB stack (v3.0.0.0) abuses __disable_interrupt() unnecessarily.
    You may replace __disable_interrupt() for transfers with dropping the interrupt enable bit of just the target endpoint on USBIEPIE / USBOEPIE.


    I can't find single mask (enable) of the unified USB interrupt vector (USB_UBM). If this mask would be there, ALL of __disable_interrupt() on the stack could be replaced with this one.. Hardware design flaw?

    Tsuneo

  • Philip McCorkle said:
    I thought the DMA was supposed to operate in the background, while the CPU does it's thing?

    Not exactly. DMA only means that the CPU does not need to react on an interrupt, save registers, execute the interrupt code etc just to move a single data word.

    The transfer iotself, however, requires access to the address and data bus as if the CPU itself would do the transfer. So each DM Atransfer takes 4 CPU cycles in which the CPU is 'frozen'. If it is a block transfer, then it is a bit faster since there is no need to synchronize to the CPU clock for each individual transfer. On burst transfers, DMA and CPU work interleaved, so the CPU continues with app. 25% of its normal speed.

    So no, during a DMA transfer, the CPU is halted. It is just that on DMA transfers, there i s no software overhead. Even if the CPU makes a memory copy in a loop, there's decremneting the loop counter, comparing for loop end, and a jump and the instruction for the actual transfer, so DMA is much faster. And if you have to wait for a trigger, well the CPU either had to do a busy.waiting, not being able to do anythign else, or you had to use interrupts, which case stack access and much more. Not to mention that normally, an ISR cannot be interrupted by another ISR, but a DMA easily can, and is done after 4 cycles, which is ~5 times faster than the absolute minimum ISR solution.

    So I rather think it's your program flow logic that makes the trouble.

     

  • Ok, I've reduced the code down to a simple DMA interrupt that ONLY re-enables the DMA, and a USB loop that sends a packet when the DMA interrupt has set a PacketReady flag. The DMA is only modifying a single byte, (DMA0SZ = 1;). The USB data being sent isn't even remotely related to the DMA memory address - it's an entirely different buffer. However, when I view the data from the USB, it's garbage. I'm getting a rectangular waveform output where each change in data value is separated by 32 samples, and each value is a discrete multiple of 64. Any thoughts as to what is going on? Code file is attached.

    On another note, I have tried what you suggested, Jens. I broke my program up into two separate parts - DMA and USB - and tried each by itself. When the DMA was alone, it would trigger every ~2ms. When I had the USB alone, it would return from the packet transfer function every ~3ms. However, when I combine the two, the DMA now triggers every ~18ms and I get garbage on the USB.

    1348.main.zip.

  • Just an idea: does the USB transfer function use DMA too to copy data from ram to the USB controller? If so, maybe the USB transfer reprograms your data transfer DMA channel, so both transfers do something weird (e.g. 18 ms is what it takes to transfer the number of bytes set in the USB transfer but done at the pace of your ADC sampling rate - or something similar? And the addresses are mixed-up too...

    You should check this.

  • > does the USB transfer function use DMA too to copy data from ram to the USB controller?

    Yes. The USB Descriptor Tool gives you these options,
    - Enable/disable DMA for memcpy on USB stack
    - DMA channel number

    The tool generates this macro on descriptors.h

    #define USB_DMA_CHAN             0x00        // Set to 0xFF if no DMA channel will be used 0..7 for selected DMA channel

    Tsuneo

**Attention** This is a public forum