This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ADC-DMA cc2530 times

Other Parts Discussed in Thread: CC2530

Hello,

I am developing an application using cc2530 and I would like to discuss with you about some doubts related to its ADC and DMA feature. Nowadays I am studying how the ADC can be converting during 20 msec, a period of my signal, and, after this, how DMA can write the conversions to memory. I have chosen a decimation rate of 512.

As I could read in the user's guide of cc2530, the sampling frequency of the ADC is set to 4MHz, that is a sampling period of 0.25 usec. Then, I will get 80000 samples in 20 msec, the period of my signal.

Besides, I could read that the time of conversion of a sample is given by the following expresion: Tconv = (decimation rate + 16) *Tsampling. In my case, I will get a conversion time of 132 usec.. So, I suppose the total conversion time of all the 80000 samples is: 80000*132 usec= 10560 msec.

My first surprise is how is this time so large. Is this right? More than 10 seconds to convert 20 milliseconds? I hope I am wrong... Please, Could anybody tell me where have I made a mistake?

Secondly, my second doubt is related with DMA channel transfer count, which is configured with the parameter LEN. This can use as maximum 13bits, that means it can count up to 8192 transfers. That's not good enough for 80000 transfers I need. So, what do you recommend me to do? Which could be the solution? To disarm and rearm the channel 10 times could  work?

I will appreciate very much if you could help me with these doubts,

Lucía

 

 

  • Hi Lucia,

    I think you have misunderstood the concept of sampling rate for the ADC. The 4 MHz sampling rate is the internal sampling rate (or operating frequency) of the ADC, but it is not possible to produce ADC conversion results at that rate. In order to produce a reliable result, the measurements must be decimated by averaging, which is taken care of by the hardware. This is controlled by the decimation rate, which gives you a trade-off between conversion time and accuracy.

    Note that the conversion time will give you the maximum rate that you can use for the ADC conversions. Normally, you will use a lower rate than that, depending on the need in your application. This can be controlled by the trigger to the ADC, for instance using Timer 1.

    You are correct that getting 80000 samples will take at least 10.56 s when you use the maximum decimation rate of 512. But what are you planning to do with 80000 samples? The RAM size of the CC2530 is 8 KB, so there is no way you can store that amount of data. This also addresses your issue with the maximum DMA transfer length.

    You said that your signal has a period of 20 ms. I am not sure what you mean by this; whether it is (1) a periodic signal with a 20 ms period that you want to monitor continuously, (2) if you want to do a conversion every 20 ms, or (3) if you want to monitor your signal for 20 ms at a time. In case of (1), I would expect that you want to use some oversampling compared to the 20 ms period (this will depend a lot on the nature of the signal and the application - we would need more details in order to give more specific advice). For instance, if you need to oversample your signal by a factor of 8, you should set up the ADC to do a conversion every 2.5 ms using Timer 1. In case of (2), you should just set up the ADC to do a conversion every 20 ms using Timer 1. In case of (3) we would need more information on your application in order to advice you.

    I hope this helps.

  •  

    Hi Hec,

    Thank you very much for answering me. Your post has been very useful.

    I think my application fits with your third option. I mean, I am interested in getting rms value of a periodic signal. The period of the signal is 20 milliseconds. Then, I would like to get samples of the signal for 20 milliseconds continuously. But, it is enough to monitor just one period continuously, stop and after one second start again the same procedure.

    That's why I was interested in knowing the number of samples I could get during 20 milliseconds to process them.  Now, thanks to your help, I know that if I work with a decimation rate of 512, the conversion time will be 132 useconds. What means that in 20 milliseconds I will get at most 151 conversions.

    So, If I am not wrong I can set the LEN parameter of DMA to 151 if I am interested in transfering a sequence of conversions of 151 transfers. After doing all the transfers, I am thinking in disarm the channel by writing a 1 to the DMAARM.ABORT register bit and stopping also the ADC to save power. Do you think it could be a good solution?

    Thanks for helping me,

    Lucía

     

  • I am glad that my post was helpful. You could do the operation mostly as you suggest. That would mean setting ADCCON1.STSEL to 01 (binary). However, I would suggest that you use Timer 1 for timing the conversions, provided that Timer 1 is available in your system (not used for other purposes). I that case, I would suggest to make 128 conversions over one 20 ms period. There are two reasons for this:

    1. I am not sure if the timing of repeated conversions are completely accurate to the cycle. If not, you will not get exactly one 20 ms period when using 151 samples (and there is already a rounding error in the number 151).
    2. If you have 128 conversions, calculations on the result may be easier, as you can use a shift operation for divide, which is faster than a division.

    If you are to use Timer 1, you should set timer 1 to run in modulo mode without prescaling (T1CTL = 0x02), and you should set T1CC0H:T1CC0L to 4999 (not 5000, as the timer counts from 0 to T1CC0H:T1CC0L inclusively), i.e. T1CC0L = 0x87, T1CC0H = 0x13. You should set T1CCTL0 to 0x04 to use compare mode and disable interrupt (you don't need the T1 interrupt in this case).

    For the DMA, you do not need to use DMAARM.ABORT. You should set up the DMA to single mode and arm it before you start the ADC. The count would be set to 151 or 128, depending on what ADC trigger method you choose. The DMA will stop and disarm automatically when this number of conversions has been made. You will then get a DMA interrupt, and you can use this interrupt to stop the ADC and Timer 1 (if used).

    Good luck with your implementation!

  • Hi Hec,

    I am working on similar thing like Lucia, I am trying to measure voltage over 20ms period, with sampling frequency of 4kHz. But I don't know how to trigger convesion using timer1 without interrapt routine? I missed something probably. Can you suggest me where to find that?

    Thanks,

    Vukadin.

  • I found answer....  ADC conversion sequences...