Hello,
I am developing an application using cc2530 and I would like to discuss with you about some doubts related to its ADC and DMA feature. Nowadays I am studying how the ADC can be converting during 20 msec, a period of my signal, and, after this, how DMA can write the conversions to memory. I have chosen a decimation rate of 512.
As I could read in the user's guide of cc2530, the sampling frequency of the ADC is set to 4MHz, that is a sampling period of 0.25 usec. Then, I will get 80000 samples in 20 msec, the period of my signal.
Besides, I could read that the time of conversion of a sample is given by the following expresion: Tconv = (decimation rate + 16) *Tsampling. In my case, I will get a conversion time of 132 usec.. So, I suppose the total conversion time of all the 80000 samples is: 80000*132 usec= 10560 msec.
My first surprise is how is this time so large. Is this right? More than 10 seconds to convert 20 milliseconds? I hope I am wrong... Please, Could anybody tell me where have I made a mistake?
Secondly, my second doubt is related with DMA channel transfer count, which is configured with the parameter LEN. This can use as maximum 13bits, that means it can count up to 8192 transfers. That's not good enough for 80000 transfers I need. So, what do you recommend me to do? Which could be the solution? To disarm and rearm the channel 10 times could work?
I will appreciate very much if you could help me with these doubts,
Lucía