This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

what is the highest bandwidth sensor signal that can be input into MSP430?

Other Parts Discussed in Thread: MSP430F1611

Hi I have been a question about MSP430 with sensor applications.

What is the highest bandwidth sensor signal that can be input into MSP430?

I know MSP430 is a 8MHz microcontroller. Is this related with the sampling rate?

Also how does this change with the # of sensor signals?

What will change if increasing the # of sensors?

 

Can anyone give me some instruction or hint about these questions?

Thanks a lot!

 

  • Hi Charly,

    pls provide information on the type of sensors you intend to connect to the MSP430.

    There were parts available which have a main clock of up to 25MHz. Since the MSP430 is a 16-bit architecture this will give you a verry good performance (even in a DSP-like application (i.e. multiply-accumulate)).

    Rgds
    aBUGSworstnightmare 

  • Hello aBUGSworstnightmare,

    actually lets suppose that we have a telosb that runs MSP430. The question is how does the bandwidth change with the number of sensor signals if I understand right (from Charly)? So, you aBUGSworstnightmare says that it depends on the clock also? can you give an example to understand how the different clocks (ex. 8, 16, 25 Mhz) can influence our bandwidth anc how the number of each different clock can influence our bandwidth too.

    Cheers,

    James

  • Hi James,

    I have to quote myself when asking for the type of sensor you need to connect again. 

    Rgds
    aBUGSworstnightmare 

  • If you are using an external converter, the maximum rate heavily depends on what type of interface is being used.  Is it a serial or parallel interface using 8 bit, 16 bit, or 24 bit samples, and does the MSP you are using support direct memory access (DMA), and are you using it.

    If you are using the MSP's hardware integrated ADC, the maximum sample rate of the ADC is usually given in the part of the data sheet, or user's guide, pertaining to the ADC, and the maximum bandwidth is half the maximum sample rate..  For MSP's using an SAR converter, the maximum sample rate is normally stated outright with a number, and for MSP's using a Delta Sigma converter, the maximum rate is a function of modulator rate and oversample rate, and is usually given as a formula in the user's guide.

  • Hi aBUGSworstnightmare ,

    Suppose we just use the himidity/temp sensors in Telos mote.

    How does the bandwidth change with the number of sensors ?

    Thanks!

     

    Charly

  • For Telos MSP430F1611, a 8MHz processor with 12-bit SAR ADC

    I found some literature that the max sample rate is 200 kHz.

    according to the user guide, is it possible to achieve

    sample time  2k *9.011 * 40pF + 800ns =~ 1520 ns = 13 ADC clks 

    8M / (13 + 13) = ~307 sample rate. (~153 kHz bandwidth )

    Is it correct?

    What will happen if the number of the sensors increases?

    The conversion time increases and sample rate decreases?

    what other aspects need to be considered?

     

    Thanks!

  • Charly I think that this is for the 1st series (MSP430F1xx) that has 8MH processor. For the 5series i think that you have 25MH so the sample rate is something like 1MH i think. If im not right please one expert just correct me.

    James

  • Charly,

    You are correct that if you have zero ohm input resistance into the port pin, that you can acheive an ADC conversion rate of about 13 clocks for the sample time, and another 13 clocks for the conversion.  One note is that 13 clock sample and hold is not an option, only 4, 8, 16, .... 1024 values are available (AD12CTL0 register), so you would have a minimum clock count of 29 tics.

    If you increase the number of sensors, even though there are multiple inputs, there is only one ADC (see the appropriate ADC block diagram for your device) therefore you can only do one conversion at a time.  So, for 2 sensors, each sensor gets 29 clocks for a total of 58 clocks to sample both channels (3 sensors is 3 X 29 tics and so on).  On top of this there is the hardware interrupt and software code latecy which decreases the maximum bandwidth.

    If you are looking for maximum speed over accuracy, you can choose the 16 tic sample hold time, but if accuracy is more important, I would suggest the 32 tic sample hold. 

    Sparkchaser

**Attention** This is a public forum