This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

CC3200: CC3200 Testing to match spec

Part Number: CC3200
Other Parts Discussed in Thread: CC3100

Hello -

I am trying to test the transmit output power of the CC3200 to see if I can get matching results with the spec.  My set up is a CC3200 that is cabled directly to a spectrum analyzer.  I am using the spectrum analyzer to measure power of a 54 Mbps OFDM signal from the CC3200.  I am integrating power over a 20MHz bandwidth of the OFDM signal on the spectrum analyzer and getting a measurement of -4 dBm.  The spec says that I should be seeing +14.5 dBm.  Any reason for this huge difference?

Thanks

  • Hi,

    How are you generating 54 Mbps OFDM signal? Are you using radio tool?

    What is the power settings?

    Please provide more information about your setup

    Regards,

    Charles O

  • I am generating the 54 Mbps OFDM signal using radio tool. The version I am using is 1.1 fix and the service pack on the CC3200 is 1.0.0.10.0.
    The power setting is at 0. I have attached a 500 MHz to 2500 MHz filter between the CC3200 and the spectrum analyzer so CC3200 cabled to filter which is then cabled to spectrum analyzer.
  • Daniel -
    First, please use latest firmware SDK and SP (found here==> www.ti.com/.../cc3200sdk)
    Second - you might want to check the impedance of the filter and also sweep the filter and the cables and see what the loss of those together is, then see if it adds up to the loss you are seeing.

    Also, have you tried connecting directly to eliminate the extra component (while still sweeping your cable)?
  • Hello Josh,

    I will try the latest SDK and SP but I am still skeptical about it accounting for this difference as these low power outputs is seen throughout all rates.

    The filter only accounts for around 1.4 dB of attenuation while the cable is probably around 1 dB at most.  It still does not explain the ~20 dBm loss from spec (since my integrated power output over 20 MHz at 54 Mbps is ~-4dBm).

    Thanks,

    Daniel Cho

  • Also, since I am configuring and running the CC3200 using the radio tool GUI, do I need the SDK?
    Don't I just need a radio tool GUI version that is compatible to the service pack that I have flashed onto my CC3200?
  • Daniel -
    you need to load the service pack and the radio tool application binary - based on your description above, i doubt the radio is turning on correctly without the application binary loaded and guessing on the loss is to me not OK if you want exact knowledge of what you are doing, but you are probably close to what it is - but still, i would recommend checking so you know for sure. What you are describing (I think) is simply either an open or the transmitter not turning on.

     
    C:\ti\CC3100_CC3200_RadioTool_1.2\CC3200BoardApplication_Binary (this is the MCU image)
    C:\ti\CC3100_CC3200_ServicePack_1.0.1.11-2.9.0.0 (service pack binary to load, too)

    steps and further details are here:
     

  • Hi Daniel,

    - Can you compare results with CC3200 LaunchPad? Maybe you have some other systematic error inside your measurement chain.
    - Are you sure that your impedance matching is correct?

    Jan
  • I did load the service pack and the radio tool application binary to the CC3200s.  The versions for both of them are given on the above comments. 

    The settings on my spectrum analyzer is:

    center freq - 2442 MHz

    Integrated BW span - 20 MHz

    Res BW - 100 kHz

    sweep - 5s

    I have been getting those low power output values due to integrated over the average of the signal but if I integrate over the peak values of the signal, I get ~14.3 dBm for a 54 Mbps which I am happy with as the spec states that 14.5 is typical transmit power output.  So is the peak values of the signal what I am suppose to be integrating 20 MHz over or is it the average values of the signal?  Also, on the spec sheet, is that 14.5 dBm transmit power output for a 100 byte packet? If I transmit a 1400 byte packet at 54 Mbps, I get a power output reading of ~18 dBm (peak values of signal).

  • I am trying to compare the results with the CC3200 Launchpad spec sheet and I am getting values below the spec sheet data. There is no documentation on the testing setup so it is difficult for me to get the results TI has on the spec sheet for the CC3200.

    I am not an RF guy so I am a bit confused by what you mean by "impedance matching."
    If the CC3200 is directly cabled to the spectrum analyzer input, isn't the impedance matched at the input of the spectrum analyzer?

    My entire test is mainly to check how close the CC3200s that we have matches RX sensitivity values on the spec. I am trying to match the TX power output as a first step because if I can't get the transmit power output as given on the spec then I won't be able to get the RX sensitivity value.

    For the 54 Mbps, the spec states that its transmit power output is 14.5 dBm and its RX sensitivity is -74 dBm (for a 10% PER). So if there is an 88.5 dB of attenuation, I should be seeing a 10% packet error rate. I am seeing a 10% PER at around 81 dB of attenuation.

    Should the RSSI and the 20 MHz integrated BW power measurement be the same? If my spectrum analyzer measures -20 dBm over 20 MHz (which is the entire signal), should the RSSI on the CC3200 receiver read -20 dBm as well?

    I have the CC3200 (receiver) and the spectrum analyzer connected together by a power splitter. (All attenuation is considered when calculating)
  • Hi Daniel,

    I hope all is well. For the Data sheet power measurements it was dune with a burst mode power meter not a spectrum analyzer. The reason for this is that even using the Radio tool in Continuous mode we cannot get a 100% duty cycle. Thus you need to measure the power in burst mode. Using the Spectrum analyzer as you setup is the peak power as you stated as the SA must be put in max hold. IF you look at any of our certification reports you will see the peak powers as well for comparison.

    Note that as with our DS, the certification house will use a burst power meter.

    I hope this helps.

    Thanks,
    Riz
  • Hi Daniel,

    I will try to answer your questions below:

    1) I am not an RF guy so I am a bit confused by what you mean by "impedance matching."
    If the CC3200 is directly cabled to the spectrum analyzer input, isn't the impedance matched at the input of the spectrum analyzer?

    Yes you are correct. If you are connecting to the U.FL connector then the output of the CC3200 / BPF is 50 ohms at 2.45GHz, so if you use an RF cable to connect to the test equipment there is nothing you need to do. When you are connecting to an antenna, the parasitics of the PCB, etc can effect the overall impedance, and thus you use a matching network to get back to the optimal 50 ohm impedance of the antenna.

    2) For the 54 Mbps, the spec states that its transmit power output is 14.5 dBm and its RX sensitivity is -74 dBm (for a 10% PER). So if there is an 88.5 dB of attenuation, I should be seeing a 10% packet error rate. I am seeing a 10% PER at around 81 dB of attenuation.

    For the TX and RX performance numbers they are not related. That is the TX of 14.5dBm has nothing to do with the -74 dBm receiver characteristics. The CC3200 goes thru a Power amplifier to generate the 14.5dBm output power. On the other hand, there is an LNA within the part that will receive the signal, so not related. What we mean by -74dBm sensitivity is what is the level that the CC3200 sees in that the worst case BER will be 10%. So as an example if you provide a calibrated signal of -74dBm to the input of the CC3200 and measure BER, it should be less than 10% but not higher. You may get 5% for example. In this case you can further reduce the input power to the CC3200 until a 10% BER is reach. That may be -76 for example.

    3) Should the RSSI and the 20 MHz integrated BW power measurement be the same? If my spectrum analyzer measures -20 dBm over 20 MHz (which is the entire signal), should the RSSI on the CC3200 receiver read -20 dBm as well?

    RSSI is a receiver measurement. If you use our Radio Tool for the SimpleLink family you can generate a WIFI signal with a signal source, then use the GUI of the RadioTool to set the CC3200 in receiver mode and measure RSSI. It cannot be done with the Spectrum Analyzer.

    I hope this helps.

    Thanks,
    Riz
  • Thank you for the insight that the burst mode power meter was used. It was very helpful. I am getting consistent measurements between the RSSI value outputted and the spectrum analyzer's burst mode power measurement output (there is about a 1 dBm difference between the two) but they are both still off the spec. With attenuation accounted for, both the CC3200 receiver and the spectrum analyzer burst mode power meter function is showing that the CC3200 transmitter transmits at +12 dBm compared to the +14.5 dBm as stated on the spec sheet.

    The RX sensitivity is also off. For both the CC3200 receiver (RSSI value) and the spectrum analyzer burst mode power meter function, it is showing that at -71 dBm, there is a 10% packet error rate when on the spec sheet, it states that the 10% packet error rate should be seen at -74 dBm.

    Any insight as to why there is this offset of approximately 3 dBm?
  • Here is the setup that I currently have for testing:
    CC3200 transmitter -> variable attenuator -> power splitter -> CC3200 receiver and spectrum analyzer
    With that setup, I am measuring the RSSI value of the CC3200 receiver and the power burst measurement on the spectrum analyzer.

    For 2), I do not understand how the TX power and the RX sensitivity is not related. If I am sending 1000 packets from one CC3200 to another CC3200 at 54 Mbps, shouldn't 100 of the packets be dropped if there is 88.5 dB of attenuation between the two CC3200s? With two CC3200s connected together with the setup above, I am getting this 10% packet error rate at -71 dBm instead of -74 meaning that the RSSI value that is shown on the CC3200 receiver is -71 at 10% packet error. Shouldn't the RSSI value show -74 dBm at 10% packet error?

    For 3), I am using the spectrum analyzer purely for burst power measurement of the transmitter. My spectrum analyzer has burst power measurement capability and from another TI employee, he stated that the spec data was measured using a burst power meter. So I do not understand why my spectrum analyzer cannot be used to measure power outputted by the CC3200 transmitter. So should my spectrum analyzer burst power measurement be the same as the RSSI value? If I get the peak values only of the signal, I'm off by 1 dBm from the RSSI but if I get the average values of the signal, I'm off by about 7 dBm. When doing testing, was only the peak values of the signal taken?
  • What is the testing setup that was used for the RX sensitivity and TX transmit power? I am unable to find an instrument that can reach values lower than -70 dBm. How were you able to measure the sensitivity for all the rates because the 1 Mbps DSSS goes as low as -94.7 dBm.

    The testing setup as well as the configuration of the instrument/radiotool(cc3200) used for testing will be greatly appreciated as I cannot replicate what is on the datasheet.
  • Hello Riz,

    I was able to get measurements very close to the RSSI values outputted by the CC3200 receiver and the RX sensitivity (for 10% PER) as listed on the datasheet.  I got the measurements using the burst power measurement function on my spectrum analyzer.  An interesting thing that I found was that the peak burst power measurements from the spectrum analyzer matches very closely to the RSSI values outputted by the CC3200 receiver but the average burst power measurements from the spectrum analyzer matches very closely to the RX sensitivity values on the datasheet. There is approximately a 6 dBm difference between the peak burst power measurement and the average burst power measurement.

    i.e) For 36 Mbps, the datasheet states that 10% PER (RX sensitivity) will result at 80.5 dBm. 

    Experimentally, the 10% PER happens when the average burst power measurement from my spectrum analyzer is approximately -80 dBm but the peak burst power measurement from my spectrum analyzer is -74 dBm.  So the 10% PER happens when the RSSI on the CC3200 receiver outputs -74 dBm since the RSSI on the CC3200 receiver matches with that of the peak burst power measurement from my spec analyzer. 

    Why is it that the peak measurement matches with the RSSI outputted by the CC3200 while the average measurement matches with the RX sensitivity?

    Thanks,

    Daniel Cho

  • Hi Daniel,

    I will try and address your questions summarized here:

    1) I am measuring 12dBm when the DS states +14.5dBm

    Are you using packetized or continuous mode? What channel are you measuring? Note that there is around a 1dB difference between packetized and continuous mode (packetized will be higher). Also there is ~ a 2dB variation in output power from channel to channel. The one show I the DS will be the highest. There is also a +/-1.5dB variation from part to part.

    2) The R sensitivity is off:

    The U.FL connector is not a very good connector for measuring the low powers. I suggest you use an RF coex connection to do this sensitive measurement. That usually explains the 3-4dB difference.

    3) I do not understand why TX power and RX sensitivity are not related.

    To be clear, the TX value of the CC3200 is not related to the RX sensitivity value. As mentioned in my previous post, you are not transmitting and receiving at the same time, so the TX and RX of the part are not related. Of course the level you are receiving (TX from some other device or signal generator) is required on the receive side, so in that sense there are related, but the TX and RX spec of the CC3200 are not related. They are independent paths in the radio. In your set up you are adding isolation or loss to the receive side, that is fine, but don't mix up the CC3200 mode in RX mode with the TX of the same part.

    4) I am using the spectrum analyzer for burst power measurements not a power meter.

    As or measurements were done with a burst power meter, I cannot comment on why the spectrum analyzer cannot be used. I suggest if you are concerned with matching exactly you use a burst power meter (not in 1 the difference between powers).

    5) Should my spectrum analyzer burst power measurements be the same as the RSSI value?

    I don't think this comment is relevant, from your setup it looks like you have 2 radios on in TX mode and one in RX mode. Just because you set the TX on one with attenuation and measure the TX on the spectrum analyder and the RX on the other, does not mean they are the same. The measurements should be done independently.

    6) Peak verses average:

    For OFDM signals they have a peak-to-averge or PAR. So the peak and average measured values will be different. Thye vary from 6 to 9 depending on the modulation.

    7) What is the RX sensitivity measurement set-up?

    We use RadioTool with the CC3200 in RX mode. We then use an IQView that can generate the modulated signal. We can adjust the level fro the IQ view until the Radio tool RX provides us the appropriate RX signal and BER.

    8) What is the TX setup?

    We use the RadioTool to transmit the signal and then the IQ view I receive rmode to get the statistics of the signal.

    9) why is the peak measurement matches with the RSSI output by the CC3200 while the average measurement matches the RX sensitivity?

    Again see 6 and 7 above, the measurements are independent.

    I hope this helps.

    Thanks,

    Riz

  • Hello Riz,

    Thank you for the reply! Although there seems to be some misunderstandings.

    1) I am measuring at channel 7 in continuous mode. I guess with all those variations, and the fact that continuous mode is 1 dBm lower than packetized, it makes sense.

    2) By your response, I'm guessing that the u.fl connector is okay for use in high power measurements but the values that I receive from the receiver end is consistent with the attenuation set between transmitter and receiver at all power levels. So that means that the 3 dBm difference that I am seeing between RSSI value in CC3200 receiver and the datasheet is there for all power levels. Can there be as big of a 3 dBm loss from u.fl connector at all power levels? And even if there is a 3 dBm loss, that does not affect the RX sensitivity of the receiving CC3200 so why is the 10% PER at -71 and not at -74?

    3) There was a huge misunderstanding here. I have one CC3200 transmitting at all times and a DIFFERENT CC3200 receiving at all times (see number 5 for complete setup). My question was shouldn't the RSSI value at the receiving CC3200 match that of the datasheet's RX sensitivity? And shouldn't my spectrum analyzer's average burst power measurement be the same as that of the RSSI in the receiving CC3200?
    i.e. For 54 Mbps, it states that there is a 10% PER at -74 dBm on the datasheet.
    On the CC3200 that is receiving, the RSSI value is -71 dBm when there is a 10% PER. Shouldn't the 10% PER occur when the RSSI reads -74 dBm?

    4) Burst power is used for all measurements

    5) Huge misunderstanding here. This is my set up:
    CC3200 (transmitter which is transmitting at all times) cabled to variable attenuator cabled to power splitter (1 to 2 so one input two output) cabled to my spectrum analyzer (which is receiving at all times) and CC3200 (receiver which is receiving at all times). In my test, I set the variable attenuator to 0 and increment it by 10 dB so I can sweep through all power levels. 

    So 2 receivers (spectrum analyzer and always receiving CC3200) and one transmitter (always transmitting CC3200).

    Since the spectrum analyzer and the CC3200 (always receiving) is receiving the same thing due to the power splitter, shouldn't the RSSI value of the receiving CC3200 be the same as that of the spectrum analyzer's received measurements? I am taking two different measurements from the spectrum analyzer, peak values and average values. The peak values measured on my spectrum analyzer that are measured from my transmitting CC3200 match up nicely with the RSSI value from the receiving CC3200. The average values that are measured on my spectrum analyzer matches nicely with the RX sensitivity as stated by the datasheet.
    i.e. For 54 Mbps, it states that there is a 10% PER at -74 dBm on the datasheet.
    When receiving a 10% PER, the RSSI value of the receiving CC3200 reads -71 dBm, the peak value on my spectrum analyzer reads -71 dBm but the average value on my spectrum analyzer reads -74 dBm. This is the part that does not make sense. Why does the RSSI read -71 dBm at 10% PER when it's suppose to be reading -74 dBm for a 10% PER? The RSSI should be matching up with the average values on my spectrum analyzer because at 10% PER, my spectrum analyzer is measuring -74 dBm which is what the datasheet is stating but instead, it is matching up with the peak values which shouldn't even be measured in the first place.

    6) See number 5. I am not curious about the peak to average for OFDM signals, I am wondering why the RSSI matches with the peak values and not the average values when the average values matches correctly with the datasheets expected 10% PER. Were the peak values measured for the datasheet? It looks like the peak values were used for outputting RSSI values and the average values were used to measure expected 10% PER.

    7) Okay

    8) Okay

    9) I know that the peak and average measurements are independent. I am asking why the RSSI output of the receiving CC3200 is using peak values while the RX sensitivity (expected 10% PER) is found using the average values. See numbers 5 and 6.

    In conclusion, numbers 2, 3, 5, and 6 all basically ask the same question.

    If it helps, here is my data testing for the 36 Mbps36Mbps_test.xlsx

    As you can see, the RSSI values and the peak values are pretty much the same.  BUT the 10% PER happens at the average values AND NOT AT THE RSSI/PEAK VALUE.  Why does the RSSI value match up with the peak value and not the average value??

    Thanks,
    Daniel Cho

  • Hi Daniel,

    So the main issue you are seeing now is why is there is a 3dB difference between the SA measurements and the RSSI measurements correct?

    So lets start with the differences in the setup. As I mentioned the RX CC3200 you are receiving with a U.FL connector. This connector is not good at the sensitive measurements and could be the reason for the difference. It is not loss in the connector that is 3dB, but it means that the PER can increase, so what is suppose to be -74 @10% you get -71 instead. Try changing the RX CC3200 one for a solder down version cable. as an FYI we have seen this in the past and four our measurements we do no use the U.FL connector, but use a solder down RF pig tail to do the measurements.

    If everything you said is calibrated so that the power receive after the splitter is the same on both output, then the received power should be the same +/-1dB or so. I would not put much merit into the 3dB difference between the SA and the RX CC3200 until you solder down a cable.

    Also the SA you state is being used in burst mode correct? can the SA accurately measure at the lower power levels? did you try matching the powers at say -40 or -50dBm to see if the SA and RSSI numbers match up.

    If we need a quick call to discuss, rather than over e-mail, you can send me a private note.

    Thanks,
    Riz
  • Hello Riz,

    The PER increasing due to the u.fl connection makes sense to me but the thing that does not make sense is that the SA and the RSSI matches up fairly well for all power levels but for the peak measurements. 

    The SA is being used in burst mode and can accurately measure up to around -75 dBm.  The SA and the RSSI does match up but the RSSI matches with the PEAK values of the SA not the AVERAGE values. 

    I have just sent you a private note with my phone number.  Feel free to give me a call at any time.  I believe the phone call will not be that long.

    Thanks,

    Daniel Cho