This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

CC110L: Moving ATA5760 functionality to CC110L-based solution - some questions

Part Number: CC110L
Other Parts Discussed in Thread: CC1101, CC115L

Hello,

I recently revived a project where I want to duplicate the functionality of a (obsolete) ATA5760 device to a CC110L chip (also see https://e2e.ti.com/support/wireless-connectivity/sub-1-ghz-group/sub-1-ghz/f/sub-1-ghz-forum/1036447/trying-to-get-synchronous-serial-mode-w-manchester-encoding-to-work-so-far-getting-only-noise ).

At the moment, I do have a setup with the CC110L that works with some transmitters, but not all. I think I'm running into some issues concerning IF and bandwidth that I can't readily find an answer to.

The old ATA5760 uses an IF of 950 kHz and a bandwidth of 300 kHz or even 600 kHz, but when I set those parameters in the CC110L, things don't work at all -- the carrier output (CS) does not respond, and the serial signal output just shows noise all the time. I get the best performance so far with an IF setting of ~395 kHz (FSCTRL1 = 0x0F) and a bandwidth of ~60 kHz (MDMCFG4:MDMCFG3 = 0xF637).

Unfortunately, especially older transmitters are still not picked up correctly by the CC110L, probably because their transmitter frequency is slightly different from newer products, and the 60 kHz bandwidth is too narrow. One additional problem is that I only designed the newer products, and do not have the proper RF specifications for the older ones.

As I am still not very knowledgeable about the finer details concerning digital receivers, I have some questions:

  • How does the absolute receiver bandwidth (frequency range) relate to the base frequency (869.200 MHz)? Is this bandwidth centered around the Fbase -- so with the absolute frequency range of Fbase-1/2BW ... Fbase+1/2BW? Or is Fbase the lowest frequency, with the frequency range from Fbase ... Fbase+BW?
  • Even transmitter products that work reliably with a 60 kHz receiver bandwidth have trouble getting through at anything over 80 kHz, even at very close range (1 meter), which I do not understand. How can this happen? A greater bandwidth should make it easier to receive various signals, not more difficult, and at this range, noise should not mess things up. Also, the datasheet mentions bandwidths of 300 kHz and more as being quite normal for this type of application.
  • Is the actual choice of IF important? When working with analog receivers, I know about the importance of proper IF choice, and tuning and filtering of the IF circuits, but things like the CC110L are more like a black box to me in this respect. I do have SmartRFStudio 7 working in Wine under Linux now, and I noticed that the IF frequency is apparently chosen automatically.

Thanks for any information,

Regards,

Richard

  • Hi Richard,

    • How does the absolute receiver bandwidth (frequency range) relate to the base frequency (869.200 MHz)? Is this bandwidth centered around the Fbase -- so with the absolute frequency range of Fbase-1/2BW ... Fbase+1/2BW? Or is Fbase the lowest frequency, with the frequency range from Fbase ... Fbase+BW?

    Please see the App Note SWRA122 (CC11xx Sensitivity Versus Frequency Offset and Crystal Accuracy): https://www.ti.com/lit/swra122

    Section 2 gives an explanation of how to choose the RX BW setting (it is also applicable to the CC110L); it is centered around Fbase. The other sections are also worth looking at for an idea of recommended settings for characterised PHYs.

    • Even transmitter products that work reliably with a 60 kHz receiver bandwidth have trouble getting through at anything over 80 kHz, even at very close range (1 meter), which I do not understand. How can this happen? A greater bandwidth should make it easier to receive various signals, not more difficult, and at this range, noise should not mess things up. Also, the datasheet mentions bandwidths of 300 kHz and more as being quite normal for this type of application.

    It is worth double-checking the tolerance of the crystal you are using for the CC110L - as SWRA122 explains in more detail (see above for the link to the document), you need to account for this in your RX BW setting.

    If the older transmistters have a slightly different transmitter frequency from the newer products, are you able to measure them and see what this difference is? I.e. what is the difference between the working/non-working transmitters? Knowing more information about the older transmitters would help greatly with this.

    • Is the actual choice of IF important? When working with analog receivers, I know about the importance of proper IF choice, and tuning and filtering of the IF circuits, but things like the CC110L are more like a black box to me in this respect. I do have SmartRFStudio 7 working in Wine under Linux now, and I noticed that the IF frequency is apparently chosen automatically.

    Frequency programming is detailed in Section 5.21 of the CC110L Datasheet: https://www.ti.com/lit/swrs109 (it also links to Table 5-34 which gives the register description that controls FREQ_IF). In short, yes it is important.

    In SmartRF Studio, you can see the FSCTRL1.FREQ_IF register is adjusted according to the PHY, so you can look at these to gain further understanding with regards to the CC110L.

    From SWRA122: If there is an error in the transmitter carrier frequency and the receiver LO frequency, there will also be an error in the IF frequency. For simplicity assume the frequency error in the transmitter and receiver is equal (same type of crystal). If the receiver has an error of –X ppm and the transmitter has an error of +X ppm the IF frequency will have an error of +2 × X ppm (CC11xx uses low side LO injection). Conversely, if the receiver has an error of +X ppm and the transmitter an error of –X ppm the IF frequency will have an error of –2 × X ppm.

    There is also the following thread which could help with understanding: https://e2e.ti.com/support/wireless-connectivity/other-wireless-group/other-wireless/f/other-wireless-technologies-forum/241448/cc1101 - refer to the first answer by Sverre in particular for CC1101/CC110L-specific information.

    Regards,

    Zack

  • Hi Zack,

    Thank you for your elaborate reply! I see that I have quite some more studying coming to do still.

    About those working and failing transmitter products: the working ones produce 2-FSK on 869.195 and 869.205 MHz (with ~15 kHz =20 ppm tolerance), the older ones show on average 869.245 and 869.285 MHz respectively, apparently with somewhat more tolerance (the worst one shows almost 30 kHz higher frequencies). The crystal used with the CC110L is 27 MHz 10 ppm.

    And indeed results get slightly better if I slightly increase the base frequency.

    Things are complicated further because I use the 'raw' serial synchronous outputs (CS, serial clock, SSDATA) instead of the internal packet handling because of the old packet format (basically 1kbaud Manchester encoding but with just one sync bit instead of a sync word). This is what one such packet looks like:

    So it's basically just 24 preamble bits, followed by one sync bit, followed by three or four payload bytes (fourth byte not shown here). So far, I have not succeeded in configuring the CC110L to process this internally as Manchester-encoded data and get the payload in the RX FIFO.

    I suspect that my handling of this clock and data stream in the external controller is not very fault-tolerant, although I can't see any difference in the packet format between the older and newer products.

    So all suggestions for improvement or things to try are welcome - and I certainly have some more reading to do!

    Thanks again, regards,

    Richard

  • If I understand correctly, the following are sent on the air:

    The CC110L can minimum look for 2 bytes of sync, so the two sync bits must form a new sync word together with part of the preamble. Also know that CC110L will interpret 01 as 0, and 10 as 1:

    You can therefore try to configure the CC110L for a 2 byte sync word (16/16) that is 0x00 0x01, and then look for a 3 byte long packet (use fixed packet length).

    The packet will then be received as 0xFE, 0xAC, 0x5B

    Siri

  • Thank you, this looks promising!

    I already decided to start out with a clean slate with regard to CC110L RF parameters, carefully examining and calculating everything from scratch (e.g. I noticed that the Deviation parameter was way too high), taking care to use SmartRF Studio's suggested register values.

    If this works with the old synchronous serial detection via the external controller, I shall try implementing your suggestion next, as it is much simpler and most likely more robust.

    I'll get back to this as soon as I have results and/or more questions.

    Regards,

    Richard

  • OK, I've been trying to get this to work, but so far without success. Here's the general approach:

    1. Packet length (PKTLEN) = 4 (see point 6)
    2. Set up GDO2 for Carrier Sense (CS) out (GDO2 Signal Selection = 0x0E). GDO2 is connected to a polling/interrupt input on the controller.
    3. Wait for CS to go high.
    4. When CS goes low, we could have received a packet.
    5. Check CC110L Status register using SNOP read command strobe (0xBD). The lower nibble should give us the number of bytes in the RX FIFO.
    6. When not zero, read this number of bytes from the RX FIFO (the reason for this procedure is that the transmitter can send either 3 or 4 bytes).

    Steps 1-4 work as long as I'm still using the Synchronous Serial mode - the Status byte returned is always 0x10 (RX mode active, 0 bytes in the FIFO, as expected).

    But when I change PKTCTRL0 bits 5:4 from 01 (SS mode) to 00 (normal FIFO mode), things go completely wrong. First off, I get no CS signal on GDO2 any more, regardless whether I have a transmitter active or not. Also, when I read the Status register at this point, I always get 0x04 the first time after initialization, apparently indicating that there are 4 bytes in the FIFO and that the device is in idle mode (which is not what I would expect).

    After reading those 4 bytes (looks like random noise), the Status register now shows 0x00 as expected. RX turns off automatically after receiving the specified number of bytes, and must be explicitly switched on again (SRX) for the next packet (MCSM1 = 0x30).

    But when I turn on the receiver, the Status register immediately reports a FIFO overflow (0x6F), and reading the RX FIFO produces just an endless stream of random bytes (the first byte on every line is the Status register, the four subsequent bytes are what I read from the RX FIFO):

    0x6F 0x43  0x71  0x2B  0xE9 
    0x6F 0xBA  0x48  0x41  0xD5 
    0x6F 0x8E  0x51  0xD4  0x2A 
    0x6F 0x3A  0xCC  0xB0  0x7D 
    0x6F 0xC2  0x90  0x20  0x3E 
    0x6F 0x52  0xBA  0x20  0x62 
    0x6F 0xEE  0xC4  0x43  0xED 
    0x6F 0xE0  0x9B  0x09  0x02 
    0x6F 0x71  0xA1  0x57  0xD3
    0x6F ...

    So I'm basically at a loss here. The receiver no longer responds to my transmitters at all, but instead fills the RX FIFO with an endless stream of noise.

    For completeness, here are my current register settings; I hope that there's something simple I overlooked:

    0x06			    ; [0x00 - GDO2_CFG] Carrier Sense out
    0x0E			    ; [0x01 - GDO1_CFG] Carrier Sense out / SPI-DO
    0x0C			    ; [0x02 - GDO0_CFG] Serial Synchronous Data Out
    0x47			    ; [0x03 - FIFOTHR] RX FIFO and TX FIFO thresholds
    0x00			    ; [0x04 - SYNC1] Sync word, high byte
    0x01			    ; [0x05 - SYNC0] Sync word, low byte
    0x04			    ; [0x06 - PKTLEN] Packet length
    0x40			    ; [0x07 - PKTCTRL1] Packet automation control
    0x00			    ; [0x08 - PKTCTRL0] Packet automation control - set normal FIFO mode, fixed packet length
    0x00			    ; [0x09 - ADDR] Device address
    0x00			    ; [0x0A - CHANNR] Channel # 0
    0x0F			    ; [0x0B - FSCTRL1] Frequency synthesizer control => IF = 395.50781 kHz
    0x00			    ; [0x0C - FSCTRL0] Frequency synthesizer control
    0x20,0x31,0x4D	    ; [0x0D-0x0F - FREQ2:1:0] Frequency setting: 0x20 31 0D = 2109709 => 869.199692 MHz
    0x96			    ; [0x10 - MDMCFG4] Modem configuration: Channel bandwidth = 168.7500 kH
    0x37			    ; [0x11 - MDMCFG3] Modem configuration: Symbol rate = 2.002000809 kHz
    0x00			    ; [0x12 - MDMCFG2] Modem configuration: 2-FSK, 16/16 sync word bits
    0x02			    ; [0x13 - MDMCFG1] Modem configuration: 2 preamble bytes minimum
    0xE5			    ; [0x14 - MDMCFG0] Modem configuration: Channel spacing = 199.813843 kHz
    0x24			    ; [0x15 - DEVIATN] Modem deviation setting => 
    0x07			    ; [0x16 - MCSM2]
    0x30			    ; [0x17 - MCSM1] Main Radio Control State Machine configuration: IDLE after RX
    0x18			    ; [0x18 - MCSM0] Main Radio Control State Machine configuration: Calibration from IDLE to RX
    0x16			    ; [0x19 - FOCCFG] Frequency Offset Compensation configuration
    0x6C			    ; [0x1A - BSCFG] Bit Synchronization configuration
    0x03			    ; [0x1B - AGCCTRL2] AGC control
    0x43			    ; [0x1C - AGCCTRL1] AGC control
    0x91			    ; [0x1D - AGCCTRL0] AGC control
    0xF8			    ; [0x20 - RESERVED]
    0x56			    ; [0x21 - FREND1] Front end RX configuration
    0x10			    ; [0x22 - FREND0] Front end TX configuration
    0xE9			    ; [0x23 - FSCAL3] Frequency synthesizer calibration
    0x2A			    ; [0x24 - FSCAL2] Frequency synthesizer calibration
    0x00			    ; [0x25 - FSCAL1] Frequency synthesizer calibration
    0x1F			    ; [0x26 - FSCAL0] Frequency synthesizer calibration
    0x59			    ; [0x29 - RESERVED]
    0x7F			    ; [0x2A - RESERVED]
    0x3F			    ; [0x2B - RESERVED]
    0x81			    ; [0x2C - TEST2]
    0x35			    ; [0x2D - TEST1]
    0x09			    ; [0x2E - TEST0]

    Do you have any idea what I'm doing wrong here?

  • Not sure I understand your code or why you do what you do.

    To generate settings, please use SmartRF Studio.

    Set the necessary RF parameters, as frequency, BW, Deviation and data rate.

    Then set the necessary packet configurations.

    Assume you are using the the 38.4 kbps settings, and that all RF parameters are OK by default.

    Then you need to change the following:

    2 bytes sync word (0x00, 0x01)

    Fixed packet length

    Packet length 3 (not sure why you want to set it to 4 for the example you showed above, when the payload is 0xFE, 0xAC, 0x5B

    With the settings from SmartRF STudio, IOCFG = 0x06 (sync received on rising edge, packet received on falling edge)

    You should not use CS as interrupt to the MCU when you use the packet engine, then there are signals for when a packet is received)

    Your vode should simply do the following:

    Init MCU

    Reset Radio

    Init Radio

    Strobe SRX

    Wait for falling edge interrupt on GDO0 (packet received)

    You can now read the TX FIFO

    With te default settings in Studio, APPEND_STATUS = 1, and there will not be 5 bytes in the FIFO that must be read:

    3 payload bytes and 2 status bytes.

    Siri

  • I did a quick test with Studio where I used a CC115L as transmitter and CC110L as receiver. Not sure I got all you RF parameters correct, but wanted to show you the packet handling feature. Just realized that you said that the packet is 4 bytes also, but that you did not use that in the figure, so I have used 4 bytes.

    For the transmitter I did not use Manchester, and wrote the complete packet to the TX FIFO:

    Register settings from Studio:

    TX:

    static const registerSetting_t preferredSettings[]= 
    {
      {CC115L_IOCFG2,           0x2E},
      {CC115L_IOCFG0,           0x06},
      {CC115L_FIFOTHR,          0x47},
      {CC115L_SYNC1,            0x55},
      {CC115L_SYNC0,            0x56},
      {CC115L_PKTCTRL0,         0x05},
      {CC115L_FREQ2,            0x21},
      {CC115L_FREQ1,            0x6E},
      {CC115L_FREQ0,            0x3A},
      {CC115L_MDMCFG4,          0xF6},
      {CC115L_MDMCFG3,          0x43},
      {CC115L_MDMCFG2,          0x00},
      {CC115L_DEVIATN,          0x15},
      {CC115L_MCSM0,            0x18},
      {CC115L_RESERVED_0X20,    0xFB},
      {CC115L_FSCAL3,           0xE9},
      {CC115L_FSCAL2,           0x2A},
      {CC115L_FSCAL1,           0x00},
      {CC115L_FSCAL0,           0x1F},
      {CC115L_TEST2,            0x81},
      {CC115L_TEST1,            0x35},
      {CC115L_TEST0,            0x09},
      {CC115L_MARCSTATE,        0x01},
    };

    RX:

    // Address Config = No address check 
    // Base Frequency = 869.199646 
    // CRC Autoflush = false 
    // CRC Enable = false 
    // Carrier Frequency = 869.199646 
    // Channel Spacing = 199.951172 
    // Data Format = Normal mode 
    // Data Rate = 2.00224 
    // Deviation = 5.157471 
    // Device Address = 0 
    // Manchester Enable = true 
    // Modulated = true 
    // Modulation Format = 2-FSK 
    // Packet Length = 4 
    // Packet Length Mode = Fixed packet length mode. Length configured in PKTLEN register 
    // Preamble Count = 4 
    // RX Filter BW = 58.035714 
    // Sync Word Qualifier Mode = 16/16 sync word bits detected 
    // TX Power = 0 
    // PA table 
    #define PA_TABLE {0x50,0x00}
    
    static const registerSetting_t preferredSettings[]= 
    {
      {CC110L_IOCFG0,           0x06},
      {CC110L_FIFOTHR,          0x47},
      {CC110L_SYNC1,            0x00},
      {CC110L_SYNC0,            0x01},
      {CC110L_PKTLEN,           0x04},
      {CC110L_PKTCTRL1,         0x00},
      {CC110L_PKTCTRL0,         0x00},
      {CC110L_FSCTRL1,          0x06},
      {CC110L_FREQ2,            0x21},
      {CC110L_FREQ1,            0x6E},
      {CC110L_FREQ0,            0x46},
      {CC110L_MDMCFG4,          0xF6},
      {CC110L_MDMCFG3,          0x43},
      {CC110L_MDMCFG2,          0x0A},
      {CC110L_DEVIATN,          0x15},
      {CC110L_MCSM0,            0x18},
      {CC110L_FOCCFG,           0x16},
      {CC110L_RESERVED_0X20,    0xFB},
      {CC110L_FSCAL3,           0xE9},
      {CC110L_FSCAL2,           0x2A},
      {CC110L_FSCAL1,           0x00},
      {CC110L_FSCAL0,           0x1F},
      {CC110L_TEST2,            0x81},
      {CC110L_TEST1,            0x35},
      {CC110L_TEST0,            0x09},
      {CC110L_CRC_REG,          0x21},
      {CC110L_RSSI,             0x80},
      {CC110L_MARCSTATE,        0x01},
    };

    As you can see, the packet format is correct, and I am able to receive the packets in a correct format.

    However, If I send 20 packets, I only receive 8 of them.

    This might be due to wrong RF parameters, or the fact that the sync word used is not a very good one.

    I am not an RF expert, so I cannot say why so many packets are lost, but hopefully you will get the format correct.

    Siri

  • OK, after careful comparison of your settings (and accounting for your 26 MHz crystal vs. my 27 MHz), I managed to get the same results as you.

    It turns out that I had somehow entered 0x40 for PKTCTRL1 instead of 0x00 -- probably because of SmartRF Studio's rather unwieldy user interface (switching between Easy Mode and Advanced Mode erases all carefully tuned settings, and when a config is saved in Easy Mode, it can't be loaded in Advanced Mode and vice versa).

    However, like in your setup, the receiver still fails to detect a lot of packets (whereas the original ATA5760-based receiver missed not a single one). I'll try and tackle that together with an RF expert, who also has all the necessary (and expensive) measuring equipment. I really hope I can solve this reliability problem, because this ~50% packet loss is totally unacceptable in the end-user application. I shall post a follow-up as soon as I have results in this area, although this may take a couple of days. I will also check if my old Synchronous Serial solution performs any better than this internal packet-based solution.

    For now, many thanks for your assistance and time! The main issue (migrating the ATA5760 functionality while retaining backward compatibility) is not yet fully resolved, but at least I now have a setup that basically works - as well as a lot more experience and knowledge about these digital receiver devices.

    Thanks again, all the best,

    Richard

  • Glad you at least have been able to get the packet format set up correctly. 

    I hope you will be able to tune the RF parameters as well so that the CC110L will be able to receive all packets.

    BR

    Siri

  • The packet format seems somewhat functional, but I still lose a lot of packets (some 35%) - and so far, the problem does not appear to be related to the RF: all transmitters are located at just 2 meters from the receiver, the RSSI shows a very strong signal, and the chosen RF parameters appear quite optimal.

    My suspicion is that the CC110L somehow fails to properly decode the packets, for reasons that I don't understand so far.

    In order to collect more information, I hooked up one channel of my oscilloscope to the transmitter modulator signal (blue trace at t:he bottom) and the other channel to the receiver CS output on GDO2 (yellow trace at the top) - and this is where I notice something strange.

    This is what the CS signal looks like when a packet is missed:

    And this is the same CS signal when a packet is successfully received:

    It turns out that when CS goes low for 1 millisecond after receiving the last payload byte, then the packet is received correctly. Otherwise, it is not recognized. In all instances, the exact same signal is transmitted at the exact same distance, under identical conditions.

    The CC110L datasheet tells me that the CS signal is controlled by the RSSI level: if the RSSI is above a certain threshold, CS is high, otherwise it is low - but it appears that it does not work quite as simple as that. Also, the RSSI level is more than high enough, and the 'cleanest' CS signal is the one that always drops packets.

    I don't know what to make of this, but maybe this tells you something. Now I first thought that the CC110L can't reliably recognize these packets with just 8 preamble bits and a 16-bit sync word -- but I reprogrammed a transmitter to send a 16-bit preamble and a 16-bit sync word, with the exact same result (again 35% dropped packets), so it's not that the preamble is too short; something else must be wrong.

    Do you have any suggestion that I could investigate or try out? I'm still consulting with my RF guru later this week to see if RF parameters are indeed optimal, but the RSSI values at short distance (mostly in the 0x18 - 0x20 range) suggest that the RF signal is more than strong enough, so noise or drop-outs can't be the problem. Yet somehow, packets are very often not reliably detected. I'm a bit at a loss here -- and I'm even contemplating tests with a completely different device (this RF expert has an SX1276 evaluation board that we can set up for this application).

    I shall report back if I have more data.

    Thanks again, regards,

    Richard

  • Hi Richard

    I am not sure I fully understand what you are showing. 

    Can you please output the sync found/sync received (IOCFGx = 0x06) on a GDO pin in addition to the CS signal?

    Also, it would be useful if you could do the following exercise:

    Always send the exact same packet from your transmitter, and then print out all your received packet. Is there a pattern as to where the packets are failing? Is there just a single bit error in the packets or is multiple bits wrong? Do the error always occur in the same place?

    The recommended settings for CC110L for best RF performance is 4 bytes of preamble and 4 bytes of sync, so your rf performance will be degraded, but not sure if 35% PER is within what to expect.

    BR

    Siri

  • I did a quick test again and turned off manchester on the CC110L.I still used 16 bits sync but had the sync word set to 0x55, 0x56. I increased the packet length to 8.

    In this case I received all packets.

    Not sure why, but it is worth given it a try. Doing the manchester decoding in SW after the packet is received should not make much overhead to your code.

    siri

  • Sorry if I was a bit unclear -- it was getting late, and I was trying to make sense of what I was observing myself. I just repeated the measurements, this time also including sync_found on GDO2.

    I have one transmitter, sending the same 3-byte packet (the one above that you already analysed) over and over again, every 20 seconds, at a fixed distance of 75 cm (about 3 feet) from the receiver.

    The transmitted signal (purple oscilloscope trace):

    1. Carrier on (10 ms)
    2. Preamble (1 byte)
    3. Sync word (2 bytes)
    4. Payload (3 bytes, 0x0153A4)

    The receiver has CS hooked up to the oscilloscope (yellow trace, GDO0=0x0E), as well as sync_found (blue trace, GD02=0x06).

    As the payload bits are received inverted, the correct payload received should be 0xFEAD5B, as you already indicated. In addition, I have configured the receiver to add RSSI and CRC to the payload bytes.

    This is how these signals look when a packet is received correctly (about 65% of the time):

    The carrier (purple) is switched on, and in response, CS (yellow) goes high. After the sync bits have been received correctly, sync_found (blue line) goes high.

    After receiving the last payload byte, sync_found goes low again. Also, CS goes low for 1 millisecond and then high again (as the carrier is still present).

    In these cases, the payload is received correctly most of the time:

    0xFE 0xAD 0x5B 0x48 0x80

    sometimes, the last payload bit is off by one:

    0xFE 0xAC 0x5C 0x40 0x81

    The RSSI value varies a bit, but is generally around 0x48 in this configuration, which, if I interpret it correctly, boils down to 36 dBm - 74 dBm (=RSSI_offset) = -38 dBm. This is more than high enough for good reception.

    When things go wrong, this is what the signals look like:

    So the exact same transmission (of course), an uninterrupted CS signal, but no sync_found signal. And of course no packet is received.

    I will try to see what happens when I change the receiver's sync word by one bit (0x55 0x56 or maybe 0xAA 0xA9) -- ah, I see you just came up with the same thought!

    OK, so I will try this, and yes, handling the Manchester encoding in software is no problem at all -- it is in fact what I started out with when using the Synchronous Serial mode.

    I have to be off now, but I hope to implement this new approach later today.

    And oh, thank you very much for taking all this time and trouble to help me! I really appreciate it, and it also forces me to carefully check what I'm doing all the time.

    I'll let you know how things go later today -- hopefully with 100% success

  • OK, I rearranged the code for MCU-based Manchester decoding, and I see no more dropped or malformed packets. I think the problems are finally solved!

    Thank you once again!

  • Glad to hear you were able to get it up and running :-)

    Siri

  • Sorry to bother you once more, but I stumbled upon one new problem: I want to use GDO0 as sync_found (GD02=0x06) and hook it up to a controller interrupt, but when I initialize the CC110L, GDO0 remains in the default configuration (i.e. a test signal of CLK_XOSC/192 = 141 kHz) until the first packet is received. This is what GDO0 looks like after initialization (yellow trace):

    As you can imagine, this 141 kHz signal on the controller's primary interrupt line is a problem. The CC110L datasheet even explicitly mentions on several occasions that this XOSC-drived test signal should be turned off before RX is switched on ("To optimize RF performance, these signals should not be used while the radio is in RX mode"), but how do I do this? The state diagram (Figure 5-11) doesn't say when these configuration changes are actually carried through.

    Here's my initialization, with line 8 marked, showing that the initialization value is correct:

     

    Reset:
    single byte 0x30
    
    Main init:
    address 0x00 + burst = 0x40
    0x0E			    ; C [0x00 - GDO2_CFG] Carrier Sense out
    0x2E			    ; C [0x01 - GDO1_CFG] High impedance
    * 0x06			    ; C [0x02 - GDO0_CFG] Sync Received out
    0x47			    ; C [0x03 - FIFOTHR] RX FIFO and TX FIFO thresholds
    0x55			    ; C [0x04 - SYNC1] Sync word, high byte
    0x56			    ; C [0x05 - SYNC0] Sync word, low byte
    0x08			    ; D [0x06 - PKTLEN] Packet length
    0x04			    ; D [0x07 - PKTCTRL1] Packet automation control => append status
    0x00			    ; C [0x08 - PKTCTRL0] Packet automation control - set normal FIFO mode, fixed packet length
    0x00			    ; D [0x09 - ADDR] Device address
    0x00			    ; D [0x0A - CHANNR] Channel # 0
    0x0F			    ; C [0x0B - FSCTRL1] Frequency synthesizer control => IF = 395.50781 kHz
    0x00			    ; D [0x0C - FSCTRL0] Frequency synthesizer control
    0x20,0x31,0x4D	    ; C [0x0D-0x0F - FREQ2:1:0] Frequency setting: 0x20 31 0D = 2109709 => 869.1733246 MHz
    0x96			    ; C [0x10 - MDMCFG4] Modem configuration: Channel bandwidth = 168.7500 kH
    0x37			    ; D [0x11 - MDMCFG3] Modem configuration: Symbol rate = 2.002000809 kHz
    0x02			    ; C [0x12 - MDMCFG2] Modem configuration: 2-FSK, 16/16 sync word bits, no Manchester decoding
    0x02			    ; C [0x13 - MDMCFG1] Modem configuration: 2 preamble bytes minimum
    0xE5			    ; C [0x14 - MDMCFG0] Modem configuration: Channel spacing = 199.813843 kHz
    0x24			    ; C [0x15 - DEVIATN] Modem deviation setting => 
    0x07			    ; D [0x16 - MCSM2]
    0x3C			    ; D [0x17 - MCSM1] Main Radio Control State Machine configuration: IDLE after RX
    0x18			    ; C [0x18 - MCSM0] Main Radio Control State Machine configuration: Calibration from IDLE to RX
    0x16			    ; C [0x19 - FOCCFG] Frequency Offset Compensation configuration
    0x6C			    ; D [0x1A - BSCFG] Bit Synchronization configuration
    0x03			    ; D [0x1B - AGCCTRL2] AGC control
    0x43			    ; C [0x1C - AGCCTRL1] AGC control
    0x91			    ; D [0x1D - AGCCTRL0] AGC control
    
    Continued init:
    address 0x20 + burst = 0x60
    0xFB			    ; C [0x20 - RESERVED]
    0x56			    ; D [0x21 - FREND1] Front end RX configuration
    0x10			    ; D [0x22 - FREND0] Front end TX configuration
    0xE9			    ; C [0x23 - FSCAL3] Frequency synthesizer calibration
    0x2A			    ; C [0x24 - FSCAL2] Frequency synthesizer calibration
    0x00			    ; C [0x25 - FSCAL1] Frequency synthesizer calibration
    0x1F			    ; C [0x26 - FSCAL0] Frequency synthesizer calibration
    
    Continued init:
    address 0x29 + burst = 0x69
    0x59			    ; D [0x29 - RESERVED]
    0x7F			    ; D [0x2A - RESERVED]
    0x3F			    ; D [0x2B - RESERVED]
    0x81			    ; C [0x2C - TEST2]
    0x35			    ; C [0x2D - TEST1]
    0x09			    ; C [0x2E - TEST0]

    Is there a way to force the configuration change for GDO0, so that it switches from the 141 kHz block wave signal to the newly configured state without first receiving a packet?I have not found any command that actually does this.

    Or does this problem occur because I use burst mode to initialize the device? (I haven't tried changing this sequence to single instructions yet, as it is quite a bit of extra work.)

    Thanks again,

  • Hi

    The register will be updated when you configure it, not after a packet is received, so the only thing you need to make sure of is that you are writing 0x06 to the IOCFG0 register, BEFORE you strobe SRX.

    Unfortunately there are some problems with the license server, so I am not able to show you a logic analyzer plot of how this looks like when doing burst access as you do, but for a single access (I can do that with SmartRF Studio) you will see that the GDO0 line changes right away.

    If you monitor the GDO0 line when doing the init you describe above, you should see the same thing. The GSO0 will stop toggling as soon as you configure it.

    On your MCU, the interrupt on GD00 must be disabled until you have configured IOCFG0 properly.

    It might also be necessary that you clear pending interrupt generated when the clock is toggling the line, BEFORE you enable the interrupt:

    Pseudo code:

    Init MCU (interrupt on GDO0 disabled)

    Init Radio (IOCFG0 = 0x06)

    Clear interrupt on GDO0

    Enable interrupt on GDO0

    Start RX

    BR

    Siri

  • Hi Siri,

    Thanks again for your reply!

    This was indeed a problem with GDO0 interrupts being registered before initialization of the CC110L - and after noticing and fixing this issue, I misinterpreted the single trace for GDO0 as still producing 141 kHz, whereas in reality, GDO0 had stopped outputting the CLK_XOSC/192 after a 100 ms initialization delay I built in, but that was outside the visible (single-trigger) portion of the screen. So basically, I messed up twice at the same point ... my apologies!

    So far, it appears that everything is now working exactly as intended.

    Regards,

    Richard