This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

SysCtlClockSet Questions



Been working on setting my clock using the SysCtlClockSet macro and been having some problems understanding the documentation on it. I'm trying to reach a specific clock with the PLL at 400MHz, and dividing by a specific number. I can't understand entireley the parameters of the macro because it always divides the PLL by 2, and then it always uses the RCC register to set the clock frequency. Inside the macro (going to Sysctl.c) you can see what type of changes it does to what registers and it does always change the RCC and RCC2 register but I don't know what parameter to send to use the RCC2 register to select my desired clock. During the reading it repeats different classes of the Stellaris, Firestorm, Fury, etc. What class is the LM4F120H5QR?

Thanks,

  • Your LX4F is "Blizzard" class - others you've listed all are "ill-fated" Cortex M3 devices.

    Suggest some caution - and avoidance of direct interaction w/ RCC and RCC2 registers.  Mistake here can "lock you out" from your MCU.

    StellarisWare "SysCtlClockSet" - as you note - provides everything you need.  Use of the PLL and proper "SysDiv" get you reliably to 50MHz - there have been some issues w/operation @ higher System Clocks.  (check your part no. and Rev no. - to be safe/sure)

    Here's code which consistently/robustly works w/our group's LX4F:  (sets System Clock to 50MHz)

          ROM_SysCtlClockSet(SYSCTL_SYSDIV_4 | SYSCTL_USE_PLL | SYSCTL_XTAL_25MHZ | SYSCTL_OSC_MAIN);

    Vital that "SYSCTL_XTAL_25MHZ" match the xtal value on your board!   Users suffer "lock out" when "copy/pasting" from Aps which use different value xtal than that afixed - their board. 

    Restrictions may apply - based upon MCU number and Rev - and use of ROM vs. "standard StellarisWare" functions.  IIRC - certain frequency "bands" may have issues - always best to review latest/greatest errata and MCU manual for best guidance...

    I posted a code listing w/in past week which reliably outputs a "divide Sys Clock" by 1000 (thus 50MHz yields 50KHz) and causes no disturbance to the (usually sensitive) xtal input pins (when scope probed) on most MCUs.

  • Hey there again cb1, thanks for the quick response. I'll use the 50Mhz setup and go from there to set the UART clock from that. The reason, as I've said before, of wanting to affect RCC2 is to be able to get the least error on my baud rate. 50Mhz doesn't give me the least amount of error but I imagine it should function accordingly.

    The UARTConfigSetExpClk basically is in charge of calculating the value from the Auto Baud Rate generator. Is it possible to verify the baud rate that was set to a UART peripheral with an oscilloscope? 

    Thanks,

  • Hey back Hector - remembered your name - not your topic - but now recall.  (I gave you tech detail behind slandrum's 2.5% Uart guidance)

    Both he/I feel you may be bit over-emphasizing the accuracy requirement for Uart.  Can't imagine that you'll even approach the 2.5% figure - using any of the standard xtal values listed w/in MCU manual.  Again - be sure to properly "match" your board-installed xtal to the parameter required by SysCtlClockSet(). 

    Trick - way popular in the past - was to xmit 0x55 or 0xAA - both yield an "up-down" bit pattern - which easily enables your scope to confirm individual bit times.  At 9600 baud the bit time is 104.16uS, @ 38.4K baud - bit time is 26.04uS.  If you can cause the remote device to transmit a string of 0x55 or 0xAA - you can easily measure the bit time - and determine the baud rate.

  • @Hector - Merci - mon ami - j'avais l'appreciation tres grande.

    I screwed up bit time @ 38.4KBaud (only by 10x) - now that post corrected...

    You can ease your xtal accuracy requirements by "binning" various xtals (say those "high" and those "low") and then installing only the high/fast xtal when your remote device is also "high/fast" frequency-wise.  (or the low/slow xtal - when vice-versa)  This method enables you to defeat the case where one side is fast - the other slow - and the total error is cumulative.  If the frequency error can be constrained to, "both remote/local too fast" or "both remote/local too slow" - your frequency "delta" can "balloon" to 5% - w/out impacting your commo...

  • Hello,

    Currently trying to get the oscilloscope to read the bits I'm trying to send by my UART port. No success at the moment. I'll keep you updated. 

    Thanks again for your help, 

  • Do keep in mind the tip of sending: 0x55 or 0xAA.  The "up-down" alternating bit pattern greatly eases your measurement.

    Suggest that you start @ 9600 baud - only after that's working reliably would I speed up.  (recall ea. bit @ 9600 baud is ~104uS wide)

    And - always keep in mind that "true" RS232 levels can (and will) quickly "do in" any MCU input.  You must use an RS232 level shifter IC should your remote device transmit @ these standard levels.  (real RS232 outputs between + 12-15 and - 12-15V - signal must both be inverted and level shifted for any MCU)

  • Hello,

    Been keeping at it.  Had success in sending a 0x55 in a while loop and helped calculate the baud rate that I'm actually configuring in the UART1 pin. The oscilloscope is reading 8.7us, or 114942 Hz. This is indeed a pretty neat trick because we know we're actually sending something. Two problems now though, I'm using a Logic Level Converter (LLC) to lower the "5V"signal from the launchad to 3.3V which is the operational voltage of my GSM. However, to begin with the oscilloscope is also measuring the amplitude of the signal and the voltage is of 360mV. Why is this happening? Aren't I supposed to be getting a 5V amplitude signal that then passes through the LLC and receive a 3.3V exact signal in the other side? Or am I understanding this wrong? Haven't been able to get more than just trash bits from the UART1Rx thats receiving from the GSM after it "receives" something from my UART1Tx. Any explanations would be helpful.

    Thanks,

  • Hector Tosado said:
    Logic Level Converter (LLC) to lower the "5V"signal from the launchad to 3.3V

    Why would you think this? (5V from launch)  No modern MCU operates @ 5V levels - long passe.  Stellaris MCUs (and most other ARM MCUs) all comfortably input & output @ 3V3 levels.  Using your scope you can easily test/confirm that Stellaris MCU outputs 3V3 (when high) and 0V (when low).

    Diagnosing your non Stellaris device leads us to uncharted waters - bit outside the constraints of this forum...

    We have likely exhausted your initial topic - risk the wrath of officials if topic wanders further...

  • Hello,

    I'm assuming this because I'm connecting it to a 5V source. And from what fellow students have explained the outputs related to the MCU are going to be high at the same operational voltage that you're powering the MCU with. That was my knowledge thus far. So what you mean to explain its that even though the launchpad is receiving a 5V from the USB power the output pins still "operate'' at a 3.3V, so no real LLC is needed. I should be able to communicate succesfully with my GSM without any problems, other than connecting a cable between them. 

    Doubts on measuring this high and low voltage from the scope is, as I mentioned before, the scope is measuring a high (1) and a low (0) with a difference of 360mV between those two. In other words, _|¯|_|¯|_|¯|_, ΔY is 360mV. This is not even close ot the 3.3V I shoule be seeing. 

    I'll keep my posts on a more focused topic as to not anger the wandering officials :D Should I create a new post to continue with this topic then?

    EDIT EDIT: I was able to understand what you meant by measuring the output. Didn't use the scope but did use a Digital Multimeter. When sending 0x55 from the UART1Tx, the voltage measure was about 1.63 which is roughly half of 3.3V. This should be because of sending a high half the time and a low the other half, which "halfs" it out. Now my problem should be in assuring the other device is setup correctly as I can see my MCU is sending correctly. 

    Thanks,

  • Suggest you let this thread "die" here, now. 

    Note - your MCU is not being powered by 5V - there is a very noticeable 3V3 Regulator interposed between the incoming 5V and the MCU.  Devil really is in the detail with these devices - a review of your board's schematic will explain much.

    Your 360mV - if I had to guess - may be the result of a 10:1 voltage probe.  Quick, easy test would be to measure across a "known" 5V - suspect that would yield 0.5V!

    Even 9600 may be too fast for DVM measure - dial down as far as Stellaris allows - maybe to 2400 or 1200 baud.  Bit levels @ these speeds may persist long enough that humble DVM can capture & display correctly...  Or - as past suggested - you could master GPIO Output (no Uart) and simply set one GPIO high...