This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MSPM0L1305: After setting the Optimization Functionality is not working

Part Number: MSPM0L1305

Tool/software:

Hi team,

We have 2 states one is active and the other one is sleep. Before setting any optimization it was waking up from sleep mode to active mode. However, after setting optimization level I am unable to wakeup from sleep mode whatever may be the optimization level. Could you please help me to resolve this issue ASAP.

  • Hello Sowmya, 

    Which IDE does you use? If you use the CCS, you can refer to this document for the specific information of each compiler optimization. The default level is 2, can your project work when the level is 2? Sometimes when you set a higher optimization level, the compiler will even modify the execution order of instructions. So the better way is to debug your code and see the "Disassembly" window which is the assembly statement.

    https://software-dl.ti.com/codegen/docs/tiarmclang/rel2_1_0_LTS/compiler_manual/using_compiler/compiler_options/optimization_options.html

    Best Regards,

    Janz Bai

  • If this is indeed the case, there is a compiler bug. I suggest you ask this on the CCSSTUDIO forum. There they will ask you to send in the working and non-working version.

    e2e.ti.com/.../code-composer-studio-forum

  • Hi Team

    In sleep mode, we are configuring the 32 kHz clock. We are using both the ADC and a timer. The timer operates on the LFCLK (low-frequency clock), but the ADC requires a high-frequency clock, so I am using the 16 MHz clock. It works without optimization, but with optimization level -2, the ADC is not functioning. Can anyone please help me with this issue.

  • Hello Sowmya,

    In theory, modifying the compiler optimization level won't impact the usage of peripheral because our example project in SDK is level-2 by default. The most likely cause is that some variables are optimized incorrectly so I recommend you debug your code and see the "Disassembly" window which is the assembly statement. By the way, do you use such as "while (1 or other conditional statement) { ;}" (there is nothing in the while cycle) ?

    Best Regards,

    Janz Bai

  • Hi Janz,

    Thank you for the reply

    We have found that the reason the ADC is not working after optimization is that the FET is not enabled during the process. Optimization affects the timing for enabling the FET.

     

    delay_cycles (800); -> if I use this function for delay,how many micro seconds will it take, and how do I calculate it if my clock is 32MHz

  • For the library function, "delay_cycles (800); " at 32MHz will delay for at least (800/32)=25usec. Being pre-compiled, it isn't subject to optimization.

    If you wrote your own, it would be subject to optimization, and might even disappear entirely. Also, this is the minimum delay, and e.g. interrupts could extend it.

    Do you believe the delay is too short or too long?

  • Hi Bruce,

    Thank you so much for the reply,

    I added the function delay_cycles (800); in my source code and the ADC is working. However I have 5 switch on the same channel, and the ADC value for one of the switches is not matching as expected I am not sure what wrong. I want to achieve by using optimization level -s

    Could you please help me on this

  • After "volatile" questions, the next optimization hazard is races in the program flow -- optimization doesn't create these, but it can make them worse.

    I'm supposing that the switches turn on some kind of excitation circuit(s), then when they're ready you read from the ADC, and the code in between just happened (without -O) to take the right amount of time (something like 25usec), but now (with -O) it takes less time. The delay_cycles() then fills in the gap. Is this more-or-less what is happening?

    a) Do you have any indicator you can use to tell when the excitation is ready? Or is it purely an elapsed-time thing?

    b) Where did the number 25usec (800 clocks) come from? Was that the minimum time that had an effect, or did you try a few settings? Is it possible that one excitation source just takes a little bit longer?

    I have seen code that used a Timer output to (1) trigger an excitation switch [1st edge] (2) trigger the ADC some (known/tunable) time later [2nd edge]; as an added bonus it (3) provided a fixed-rate on the ADC sampling. This is probably more than you want to take on right now, but it's the kind of design that is not affected by optimization.