This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TMS320 F28335 - Simulink - CCSv5: Sample Time changes during program execution.

Other Parts Discussed in Thread: TMS320F28335

Hello,

I am working with a TMS320F28335 with Experimenter Kit: Delfino Control Card with Docking Station USB-EMU [R3].

I made an algorithm using Simulink and I run it on the hardware by loading the ".out" file by using CCSv5 in debug mode.

I set the Sample Time in Simulink as 20 µs and I created a routine to set a GPIO to the state "ON", during one "SampleTime"  ( 20 µs ),  then "OFF" for another 9 "SampleTimes".

When I run the program, I can measure the amount of time every program loop with an osciloscope connected to that GPIO. In this case, the osciloscope shows a rectangular wave with period = 200µs, remaining "ON" for 20 µs and "OFF" for 180 µs.

As I expand my argorithm in Simulink, the simulation seems OK, but when I run the new program on the board, the Sample Time is multiplied by a few times its original value of 20 µs.

My question is: did i put too many operations for a very small amount of time, in a way the board cannot execute within the Sample Time I defined in Simulink?

I noticed that my algorithm runs in loop of 20µs again, when i disconnect the output wire from the hardware output block (I am using aGPIO or a PWM block as output signal in the end of my algorithm).

Can it be solved with a hardware configuration on CCS or Simulink?

Kind Regards

Leo

  • Hi Leo,

    There seems to be some king of delay initially. Too many operations quite possible. But until I've a look at the code I won't be able to comment further. Secondly this being Simulink project... I've no experience with the tool.

    Regards,
    Gautam