Hello,
I have been developing using sensor-less commutation for a while now, and am currently experimenting with using sensored speed control again. I am going back to lab 12b to start simple. I will mention I am controlling a PMSM whose output is connected to a gearbox. I have run this configuration sensorless with few problems, but am seeking the smooth startups of sensored control.
I have left the speed ref value as the default 0.1krpm and have my BW set at 5. Upon setting the Run_Identify and enableSys flags, after Rs was recalculated, two possible things happen:
1) the motor spins up to 0.1krpm and spins smoothly
2) the motor quickly (definitely faster than the max_accel = 0.2krpmsps) accelerates to either -2.5 or 2.5 krpm and stays there.
The maximum speed of my motor is defined as 2.25krpm in the User.h file. The strangest part of this phenomenon is that I can't seem to find any consistent reason for either of these cases happening, it is seemingly completely random.
I dug a big deeper, to be sure all the control signals going into the Spintac Vel CTL unit were correct.
When the motor is running at the desired 0.1krpm, as expected the st_obj.vel.ctl.velLPF = 0.004 (which is multiplied by the constant 24 to get 0.1). The .out value is 0.036.
When the motor is running out of control, I noted the st_obj_vel.ctl.VelRef = 0.00417 (multiplied by 24 to give 0.1, which means my velocity reference is the correct, desired value). The .VelLpf = -0.1059 (multiplied by 24 to give ~-2.5, meaning that my runaway speed is getting back to the Vel CTL unit correctly). The .Out value is 0.3636.
In this case, the runaway speed is a large negative value, and the system is trying to apply a large positive torque to correct it, but it continues to spin in the other direction. I would think next that there is a problem with the CTRL object, I don't see any error codes there. I will investigate the settings/input signals to the CTRL object tomorrow.
Any ideas on this problem? It's got me pretty stumped...