This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
I will use Tamagawa Engota with IDDK2.2. I noticed that in cl_f2838x_tmdxiddk_cpu1.c, the GPIO32 setting is not waiting to be done. So I decided to set GPIO32 to H level.
I added the following code to configureGPIO();. Is this a simple bug? Or is it intended to do something?
// GPIO32->TamagawaAbsEnc
GPIO_setMasterCore(32, GPIO_CORE_CPU1);
GPIO_setPadConfig(32, GPIO_PIN_TYPE_STD);
GPIO_setPinConfig(GPIO_32_GPIO32);
GPIO_setDirectionMode(32, GPIO_DIR_MODE_OUT);
GPIO_writePin(32,1);
I think the code you're looking for is in fcl_tformat_f2838x_config.c. There's a function in there called tformat_setupGPIO() that does configure GPIO32 as a GPIO output. Note that the comments in the code are wrong, but looking at the GPIO number #defines in fcl_tformat_f2838x_config.h, you'll see the actual pins.
Whitney
Thank you very much for your quick answer.
My rfotmat_setupGPIO(); defined seems GPIO139 but defined ENC_POWEREN is 32.
void tformat_setupGPIO(void)
{
//
// GPIO7 is SPI Clk slave
//
GPIO_setMasterCore(ENC_CLK_PWM_PIN, GPIO_CORE_CPU1);
GPIO_setPinConfig(ENC_CLK_PWM_CFG);
//
// GPIO63 is the SPISIMOB
//
GPIO_setMasterCore(ENC_SPI_SIMO_PIN, GPIO_CORE_CPU1);
GPIO_setPinConfig(ENC_SPI_SIMO_CFG);
GPIO_setQualificationMode(ENC_SPI_SIMO_PIN, GPIO_QUAL_ASYNC);
//
// GPIO64 is the SPISOMIB
//
GPIO_setMasterCore(ENC_SPI_SOMI_PIN, GPIO_CORE_CPU1);
GPIO_setPinConfig(ENC_SPI_SOMI_CFG);
GPIO_setQualificationMode(ENC_SPI_SOMI_PIN, GPIO_QUAL_ASYNC);
//
// GPIO65 is the SPICLKB
//
GPIO_setMasterCore(ENC_SPI_CLK_PIN, GPIO_CORE_CPU1);
GPIO_setPinConfig(ENC_SPI_CLK_CFG);
GPIO_setQualificationMode(ENC_SPI_CLK_PIN, GPIO_QUAL_ASYNC);
//
// GPIO66 is the SPISTEB
//
GPIO_setMasterCore(ENC_SPI_STE_PIN, GPIO_CORE_CPU1);
GPIO_setPinConfig(ENC_SPI_STE_CFG);
GPIO_setQualificationMode(ENC_SPI_STE_PIN, GPIO_QUAL_ASYNC);
//
// GPIO9 is tformat TxEN
//
GPIO_setMasterCore(ENC_TXEN_PIN, GPIO_CORE_CPU1);
GPIO_setPinConfig(ENC_TXEN_CFG); // out x bar
//
// GPIO139 is PwrEN
//
GPIO_setMasterCore(ENC_PWREN_PIN, GPIO_CORE_CPU1);
GPIO_setDirectionMode(ENC_PWREN_PIN, GPIO_DIR_MODE_OUT);
}
There seems to be another reason why the encoder does not turn on.
It seems that the output value H has not been written.
Thank you very much.
I'll forward your thread to one of our encoder experts to help you debug. Unfortunately they're out of the office for the holidays and likely won't be able to get back to you until early January. Thank you for your patience.
Whitney
Thank you for your kind answer.
I have connected a Tamagawa TS5700N8501 and as soon as I set GPIO32 to H level, it powers down. When I connected a regulated power supply directly to the encoder, the current consumption was 120mA. According to the data sheet of the encoder it is Typ.125mA Max.150mA, so there is nothing wrong with the encoder. It may be that the power supply modules M3 and M9 do not have enough current to supply the load to this encoder. I will try to supply 5V directly to the encoder from an external regulated power supply and only use GND as common. Is there any other better way?
Translated with www.DeepL.com/Translator (free version)
Hello - yes we have seen the encoders may require being powered directly and using a common ground instead of powering through the booster pack or IDDK. Your solution seems fine.
Thank you for your answer, Lori Heustess The Tqamagawa TS5700N8501 has a large input capacitance on the power supply line.
Therefore, when the power is turned on while the program is running, a large current will flow momentarily, causing the power supply voltage to drop and the CPU to reset.
However, permanently, I think the ENDAT's power line should be turned on and off with a jumper, instead of being turned on and off by software.
If it is turned on from the beginning, it will follow the reset sequence to power on and reset off. Power should be not control software but jumper hardware.
There is another serious question.
The EnDat module does not receive the clock signal, which is supposed to be generated by PWM4, but in the actual code it just sets the PWM4 trip event.
At least in the IDDK2.2 that I have in front of me, the ENDAT module is not receiving any clock pulses.
Jumper J10-ON,J18,1-2ON How are the trip events related to the communication with the Tamagawa Seiki encoder?
I don't think I need the trip event to communicate with the encoder, and I think what I need to set for PWM4 is to generate PWM pulses, not to associate it with the trip event. Please explain in detail if I'm wrong.
Hello,
The EnDat module does not receive the clock signal, which is supposed to be generated by PWM4, but in the actual code it just sets the PWM4 trip event.
One clarification, the CLB output overrides the ePWM4 output. We use the trip functionality of the PWM only to set the pin level before the CLB takes over the pin. The PWM peripheral itself does not toggle the pin. Let me know if that helps clarify.
Regards
Lori
Thank you for your reply.
So why is my IDDK2.2 board not able to communicate with the encoder at all? It certainly seems that the first control field is sending something. But I am not getting any response back.
Currently, the CLK pin of ENDAT is still L level on the oscilloscope.
Also, SPI is MSB first, but does MCBSP convert it to LSB first?
Currently, the CLK pin of ENDAT is still L level on the oscilloscope.
I know EnDat works fine on the TIDM-1008 hardware. (https://www.ti.com/tool/TIDM-1008). It is possible there is a pinout difference on the IDDK. Was anything changed from the released version of the software?
Also, SPI is MSB first, but does MCBSP convert it to LSB first?
The T-format and the EnDat use the SPI. I'm not sure where the McBSP question comes in?
Hello, I have been asking this question frequently.
I tried removing and reinstalling MotorControlSDK3.03, but something seemed wrong? I found out that all CLBXBARs in pm_tformat_source.c of PM_tformat_lib are commented out. one XBAR input and one output are defined, but according to TIDUE74C, there should be two XBAR outputs. What is the purpose of this change?
Hasn't debugging been completed?
void tformat_initCLBXBAR() {.
// XBAR_setCLBMuxConfig(XBAR_AUXSIG0, XBAR_CLB_MUX01_INPUTXBAR1);
// XBAR_enableCLBMux(XBAR_AUXSIG0, XBAR_MUX01);
// XBAR_setOutputMuxConfig(XBAR_OUTPUT6, XBAR_OUT_MUX13_CLB4_OUT4); // XBAR_setOutputMuxConfig(XBAR_OUT_MUX13_CLB4_OUT4); // XBAR_setOutputMuxConfig(XBAR_OUT_MUX13_CLB4_OUT4)
// XBAR_enableOutputMux(XBAR_OUTPUT6, XBAR_MUX13);
}
found out that all CLBXBARs in pm_tformat_source.c of PM_tformat_lib are commented out. one XBAR input and one output are defined, but according to TIDUE74C, there should be two XBAR outputs.
This function is leftover from when the library was only provided in binary and not source. This was done because the XBAR configuration couldn't be modified with the source only in binary. The code was commented out but should be completely removed.
Since then the XBAR configuration has been moved to the system example code as shown below:
in tformat.c: tformat_init(): // //XBAR configuration for tformat operation // tformat_configXBAR(); .... void tformat_configXBAR(void) { // // Connect InputXbar-INPUT1 to GPIO63 - SPISIMO // XBAR_setInputPin(XBAR_INPUT1, 63); XBAR_setCLBMuxConfig(XBAR_AUXSIG0, XBAR_CLB_MUX01_INPUTXBAR1); XBAR_enableCLBMux(XBAR_AUXSIG0, XBAR_MUX01); XBAR_setOutputMuxConfig(XBAR_OUTPUT6, XBAR_OUT_MUX13_CLB4_OUT4); XBAR_enableOutputMux(XBAR_OUTPUT6, XBAR_MUX13); }
I have a recurring question.
I have confirmed that the Tformat encoder interface works with TIDM1011. Due to this fact, we did a thorough comparison of the differences with IDDK2.2. There was a programmatic change in the pin assignments, and we confirmed that the change was correct. Comparing the schematics, I found that in TIDM1011, SPISTE is dropped to GND at 150KΩ, while in IDDK2.2, it is dropped to GND at 0Ω. What do you think, should I leave J13 open or add 150KΩ to GND?
Translated with www.DeepL.com/Translator (free version)
With a 150K resistor, it is a weak pull down and can be susceptible to noise or GND jitters if the GND is bad, but it will give the option to use the pin as output if ever needed. Obviously a 0 ohm pull down is going to be strong. Instead, I would consider using a 2K pull down as a compromise.
My question is not that SPISTE should be shorted to GND, but that the level should be free. I'm not talking about pull-down, but rather using a termination resistor as 150KΩ or eliminating the need for a termination resistor.
Pls review the TRM for possible such operations. It is needed for selecting the SPI peripheral.
Adding to Ramesh's comment, The SPI for the encoder solution is configured as a slave and is controlled (clocked) by the CLB. It's transmit is always enabled (thus SPISTEn is low).
The RS485 driver IC has hardware that does not accept any reception when transmission is enabled. It cannot be always "L". As soon as the transmission is finished, it must be set to "H" or nothing will be received.
SPISTE is the enable for the internal SPI peripheral. The RS485 line drivers are connected to a signal called TxEn in the TIDM1011 documentation.
I replaced the 379D board with the bundled one, set J13 to 150KΩ, and recreated the project. Please let me know if there is any difference between the TMS320F28388 board and this part of the IDDK2.2. Or is my TMS320F388 board broken?
Hello,
Unfortunately I'm not sure what is working and what is not working for you. Please check my understanding -
t-format on the iddk project built for f28379xD - works?
t-format on the iddk project built for f28388x - doesn't work?
Using an oscilloscope - what activity on the data lines do you see?
Hello.
The f280379D board works to communicate with the encoder.
The f280388D board cannot communicate with the encoder.
The above is the current situation.
I will try to get a new f280388D board and see if the same thing happens.
The clock on my f280388D board is 2 5MHz, although there is also a 20Mhz clock setting. It seems that there is a 20Mhz clock setting, but I left it at the default setting. I'll add up the clock division ratio to see if it's correct.
Also, check the schematic on the daughter card to see if it is correct.
Translated with www.DeepL.com/Translator (free version)
Hello.
I have got a new, 320F28388D daughter card.
I replaced it, and when I tried it, it communicates with the encoder correctly. J13 is set to 150KΩ.
I do not know why the board is broken.
Now I can continue my development. Thank you very much.