This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

EtherCAT Slave Stack Configuration



When looking through the full slave stack code, as compiled and executing on the ICE board, I see in the 'ecat_def.h' file that he macro specifying the 16 bit host architecture is true; why is this the case for the 32 bit AM335X processor?

 

/* CONTROLLER_16BIT: Shall be set if the host controller is a 16Bit architecture */

#define                CONTROLLER_16BIT                          1

  • Scott,

    I urgently suggest you move on to the latest IA-SDK and SSC sources. Otherwise we will discuss around known and fixed issues all the time.

    Anyway we removed the 16 bit setting from this file in latest SDK. It doesn't make any sense for ARM Cortex-A8. Generally SSC is designed to be portable on 8,16 or 32 bit architectures. We are currently using the 8 bit format mainly to avoid word alignment issues with the 32 bit architecture. In some cases you will have headroom then for optimizing the data accesses! Right now we focus on stability and ease of use. Usually internal data accesses are not a limiting factor for overall communication performance on our architecture where we have a fast internal bus anyway.

    At the end customers may want to adapt the SSC or any other stack for their application needs and work on performance optimization also to distinguish from competition. There are a couple more areas in our examples where we left room for improvement...

    Regards.