I have read that the method sizeof(char) will return 1 and sizeof(int) will also return 1 since the F28335 can only address 16 bits values.
I have trouble understanding how the sizeof method is implemented in CCS. For example, I run the following code:
struct sVarsStorage
{
int16 id; // Var Id
float32 value; // Var value
};
Uint16 lIntSize = sizeof(int16);
Uint16 lFloatSize = sizeof(float32);
Uint16 lBufEntrySize = sizeof(sVarsStorage);
And obtain the following:
lIntSize = 1 as expected.
lFloatSize = 2 as expected.
lBufEntrySize = 4 ??? (I would have expected 3!)
What will happen if I use a char* to compute the CRC of a table of n elements of type sVarsStorage?