This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TMS320F28335: How to distinguish between 16-bit or 32-bit instructions?

Part Number: TMS320F28335

Hi Team,

1) When the program generates the hex file through CCS, all programs are stacked together (see the hex file below). How does the processor distinguish among these data which are 16-bit instructions and which are 32-bit instructions? 

2) In CCS, the PC pointer executes one instruction at a time. How does the processor tell the number of bits of the instruction being executed? Is there a flag bit? 

Could you help check this case? Thanks.

Best Regards,

Cherry

  • To understand the format of that hex file, please search the C28x assembly tools manual for the sub-chapter titled Intel MCS-86 Object Format.  At that point in processing, everything is handled as a stream of 16-bit words that are associated with some starting address, and have some length.  Once things get to the point where the PC has an address in it and it is time to start processing an instruction, then what?  Search the C28x instruction set manual for the sub-chapter titled Pipelining of Instructions.  Quote:

    The decode 1 (D1) hardware identifies instruction boundaries in the instruction-fetch queue and determines the size of the next instruction to be executed.

    Thanks and regards,

    -George