I would like to know the differences generated by the MSP430 compiler when doing division vs. right-shifting.
For instance, with a signed 16-bit number containing a 12-bit left-adjusted value, is it useful to perform the division by 16 and what are the results vs. shifting right by 4 bits? I'd like to know the comparison of clock cycles, and resource requirements for both, how the compiler generates the resulting ASM, etc. Does the MSP430 have a native right shift opcode? I didn't see one in the instruction set.
Is there a way for me to execute a comparison in the code with both and calculate empirically the clock cycles? I've never had to do this before.