This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

How different are 8-bit microcontrollers from 32-bit microcontrollers when it comes to programming them?

Right, so we have 8-bit, 16-bit and 32-bit microcontrollers in this world at the moment.

All of them are often used. How different is it to program 8-bit and 16-bit micrcontrollers?

I mean, does it require different technique or skill? Lets take microchip for example.

What new things does a person need to learn if they want to transition from 8-bit microcontrollers to 32-bit microcontrollers?

  • Lets take microchip for example.

    One of the worst examples possible, at least in regards to 8-bit MCUs. And when it comes to strategic decisions...

    The 8 and 16-bit controllers are a product of their time, or better, of the available technology. With just 5k, 10k or 20k transistors per die, you can't create fancy cores and peripherals. So they pack as much into the silicon as they can. They are less efficient than 32-bit MCUs for a simple reason - the coding of an instruction is independent of the (structural) bus width. You need bits to encode the instruction itself, for the operand(s), and the address mode. When fetching a 16-bit wide instruction, a (real) 8-bit MCU needs two cycles, while a 32-bit MCU will fetch two instructions at once. The greater bus width and operand length plays a role (e.g. arithmetic instructions), but is often less significant. And as a second great advantage, 32-bit MCUs can cover their whole address range with one register - 8-bit MCUs cannot, and 16-bit MCUs only when staying within 64kByte. (The 20-bit extended MSP430 lose about 10..20% performance when switching to extended address mode).

    So, why do 8-bit and 16-bit MCUs still "live" ? Two simple reasons.

    The first is backward compatibility. Often, companies are happy to replace an old, "expired" controller with as little effort as possible. Often it not only runs the same instruction set, but is even pin compatible (This is Microchip's business model ...).

    Second, for certain applications, core performance is not really important - peripherals play a much greater role. If core performance suffices to feed the peripherals (and the MCU is cheaper than the competitors), project managers even decide for terrible core designs without (real) stack,  a single interrupt vector, and a minimum of 4 clock cycles per instructions (for those not in the know, I talk about the MIcrochip PIC16/PIC18).

    At the end, it often boils down to "the cheapest that can deal with the task".

    I mean, does it require different technique or skill? Lets take microchip for example.

    What new things does a person need to learn if they want to transition from 8-bit microcontrollers to 32-bit microcontrollers?

    They are not basically different, however, 32-bit MCUs tend to have more complex peripherals, and interconnections between those. DMA, for example, is rarely supported with 8-bit MCUs.

    One difference I experienced is the approach of customer companies - 8-bit MCUs tend to end up in cheap mass products, where fractions of a cent (in BOM costs) count. The SW design usually ignores "good practices" for the sake of price. Structured design, modularity, portability ? If it adds to code size, forget it. One needs to like this style of development, so take care what road you take now ...

  • user4734723 said:
    Right, so we have 8-bit, 16-bit and 32-bit microcontrollers in this world at the moment.

    We also have 4-bit controllers (although I've not used them).  There may even be some smaller ones around.

    Currently embedded computing stretches over at least 4 bit single masked ROM processors to 64bit (or higher) multi-core/multi-CPU assemblies.

    user4734723 said:
    All of them are often used. How different is it to program 8-bit and 16-bit micrcontrollers?

    Simultaneously not at all different and completely different.

    There actually tends to be a lot of similarity between 16bit micros and 32 bit micros. Relatively uniform architectures, larger sets of peripherals and timers. The 32bit micros tend to grow to larger memory sizes and more sophisticated peripherals but there's a lot of overlap. As a result 16 bit architectures are being pressured by the 32bit architectures. You may find 32bit chips cheaper and more capable than a given 16 bit chip now.

    8 bit micros tend to simpler and/or specialized peripherals. They tend to be cheap. One reason they continue in new designs is not only cost but installed base. If you are building an upgraded version of an existing product with an 8051 in it you will look for a variant of that 805a to fit the new product. Likewise a company that has invested a lot in a specific architectures for other projects will favour that architecture until the pain from a architecture change is less than the advantage of switching (pain comes in tool cost, re-training, no collections of pre-existing working examples etc...)

    8 bit micros are more likely to include significant assembly and other non-portable constructs (such as language extensions).

    user4734723 said:
    I mean, does it require different technique or skill?

    Require? No. Are those who have programmed only 8-bit micros going to have to unlearn some habits? Probably

    user4734723 said:
    Lets take microchip for example.

    I'm with f.m. Let's not.

    user4734723 said:
    What new things does a person need to learn if they want to transition from 8-bit microcontrollers to 32-bit microcontrollers?

    In either case learn proper programming techniques, at least some electronics.

    Most important, always look to learn to do it better. Respect the history of embedded computing, there is a lot of wisdom there but don't be slavish to it there is also a lot of folly (some of it from working around resource constraints that no longer exist)

    Robert

  • May I applaud the efforts of (both) of these forum "Giants?"    Great job f.m. & Robert!

    I cannot add to either - but as more "biz guy" - I found the "Programming Focus" not to be your proper (i.e. most vital) target.
    Both f.m. & Robert steered (again properly - IMO) to the key MCU implementation & design issues - (rather than just programming) which far more dominates device: selection, utilization and choice...