This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

What's the point of 16/32-bit to 128-bit UUID conversion?

I understand that 128-bit values are awkward to work with, but why does the Bluetooth standard allow you to make a full UUID *out of* a value with fewer bits? Doesn't this only allow for 2^31 values rather than the 2^127 that is understandably "considered unique over all space and time"?

  • No, there is just an addition representation for the 32 bit UUIDs. Only a very small range of the full 128 bit value range is used to mirror the 32 bit UUIDs into the 128 bit realm.
  • Hi,

    In "theory" all UUIDs are 128-bit, but for the sake of efficiency the BT SIG has defined a base UUID (00000000-0000-1000-8000-00805F9B34FB) and a scheme whereby a portion of this represents an abbreviated 16-bit or 32-bit UUID. In reality then, each 16-bit UUID is also a 128-bit UUID using the base uuid namespace.

    See  for some more information.

    It should be noted that only adopted profiles can use this short form (also, a range of 32-bit and 16-bit values are available for purchase from the SIG). Therefore, all proprietary attribute UUIDs must use a long form 128-bit variant over the air. It is up to you how to generate this value. Most use the same concept as SIG, but with another base UUID, so that 16-bit values can be used internally for example, but it's also possible to give completely different UUIDs to all your proprietary services and characteristics.

    Best regards,
    Aslak