Hello,
In order to check the whiteness of the internal noise, a histogram of AWR1243 AD raw data of one frame was drawn for several input conditions. The AD quantization level was set to 16 bit mode.
In the case of low level input of Rx, the occurrence probability of 8 bits 0 to 255 of LSB was almost constant. I thought this to be a normal distribution as Gaussian noise.
In the case of the high level input, the occurrence probability of 0 to 255 of the 8 bits of the LSB is greatly biased, and its AD output value was unevenly distributed in 8 steps (0, 32, 64 ...) at regular intervals. However, the probability of occurrence of unevenly distributed eight levels was almost constant. This seems to be only 3 bits in the LSB 8 bit that works normally.
The nature of the noise was as expected, but the uneven distribution of the LSB value according to the input level as above has become clear. I'm new to the sigma-delta type ADC, but is it normal to behave like a bit moving up and down depending on the input level going up and down?