Hello, everyone. The confusion is as follows. If we use a 3.0V reference as the full-scale voltage, then 1LSB is about 3/(2^16-1)=46uV. If we want to guarantee the 16bits resolution we should make the total output peak-peak noise is much smaller than 1/2*LSB which is 23uVpp. So it means the total rms noise should be smaller than 1/3*1/6*1/2*LSB=1.3uVrms. The datasheet of dac says its broadband noise is 55nV/rtHz at 10KHz. Let's assume it to be 50nV/rtHz and ignore the 1/f noise. Now we have BW=(1.3uV/(50nV/rtHz))^2=676Hz as the noise bandwidth, which means the signal bandwidth should be below 676Hz to guarantee the resolution. The conclusion seems confusing to me and i'm not quite sure the above analysis is correct or not. Can anyone help to tell how to understand it?