Hi experts,
In the file C:\ti\mmwave_industrial_toolbox_4_8_0\labs\level_sensing\68xx_high_accuracy\src\dss\dss_data_path.c, we can see
frequencySlopeMHzMicoSec =(float)profileCfg.freqSlopeConst * 3600.f* 900.f/((float)(1<<26)); (line 587)
But in the file rl_sensor.h , annotation of profileCfg.freqSlopeConst says
For 60GHz devices (57GHz to 64GHz):
1 LSB = (2.7e6 * 900) / 2^26 = 36.21 kHz/uS for 60GHz devices , (line 711)
Since 6843 is 60Ghz device, I think 3600 should be replaced by 2700 to get frequencySlopeMHzMicoSec.
Can you pls explain why 3600 is used in the codes?
Customer concern it may lead to the actual chirp slop changed from 33.71Mhz/us to 44.9Mhz/us in the demo.