I would like to understand the relationship between chirp parameters, and required datarate on the CSI interface.
The average data rate per frame is: (NUM_SAMPLES * NUM_BITS * 2 * NUM_RX * NUM_CHIRPS_PER_FRAME) / (FRAME_TIME)
The peak data rate per chirp is: (NUM_SAMPLES * NUM_BITS * 2 * NUM_RX) / (IDLE_TIME + RAMP_END_TIME)
An example chirp profile:
NUM_SAMPLES = 1024
NUM_BITS = 16
NUM_RX = 4
IDLE_TIME = 35us
RAMP_END_TIME = 65us
CHIRPS_PER_FRAME = 10
FRAME_TIME = 10ms
Based on these numbers, I calculate an average (per-frame) data rate of 131 Mbit/s and a peak (per-chirp) datarate of 1.31 Gbit/s.
My experimentation suggests that the CSI datarate needs to be set to accommodate the higher limit. For example, I observe that using a 1.2 Gbit/s CSI configuration does not return data, while using a 1.8 Gbit/s configuration does. Can you confirm this is the expected behavior?
Thanks for your assistance,
Antonio