This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AWR1243: AWR2243: Impact of Chirp Idle Time on Peak MIPI CSI Throughput / Device Performance

Part Number: AWR1243
Other Parts Discussed in Thread: AWR2243

Hello,

We are working with the AWR2243 device for an automotive RADAR sensor.

We would like to reduce the peak MIPI CSI data rate to below 500Mbps per lane while keeping the chirp configuration Ramp Time unchanged.

Based on the formula we were given for peak data rate:  Peak MIPI CSI Data Rate = ( Number of Samples * Bits Per Sample * Number of Channels ) / ( Ramp Time + Idle Time )

we think that we can just increase the Chirp Idle Time to reduce the peak rate.

What are the concerns or possible system issues of increasing the Chirp Idle Time?

A typical chirp profile looks like this for us today:

Number of Samples:  465

Bits Per Sample:  32

Number of Channels: 4

Ramp Time:  25.27 us

Chirp Idle Time:  2.3 us

Based on the numbers above and using the formula, our peak MIPI CSI data rate should be approximately 540Mbps per lane.  We would like to reduce the peak data rate to approximately 480Mbps per lane.

To do this while retaining our ramp time, we would have to increase the Chirp Idle Time to approximately 5.7us.

How does this impact the system?  Will there be any buffer size limitations, etc? Are there any other concerns?

Thank you!

  • Hello Rich Ripple ,

    You are right that increasing the chirp idle time provides a larger time for the CSI data transfer and hence a lower CSI rate could be used. The CSI rates that can be configured are 600/450/400/300/250/150 Mbps.

    As you are aware increasing the chirp idle time increases the chirp to chirp time and hence impacts the max unambiguous velocity estimation. Other than that there is no other impact of using a larger idle time and lower CSI rate.

    Regards,

    Vivek