This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AWR1642BOOST: Regarding calculation of Chirp time

Part Number: AWR1642BOOST

Hello,

I was wondering if there's a standard formula for calculating the chirp time. I've gone through the document titled "Programming Chirp Parameters in TI Radar Devices," but I'm still unsure about how to calculate the chirp time. I have used the below parameters to set up the awr1642boost evm for data capture. Could you please provide some guidance on how to calculate the chirp time using the parameters below?

Number of frames (n_frame) = 2326
Chirps per frame (chirp_per_frame) = 128
Frame periodicity (frame_periodicity) = 25.8 milliseconds
Wait before copy (wait_before_copy) = 70000
Additional parameters:

Starting frequency (start_freq) = 77 GHz
Idle time (idle_time) = 150 microseconds
ADC start time (adc_start_time) = 6 microseconds
Ramp end time (ramp_end_time) = 50 microseconds
Frequency slope (freq_slope) = 80
ADC samples (adc_samples) = 256
ADC sample rate (adc_sample_rate) = 6000
RX gain (RX_gain) = 30


Thank you,
Deepu

  • Hi Deepu,

    You can calculate your chirp time as shown below:

    Total Chirp time (for 1 chirp) = Active Chirp time + Idle time

    Active Chirp time = Ramp end time - ADC start time

    Regards,

    Kaushik

  • Hi Kaushik,

    Thanks for your suggestion. I am facing measurement accuracy issues, particularly with parameter calculations, which affect our data quality. We recently acquired eight radar models with DCA1000 for vital signs and other studies, but incorrect calculations are hampering our progress. Your help would be greatly appreciated.

    I am getting the wrong maximum range and total time for collected data based on the suggested formula to calculate chirp_time. I use this formula: max_range = Fs_adc / 2.0 * chirp_time / bandwidth * C , where C=3e8 (second), Fs_adc = 6000,000(sample per second), chirp_time = 0.000196(second), and bandwidth = 3.433e9 Hz. I used the sensing estimator to get the parameter and bandwidth values(see values).

    I think I am getting the wrong maximum range and total data collection time. As previously stated, my findings reveal a span of 51.38 meters, utilizing 256 samples, with Fs_adc set at 6e6, slope 80, and a chirp time of 0.000196 seconds over a data collection period of 50 seconds. In reality, I collected data for 1 minute (see the below screenshot of mmwave studio), and the subject was sitting at a 40cm distance. Kindly help me verify the calculated chirp_time, bandwidth, maximum range, Fs_adc, and subject distance.

    Best,

    Deepu

  • Hi Kaushik,

    Could you please respond to my above query?

    Best,

    Deepu

  • Hi Deepu,

    • If you populate the sensing estimator tool with the right values, you should be able to see the right set of configured parameters such as max range, range res etc. 
    • Deepu Kumar said:
      As previously stated, my findings reveal a span of 51.38 meters
    • I'm not able to find any previous mention of this. Can you please provide more info here? What do you mean by span here?
    • What observation of yours does not correlate with the 40cm measurement?

    Regards,

    Kaushik

  • Hi Kaushik,

    I calculated the values suggested by you and used the sensing estimator with the parameter values provided above. I have been doing a project on vital signs detection by collecting data from subjects sitting at a 40cm distance. Kindly answer the below questions.

    1. The calculated values by your suggested formula are 196 microseconds, and the sensing estimator is 42.667 microseconds. May I know why they are different?
    2. Additionally, The bandwidth shown on mmwave studio is 4000MHz, and the bandwidth shown in the sensing estimator is 3433MHz. Why are they different?
    3. Finally, increasing the values of periodicity from 25.8ms to 40.0ms increases the data collection time from 62 seconds to 96 seconds. I was wondering if periodicity affects the quality of the data.

    Best

    Deepu

  • Hi Deepu,

    Let me forward this thread to our Vital signs expert to comment on this further. Please allow us 2 day's of time to get back to you on this. Your patience is greatly appreciated!

    Regards,

    Kaushik

  • HI, there:

    Please see my answers below.

    1. The calculated values by your suggested formula are 196 microseconds, and the sensing estimator is 42.667 microseconds. May I know why they are different?

    [ZY] It should be the the ramp time which is 42.667us.  Idle time should not be included.    

    1. Additionally, The bandwidth shown on mmwave studio is 4000MHz, and the bandwidth shown in the sensing estimator is 3433MHz. Why are they different?

    [ZY]  Bandwidth showing in mmwave studio is the total bandwidth, but the bandwidth showing in the sensing estimator is the valid bandwidth (used for range resolution calculation).  The total bandwidth included the bandwidth that are used in ADC start time and extra ramp time.  This ADC start time and extra ramp time are needed to achieve stable settlement. 

    1. Finally, increasing the values of periodicity from 25.8ms to 40.0ms increases the data collection time from 62 seconds to 96 seconds. I was wondering if periodicity affects the quality of the data.

    [ZY]  Increase the periodicity of the frame should not affect your system performance.  The range resolution, max range, Doppler resolution and max Doppler are all based on the active chirping within the frame.  Not related to frame duration.   

    Best,

    Zigang

  • Hi Zigang,

    Please find the attached image for the calculation and chirp time. The sensing estimator shows that 42.667 microseconds is a chirp time, not a ramp time. I still do not understand the calculation of chirp time.

    1.[Arjun] The formula suggested by TI Engineer(Kaushik) to calculate chirp time was: 

    Total Chirp time (for 1 chirp) = Active Chirp time + Idle time

    Active Chirp time = Ramp end time - ADC start time

    My selected parameter values in microseconds are ramp_end_time =50, idl_ time =150, and ADC_start_time =6. Upon using the suggested formula, Active chirp time would be 50 - 6 = 44. The total Chirp time for 1 chirp would be 44 + 150 = 194 microseconds.

    I have a total chirp of 297,728 for 2326 frames and 128 chirps per frame. So, the total chirp time for all the chirps would be 297,728 * 194 = 57,759,232 microseconds. That is 57.759232 seconds. May I know which value for the chirp should be used to calculate the range, please?

    Best,

    Deepu

  • Hi, Deepu:

    Please refer to the figure 1 at the following app notes https://www.ti.com/lit/an/swra553a/swra553a.pdf.   ChirpTime = idle time + total ramp time.  

    The total ramp time = ADC start time + ADC Sampling Time + extra ramp time.

    The following equation you mentioned is not correct. Please use the app notes as your golden reference.

    Total Chirp time (for 1 chirp) = Active Chirp time + Idle time

    Active Chirp time = Ramp end time - ADC start time

    You can also find the equation for range resolution and max range inside this same app notes (section 2.1).

    Hope it helps. 

    Best,

    Zigang