This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWR6843ISK-ODS: 3D People Counting (wall-mount) Steering vector

Part Number: IWR6843ISK-ODS
Other Parts Discussed in Thread: DCA1000EVM, , IWR6843ISK

Hello TI team,

I am using IWR6843ISK-ODS and DCA1000EVM. 

Meta Image: C:\ti\mmwave_platform_1_2_1\tools\studio_cli\src\pre-built-Binaries\mmwave_Studio_cli_xwr68xx.bin

# mmWave Config file

% ***************************************************************
dfeDataOutputMode 1
channelCfg 15 7 0
adcCfg 2 1
adcbufCfg -1 0 1 1 1
profileCfg 0 60.75 30.00 25.00 59.10 0 0 54.71 1 128 4000.00 2 1 36
chirpCfg 0 0 0 0 0 0 0 1
chirpCfg 1 1 0 0 0 0 0 2
chirpCfg 2 2 0 0 0 0 0 4
frameCfg 0 2 96 0 55.00 1 0
lowPower 0 0
lvdsStreamCfg -1 0 1 0

The adc_data_Raw_*.bin files collected by studio_cli are processed by MatlabExamples to get raw Frame Data 'radarCube.mat'. I'm trying to get point clouds through the following process:

1. Range processing

2. Static clutter removal

3. Doppler_processing

4. CFAR detection

5. Get indices of detected peaks

6. peakVals and SNR calculation

7. Peak Grouping

8. Get azimuthInput

9. create steeringVec

10. aoa estimation

I know these steps are slightly different from the signal processing chain of "3D People Counting Demo Software Implementation Guide". But my process should work, right?

My question is mainly about how to correctly design the steering vector, which is needed in step 9. The Guide already give us the method of wall-mounted IWR6843ISK, and cell-mounted IWR6843ISK-ODS. However, I'm trying to use wall-mounted ODS and the signal processing chain similiar to wall-mount processing chain.

I compared the different between these two virtual antenna coordinates, and can understand m_ind, n_ind, phase rotation. We first get "element spatial locations" from m_ind and n_ind, then can create steering vector.

But what I am confusing about is the Guide saying "only azimuth-antennas, 8 antennas of the pre-calculated azimuth steering vectors are used."

Q1: For ODS pattern, how many or which antennas should I use when do the azimuth angle estimation?

And, when estimating the elevation angle, the Guide saying "all 12 antennas are used. The elevation steering vectors are calculated by multiplying pre-calculated elevation steering vectors with the azimuth steering vector corresponding to the detected point." I followed the step of '6.1.1.1.1.1 Azimuth steering vectors calculation' and '6.1.1.1.1.2 Elevation steering vectors calculation', and get steeringVecAzim shape (187, 12), steeringVecElev (187, 12),  correspondingly.

(azimuth steering vectors include phase rotation coefficients, but elevation steering vectors not).

using parameters:

ANGLE_RES = 0.75

ANGLE_RANGE = 70

ANGLE_BINS = int((ANGLE_RANGE * 2) // ANGLE_RES) + 1

Q2: My question here is what's the meaning of multiplying two steeringVec together? Why not just select the elevation direction antennas?

It would be of great help if TI could provide example code for the signal processing chain. Or where can I find more information on these. Any help is greatly appreciated.

Thank you,

Shuting

  • Hi Shuting

    I know these steps are slightly different from the signal processing chain of "3D People Counting Demo Software Implementation Guide". But my process should work, right?

    You will have to test for yourself to see. I don't see anything glaringly wrong with it though.

    But what I am confusing about is the Guide saying "only azimuth-antennas, 8 antennas of the pre-calculated azimuth steering vectors are used."

    This is saying that we're only using the data from the bottom 8 antennas on the ISK antenna layout to estimate azimuth angle. If you're trying to estimate using the ODS antennas, then you'll have to either use just one of the two sets of 4 antennas that are all at the same elevation angle, or you'll have to find a way to combine them as separate estimations.

    Q2: My question here is what's the meaning of multiplying two steeringVec together? Why not just select the elevation direction antennas?

    Multiplying two steering vectors together is the same as multiplying the measurement by the first steering vector (azimuth steering vector) then by the second (elevation steering vector). You can multiply them together (the azimuth and elevation steering vectors), then multiply by the measurement to get the same result.

    Best,

    Nate

  • Hi, Nate

    Thanks for your reply. But the result I got is still problematic, I would like to ask for further advice.

    I try to find the azimuth angle by doing angle FFT with 4 antennas [0, 3, 4, 7] in horizontal direction after rangeFFTdata.

    I have three corner reflectors in my scene.

    On the range doppler, three targets can be distinguished from two peaks (1&2 in the same range, and 3).

     

    But after range angle, although I get three peaks, the angle values are wrong in my opinion.

    Then, I checked the code of the data organization and found that the rawDataReader script will convert the data to ImRe format.

    % checking iqSwap setting
    if(Params.adcDataParams.iqSwap == 1)
    % Data is in ReIm format, convert to ImRe format to be used in radarCube
    frameData(:,[1,2]) = frameData(:,[2,1]);
    end

    % Convert data to complex: column 1 - Imag, 2 - Real
    frameCplx = frameData(:,1) + 1i*frameData(:,2);

    I don't understand why the image should be placed in the real position of the complex data? But when I try to keep the real and image of the data where they should be, the range doppler gives me results that are mirror images of the original. Should I use ImRe format or Real + j* Image format? 

    Besides, I'm thinking how much impact does "compRangeBiasAndRxChanPhase" make? But I see that this variable is not adjusted in ODS_6m_default.cfg. Do I need or have to compensate for this bias when processing the collected raw data?

    Thanks for your patience in replying.

    Best,

    Shuting

  • Hi Nate,

    I found this thread,

    https://e2e.ti.com/support/sensors-group/sensors/f/sensors-forum/846402/iwr6843isk-ods-my-elevation-angle-fft-result-is-upside-down

    Which shows that the virtual antenna pattern is

    % ch-1 ch-4 ch-9   ch-12
    % ch-2 ch-3 ch-10 ch-11
    %                 ch-5   ch-8
    %                 ch-6   ch-7

    But should not it be

    RX1-TX1

    (ch1)

    RX4-TX1

    (ch4)

    RX1-TX2

    (ch5)

    RX4-TX2

    (ch8)

    RX2-TX1

    (ch2)

    RX3-TX1

    (ch3)

    RX2-TX2

    (ch6)

    RX3-TX2

    (ch7)

    RX1-TX3

    (ch9)

    RX4-TX3

    (ch12)

    RX2-TX3

    (ch10)

    RX3-TX3

    (ch11)

    Looking forward to your reply.

    Thank you,

    Shuting

  • Hi Shuting,

    Zigang's information in that thread is correct. Please make that change and let us know if it resolves your issue. I don't think the compRangeBiasAndRxChanPhase will make much of a difference in this scenario too.

    % checking iqSwap setting
    if(Params.adcDataParams.iqSwap == 1)
    % Data is in ReIm format, convert to ImRe format to be used in radarCube
    frameData(:,[1,2]) = frameData(:,[2,1]);
    end

    % Convert data to complex: column 1 - Imag, 2 - Real
    frameCplx = frameData(:,1) + 1i*frameData(:,2);

    Finally, can you share the location of this code with me so I can investigate?

    Best,

    Nate

  • Good morning Nate,

    The path of the "rawDataReader" file is: C:\ti\mmwave_studio_02_01_01_00\mmWaveStudio\MatlabExamples\singlechip_raw_data_reader_example.

    line 416

    Thank you, 

    Shuting

  • Thank you Shuting.

  • Hi Nate,

    I'm looking for the azimuth angle fft result looks like the above video. The peak of the moving object (corner reflector) angle changes continuously. However, most of the time I will get a relsult likes the second video. In which one peak disappears while another peak increases. I've been stuck on this problem for two weeks.

    Besides, about the antenna pattern, I think it's related to the config chirpCfg, or the orders of TXs.

    In terms of MatlabExamples code, I think the comments are wrong. Originally, the saved file has non-interleaved format beginning with the real part of every two samples and followed by the imaginary part of the every two samples. (Mmwave Radar Device ADC Raw Data Capture, P10)

    If iqSwap == 1, meaning the order of real part and imag part is reversed. So the Matlab code should comment:

    % checking iqSwap setting
    if(Params.adcDataParams.iqSwap == 1)
    % Data is in ImRe format, convert to ReIm format to be used in radarCube
    frameData(:,[1,2]) = frameData(:,[2,1]);
    end

    % Convert data to complex: column 1 - Real, 2 - Imag
    frameCplx = frameData(:,1) + 1i*frameData(:,2);

    Lastly, I bought a Laser Boresighter https://www.amazon.com/dp/B09F6G4LVF?psc=1&ref=ppx_yo2ov_dt_b_product_details .

    I will try to update compRangeBiasAndRxChanPhase, to see what will happen, and will let you know.

    Thank you for continue help.

    Best,

    Shuting