This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWR6843ISK: Source of range, azimuthAngle, and elevAngle values in DPU_TrackerProc_process after DPU_AoAProcDSP_process

Part Number: IWR6843ISK
Other Parts Discussed in Thread: MMWAVE-SDK

Hello TI Team,

I am currently analyzing the Long Range People Detection project using the IWR6843ISK board.

After executing the following function:

uint32_t DPU_AoAProcDSP_process(
    DPU_AoAProcDSP_Handle     handle,
    uint32_t                  numObjsIn,
    DPU_AoAProcDSP_OutParams *outParams);

the point cloud data such as X, Y, Z, Velocity, noise, and SNR are generated and stored for each detected object.

Subsequently, the following function is called:
int32_t DPU_TrackerProc_process(
    DPU_TrackerProc_Handle     handle,
    uint32_t                   numObjsIn,
    DPIF_PointCloudSpherical  *detObjIn,
    DPIF_PointCloudSideInfo   *detObjInSideInfo,
    DPU_TrackerProc_OutParams *outParams);

Before the gtrack_step() function is executed, the following code assigns data to the tracker input structure:
trackerProcObj->pointCloud[n].range     = pDpuCfg->res.detObjIn[n].range;
trackerProcObj->pointCloud[n].azimuth   = pDpuCfg->res.detObjIn[n].azimuthAngle + pDpuCfg->staticCfg.sensorAzimuthTilt;
trackerProcObj->pointCloud[n].elevation = pDpuCfg->res.detObjIn[n].elevAngle;
trackerProcObj->pointCloud[n].doppler   = pDpuCfg->res.detObjIn[n].velocity;
trackerProcObj->pointCloud[n].snr       = pDpuCfg->res.detObjInSideInfo[n].snr;

My question is about where the following values come from:

  • pDpuCfg->res.detObjIn[n].range

  • pDpuCfg->res.detObjIn[n].azimuthAngle

  • pDpuCfg->res.detObjIn[n].elevAngle

It seems that these values are derived by converting Cartesian coordinates to Spherical coordinates, but I would like to confirm exactly which part of the AoA processing code performs this conversion.

Could you please clarify which function or section of the DPU_AoAProcDSP_process() or DPU_TrackerProc_process() (or related modules) is responsible for calculating and assigning these range, azimuthAngle, and elevAngle values before they are passed to the tracker?

Thank you for your support and clarification.

Best regards,
Jaehoon Kim

  • Hi Jaehoon,
    I am looking into your query. Please allow me some time to respond!
    Thanks

  • Hello Jaehoon,

    1. For range estimation, the Hardware Accelerator (HWA) present on board does FFT Computation. The "DPC_ObjDetRangeHwa_preStartConfig()" function which further calls "DPU_RangeProcHWA_config()" configures the paramsets for fast computation of Range FFT on HWA.

    The DPC_ObjDetRangeHwa_preStartConfig() function is present inside DPC_ObjectDetection_ioctl() function which is located in this file on your CCS project:
    "long_range_people_det_6843_mss\objdetrangehwa.c"

    2. The DPU_RangeProcHWA_process() function called inside DPC_ObjectDetection_execute() in the same file, actually triggers the HWA every frame to calculate Range FFT.

    3. You are correct, the angle estimation is done on DSP using DPU_AoAProcDSP_process() function.

    4. I think you are unable to find the function definition of functions like DPU_RangeProcHWA_process() and DPU_AoAProcDSP_process() because you are searching only inside the CCS project for this application. Note that the CCS Project directly includes files from TI mmWave SDK as well, which are used to build the project.

    Searching for these functions in MMWAVE-SDK will give you their definition. For example, definition for DPU_AoAProcDSP_process() is present at:
    mmwave_sdk_03_05_00_04\packages\ti\datapath\dpc\dpu\aoaproc\src\aoaprocdsp.c

    You can find the logic for angle estimation here.

    Thanks,
    Saransh

  • Hello Saransh,

    Thank you for your previous response.
    I have reorganized my questions into two specific points as follows:

    Question 1 – Conversion from Cartesian to Spherical

    In aoaprocdsp.c, the final output data appears to be in Cartesian coordinates, as shown below:

    objOut[objOutIdx].x = x; // Unit: meter
    objOut[objOutIdx].y = y; // Unit: meter
    objOut[objOutIdx].z = z; // Unit: meter
    objOut[objOutIdx].velocity = params->dopplerStep * dopplerSignIdx;
    objOutSideInfo[objOutIdx].noise = objIn[objInCfarIdx].noise;
    objOutSideInfo[objOutIdx].snr = objIn[objInCfarIdx].snr;
    res->detObj2dAzimIdx[objOutIdx] = maxIdx;
    objOutIdx++;

    All Cartesian coordinate components (x, y, z) are expressed in meters.

    However, in trackerproc_3d.c, the corresponding data types are spherical coordinates with the following units:

    Variable Unit
    pDpuCfg->res.detObjIn[n].range meter
    pDpuCfg->res.detObjIn[n].azimuthAngle degree
    pDpuCfg->staticCfg.sensorAzimuthTilt radian
    pDpuCfg->res.detObjIn[n].elevAngle degree

    After copying, the local tracker structure uses:

    Variable Unit
    trackerProcObj->pointCloud[n].range meter
    trackerProcObj->pointCloud[n].azimuth radian
    trackerProcObj->pointCloud[n].elevation radian

    From this, it seems that the Cartesian data from aoaprocdsp.c is converted to Spherical form before being passed to the tracker process.
    Could you please clarify where this Cartesian-to-Spherical conversion is implemented in the code?


    Question 2 – Unit mismatch in assignment

    In the following section from trackerproc_3d.c, the right-hand side variables (azimuthAngle, elevAngle) are expressed in degrees, while the left-hand side variables (azimuth, elevation) are in radians:

    for (n = 0; n < mNum; n++)
    {
          trackerProcObj->pointCloud[n].range(Unit: meter) = pDpuCfg->res.detObjIn[n].range(Unit: meter);
          trackerProcObj->pointCloud[n].azimuth(Unit: radian) = pDpuCfg->res.detObjIn[n].azimuthAngle(Unit: degree)
                                                                                                      + pDpuCfg->staticCfg.sensorAzimuthTilt(Unit: radian);
                trackerProcObj->pointCloud[n].elevation(Unit: radian) = pDpuCfg->res.detObjIn[n].elevAngle(Unit: degree);
                trackerProcObj->pointCloud[n].doppler = pDpuCfg->res.detObjIn[n].velocity(Unit: m/s);
                trackerProcObj->pointCloud[n].snr = pDpuCfg->res.detObjInSideInfo[n].snr;
    }

    This appears to mix degrees and radians in the same expression.
    Could you please explain whether there is an implicit conversion step elsewhere in the code,
    or if this is an intentional design (e.g., degrees being converted later within the tracking DPU)?


    Thank you for your help in clarifying these details.

    Best regards,
    Jaehoon Kim