This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AWR1843BOOST: Data Interpretation Issues

Part Number: AWR1843BOOST
Other Parts Discussed in Thread: AWR1642BOOST-ODS, AWR1843

Hallo All,

Currently I am using AWR1843Boost along with obstacle detection example. I could see the data with the ods_3d_visualizer demo GUI. I would like to read the data using external microcontroller. 

In order to do that I shorted R166 and R169 resistors.

I can see the objects in the GUI. I am trying to read the data (Real term application with baud rate of 921600) using MSS LOGGER port, but I did not really short R26, Instead I am trying to read data from R26 itself. I can see the data as some hex numbers.

I am using gui Monitor Configuration as 1 1 0 0, with this configuration I should get data like FRAME Header + TLV (Detected Objects) + TLV (Clusters) .

But when I am trying to analyse it does not look the same. Please find the attached file. 

Is MSS LOGGER is the correct port to read the data? 

How Can I really use this data? Like I want the exact position of an object, How Can I get that position from the above data?

BR

Venkatesh.

  • Hi,

    As mentioned in the ODS demo documentation, the ODS demo was designed to work with AWR1642BOOST-ODS EVM.

    If you run the demo with AWR1843BOOST, the angle information is not correct.

    Regarding the format of the output data, please see the User Guide.Once you detect the magic number, you should be able to understand the content of the data.

    The (x,y,z) is provided in Q9 format. So you will have to scale the integer data by 2^9=512

    Please also see attached snapshot that may help

    Here is the matlab code that performs this operation in the gui

     

    function [detObj, idx] = getDetObj(bytevec, idx, tlvLen, rangeIdxToMeters, dopplerResolutionMps, numDopplerBins)

        global OBJ_STRUCT_SIZE_BYTES;

        detObj =[];

        detObj.numObj = 0;

        if tlvLen > 0

            %Get detected object descriptor

            word = [1 256]';

            detObj.numObj = sum(bytevec(idx+(1:2)) .* word);

            idx = idx + 2;

            xyzQFormat = 2^sum(bytevec(idx+(1:2)) .* word);

            idx = idx + 2;

     

            %Get detected array of detected objects

            bytes = bytevec(idx+(1:detObj.numObj*OBJ_STRUCT_SIZE_BYTES));

            idx = idx + detObj.numObj*OBJ_STRUCT_SIZE_BYTES;

     

            bytes = reshape(bytes, OBJ_STRUCT_SIZE_BYTES, detObj.numObj);

            detObj.speedIdx = (bytes(1,:)+bytes(2,:)*256);

            detObj.x = bytes(3,:)+bytes(4,:)*256;

            detObj.y = bytes(5,:)+bytes(6,:)*256;

            detObj.z = bytes(7,:)+bytes(8,:)*256;

            detObj.x( detObj.x > 32767) =  detObj.x( detObj.x > 32767) - 65536;

            detObj.y( detObj.y > 32767) =  detObj.y( detObj.y > 32767) - 65536;

            detObj.z( detObj.z > 32767) =  detObj.z( detObj.z > 32767) - 65536;

            detObj.x =  detObj.x / xyzQFormat;

            detObj.y =  detObj.y / xyzQFormat;

            detObj.z =  detObj.z / xyzQFormat;

  • Hey Cesar,

    Thanks for the quick reply.

    Is there anyway I could correct this angle information or Do I have to use another demo, which is designed for AWR1843BOOST?

    I tried to work with automated_parking demo, but the object detection with ods_demo is far better than automated_parking demo (my observation). I just need to detect objects  with in 2 or 3 meters range.

    What configurational changes do I need to implement on automated_parking example, in order to make it work like ods_demo?

    As far as I understood chirp_configuration for automated_parking example are hardcoded. Can you please point me, where I can change these parameters?

    Thanks in advance.

    BR

    Venkatesh

     

     

  • Hi,

    The ODS demo was optimized to detect static objects. So, for static object detection the ODS processing chain is better than the Automated Parking one.

    For moving objects the Automated Parking demo processing chain is better.

    Changing the configuration will not make the Automated Parking demo better for static objects.

    Have you tried the AWR1843 mmWave SDK demo?

    This is a generic processing chain that may be better for your project.

    Thank you

    Cesar

  • Hi Cesar,

    Thank you I will try that demo and get back to you if i have any doubts.

    I think there is a typing mistake in Data packet format of the ODS_DEMO (Just for Info) . Thats why I was bit lost. Here I am adding the text from frame header packet of  ODS_DEMO . 

    frameHeaderStructType = struct(... 'sync',

    {'uint16', 8}, ... % syncPattern in hex is: '02 01 04 03 06 05 08 07'

    'version', {'uint32', 4}, ... % 0xA1642 or 0xA1443

    'totalPacketLen', {'uint32', 4}, ... % In bytes, including the header

    'platform', {'uint32', 4}, ... % See description below

    'frameNumber', {'uint32', 4}, ... % Starting from 1

    'timeCpuCycles', {'uint32', 4}, ... % Time in DSP cycles when the message was created

    'numDetectedObj', {'uint32', 4}, ... % number of detected objects

    'numTLVs' , {'uint32', 4}, ... % Number of TLVs in this message '

    subFrameIndex', {'uint32', 4}); % always zero 

    Platform Value should be % 0xA1642 or 0xA1443, but not the version. 

    Thank you.

    BR

    Venkatesh.

  • Thank you

    Cesar

  • Hello Cesar,

    I tried to work in debug mode. I loaded DSS and MSS programs using code composer studio and ran the group as specified in the user guide. I opened the GUI and I can see the data. But when I tried to put some brakpoints inside the code (send TLVs part in  MSS_MAIN.c file), it is not stopping there.  But I do know that the send TLVs part, where a put breakpoint is executing in a every cycle.Is there any reason behind that? 

    I would like to send only x,y and z coordinate via UART from the radar. I do not want to send all other information like magicword, platform. How can send only x,y and z coordinates? 

    Thanks in advance.

    BR

    venkatesh.

  • Hi

    In order to set a break point you would need to disable the optimization for the specific file and re-build the code.

    Changing the optimization level in CCS project is accomplished by right click on the specific file in the CCS project and then selecting specific compile options.

    You will need to keep the magic number because it marks the boundary of the frame.

    You would need to modify the code to select the buffers you want to send to the host

    thank you

    Cesar

  • Hallo Cesar,

    Before I used to work with ODS_DEMO example on AWR1843BOOST board and the output data format is as shown below.

    But currently I am using xxr1843Demo on the same board and the data output format seems different(Detected object infomation) from ODS_Demo output data. The current output data is

    0201 0403 0605 0807 0300 0303 A000 0000     Magic Word

    4318 0A00 4203 0000 E3C7 C5AA 0500 0000   5 Objects Detected

    0200 0000 0000 0000 0100 0000 5000 0000      two TLV, TLV type detected, Length 

    337B AEBD 76B2 B23C 0000 0000 0000 0000    {Object information is different from the above example}

    DA64 6C3D 8DE6 873D 0000 0000 0000 0000

    9EBA 2A3E C701 C03E 0000 0000 0000 0000

    A2C1 573E 2F7C AA3F 0000 0000 0000 0000

    E89A ACBE CA1F A73F 0000 0000 0000 0000

    0700 0000 1400 0000 B400 4D03 B400 4D03

    A000 C602 A801 7002 A801 7002 0003 1300

    The message format (Object information) is not same as the ODS_Demo example. I looked into index file in Doxygen folder, but there is no information about how to get x,y and z co-ordinates of the detected objects in that file. Where can I find the documentation related to the specific data formats ?

    How can I get x,y and z coordinates of the detected objects? Where Can I find Parser or GUI for the xxr1843Demo ?

    Thank you.

    BR

    Venkatesh.

  • Hi

    Sorry for late reply

    In addition to the documentation provided, please review the source code in order to understand the format

    Thank you
    Cesar