This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DCA1000EVM: Setting up LVDS Stream for processed radar data with MMWAVEICBOOST and AWR6843AOPEVM

Part Number: DCA1000EVM
Other Parts Discussed in Thread: AWR6843, MMWAVEICBOOST, , AWR6843AOPEVM, UNIFLASH

Tool/software:

Hi everyone

as the title already indicates, I am trying to set up the configuration of AWR6843AOPEVM + MMWAVEICBOOST + DCA1000EVM, in order to transmit the processed radar data of the AWR6843 via ethernet (UDP).

So far I was able to get most of the toolchain running on my Linux host machine.
That covers mmWave SDK v3.6, Uniflash, DCA1000 CLI (built with makefile, as mmWave Studio is only available for Windows hosts), mmWave Demo Visualizer for SDK v3.6, ...

I flashed the out-of-box demo of the SDK v3.6 and was able to configure the device via the mmWave Demo Visualizer.
The device is in functional mode.

I have connected the sensor EVM to the MMWAVEICBOOST and both boards are connected to the DCA1000 via 60-pin Samtec cable.

I have a power supply connected to DC input jack on MMWAVEICBOOST and set the switch on DCA1000 accordingly.

The ethernet port of the DCA1000 is connected to my Linux host (with static IPv4 192.168.33.30, Subnet-Mask 255.255.255.0) and the XDS110 port of the MMWAVEICBOOST is also connected.

When powering the whole setup, all related LEDs are flashing up.

I was trying to load a configFile.json with the DCA1000EVM_CLI_Control to the FPGA.
In order to keep things simple, I started with the example config file from the user manual for DCA1000.

As written in the developer guide, I ran the command "export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$pwd", so the CLI tools is able to find the .so and the tool seems to work fine.
When using the command "./DCA1000EVM_CLI_Control cli_version", I get the following response:
"Control CLI Version : 1.0"

Now if I try to load the config file with "./DCA1000EVM_CLI_Control fpga configFile.json", I receive the following error message:
ConfigureRFDCCard_Fpga(): UDP recvfrom failed : 11
FPGA Configuration :
OS error - -2

In parallel I am trying to check what is being sent with Wireshark and here everything looks fine to me.
I can see the Command Frame from source 192.168.33.30 using UDP, Port 4096, Length = 14, to destination 192.168.33.180.
Data: 5aa50300060001020102031eaaee

There also seems to be a positive response from the target with source 192.168.33.180 to destination 192.168.33.30, from port 1024 to 4096, Lenth = 8.
Data: 5aa503000000aaee

I have checked the developer guide and tried to manually decode the request and the response.
Both seem fine to me, but still I do not get any connection or can record any data.

Any help would be highly appreciated.
Thanks in advance and best regards!

  • Hi,

    One of our experts will get back to you with an answer in a day or two.

  • Thanks for a first feedback.
    Sadly noone has replied yet.

    In the meantime, I was able to figure out most of the toolchain related issues, I was facing.

    First of all I had to open the corresponding ports on ufw on my linux machine. The default UDP ports for this communication setup (4096 and 4098) were closed and I had to allow the access.

    Second issue I had was, that eventhough I was able to send a configuration to the FPGA of the DCA1000 and seemed to be able to start a recording, nothing was actually transmitted.
    I was not aware of the fact, that I need to send the profile configuration with each power-on. The CLI of the OOB Demo does not safe the profile in NVM.
    After sending a chirp/radar profile and then triggering the start of a recording, I actually receive the corresponding UDP packages now.

    To keep things simple first, I have set up the transmission of the raw ADC values only, as this is probably the main purpose of this setup.

    Nonetheless, while looking at the source code of the OOB, it seemed to me, that the transmission of the radar point cloud via LVDS should be possible.
    Is this a missconception or am I on the right track?

    Again, thanks in advance.
    BR

  • Hi,

    The page below lists our documentation on the methods for raw data capture for our devices. Let us know if what you are trying to do does not match one of these use cases.

    https://dev.ti.com/tirex/content/radar_toolbox_2_20_00_05/docs/hardware_guides/raw_data_capture_tools_summary.html

    Thanks,

    Clinton

  • Hi Clinton,

    thanks for the reply.

    As all methods described under the link you've shared are related to raw data capture (i.e. raw ADC values), none of these use cases matches mine.

    I want to transmit the calculated point cloud data (cartesian), not the raw ADC data.

    This is probably not in focus of the OOB demo, but it does transmit the radar data via UART, so I assume it will also be possible to stream this data via LVDS and have the DCA1000 transmit it via UDP frames.

    BR

  • It may be possible, but the OOB demo does not natively support it.

    Thanks,

    Clinton

  • Hi Clinton,

    could you explain what the difference / purpose of the HW session and the SW session is?
    From reading the User Guide, I could not find a proper explanation.

    In the source code of the OOB demo, in mmw_lvds_stream.c/h, I found the function "MmwDemo_LVDSStreamSwConfig", which seems to configure the LVDS stream to transmit the detected objects point cloud.
    From what I see when checking for references of this function, it looks like the function is not used anywhere.

    In mss_main.c, I have identified the spot, where the processed data is sent out via UART ("MmwDemo_transmitProcessedOutput").
    Just a couple of lines before that, there is an if-statement, checking for whether a SW session was enabled for the current sub-frame.
    If so, "MmDemo_transferLVDSUserData" is called, which seems to transmit the DPC results via LVDS.

    Am I just getting this wrong or is just a little modification necessary in order to achieve my goal?

    Do you have any additional information on the concept of HW / SW session?

    Even in the Doxygen documentation for the AWR68xx OOB demo, in the chapter "Streaming data over LVDS", it says:
    "The LVDS streaming feature enables the streaming of HW data (a combination of ADC/CP/CQ data) and/or user specific SW data through LVDS interface."
    A couple of lines below that, there is a picture of the LVDS SW Data format, that shows DPIF_PointCloudCartesian_t.

    To me, this sounds like, the demo would actually natively support the transmission of the processed point cloud, it seems like it is just not clearly described, how to enable this.
    Or maybe I am just to blind to find the right information. ;-)

    Thanks again and BR

  • HI MaWa,

    Apologies that this thread was left open for some time. Are you still facing these issues and needing support?

    Is your goal to output the point cloud (cartesian) over LVDS? or over UART?

    Thanks,

    Angie

  • Hi Angie,

    we are currently checking whether the UART transmission and the sent data is sufficient for us.

    I have now set up the demo as intended to transmit the processed radar data via UART and used the provided Python scripts to decode the transmitted data.

    I will update this thread once we have further questions.

    BR

  • Hello,

    Thank for for the update! Please feel free to let us know if you have any other questions.

    Regards,

    Kristien

  • Hi everyone,

    my colleagues have now taken a closer look at the radar data transmitted via UART and rated it as usable.

    No further need for support to transmit the data via Ethernet for now.

    Thanks to everyone for the responses.

    BR