This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DCA1000EVM: minimum frame period / receiving real time raw data

Part Number: DCA1000EVM
Other Parts Discussed in Thread: IWR1443, IWR1642, AWR1243, AWR1642

Hi,

I wonder two things,

1. What is the minimum frame period value?

I'd like to capture real time data from equally spaced chirp with at most 1ms interval. Since the continuous chirping mode is not available for IWR1443 (value 2 for dfeDataOutputMode corresponds to that option according to mmwave sdk user guide), I need to use the frame structure. However, I see some restriction in the frame period related to "duty cycle", and I don't get what it exactly means. Can I make the frame period shorter than 1ms or put many chirps equally spaced so that every chirp (even across different frames) has the same interval to the next one?

2. Is there any guide on receiving real time raw data?

Through the DataCaptureDemo in the Quick Start Guide, I see the demo lua file records data and call matlab to post process it. However, I want to receive data continuously and process it in real time. Do you have any information about this? Also, I would really appreciate if you let me know where I can edit the post processing matlab code.

  • Hello,
    The radar chirps are sent in a form of frames. You can send N chirps (which are typically spaced few usec apart) to form a frame. You can find some basics of this in the following app note : www.ti.com/.../swra553
    The data analyses , ie range, velocity etc. are typically done at frame level. Based on the chirp configuration set, ie the ramp end time, idle time etc. you can find the time taken to complete one chirp (idle time + ramp end time) and hence compute the time needed for N chirps and this give the "active frame time ".
    Now there is a frame periodicity which decides what is the gap between one frame to another. This is typically decided based on the measurement update rate required (frame per second). If you need 20fps then the frame periodicity is set to 50msec. The active time /frame periodicity provides the duty cycle of frames. Typically 50% duty is common, but this is not mandatory.

    regards,
    Vivek
  • Hello,
    Regarding the real time capture and real time processing, the DCA1000 can capture the data at the rate at which it is sent out from the AWR device (as long as it is withing the data rate supported by the ethernet port). But real time processing cannot be done because a PC tool cannot process the data at this rate (we are talking of about 500Mbps or so). Hence it is stored in the file and processed after it is saved.

    Regards,
    Vivek
  • HI Vivek,

    I'm very interested in your comment above about a PC tool not being able to process data real time ish. Why can't it because surely a high spec laptop can match the processor built into the IWR1642 (or match the FFT accelerator of the 1443)? I'm currently in the process of developing my PC tool and so your comments have me a little concerned. Currently I get 14 to 20 FPS real time streaming (1TX 4RX) doing full RDM processing and display to screen. CPU load is low (<25% on a quad core) and I could easily parallelise up a few things e.g. I discard frames currently if I can't keep up whereas they could go into a producer/consumer type scenario.

    Am I going to face a catastrophic problem when I go to 2TX 4RX?

    Regards

  • Hello John,
    Presently how are you getting reading the data over ethernet? presently the mmwave studio tool that we provide to configure the mmwave sensor and get the raw ADC data from the DCA1000 stores the data a file and the file is then read back to process. The reason being there is lot of data handling that needs to be done before it can processed. Since the data is sent as UDP packets there is no guarantee of the order of arrival of packets , no guarantee on the reception of the packet on the PC end etc. Hence once a large chunk of data is received the packets may need to be reordered, if there are any missing packets there might be a need to stuff zero's to identify missed data etc. So its not easy to do all this in real time as the data is coming in.

    Could you clarify why you want to get the raw data ion real time? Instead you could get the processed data from the 1642 which would be the final object data.

    Regards,
    vivek
  • Hi Vivek,

    I check the order of the udp packets on the fly using the supplied header. Then do signal processing on the fly. My data output is good.

    However I can only get this to work by first setting the unit up in mmwave studio. For some reason I just don't understand the oob demo data stream correctly. Just about ok for my purpose.

    Lots of reasons why want to do real time signal processing. Advanced demonstrators that don't require code porting to c (1642 baseline code is comprehensive but there are other radar signal processing techniques not implemented). Building in synchronised camera streams (mmwave studio doesn't allow). Studying complex environments where it is difficult to replicate and want to see range doppler/angle maps faster than serial can provide, and when it is easier to see live or with well synchronised video. To integrate into current test harnesses without more software etc.

  • Hello John,
    Great to hear that you are able to extract the data directly from the ethernet port and process it in real time ! It would a great help to the forum if you can share the details about how you exactly implemented it and how you extracted the valid ADC data form the UDP paket.

    By the way , if you use mmwave studio you can also enable continuous CW transmission instead of chirps/frames. This related to the first query had posted above.

    Regarding OOB demo data, that would be adding some additional headers along with the ADC data in the device. But if you are able to use the mmwave studio I would recommend you continue using that since you get more flexibility there.

    Regards,
    Vivek
  • Hi Vivek,

    I can't release my code to do this but it is fairly simple using mmwave studio. Basically all you have to do is run mmwave studio (or for simplicity the data_capture lua script) and make sure you have connected a UDP socket to the appropriate connection. My advice would be:

    • Have one process to capture the data over UDP. Each UDP packet has a header of sequence number and bytes counted. This header can be used to find missing or out of sequence UDP packets. In my experience I find that the first few frames captured can have a couple of packets missing. After that though the stream is steady enough that it isn't really a problem. I've cheated a little bit here and when I detect the sequence number is out of order I just ignore that entire frame. Apart from first frame, or if running greater than 20 frames per second with heavy processing, that gives me a good frame rate (as a human cant observe a problem).
      • You need to collect the data from multiple UDP packets. Note that the end of a frame will most likely be in the middle of a UDP packet. To determine how much data to collect you need to know the size in bytes of a radar frame. If you can decode an adc.bin file in matlab then it should be easy to know how to know how big a frame of data is. 
    • Have another process to 'get' the data from the data capture process queue and do any radar signal processing required - RDM, DOA, CFAR, tracking etc.
    • Finally have another process to do any plotting to screen in real time (pyqtgraph isn't bad as a place to start for real time plotting in python as I find matplotlib can't quite hack it).

    If anybody wants to try it I would suggest they first try to connect to the UDP data stream, apply sequence number consistency checks, and then write the raw data (ie minus the header of sequence number and bytes counted) to a file. That file should be the same as one dumped as if mmwave studio were doing it. Then try to decipher that static file either in matlab/python. If that can be done then it is an easy step to do it all online.

    Sorry I can't be more help and give the code.

    Best regards.

  • Hi John,

    I have a question on establishing UDP connection.

    It seems like the commands, such as ar1.CaptureCardConfig_EthInit, in DataCaptureDemo lua script deal with sending and receiving simultaneously.

    Since the data port is already connected by mmwave Studio, I cannot use my own C program to bind to the port.

    How could you set only the sending environment so that you can make a connection to the receiving port through other platform like C?

    Best,

    Youngjae

  • Hi Youngjae,

    I normally run my program first before the mmwave studio lua script so that my program binds the port and then just waits until some data is received. I can't remember ever recalling this problem but next time I've got my kit running I will do it the other way round. Just for reference the simple code I've got is:

    # Create UDP socket to the data port.
    sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
    server_address = ('192.168.33.30', 4098)
    sock.bind(server_address)

    I believe mmwave studio is also able to connect to this port after I have. I know this because annoyingly it is able to download the raw data files as I found out when my hard disk was getting rather full the other day. Whether something else is going on here like ?broadcast/multicasting? or not I don't know as I was surprised that mmwave had also connected.

    I've also commented out the last bit of the lua script that was to do with their post processing but I don't think that would have any impact here.

    --[===[

    --Packet reorder utility processing the Raw_ADC_data
    WriteToLog("Please wait for a few seconds for Packet reorder utility processing .....!!!! \n", "green")
    ar1.PacketReorderZeroFill(Raw_data_path, adc_data_path, pkt_log_path)
    RSTD.Sleep(10000)
    WriteToLog("Packet reorder utility processing done.....!!!! \n", "green")

    --Post process the Capture RAW ADC data
    ar1.StartMatlabPostProc(adc_data_path)
    WriteToLog("Please wait for a few seconds for matlab post processing .....!!!! \n", "green")
    RSTD.Sleep(10000)

    --]===]


    Hope the above helps
  • Hi John,

    Thank you for sharing your experience.

    I think the main problem preventing the simultaneous connection was using 'inaddr_any' for my PC's address. (still doesn't know mechanism in the multiple binding for the same port)

    Now I can let a socket listening to the data port and run the capture_demo script at the same time, but both of them doesn't receive any data.

    Do you have any idea on this?

    Below is the python code I am using.

    s = socket(AF_INET, SOCK_DGRAM)
    s.bind(('192.168.33.30', 4098))
    print 'udp echo server ready'
    while 1:
    data, addr = s.recvfrom(BUFSIZE)
    print 'server received from %r' % (addr)


    Best,
    Youngjae
  • Hi,

    I can't see anything wrong with what you are doing or different from me at least. Is there any indication of an error when looking at the mmwave studio "Output" window - can be accessed via the view toolbar option if not already present.

    The only other thing I can think of is have you changed the ar1.FrameConfig so that it does more than 8 frames and actually streams data. I altered the frame config to be 128 chirps, 50ms FPS, and continuous streaming. I can't remember what the default was but to set continuous streaming one of the parameters below needed to go from 8 -> 0 (hopefully it is obvious if you compare to your line):

    if (ar1.FrameConfig(0, 0, 0, 128, 50, 0, 1) == 0) then
    WriteToLog("FrameConfig Success\n", "green")
    else
    WriteToLog("FrameConfig failure\n", "red")
    end

    Best regards

  • Well it doesn't work with the same setting as yours.

    Through wireshark, I could see there was packet transmission. (Neither python nor mmwave received data though)
    However, during the transmission (under continuous mode), when I stop the python program, mmwave saves data to the raw file.

    Best
  • How big is your BUFSIZE in s.recvfrom(BUFSIZE)? I've set mine to 4096. Can't see why that should make any difference though.
  • Mine is also 4096. Probably the multiple binding operates depending on the environment.
  • Just FYI this seems to be the opposite for me. I can bind to the port through a python socket and receive data, but then nothing is recorded from mmwave studio...

    I have a question about how you synchronize the data to a frame. If your frame is 1000 bytes long, do you just consider the first 1000 bytes to belong to frame 1? Next 1000 to frame 2, etc. There's no way to resynchronize if there's a UDP error or lost packets?

  • Hi,

    The sequence number and byte count can be used to ensure synchronisation and lost packets. See section 5.2 of the DCA1000 user guide. For raw data mode it has the following table:

    So in your case if your UDP packet had 1000 bytes the first 4 correspond to sequence number, the next 6 the byte count. The remaining is radar adc data.

    Say hypothetically 3 UDP packets are transmitted but the second is lost. You would see at your end a sequence number of 1 and 3. By detecting that the sequence number hasn't increased you've detected the missed packet (dont forget it could come out of order). In a similar way the bytes count keeps incrementing by the total amount of radar ADC data transmitted.These two things can be used to judge where frames start and stop.

    I'm not sure if the sequence number and byte count is put in on the DCA1000 or 1642. However, I have always found that they correspond to the where the data is. It is this I use to keep in synchronisation. Pretty sure it is this as well that the mmwave studio uses in their script to post process.

    Regards

  • Thanks that makes sense. 

    It was not clear to me if byte count means "number of bytes in current packet", or "number of accumulated bytes".

    I was banging my head all afternoon trying to understand mismatches between the recorded adc_data_Raw_0.bin and what I see come across wireshark. If anyone else is interested, I think the mmwave studio recorder adds in 6 additional bytes so BOTH of the above is included in the header. That makes the header a total of 16 bytes.

    01000000b0050000000000000000
    02000000b0050000b00500000000
    03000000b0050000600b00000000
    04000000b0050000101100000000
    05000000b0050000c01600000000
    06000000b0050000701c00000000
    ...
    38000000b0050000d03801000000
    3900000080010000803e01000000

    4 bytes - packet seq num
    6 bytes - packet length
    6 bytes - accumulated length of recording

    I'm also not 100% sure ill be able to keep up with real time processing...would you mind sharing the specs of your workstation along with your chirp/frame parameters? Are you performing all your radar processing in python?

    Thanks!

  • Hi,

    Processor Intel(R) Core(TM) i7-7700HQ CPU @ 2.80GHz, 2801 Mhz, 4 Core(s), 8 Logical Processor(s)
    Installed Physical Memory (RAM) 16.0 GB

    At the moment just doing a 1TX and 4RX but have also played with 2Tx and 4RX with minimal impact on speed. Chirp parameters:
    nADCSamples = 256
    nChirps = 128
    framePeriod = 50ms (i.e. 20 FPS).
    complex 1 mode (sample rate 5000e3)

    All signal processing being done in python (and capture of image from webcam). For my current testing I get typically 15 FPS. If I can't keep up I just discard frames. I also discard any frame where I miss or have a packet out of sequence. I think if I bothered to optimise my processing scheme I could achieve the actual radar frame rate of 20 FPS.

    Not sure what would happen if I really tried to push the chirp parameters or put in an AWR1243 at full whack. Suspect it would slow down quite a bit. I don't know how easy it would be but my plan when I hit that wall is to try offloading some processing to my GPU which is currently sitting idle.

    Regards
  • Hello John,
    That is right, the sequence number is used to detect any UDP packet drop and the byte count can be used to now the size of the packet (in bytes) that was droped. With this information the application could either "pad" those missing bytes with zeros or something and hence maintain the chirp/frame boundary by just maintaining a byte count. These are added by the DCA1000 and not by the AWR1642.

    Regards,
    vivek
  • Hello Jeffrey,

    If you are looking at the "adc_data_Raw_0.bin" file then the packet format is as below.4 bytes additional are added to indicate the data length in that specific packet. The 6 bytes of Bytes count is the running count of the bytes transferred from the FPGA upto the previous packet.

    Regards,

    Vivek