This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Hi,
I wonder two things,
1. What is the minimum frame period value?
I'd like to capture real time data from equally spaced chirp with at most 1ms interval. Since the continuous chirping mode is not available for IWR1443 (value 2 for dfeDataOutputMode corresponds to that option according to mmwave sdk user guide), I need to use the frame structure. However, I see some restriction in the frame period related to "duty cycle", and I don't get what it exactly means. Can I make the frame period shorter than 1ms or put many chirps equally spaced so that every chirp (even across different frames) has the same interval to the next one?
2. Is there any guide on receiving real time raw data?
Through the DataCaptureDemo in the Quick Start Guide, I see the demo lua file records data and call matlab to post process it. However, I want to receive data continuously and process it in real time. Do you have any information about this? Also, I would really appreciate if you let me know where I can edit the post processing matlab code.
HI Vivek,
I'm very interested in your comment above about a PC tool not being able to process data real time ish. Why can't it because surely a high spec laptop can match the processor built into the IWR1642 (or match the FFT accelerator of the 1443)? I'm currently in the process of developing my PC tool and so your comments have me a little concerned. Currently I get 14 to 20 FPS real time streaming (1TX 4RX) doing full RDM processing and display to screen. CPU load is low (<25% on a quad core) and I could easily parallelise up a few things e.g. I discard frames currently if I can't keep up whereas they could go into a producer/consumer type scenario.
Am I going to face a catastrophic problem when I go to 2TX 4RX?
Regards
Hi Vivek,
I check the order of the udp packets on the fly using the supplied header. Then do signal processing on the fly. My data output is good.
However I can only get this to work by first setting the unit up in mmwave studio. For some reason I just don't understand the oob demo data stream correctly. Just about ok for my purpose.
Lots of reasons why want to do real time signal processing. Advanced demonstrators that don't require code porting to c (1642 baseline code is comprehensive but there are other radar signal processing techniques not implemented). Building in synchronised camera streams (mmwave studio doesn't allow). Studying complex environments where it is difficult to replicate and want to see range doppler/angle maps faster than serial can provide, and when it is easier to see live or with well synchronised video. To integrate into current test harnesses without more software etc.
Hi Vivek,
I can't release my code to do this but it is fairly simple using mmwave studio. Basically all you have to do is run mmwave studio (or for simplicity the data_capture lua script) and make sure you have connected a UDP socket to the appropriate connection. My advice would be:
If anybody wants to try it I would suggest they first try to connect to the UDP data stream, apply sequence number consistency checks, and then write the raw data (ie minus the header of sequence number and bytes counted) to a file. That file should be the same as one dumped as if mmwave studio were doing it. Then try to decipher that static file either in matlab/python. If that can be done then it is an easy step to do it all online.
Sorry I can't be more help and give the code.
Best regards.
Hi John,
I have a question on establishing UDP connection.
It seems like the commands, such as ar1.CaptureCardConfig_EthInit, in DataCaptureDemo lua script deal with sending and receiving simultaneously.
Since the data port is already connected by mmwave Studio, I cannot use my own C program to bind to the port.
How could you set only the sending environment so that you can make a connection to the receiving port through other platform like C?
Best,
Youngjae
Hi Youngjae,
I normally run my program first before the mmwave studio lua script so that my program binds the port and then just waits until some data is received. I can't remember ever recalling this problem but next time I've got my kit running I will do it the other way round. Just for reference the simple code I've got is:
# Create UDP socket to the data port.
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
server_address = ('192.168.33.30', 4098)
sock.bind(server_address)
I believe mmwave studio is also able to connect to this port after I have. I know this because annoyingly it is able to download the raw data files as I found out when my hard disk was getting rather full the other day. Whether something else is going on here like ?broadcast/multicasting? or not I don't know as I was surprised that mmwave had also connected.
I've also commented out the last bit of the lua script that was to do with their post processing but I don't think that would have any impact here.
--[===[
--Packet reorder utility processing the Raw_ADC_data
WriteToLog("Please wait for a few seconds for Packet reorder utility processing .....!!!! \n", "green")
ar1.PacketReorderZeroFill(Raw_data_path, adc_data_path, pkt_log_path)
RSTD.Sleep(10000)
WriteToLog("Packet reorder utility processing done.....!!!! \n", "green")
--Post process the Capture RAW ADC data
ar1.StartMatlabPostProc(adc_data_path)
WriteToLog("Please wait for a few seconds for matlab post processing .....!!!! \n", "green")
RSTD.Sleep(10000)
--]===]
Hope the above helps
Hi,
I can't see anything wrong with what you are doing or different from me at least. Is there any indication of an error when looking at the mmwave studio "Output" window - can be accessed via the view toolbar option if not already present.
The only other thing I can think of is have you changed the ar1.FrameConfig so that it does more than 8 frames and actually streams data. I altered the frame config to be 128 chirps, 50ms FPS, and continuous streaming. I can't remember what the default was but to set continuous streaming one of the parameters below needed to go from 8 -> 0 (hopefully it is obvious if you compare to your line):
if (ar1.FrameConfig(0, 0, 0, 128, 50, 0, 1) == 0) then
WriteToLog("FrameConfig Success\n", "green")
else
WriteToLog("FrameConfig failure\n", "red")
end
Best regards
Just FYI this seems to be the opposite for me. I can bind to the port through a python socket and receive data, but then nothing is recorded from mmwave studio...
I have a question about how you synchronize the data to a frame. If your frame is 1000 bytes long, do you just consider the first 1000 bytes to belong to frame 1? Next 1000 to frame 2, etc. There's no way to resynchronize if there's a UDP error or lost packets?
Hi,
The sequence number and byte count can be used to ensure synchronisation and lost packets. See section 5.2 of the DCA1000 user guide. For raw data mode it has the following table:
So in your case if your UDP packet had 1000 bytes the first 4 correspond to sequence number, the next 6 the byte count. The remaining is radar adc data.
Say hypothetically 3 UDP packets are transmitted but the second is lost. You would see at your end a sequence number of 1 and 3. By detecting that the sequence number hasn't increased you've detected the missed packet (dont forget it could come out of order). In a similar way the bytes count keeps incrementing by the total amount of radar ADC data transmitted.These two things can be used to judge where frames start and stop.
I'm not sure if the sequence number and byte count is put in on the DCA1000 or 1642. However, I have always found that they correspond to the where the data is. It is this I use to keep in synchronisation. Pretty sure it is this as well that the mmwave studio uses in their script to post process.
Regards
Thanks that makes sense.
It was not clear to me if byte count means "number of bytes in current packet", or "number of accumulated bytes".
I was banging my head all afternoon trying to understand mismatches between the recorded adc_data_Raw_0.bin and what I see come across wireshark. If anyone else is interested, I think the mmwave studio recorder adds in 6 additional bytes so BOTH of the above is included in the header. That makes the header a total of 16 bytes.
01000000b0050000000000000000
02000000b0050000b00500000000
03000000b0050000600b00000000
04000000b0050000101100000000
05000000b0050000c01600000000
06000000b0050000701c00000000
...
38000000b0050000d03801000000
3900000080010000803e01000000
4 bytes - packet seq num
6 bytes - packet length
6 bytes - accumulated length of recording
I'm also not 100% sure ill be able to keep up with real time processing...would you mind sharing the specs of your workstation along with your chirp/frame parameters? Are you performing all your radar processing in python?
Thanks!
Hello Jeffrey,
If you are looking at the "adc_data_Raw_0.bin" file then the packet format is as below.4 bytes additional are added to indicate the data length in that specific packet. The 6 bytes of Bytes count is the running count of the bytes transferred from the FPGA upto the previous packet.
Regards,
Vivek