What is the practical way to measure the latency of a H.264 encoding ?
Is it the time video read out from the sensor to the time the H.264 encoding stream available?
In the system/user point of view, I believed the "whole process" 's latency would make sense to the user. In the other words, it is the time the video frame enter to sensor, through the encoder, the decoder to the display media. It would include all system communication delay, as well as the decoder (software or hardware) process.
One can argue it depends on how the latency was defined. For the user's application, IMHO, the latency time only makes sense to tthe user is when the visible image been recognized on the display media.
I have the way to insert the timecode in the frame before it enters the encoder, then insert another time code on the frame exits the decoder prior to the display media, this latency has the delta precision of 1/10 ms. Using this method, we have not found any big/small size HD H.264 encoder/decoder pairs can produce the latency below 700 ms (on both ASI (wire) and IP (TS over IP) connections)... so far we have tested about twenty (20) big/small HD encoders.
I have not performed this type of experiment with TI DM368IPNC-MT5 camera yet, because there is only way to output the video is using VLC program on PC monitor. The software (VLC) performance may vary depend on many factors (processor, memory, running tasks etc...), Whenever I completed the MPGE2 Transport Stream wrapper, I would be able to send the encoded stream to an external decoder, then apply our method.
Has anyone practically measured (not scientifically calculated) the DM368IPNC-MT5 latency in any form of latency "definition" like to share ?
BTW, what is the "calculated latency" number for 1280x720p30 H.264 encoding ?
Regards,
Kien