Hi all,
In DM365 codec faqs it is written:
For H.264 encoder (STD quality mode), the DDR b/w is 230Mbytes/sec for 720P@30fps.
For H.264 HP progressive decoder, the DDR b/w is 350Mbytes/sec for 720P@30fps.
I am not able to get how to calculate the above figures.
Also if someone can tell
how can i calculate the best/worst frames per second if my DSP is 300Mhz and
DDR bandwidth is 230Mbytes/s for 720*480 resolution.
If these figures will also depend on the target bit rate and the quantization parameter value.
Also regarding latency it is written:
All the codecs operate at frame level and hence the latency is one frame encode or decoder time. For a 720P@30fps on 300MHz device, the latency will be approx 33 ms.
how the latency is calculated for the above scenario.
In later version of H.264 codec (ver 2.00.00 and beyond), there is a provision of slice level call back API which enables application to give/take data from the codec at sub frame level. For the same 720P@30fps on 300 MHz device, the latency will decrease to 5ms compared to 33 ms.
But end to end system latency will be a sum of capture + encode + transmit buffering <-Network-> receive buffering + decode +display. So there are many other factors other than codecs which comes into play here
It is written that the system latency will be sum total of all the above latency figures.what is the latency of each parameter.
Regards,
Mayank