Tool/software:
Hi, experts:
Problem background: Our project needs to encode four video streams at the same time, and the resolution of each video stream is inconsistent. After our test, we found that the resolution specification has an impact on the coding time. Last week, the customer entered a new resolution specification, which was reduced in resolution compared to the old specification. After changing the resolution, it was found that the time spent on four-way coding increased compared to before the change.
Old specifications: New specifications:
832*1120 640*864
1568*1120 * 1216*864
744*696 448*608
1920*1080 1920*1080
Tests have been done so far:
Using the old specification resolution, the total time of four-way coding is 33ms
With the new specification resolution, the total time of four-way coding is 48ms
The 448*608 was modified back to 744*696, and the remaining three resolutions still used the new specifications, and the total four-way coding time was 28ms
The 448*608 was modified to 1920*1080, and the remaining three resolutions still used the new specifications, and the total time spent on four-way coding was 34ms
The 448*608 was modified to 384*384, and the remaining three resolutions still used the new specifications, and the total four-way coding time was 45ms
The above part of the test is four simultaneous push (parallel), the following part is serial push, push the frame to one coding pipeline, and then push the data to the second coding pipeline
Using the new specification resolution, the four-way time is 9ms 15ms 5ms 27ms
The 448*608 was modified back to 744*696, and the other three resolutions still used the new specifications, and the four-way coding time was 6ms 9ms 6ms 17ms
The 448*608 is modified to 1920*1080, the remaining three resolutions still use the new specifications, and the four-way coding time is 6ms 10ms 17ms 17ms
The 448*608 is modified to 384*384, and the other three resolutions still use the new specifications, and the four-way coding time is 9ms 16ms 4ms 28ms
The other three pipelines are shielded, and only 1920*1080 single channels are pushed, and the single-channel coding takes 28ms
From the above phenomenon:
1. If the resolution of one path is changed, the encoding time of other paths will be affected.
2. After the resolution is changed small, the coding time will not decrease, but will increase
3. After the resolution is changed, the coding time does not increase as expected, but decreases
4. Parallel push four frames into the coding pipeline, compared with serial feed four frames, the coding time will be reduced
5. Theoretical resolution reduction coding time should be reduced, and then inconsistent with the actual test, the encoder on low resolution will do what special operations, thus affecting the rest of the three-way resolution coding time?
Best regards,