From the thread above, I am understanding that amount of time collecting the EOM error count for single point can be calculated from
(ADRS 0x2A) x 4096 x 32 / DataRate

Our customer and I would like to check if the number of count read and calculated from Register 0x25/0x26 is correct or not.
*The largest count value I read is around 6100 decimal.

For example, If the data rate is 10Gbps and Register 0x2A is the default value, how many times does the device check error count for each point ? (i.e. What is the sampling rate ? )
Could you please tell us how we can calculate ?

Best Regards,
Kawai