For the Delay time between output signals, both minimum and maximum times are defined and the range between these times is very wide. For example, they are defined for VOUT1 as follows:
Why is the range between the minimum and maximum times very wide?
Does it mean that the Delay time is calibrated in the range between the minimum and maximum times by the IO Delay Recalibration?
Best regards,
Daisuke