I have spent some time trying to understand the details of the calibration done with the DDR3_slave_ratio_search_auto.out binary. There are just two things that I don't fully understand:
1) On our board the DATA_PHY_FIFO_WE_SLAVE_RATIO search finds a minimum value of 0 and a maximum value of about 0x154, resulting in a final value of 0xAA. Why is a minimum of 0 considered valid if the maximum is found to be more than one full clock cycle away? As far as I know any minimum less than the maximum minus tRPRE is wrong. In my case it still works because the average of minimum and maximum is less than tRPRE away from the maximum. But it is suboptimal since it is not centered within tRPRE.
2) The Ratio Seed spreadsheet calculates a seed value for DATA_PHY_FIFO_WE_SLAVE_RATIO that is twice as sensitive to the DQS length than to the CK length. Why? The FIFO WE slave ratio has to compensate for the time it takes the read command to travel from the CPU to the RAM chip and for the time it takes the result to travel back to the CPU. There is no second transfer on the DQ(S) lines starting after the result has arrived that has to be taken into account in the formula. The equivalent spreadsheet for Keystone I processors (SPRABL2) uses a different formula where DQS length and CK length have equal weight. Is the formula in the AM335x spreadsheet wrong?