This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

How to round digital signal timing?

Hello,

I have pasted an oscilloscope example of a digital signal below.  The blue digital signal is a signal that has been corrected from the yellow distorted signal.  The problem is the timing of the signal's HIGH's and LOW's are slightly off. The signal is 115,200 bps.  One HIGH or LOW works out to 8.6 us (microsecond).  Some of my HIGH's and LOW's are a microsecond or two too short or too long.  I attempted to show what the corrected signal might look like by drawing the signal in white below.  Is there some literature on TI's website on how to correct a digital signal on the time axis? Are there semiconductors that exist that can help to "round" the signal to the nearest proper length (ie if the blue signal is 24 microseconds in length, its probably three HIGH's and should be 3 x 8.6 us = 25.8 us, not 24)?  Hopefully that all makes sense. Logically this seems possible to me, I just need a bit of guidance. Thanks for the help.