Hi,
We are using Spirit UMTG algorithm for generating call progress tones in DSP TMS320VC5502.
Algorithm generates 16 bit linear data ,and if we play that linear 16 bit data to terminal than it plays clear.
But our system architecture is of 8 bit , so for communication in our system we compress it in A_LAW or U_LAW format using MCBSP , and pass it to terminal(Analog /digital Phones)
On receiving side we decompress received data using MCBSP and it converts it in 16 bit.
When we compress and decompress data ,it adds noise in Call progress tones.
We do another exercise for proving that compression and decompression plays role for noise which is as below.
we write some test code in linux pc,in that we generate sine wave of 440hz using standard sine function,and write that generated data in "sin.pcm" file.
When we analysis that file in cool edit pro software,than it is crystal clear.
After that we compress that data using compress function(same which is used by MCBSP internally ) and after that we decompress that data(same which is used by MCBSP).
With decompressed data we generate "sine.pcm" file and analyze that file in cool edit pro.In that noise present.
This proves that Compression and decompression plays role for noise.
So is there any solution for removing noise.
Regards,
Rahul Shah