This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

C6455 TCP2: De-interleaving interleaved parities

Hello,

I am trying to program TCP2, but there are always some errors in the output. Furthermore, The greater number of iterations, the greater number of wrong bits in the output. The best result that I have got is around 20 error bits over 2000 bits, for only one iteration.

Maybe, this problem is related to the following sentence:

"Also note that interleaved parities must be de-interleaved prior to being sent to TCP2." on page 22, SPRU973

I do not understand the meaning of that sentence and, therefore, I do not how to implement an algorithm for that.

Where could I find additional information or any explanation?

Thank you in advance,

David.

  • I found the solution to the problem. Effectively, the problem was related to the interleaved parities, because I was not de-interleaving them prior to being sent to TCP2.

    I found out how to do it and now I get the expected bits in the output.

    Here it is the algorithm which de-interleaves the interleaved parities.

    Data in softBitsVector array: X0 A0 A0' X1 A1 A1' ... and the 12 tail soft bits at the end.

    Where,

        X:  systematic

        A:  parity

        A': interleaved parity

    Each data in softBitsVector array is a signed char.

     

    for(i = 0; i < ((softBitsVectorLength-12)/3); i++)
    {
           interleavedParities[i] = softBitsVector[3*i + 2];
    }

          

    for(i = 0; i < ((softBitsVectorLength-12)/3); i++)
    {
           deinterleavedParities[intrlvIndexLut[i]] = interleavedParities[i];
    }

          

    for(i = 0; i < ((softBitsVectorLength-12)/3); i++)
    {
           softBitsVector[3*i + 2] = deinterleavedParities[i];
    }