This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

GC5016 synchronizing problem


Hello all,

I'm using for GC5016 to implement several tuned filters wich a cutoff frequency of 25kHz. Two of them are configured in 4-channel DDC mode, with Interleaved IQ output, and the others are configured in 4-channel DUC mode with real output. One of the DUCs is connected to the other through the SUM in port.

Each DDC GC5016 is connected directly to one DUC GC5016, without any kind of FPGA or CPLD.

I'm experiencing problems synchronizing the DDC-DUC interface. I was trying to use the SOB output of one GC5016 to synchronize the others (in the same way as explained in the datasheet) but it doesn't work. I'm also trying using two external signal connected to the SIB of each GC5016 and adding a delay between them in the same way as explained in the TSW4100 user guide. It either doesn't work...

I'm thinking about redesign the board to add an FPGA between the DDC and DUC, but before that I would like to know if there's any other method to synchronize both ICs.

Can you help me in any way?

Thanks in advance

  • Hello,

    The key to transceiver design for the GC5016 is to look at the AFS,BFS,CFS,DFS signals.  You must also have common divided clock outputs.

    The GC5016 assumes it is a master, dealing with a Baseband logic.  The DDC outputs the Frame Strobe with the divided clock and the data, see the GC5016 IO Application Note on the TI website.  The GC5016 DUC inputs, are also based on the Output Frame strobe and divided clock, and receiving the data from the BB logic.

    You have several options

        a) the simplest solution is to fully implement the Baseband interface to and from a logic device, so that the Frame Strobe and GC5016 clock are used to transfer the Cout and Dout to Ain and Bin.  This requires a small FPGA/CPLD.  This can require simple tuning of the tinf_fs_dly parameter in the configuration file also.

       b) the manual process, just feeding the DDC baseband outputs to the DUC baseband inputs, like the TSW4100 schematic, needs the CPLD/FPGA to generate delayed sycnronization signals.  You monitor the CFS,DFS DDC outputs, after initial synchronization.  The use logic to delay the Sync Output (which outputs the sync signal, and feeds this back to SIB.  The DUC is re-synchronized so that the delayed sync causes the AFS and BFS signals to be one clock cycle later than the CFS, DFS signals so you can transfer the DDC output to the DUC input.  This delay has to be tuned for each decimation and interpolation ratio. 

    I have included the TSW4100 FPGA diagram, the (b) synchronization procedure is described in the TSW4100 User Guide.

    Regards

    Radio Joe

     

    TSW4100 FPGA_description.pdf
  • Hi Radio Joe,

    Thanks a lot for your answer.

    If I understand correctly, all the diagrams you have sent are for implementing the second option, like the TSW4100. So they doesn't apply to the first option, does they?

    One thing that I'm not understanding well is: why are, in the TSW4100 schematics (rev. B1 from February 2007), channels C&D of DDC routed to the channels A&B of DUC through the FPGA, while the others being routed directly? Are you using the TDM mode in baseband?

    And one more question about the first option: would it be enough use some kind of FIFO, clocking each side of the memory with the signals xFS and xCK of the DDC and DUC? In that case, the DDC will write the data when he has them, and the DUC will read when he needs them... Am I correct?

    Thanks again

    Regards,

    Andrés

  • Hello,

    1) the attachment for the previous post relates to FPGA work, some was for the TSW4100.  In the first option, you must control the SIA, and SIB pins, the sync select for the DDC and DUC has to be separated into two groups.  So some FPGA/CPLD is needed to delay the Sync Out pin (used to sync internally the DUC), to the Sync In pin selected for the DDC sync.

    2) The C/D DDC Out pins are used for split IQ, interleaved IQ, or TDM IQ output.  The A/B DUC in pins are used for split IQ, interleaved IQ, or TDM IQ input.  Each IO mode has a specific Frame Sync, data interleaving format that is described in both the datasheet and the GC5016 IO Application note.  The TSW4100 developed software and FPGA only uses split IQ or interleaved IQ modes.  A modified FPGA could use the TDM mode. 

         - this depends on the number of IO pins used for the data, and if the channels are a common IQ rate

    3) If a CPLD or FPGA implementation is used, you can have a FIFO or memory that is written from a statemachine that responds to the DDC Frame strobe, and Output clock.  Another statemachine responds to the DUC Frame Strobe and Output clock to read the FIFO or memory.  The tinf_fs_dly has to be adjusted in the software, or the state machine can predict with a modulo counter when the next DUC Frame strobe occurs.  The DUC expects the data to be sent across with the falling edge of the DUC Frame strobe, please follow the timing discussions in the GC5016 IO Application note.

    Regards,

    Radio Joe

  • "    - this depends on the number of IO pins used for the data, and if the channels are a common IQ rate

    3) If a CPLD or FPGA implementation is used, you can have a FIFO or memory that is written from a statemachine that responds to the DDC Frame strobe, and Output clock.  Another statemachine responds to the DUC Frame Strobe and Output clock to read the FIFO or memory.  The tinf_fs_dly has to be adjusted in the software, or the state machine can predict with a modulo counter when the next DUC Frame strobe occurs.  The DUC expects the data to be sent across with the falling edge of the DUC Frame strobe, please follow the timing discussions in the GC5016 IO Application note."


    What I want is to use the four baseband channels of DDC and DUC in Interleaved IQ mode, at the same rate. Therefore, is it posible to implement this solution? And, what is the problem of implementing this solution with different channel rates?

    Thanks again for your answers, you're helping me a lot

  • Hello,

    If you have (2) GC5016s, or a single GC5016 AB-Tx,CD-Rx, you can have different IO options:

        a) split IQ mode only supports two channels for 2 devices, one channel each direction for one device

        b) the difference between interleaved IQ, and TDM IQ is

             - in TDM IQ ALL channels must be the same IQ rate

            - in TDM, the DDC uses the DOut, DFS the DUC uses the AIn and AFS signal

            - in TDM mode, the direct linkage of the DOutput to AInput port without an FPGA is not functional due to timing delays, and the channel order (see timing diagrams)

           - there is a divided clock ACK,BCK,CCK,DCK this has divider limitations in that the number of divided clocks must be an integer for the decimation and interpolation

             selected

           - if you are trying to support less IO pins, medium to high decimation, and you are using an FPGA, the 8 pin interface mode can be used, this would reduce the IO

            pin count for each channel, You would need the xFS, xCK, and top 8 data pins from each In and Out port.  (note you can just buffer the CK to (2) GC5016s, ADC,

            FPGA, and DAC, so the FPGA can share the exact input clock, then the divided clock is not required.  (most ADCs can't drive the GC5016 capactive load at the clock

            input)

             FPGA master signals, GC5016 buffered clock, Sync Output (as FPGA input from GC5016 DDC), (2) Sync Outputs to SIA signal on both GC5016s

            (using the Interleaved IQ mode)  16bits, 8 pins

            GC5016 DDC Output A[out15..8], AFS,ACK,  Bout[15..8], BFS,BCK, Cout[15..8], CFS,CCK, Dout[15..8], DFS,DCK,  (these are all FPGA inputs)

            GC5016 DUC Input Ain[15..8], Bin[15..8], Cin[15..8], Din[15..8], DUC outputs AFS.ACK,BFS,BCK.CFS.CCK.DFS.DCK

            (using the TDM mode) 16bits, 16 pins

           GC5016 DDC Output Dout[15..0], DFS,DCK,  (these are all FPGA inputs)

            GC5016 DUC Input Ain[15..0], DUC outputs AFS.ACK    

    Regards,

    Radio Joe 

  • Thank you Radio-Joe.

    Please could you tell me what is the value of the sync_mode command used in the TSW4100, in both DDC and DUC config files?

    Regards, Andrés

  • Hello,

    The TSW4100 software has multiple program-sections for the sync select signals.  The initial configuration is set to syncmode 9 which does not use sync select signals.

    Then the programmation is changed to resync the cic_sync,fir_sync, output_sync for the specific DDC or DUC section from the GUI code.  Please contact kenchan@ti.com, for more TSW4100 GUI code details.  Obtaining the GUI code (other than the compiled version) may require an NDA.

    Regards,

    Radio Joe