Other Parts Discussed in Thread: CC3200
Hello,
I am having some difficulties understanding the output format for the DOUT Pin on the ddc118.
Here is my Setup. I am using a TI Launchpad CC3200 as my master micro-controller. It is generating the following data.
- Output Pins
- CLK speed = 100KHz
- DCLK speed = 50KHz
- CONV = 500us
- TEST Pin is held high and probed twice with 5us pulse widths to generate 22pC from the ddc118.
- Input Pins
- DVALID' (active low). I poll this pin in a while loop to know when the ddc118 pulls the pin low.
- MISO. I am using an SPI interface where I connect the SPI Clk into DCLK and MISO to DOUT to read the output data from the ddc118.
I am attaching two screenshots to demonstrate my power-up sequence and the output format of DOUT after the 10th CONV cycle.
Here is my power-up sequence with the following initial values.
- FORMAT = '0b' (16-bit output)
- RANGE = '111b'
- HISPD/LOWPR = '1b'
- CLK_4x = '0b'
- TEST = '1b'
- CONV = 500us
- CLK = 100KHz
- DVALID = 'active low' (when data is ready)
- DOUT = 'output format'
- The screenshot starts as soon as I pull RESET low to restart the ddc118 then I wait for 30ms to apply the pins listed above.
Here is a screenshot of the 10th CONV cycle.
- It is more clear in this screenshot how RESET, TEST, CONV, CLK, DVALID, DCLK, and DOUT Pins are behaving, but I am not understanding the output of DOUT.
- From my understanding of the datasheet.
- A 22pC input charge would translate to a 0x1017 hex value, but instead I am seeing 0x0106 hex coming from DOUT.
- I need to clock DCLK with 128 (16 bit x 8 channels) falling edge bits (since FORMAT = '0b' which implies 16 bit output per channel) in order to shift out every bit of each input channel starting with the MSB of channel 8-to-LSB channel 8, repeating for Channel 7, Channel 6, Channel 5, Channel 4, Channel 3, Channel 2, and finally Channel 1.
Can someone help my understand the output format of DOUT or if my timing is incorrect?
Thank You.