This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Reissuing a percussion sound module and updating the trigger inputs. Was using a TLC2543 but need a DAC with all the same specs but min 14 channels.
Looking for the following:
- SAR Analog to Digital converter
- multi-channel muxed input
- 13+ channels
- single-ended conversion
- up to 5V operation
- 8-bit resolution
- SPI interface
- must be able to read in 8 SPI/serial clock cycles
Is there a work around for either the ADS 7961SRHBT or ADS 7961QDBTRQ1, that will allow us to reduce the SCC to 8?
Hi Tom,
Thanks for your response.
I have a product that uses an older TI ADC which can grab an 8-bit sample in 8 serial clocks. I am using a relatively slow processor (8051 @ 18 MHz) which bit-bangs the SPI interface and this costs 68 machine cycles as the code is today. I need to move to the newer TI ADC (ADS7961) to get 16 channels and I am just trying to minimize the time it takes to acquire an 8-bit sample.
My engineer has made significant progress on this and it looks like 16 serial clocks won't cost us much, if anything, in performance. However, to follow through on your reply: Yes, clearly after 12 clocks the LS-bit of the 8-bit sample data would have been acquired. But, the TI documentation suggests that the full 16 serial clocks may still be required in order to
- get the SDO line to tri-state (we don't care about that)
- complete the conversion value to be clocked out in the following frame
It is unclear whether or not an early-terminated serial frame will leave SDI bits on their proper destination position. Given the way most serial interfaces work, I would guess that these bits would not land in their proper position when a frame is prematurely terminated. But, we'd need to have insight into the TI Rx state machine to know this for sure.
Note: We intend to use Auto-2 mode, so whenever we clock-in a conversion result, we will clock-out 0000b for the first 4 bits (bits 15-12), i.e., no mode change. That means that none of the bits clocked into the ADC after that (bits 11-0) have any significance. Still, if we terminate the frame early, it is possible that these top 4 bits would not shift up into proper position before termination.
Final question: it does not seem like the ADS7961 has an internal oscillator used to clock the conversion. In other words, it looks like the SAR conversion itself is driven by the serial clock. If so, does this clock need to have a consistent frequency/duty-cycle? As I said, we are bit-banging this interface. Clock timing will be inconsistent, but it will easily meet the minimum clock-hi/lo duration requirements.
PS (To be more specific, let's say I initialize the ADS7961 using full 16-clock frames, and I set it to Auto-2 mode. Then, I 'prime' the command shift register by writing 0000_0000_0000_0000b. Could I then continuously read the A/D with 12 clocks, writing 0000_0000_0000b? The answer to this may depend on what actually clocks the conversion itself, internally. Is there an eval board out there that we could mess with?)
best
Al
Tom,
I think we will give up on trying to do this in 12 clocks. It seems risky, it's not normal usage, and the gains are not big enough for us to worry about. I do have 2 follow-ups, though:
1) I am confused about REF+/- vs VA+/AGND and the external 'nominal' 2.5V reference. I want the conversion range to be 3.0V. I have a signal that is 0-to-3V and we need 3V to be full-scale digital. Can you tell me how to do this and how the nominal 2.5V VREF figures in here?
2) Regarding the consistency of the SPI serial clock, and more specifically, the exact what we driver this interface, please see explaination and questions in the attached document below. (couldn't find a way to attach the PDF so had to cut and paste)
Best
Al
---------------------------------------------------------------------------------------------------------------------------------------------------------------------
1 The A/D SPI interface is being driven by microcontroller instructions, not a
2 SPI peripheral interface. We assue this is not uncommon.
3
4 Each short SCLK phase below illustrated is 500ns, which is one machine cycle
5 on our microcontroller. It takes us one cycle to set clock, one cycle to
6 clear clock, so the maximum clock rate is is 1 MHz. The longer clock-low
7 phases illustrated below are 1000ns.
8
9 It takes us 1 cycle to grab a conversion bit from SDO, so 8 of the clock-lo
10 phases are elongated by one cycle. So, for starters, we have a SPI interface
11 with inconsistent clock frequency and non-50% duty cycle.
12
13 The numbers indicated for SDO indicate where we pick off the sample bits.
14 Also, after init, SDI is always low.
15
16 Current code (42 machine cycles):
17
18 CS -_________________________________________
19 SCLK __-_-_-_-__-__-__-__-__-__-__-__-_-_-_-_-__
20 SDO 7 6 5 4 3 2 1 0
21 SDI ___________________________________________
22
23 We can make the clock frequency consistent by padding the other clock-low
24 phases.
25
26 Proposed mod 1 (49 machine cycles):
27
28 CS -________________________________________________
29 SCLK __-__-__-__-__-__-__-__-__-__-__-__-__-__-__-__-__
30 SDO 7 6 5 4 3 2 1 0
31 SDI __________________________________________________
32
33 And, we can pad the clock-hi times to make the clock duty cycle 50%.
34
35 Proposed mod 2 (67 machine cycles):
36
37 CS -________________________________________________________________
38 SCLK __--__--__--__--__--__--__--__--__--__--__--__--__--__--__--__--__
39 SDO 7 6 5 4 3 2 1 0
40 SDI __________________________________________________________________
41
42 So, the question is: since the SPI clock is actually the source that actually
43 clocks the conversions, is there a benefit to mod 1 in terms of quality of
44 conversion. If so, in what way would it improved the conversion? As for
45 mod 2, I seriously doubt it has any benefit, but it is included for
46 completion.
47
K Tom, thanks - I think I get it. To be clear, I do not have a 3.3V rail.
I have a 5V digital voltage rail and GND, I have a 5VA and GNDA, I
have a 3.0V reference voltage, and I have a signal that is hard-constrained
to 0-3.0V.
(This because my source signal op-amp has a 5V supply and its max V-out
is limited to 3V.)
So, I want to us my 5VA for the VA source, I want to use my 3.0V reference
for Vref, and I want to operate in '0-Vref' mode. The spec indicates that a
3.0V Vref is legitimate, even though 2.5V is nominal. Will this work, and if so,
do you have any other suggestions?
Also, any comment on the attachment I sent last time regarding the SPI
interface clocking?
Best
Al
Tom,
Thank you for all the info. It proved to be extremely helpful and you assistance is very much appreciated.
Best
Al