Lab #15:

Analog-to-Digital Conversion


This lab explores the operation of a successive-approximation analog-to-digital converter (ADC). ADC's convert an analog input voltage into a binary number output digitally. This particular flavor of ADC accomplishes this conversion by rapidly comparing the input voltage to a series of internal reference voltages generated by an onboard DAC. When the input voltage matches the reference voltage, the digit which make the DAC match the input is the output of the ADC.


The internal search logic of the ADC works as follows:

  1. The most-significant-bit of the internal DAC is set high. If the input signal is higher than the DAC's output (using an on-board comparator) the MSB is left high, else it is set low.

  2. The next-most significant bit is set high, the output is compared to input, and left on if the input is still larger than the reference DAC's output.

  3. Wash, rinse, repeat for all bits, with the LSB done last.


The logic which carries this sequence out, comparing the resulting output to the input and deciding whether to leave the bits on or not is called a successive approximation register (SAR). The chip implementing this for today's lab is the ADC0804, which has 8 bits. The data sheet is available (and long!).


1) Begin by examining the data sheets for the ADC0804. The pinouts are on the first page. Note the analog inputs, the digital outputs, and the control lines. Specs and limits are on the first and second pages, and the timing diagram is on the third page. Check the maximum voltage ratings for the input and control lines. Assemble the basic ADC system with Vcc=5V and leave Vref/2 disconnected, thereby using the power supply as the reference voltage. Skip past the many pages of sample circuits for now, and proceed to the Functional Description. Determine the functions and proper connections for the CS* (conversion start), WR* (write), RD* (read), and INTR* lines. You will also need to provide an external RC network to make the chip's internal clock work - consult the data sheet (§2.6) and select R and C so that the internal clock runs at 200kHz. Note that the chip needs about 70 clock cycles to complete a digitization.


2) Provide an input voltage within the allowed range, and manually manipulate the input control signals CS*, WR*, and RD* to make the system perform a single A/D conversion. Watch INTR* as this happens. Once the conversion is complete, examine the outputs to see what the answer is. A test circuit using LEDs to do this is shown in the data sheet.


3) Once the basic operation is confirmed and the LED display is operational, wire the control lines such that the chip is put into continuous or free-running mode. This is the mode that sets the ADC to performing continuous conversions, updating the output latch as frequently as it is able. You will have noticed the instructions for how to do this while reading the functional description of the inputs in the data sheet. While the system is in free-running mode and the LEDs hooked up, determine the smallest voltage change which this system can detect by slowly varying the input voltage. Note the input voltage this occurred at, and continue to vary the input voltage until the LEDs change again - also note this voltage. Check this V for input voltages near 0V, 2.5V, and 5V. Compare these results with the expected resolution of the ADC based upon a 5V reference and 8 bits of resolution. Make a plot of the binary number represented by the output as a function of input voltage, measured with a DVM, as the input is slowly varied over a narrow range (say, 0.0 0.2V). You should get a staircase-like result - are the step sizes what you expect? Is the output linear (within the expected resolution) over this range? Finally, explore modifying the input range by altering the reference voltage. Use a trimpot to provide 1.25V to the Vref/2 pin. This sets the converter's rage to 0 to 2.5V. See if the smallest V is indeed ½ of what it was with Vref = 5V.


4) Design a “front end” for your ADC using an op-amp or two to allow inputs in the range -10 to +10V to be safely converted to the 0 to 5V range expected by the ADC. Use the op-amps to compress and shift the larger input range to the one tolerated by the ADC. Test the output of this “signal conditioning” circuit before feeding it to your ADC. The digital outputs of the ADC are now “offset binary” representations of the analog signal being input to the op-amps. Test this circuit by using the ADC output lines to drive the DAC0808 from the last lab, and feed the op-amps a sine wave of the appropriate amplitude. Compare the analog output of the DAC with the original sine wave and sketch the signals. How high a frequency can this system accept before the output becomes too distorted? What part of your system dominates this frequency limit? A square wave input might make seeing where the output deviates from the input easier here.