Digital to analog
converter (DAC) and analog to digital converter (ADC) are used in
microprocessors and computers. In a DAC, for each digital input, DAC outputs a
unique analog voltage. This output voltage is the sum of input binary weights.
Every bit in the straight binary word is weighted according to their position
in the binary word. Thus if a 5-bit DAC produces 0.2 V output for the digital
input 00010, for an input 11111, output will be 3.1 V. Resolution is defined as
the smallest change occurring in the analog output as a result of change from
one digit to the other digit in the binary input. Resolution is also known as
step size. The various parameters which influence DAC performance are
linearity, resolution, settling time, monotonicity, accuracy, temperature
co-efficient and conversion rate. An ideal DAC is always linear. Non-linearity
can be of four types. They are integral non-linearity, differential
non-linearity, scale or gain factor non-linearity and offset non-linearity.
Analog to digital converter (ADC) is used to perform tasks such as sampling, quantization
and encoding into digital format. The different types of ADCs are successive
approximation ADC, dual slope integration ADC, parallel ADC (flash ADC),
counter ADC and voltage to frequency converter ADC.
Monday, April 30, 2012
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment