Abstract:
In some embodiments, an analog-to-digital converter (ADC) comprises a loop filter configured to produce an error signal based on a difference between an analog input signal and a feedback signal. The ADC also comprises a main comparator set comprising one or more main comparators, the main comparator set configured to digitize the error signal and further configured to drive a main digital-to-analog converter (DAC). The ADC further comprises an auxiliary comparator set comprising a plurality of auxiliary comparators, the auxiliary comparator set configured to digitize the error signal when the ADC is in a runaway state and further configured to drive an auxiliary DAC to bring the error signal into a predetermined range.
Abstract:
Described is an apparatus which comprises: a digital-to-analog converter (DAC) having a DAC cell with p-type and n-type current sources and an adjustable strength current source which is operable to correct non-linearity of the DAC cell caused by both the p-type and n-type current sources; and measurement logic, coupled to the DAC, having a reference DAC cell with p-type and n-type current sources, wherein the measurement logic is to monitor an integrated error contributed by both the p-type and n-type current sources of the DAC cell, and wherein the measurement logic is to adjust the strength of the adjustable strength current source according to the integrated error and currents of the p-type and n-type current sources of the reference DAC cell.
Abstract:
An analog-to-digital converter (ADC) and method of operation thereof are provided for converting an analog signal to a digital signal. The ADC utilizes Correlated Electron Material (CEM) devices that may contain a transition metal oxide (TMO), such as Nickel Oxide (NiO). The ADC may include an interconnect circuit that is operable to couple a power supply to the CEM devices. The power supply is controlled to program the resistance of the CEM devices and thereby control performance characteristics of the ADC.
Abstract:
A method of operating a time-interleaved analog-to-digital converter for conversion of an analog input signal to a digital output signal having a sample rate R comprises, for each of at least some activations of an array of constituent analog-to-digital converters, defining first and second sets of the constituent analog-to-digital converters, feeding the analog input of each analog-to-digital converter of the first set with a reference value for imperfection measurements and clocking each analog-to-digital converter of the first set with one of the timing signals, feeding the analog input of each of analog-to-digital converter of the second set with the analog input signal for generation of an intermediate constituent digital output signal at the digital output and clocking each analog-to-digital converter of the second set with one of the timing signals, wherein no timing signal is used to clock two or more of analog-to-digital converters of the second set.
Abstract:
A circuit includes an input dispatch unit for receiving an input signal and a calibration signal and outputting N dispatched signals in accordance with a selection signal. The circuit also includes N analog-to-digital converter (ADC) units for receiving the N dispatched signals, N control signals, and N mapping tables and outputting N raw data, and N refined data, respectively. An output dispatch unit receives the N refined data and outputting an output data in accordance with the selection signal, and a calibration controller receives the N raw data and outputting the selection signal, the N control signals, the N mapping tables, and a digital code. A DAC (digital-to-analog converter) receives the digital code and outputting the calibration signal, wherein one of the dispatched signals, as specified by the selection signal is from the calibration signal while the other dispatched signals are from the input signal.
Abstract:
A circuit includes an input dispatch unit for receiving an input signal and a calibration signal and outputting N dispatched signals in accordance with a selection signal. The circuit also includes N analog-to-digital converter (ADC) units for receiving the N dispatched signals, N control signals, and N mapping tables and outputting N raw data, and N refined data, respectively. An output dispatch unit receives the N refined data and outputting an output data in accordance with the selection signal, and a calibration controller receives the N raw data and outputting the selection signal, the N control signals, the N mapping tables, and a digital code. A DAC (digital-to-analog converter) receives the digital code and outputting the calibration signal, wherein one of the dispatched signals, as specified by the selection signal is from the calibration signal while the other dispatched signals are from the input signal.
Abstract:
A method for testing linearity of an ADC, comprising receiving a trigger signal indicating an ADC input voltage step adjustment, reading an ADC output sample upon receiving the trigger signal, wherein the ADC output sample has a value range of N integer values that correspond to N discrete ADC output codes, computing a histogram of code occurrences for M consecutive ADC output codes, wherein the histogram comprises M number of bins corresponding to the M consecutive ADC output codes, and wherein M is less than N, updating a DNL value and an INL value according to the histogram at an interval of K number of ADC output sample readings, and shifting the histogram by one ADC output code after updating the DNL and the INL values.
Abstract:
Provided is a semiconductor device that is capable of performing background calibration during a reception operation without adversely affecting reception characteristics. During a reception operation, the semiconductor device detects a timing at which an invalid received signal occurs upon a gain change or a reception channel change and performs background calibration at the detected timing. In this instance, as the received signal is invalid, performing the calibration does not further decrease the substantial accuracy of reception. Moreover, an unnecessary signal component, which would arise when the background calibration is performed at fixed intervals, will not be generated as far as the background calibration is performed at random timing.
Abstract:
An analog-to-digital converter (ADC) includes an analog input stage including an output configured to generate an analog output signal and a digital stage coupled the output of the analog input stage. The digital stage is configured to classify the analog output signal into one of a plurality of consecutive voltage ranges. Responsive to the analog output signal being classified in a first enumerated voltage range of the plurality of voltage ranges during a rotation of a sample, a voltage for a subsequent rotation is determined as if the analog output signal is classified into a non-enumerated voltage range selected according to a state of a random number signal.
Abstract:
An arrangement for reading out an analog voltage input signal includes an input applying the input signal thereto, and a reference unit generating an analog reference voltage. To perform online self-calibration, the arrangement includes a superposition unit generating a combined analog signal by superimposing the analog reference voltage onto the input signal, a converting unit converting the combined analog signal into a one-bit serial data stream at a conversion sampling rate, and a decomposition unit, which includes at least two digital filters configured to generate from the serial data stream two corresponding digital signals at different data rates, which can be less than the conversion sampling rate. Two data processing units calculate from the corresponding digital signal a digital input voltage representing the input signal and a digital reference voltage representing the analog reference voltage or a disturbance voltage signal representing parasitic voltage components introduced by the superposition unit, respectively.