Abstract:
Techniques are provided for adaptive read threshold voltage tracking with gap estimation between default read threshold voltages. A read threshold voltage for a memory is adjusted by estimating a gap between two adjacent default read threshold voltages using binary data from the memory, wherein the gap is estimated using statistical characteristics of at least one of two adjacent memory levels of the memory; computing an adjusted read threshold voltage associated with the two adjacent memory levels by using the statistical characteristics of the two adjacent memory levels and the gap; and updating the read threshold voltage with the adjusted read threshold voltage. Pages of the memory are optionally read at multiple read threshold offset locations to obtain disparity statistics, which can be used to estimate mean and/or standard deviation values for a given memory level. The gap is optionally estimated using the mean and/or standard deviation values.
Abstract:
Methods and apparatus are provided for adaptive read threshold voltage tracking with separate characterization on each side of a voltage distribution about a distribution mean. A read threshold voltage for a memory is adjusted by determining statistical characteristics of two adjacent memory levels based at least in part on a type of statistical distribution of the memory levels and a distribution of data values read from cells using a plurality of read threshold voltages, wherein the statistical characteristics of the two adjacent memory levels are characterized independently on two sides about at least one mean of the statistical distribution; computing an adjusted read threshold voltage associated with the two adjacent memory levels by using the statistical characteristics of the two adjacent memory levels; and updating the read threshold voltage based on the adjusted read threshold voltage. The adjustment is optionally performed responsive to one or more read errors.
Abstract:
Data compression techniques are provided that remove redundancy across the boundary of compression search engines. An illustrative method comprises splitting the data frame into a plurality of sub-chunks; comparing at least two of the plurality of sub-chunks to one another to remove at least one sub-chunk from the plurality of sub-chunks that substantially matches at least one other sub-chunk to generate a remaining plurality of sub-chunks; generating matching sub-chunk information for data reconstruction identifying the at least one removed sub-chunk and the corresponding substantially matched at least one other sub-chunk; grouping the remaining plurality of sub-chunks into sub-units; removing substantially repeated patterns within the sub-units to generate corresponding compressed sub-units; and combining the compressed sub-units with the matching sub-chunk information to generate a compressed data frame. The data frame optionally comprises one or more host pages compressed substantially simultaneously, and the compressed data frame for a plurality of host pages compressed substantially simultaneously comprises a host page address for each host page.
Abstract:
Method and apparatus for decoding data. In some embodiments, an LDPC decoder has a variable node circuit (VNC) with a plurality of variable nodes configured to store bit reliability values of m-bit code bits. A check node circuit (CNC) has a plurality of check nodes configured to perform parity check operations upon n-bit messages from the VNC. Each n-bit message is formed from a combination of the bit reliability values and stored messages from the check nodes. A pre-saturation compensation circuit is configured to maintain a magnitude of each n-bit message received by the CNC below a saturation limit comprising the maximum value that can be expressed using p bits, with p less than n and each of the n-bit messages received by the CNC having a different magnitude. The pre-saturation compensation circuit may apply different scaling and/or bias factors to the n-bit messages over different decoding iterations.
Abstract:
Apparatus and method for reducing read disturbed data in a non-volatile memory (NVM). Read operations applied to a first location in the NVM are counted to accumulate a read disturb count (RDC) value. Once the RDC value reaches a predetermined threshold, a flag bit is set and a first bit error statistic (BES) value is evaluated. If acceptable, the RDC value is reduced and additional read operations are applied until the RDC value reaches the predetermined threshold a second time. A second BES value is evaluated and data stored at the first location are relocated if an unacceptable number of read errors are detected by the second BES value. Different thresholds are applied to the first and second BES values so that fewer read errors are acceptable during evaluation of the second BES value as compared to the first BES value.
Abstract:
Adaptive read threshold voltage tracking techniques are provided that employ bit error rate estimation based on a non-linear syndrome weight mapping. An exemplary device comprises a controller configured to determine a bit error rate for at least one of a plurality of read threshold voltages in a memory using a non-linear mapping of a syndrome weight to the bit error rate for the at least one of the plurality of read threshold voltages.
Abstract:
An apparatus for reading a non-volatile memory includes a tracking module operable to calculate means and variances of voltage level distributions in a non-volatile memory and to calculate at least one reference voltage to be used when reading the non-volatile memory based on the means and variances, a likelihood generator operable to calculate at least one other reference voltage to be used when reading the non-volatile memory, wherein the at least one other reference voltage is based at least in part on a predetermined likelihood value constellation, and to map read patterns from the non-volatile memory to likelihood values, and a read controller operable to read the non-volatile memory using the at least one reference voltage and the at least one other reference voltage to yield the read patterns.
Abstract:
An apparatus having a circuit and a decoder is disclosed. The circuit is configured to adjust an initial one of a plurality of reference voltages in a read channel of a memory by shifting the initial reference voltage an amount toward a center of a window and read a codeword from the memory a number of times. The window bounds a sweep of the reference voltages. Each retry of the reads uses a respective reference voltage from a pattern of the reference voltages. The pattern is symmetrically spaced about the initial reference voltage. The pattern fits in the window. The decoder is configured to generate read data by performing an iterative decoding procedure on the codeword based on the reads.
Abstract:
Apparatus and method for recovering data from a multi-channel input signal, such as but not limited to a readback signal from a bit patterned medium (BPM) having a plurality of subtracks. In accordance with some embodiments, a single input single output (SISO) equalizer is adapted to generate equalized outputs responsive to alternating subchannels of the multi-channel input signal. A detector is adapted to generate estimates of data symbols represented by the input signal responsive to the equalized outputs. A switching circuit is adapted to switch in different equalizer coefficients for use by the SISO equalizer for each of the alternating subchannels in the input signal.
Abstract:
Adaptive read reference voltage tracking techniques are provided that employ charge leakage mitigation. An exemplary device for use with multi-level memory cells, comprises a controller configured to: after a predefined time interval that approximates a settling time after a programming of the multi-level memory cells until a charge leakage of one or more of the multi-level memory cells has settled, determine a plurality of read reference voltages for the multi-level memory cells using a post-programming adaptive tracking algorithm; and employ the plurality of read reference voltages to read data from the multi-level memory cells. The reference voltage offsets are optionally determined based on a shift in the read reference voltages after the predefined time interval since the programming of the multi-level memory cells.