معامله کردن بین لیتوگرافی ناهمواری لبه خط و خطا-کدهای تصحیح الزامات مورد نیاز برای خاطرات NAND فلش
|کد مقاله||سال انتشار||مقاله انگلیسی||ترجمه فارسی||تعداد کلمات|
|24640||2012||5 صفحه PDF||سفارش دهید||3336 کلمه|
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Microelectronics Reliability, Volume 52, Issue 3, March 2012, Pages 525–529
The only way to keep pace with Moore’s Law is to use probabilistic computing for memory design. Probabilistic computing is ‘unavoidable’, especially when scaled memory dimensions go down to the levels where variability takes over. In order to print features below 20 nm, novel lithographies such as Extreme Ultra Violet (EUV) are required. However, transistor structures and memory arrays are strongly affected by pattern roughness caused by the randomness of such lithography, leading to variability induced data errors in the memory read-out. This paper demonstrates a probabilistic–holistic look at how to handle bit errors of NAND Flash memory and trades-off between lithography processes and error-correcting codes to ensure the data integrity.
The NAND Flash memory is a widespread storage media, due to its fast Non-Volatile (NV) high capacity storage. This type of memory based Solid State Drives (SSDs) is the main replacement candidate for conventional Hard Disk Drives (HDDs). The evolution of NAND Flash is supported by two main trends: memory cells are pushed into smaller geometries, and Flash cell architectures are moving to store more bits per cell . However, there are obstacles against NAND Flash scaling. Vertical Floating Gate (FG) transistor scaling is limited by insulators shrinking. Furthermore, the lateral dimension scaling leads to a cell-to-cell (or cross-cell) interference  and . The cell-to-cell interference, which is caused by parasitic capacitive coupling between neighboring FGs, is recognized as a major hurdle for memory cell scaling . Moreover, the effect of parasitic interference is data dependent in highly scaled NV memories , thus requires complex statistical simulation for modeling. Advanced lithography such as EUV and Double Patterning (DP) processes are required to reach Flash densities doubling every year. Line edge roughness (LER) is one of the main effects of process uncertainties in EUV lithography. All of the lithographic elements, such as source, mask, optical system and resist, and contribute to LER . LER is the outcome of the stochastic behavior of photons, photogenerated electrons, polymer resist erosion and acid-based annihilation. By itself, LER induces random deformations in size and spacing of the transistors  and . Moreover, it forms structural bending of a memory array  and . The cross-cell interference between the memory cells is strongly affected by gate spacing variations of the array. The pattern roughness of FG transistor structures is significant compared to the transistor sizes and sufficient to cause variations of the parasitic cross-cell interference and to degrade the memory functionality . These variations lead to stronger overlap and broadening of programmed threshold voltage Vth windows than would be expected in a memory array without LER. The overlap of Vth windows, due to the variable cross-cell interference, leads to error generation in a memory readout . Various post-lithographic techniques exist for LER mitigation: in-track chemical processes, ion beam sputtering, thermal and plasma treatments  and . The smoothing processes are able to reduce LER up to 35%, at the expense of an increased manufacturing cost. To keep the error rates (caused by variability) under a certain acceptable threshold, advanced error correcting codes (ECC) are used. The more variability triggers data errors, the stronger and more complex error control techniques are required. The complexity of the on-chip or on memory controller running ECC brings extra requirements for memories such as: chip area and power dissipation, as well as impact on access time  and . In this case study, 16 nm half-pitch (hp) NAND Flash memory was treated as communication channel. As the fault tolerance technique, Bose–Chaudhuri–Hochquenghem (BCH) and Low Density Parity Check (LDPC) codes were applied on the channel model including LER induced variability. In this paper, based on our previous research work, a holistic framework is proposed to trade-off the lithographic processes and the ECCs in highly scaled NAND Flash memories. The presented framework statistically percolates the variations caused by LER of advanced lithographies and post-lithography techniques all the way up from the material level to the system level. The presented probabilistic tool helps to evaluate the reliability improvement via LER smoothing processes or by boosting the error correcting complexity by adding more parity bits and/or usage of more advanced codes. The rest of the paper is organized as follows: Section 2 describes LER modeling techniques, in Sections 3 and 4 simulation results are shown followed by the conclusions.
نتیجه گیری انگلیسی
It was shown that EUV lithography suffers from pattern roughness that makes it challenging to keep pace with Moore’s Law for advanced node generations. The proposed probabilistic methodology based on the variability aware modeling of LER enables to benchmark different call architecture choices, processing techniques and ECCs. The simulated results link the process technologies and memory systems level techniques and trades-off. This paper demonstrates a probabilistic–holistic look at how to handle bit errors of NAND Flash memory and trades-off between lithography processes and error-correcting codes to ensure the data integrity.