Author_Institution :
Intel Corp. (USA), Hillsboro, OR, USA
Abstract :
As the gap between the lithography wavelength and critical feature size has continued to increase, the semiconductor industry has had to adjust. Previously, scaling along Moore´s Law had relied on improvement in lithography equipment, occasionally by reducing the wavelength and frequently by improving the effective numerical aperture. The wavelength used to define critical dimensions during much of the nineties was 248nm, with the industry switching to 193nm in the following decade. After 157nm technology failed to materialize due to technological challenges, the focus of next generation lithography (NGL) research shifted to EUV. Meanwhile, 193nm continued to be the workhorse for the continuation of Moore´s Law. The next big equipment advancement came with immersion steppers, which “increased” the effective aperture of the lens, thereby capturing more of the diffraction orders in the imaging process. Moore´s Law, however requires a significant innovation every two years! With optical lithography wavelength and effective lens aperture stretched to these limits, double patterning came to the rescue. By splitting the pattern into two, the burden on each mask was somewhat reduced, allowing a “stitched” pattern to scale. All through these changes, continued scaling also required a tighter co-optimization between process and design. As the image quality became generally weaker, greater restrictions were placed on the diversity of features that could be robustly patterned, leading to the proverbial increase in the heft of design rule manuals. All through these heroic efforts to continue Moore´s Law, in order to reap the resulting benefits to the electronics industry, a relatively new field called Computational Lithography has been providing a helping hand. Computational Lithography comprises a broad set of techniques that use physics-based calculations to eke out greater lithographic performance from a given generation of steppers. This- fertile field has recently introduced two advanced features: inverse lithography, and source-mask optimization. Such techniques have helped extend the life of optical lithography beyond previously forecast durations. At the same time, as other lithography technologies become more mature, much of the computational infrastructure developed to extend optical lithography will likely be used to optimize the newer technologies. This paper will provide some examples of how Computational Lithography is creating novel and affordable solutions to sustain the scaling trend.