## Abstract

We apply a phase retrieval algorithm to the intensity pattern of a Hartmann wavefront sensor to measure with enhanced accuracy the phase structure of a Hartmann hole array. It is shown that the rms wavefront error achieved by phase reconstruction is one order of magnitude smaller than the one obtained from a typical centroid algorithm. Experimental results are consistent with a phase measurement performed independently using a Shack-Hartmann wavefront sensor.

© 2012 Optical Society of America

## 1. Introduction

The need for more powerful computers has motivated the semiconductor industry to develop year after year new lithographic techniques that can meet the expectations of Moore’s law. In recent years Deep Ultraviolet Lithography (DUVL) has been the main patterning technology [1]. The current 32 nm node of the semiconductor chips is achieved using 193 nm immersion lithography and a double patterning manufacturing process. However, for ambitious future goals such as the 16 nm node, a complete new technology is needed. For an optical system with numerical aperture NA and wavelength *λ*, the critical dimension (CD) of the printed feature in an optical system is defined by:

*k*

_{1}is an experimentally determined factor with a minimum value around 0.25. In the current DUV immersion Lithography, NA has already reached the high value of 1.44. Decreasing the wavelength is therefore unavoidable to reduce the CD. For this reason, the interest in Extreme Ultraviolet Lithography (EUVL), which is expected to become the next-generation of patterning technology, has grown significantly in the last years [2]. EUVL uses radiation of 13.5 nm from a high-power plasma source and reflective optical systems consisting of mirrors coated with highly reflective Mo/Si multilayers (ML). The maximum reflectance of such mirrors is around 70% at normal incidence. Although the mirror substrate has optimal mechanical and thermal stability, the remaining 30% of the incident energy that is absorbed by the mirror causes deformations and hence phase errors (aberrations) on the wafer [3].

To improve the image on the wafer, using an adaptive optical system in the lithography machine to correct for the phase distortion induced by the thermal stress would be attractive. In this case one needs to measure and monitor continuously the wavefront aberrations at the 13.5 nm wavelength. This task is challenging because of the physical properties of the radiation that is used. EUV radiation is strongly absorbed by all materials and gases, hence also by refractive optical elements like lenses and beamsplitters that are typically used in an interferometric setup. For this reason, common path interferometers working at 13.5 nm wavelength have been developed in the past years [4]. Nonetheless, they reach a very high accuracy in the wavefront measurement (of the order of 100 pm rms) but require a almost perfect reference wavefront and strong effort in the alignment. Moreover, EUV plasma sources have a limited temporal and spatial coherence making interferometry more elaborate. As an alternative to interferometry, the phase of the wavefront at this wavelength of 13.5 nm can be measured by using an Hartmann Wavefront Sensor (HWS). However, as explained in Section 2, the straightforward application of the HWS [5] leads to high inaccuracies.

In this paper we show how the accuracy in the phase reconstruction using such a device can be substantially improved. We apply a phase retrieval algorithm to the intensities diffracted by the HWS. The unknown phase in the plane of the HWS is expanded in the Zernike basis and the coefficients are obtained by solving a best fit algorithm. The simulations are confirmed by the experimental results.

The outline of the paper is as follows. In Section 2, the working principle of a Hartmann wavefront sensor is recalled. We explain in particular the limitation of the method. In Section 3 we describe the application of diffraction theory that we use to simulate the experimental setup and the phase retrieval procedure. In Section 4 the design of the Hartmann sensor and two computational methods to retrieve the phase are discussed. In particular we compare the phase reconstruction results achieved by our phase retrieval method with the phase reconstruction obtained with a standard algorithm. In Section 5 the experimental setup is discussed and the phase retrieval results are presented for two different aberration types. Finally, the conclusions are presented in Section 6.

## 2. Hartmann wavefront sensor and its limits

The Hartmann Wavefront Sensor (Fig. 1(a)) is widely used for beam characterization since it gives intensity and phase distribution in real time. It consists of an opaque screen containing an array of holes placed in the beam path. The beam is sampled by the hole array and the transmitted beamlets are collected by a detector placed at a certain distance from the screen. Aberrations in the beam cause a measurable shift of the beamlets at the detector plane as compared to the aberration-free case. By measuring the shift, the local slope of the wavefront at the sampling points (i.e. in the hole array) can be determined. In this way, a discrete map of the wavefront gradient component in the *x* and *y* directions results from

*x*and Δ

_{i,j}*y*are the measured shifts along the

_{i,j}*x*and

*y*directions of the spot of hole (

*i, j*).

*L*is the distance between the sampling and detection planes. The displacement of the beamlets is usually calculated with a fast and efficient centroid algorithm [6, 7]. From the obtained gradient map, the original wavefront can be recovered by zonal or modal integration as described in [8].

The accuracy of this method is given by the smallest beamlet deviation that can be distinguished. This device was improved by R. Shack in 1960 (the Shack-Hartmann Wavefront Sensor), by replacing the hole array by a microlens array in order to achieve a small beamlet spot on the detector and thus increase the accuracy in the wavefront reconstruction [9, 10]. However, because refractive materials absorb EUV radiation, a microlens array is not suitable for this wavelength range, and thus a Hartmann Wavefront sensor is chosen for this application.

In the characterization of syncrotron beams with *λ* = 13.5 nm, it has already been demonstrated that Hartmann sensors can reach an accuracy of *λ*/120 [11, 12]. Nevertheless, approximations are made with this approach. First, the phase distribution in the sampling points is approximated by a tilted plane with an average *x* and *y* tilt component (Fig. 1(b)). Therefore, the information about the local curvature inside the sub-aperture of the Hartmann screen is lost. Secondly, Eq. (2) is only valid under the assumption of small deflection angles (Fig. 1(a)), i.e. sin(*α _{x},α_{y}*) ≃ tan(

*α*) ≃ (

_{x},α_{y}*α*). This means that in the centroid algorithm the computation of the slope is linearized [13].

_{x},α_{y}To overcome these limitations we propose to apply a phase retrieval algorithm on the intensity data collected by the detector to recover the phase structure before the transmission of the Hartmann hole array. Thus, we do not only use the information about the center of mass of the beamlet but we use the complete diffraction pattern to get information about the local phase distribution.

## 3. Mathematical description of the phase retrieval algorithm for a Hartmann wavefront sensor

Phase retrieval is a nonlinear algorithm for recovering the phase distribution at the aperture plane of an optical system from the intensity measurement at the detection plane.

For a HWS, let *A*(*x,y*) be the amplitude of the complex field distribution after transmission by the Hartmann hole array and let Φ(*x,y*) be its phase distribution. Hence the field in the aperture plane *z* = 0 is given by:

*U*(

*x,y,L*) is then given by:

**denotes the two dimensional Fourier transform,

**[

*U*(

*x,y,*0)] is the angular spectrum of the plane waves and

*f*and

_{x}*f*are the spatial frequencies in the Fourier domain [14]. The quantity

_{y}*L*represents the propagation distance and

*λ*is the wavelength. The phase retrieval procedure is to recover the phase distribution Φ from the measured intensity |

*U*(

*x,y,L*)|

^{2}and the intensity |

*U*(

*x,y,*0)|

^{2}in the aperture plane which is assumed to be uniform and known [1].

The first phase retrieval routines were the Gerchberg-Saxton algorithm [15] and the input-output algorithm proposed by Fienup [16], but the first suffers from slow convergence and the second from sensitivity to noisy data. Another phase retrieval approach is based on nonlinear optimization of a suitable object function [17].

We first write the phase as a linear combination of Zernike polynomials:

where*Z*(

_{k}*x, y*) is the

*k*polynomial. The coefficients

^{th}*α*are the the optimization variables of the problem. The chosen object function for minimization is defined by the sum of the squared differences between calculated and measured pixel intensities in the CCD:

_{k}*F*(

*x*) is the square root of the measured intensity on the pixel (

_{i}, y_{j}*i, j*), (i.e., the amplitude in the pixel (

*i, j*)). The sum is over all discrete values in the 2D distribution at the detector (i.e., over the number of CCD pixels). A gradient search method is usually chosen to perform the minimization. In this paper we use, alternatively, a trust-region reflective algorithm [18] to find the solution of the problem. The reason of using such a method is due to its fast convergence and the recent development of reliable and efficient software [19].

## 4. Simulation results

Using simulated intensity data, we compare the reconstructions obtained using the phase retrieval algorithm and the wavefront slope method. The simulations are consistent with the experimental setup we will describe in the next section. To facilitate the experiment, the problem parameters (i.e., width of the holes in the Hartmann array 2*d* and distance between the array and detector *L*) has been scaled to visible light of wavelength *λ* = 638*nm*, such that the Fresnel Number of the system

*μ*m (which is equivalent to 80

*μ*m at the EUV wavelength) and the spacing between two adjacent holes is 562.5

*μ*m. The distance between the array and the detector is chosen to be 50 mm (400 mm at EUV wavelength) to obtain a well defined spot size on the detector. The detector is assumed to have 1024 × 1024 pixels with a pixel size of 16.5

*μm*. The spot size due to one hole on the detector is about 150

*μm*and covers about 15×15 pixels on the CCD camera as shown in Fig. 2(a).

In our simulation we implemented the typical Gaussian noise distribution that has been measured during the characterization of the dark current of the CCD camera used in our experimental setup. The noise distribution (normalized to 1) shown in Fig. 2(b) is characterized by a mean value *μ* = 0.0076 and a standard deviation *σ* = 4 × 10^{−4}.

The chosen phase distributions are derived from two real measurements for a deformable mirror with defocus and astigmatism, respectively, as characterized by a Zygo interferometer. Figure 3 shows the two phase distributions. The first phase distribution has as a predominant term the 5* ^{th}* Zernike coefficient (the Malacara system [21] is used here to describe the Zernike spectrum) and will be called the “defocus distribution” (Fig. 3((a)–3(b)). The second distribution has a dominant 4

*coefficient and is called the “astigmatism distribution” (Fig. 3((c)–3(d)). The amplitude of the complex field is constant.*

^{th}In Fig. 4, the reconstructed wavefront and the relative residual wavefront error obtained with the wavefront slope measurement are depicted for both the defocus and the astigmatism case. The rms wavefront error is 0.0024*λ* in the first case and 0.0023*λ* in the second case. These values have the same order of magnitude as those reported in Ref. [11] for the EUV range. The phase retrieval algorithm is implemented in Matlab^{®} using the included *lsqnonlin* routine [19, 18]. We optimize for the first 5 Zernike orders (15 modes). At the starting point all coefficients are zero.

Figure 5 shows the reconstructed wavefront and the residual wavefront error obtained by the phase retrieval algorithm again for the defocus and astigmatism simulations. The reconstructed wavefront is clearly improved as it can be seen also from the comparison of the Zernike spectrum for each case in Fig. 6. The smaller rms wavefront errors of 5.46×10^{−4}*λ* and 2.24×10^{−4}*λ*, respectively, correspond to an improvement of almost one order of magnitude in the rms wavefront error reconstruction.

These results show that, with the simulated phase distributions, the simple evaluation of the wavefront slope is not sufficient to get an acceptable wavefront reconstruction (i.e., rms wavefront error of the order *λ*/100). In contrast, with the phase retrieval procedure we can get more information from the intensity pattern and the phase distribution of the field under test is correctly recovered.

## 5. Experiments

A sketch of our optical test bench is shown in Fig. 7. We use a semiconductor intensity-stabilized laser (Newport TLB-6304) with wavelength *λ* = 638 nm coupled by a single mode fiber. The beam coming from the fiber is first collimated by a collimator lens and then clipped by an iris of 10 mm diameter. The iris is then conjugated with a Spatial Light Modulator (SLM) (Holoeye LC-R 2500), that is used to generate a certain phase distribution in the beam. The SLM consists of a LCOS display with a resolution of 1024×768 pixels over an image array dimension of 19.5 × 14.6 mm^{2} and supports a digital video interface (DVI) signal. The phase shift induced by the SLM was characterized in situ on the same optical bench using a Shack-Hartmann wavefront sensor (Thorlabs WFS150-5C, 1.3 Mpixel, *λ*/50 rms accuracy). During the characterization, a strong astigmatism component was measured by the SH sensor at the SLM plane due to the non-perfect flatness of the surface (Fig. 8(a)). This measured phase distortion was then used to correct for this systematic error of the SLM [22] and to achieve a flatter wavefront as shown in Fig. 8(b). Next, we measured the complex shift (amplitude and phase) of the reflected beam as a function of the integer value between 0 and 255 (the gray level) associated with the pixel of the modulator (Fig. 8(c)).

Furthermore, the phase modulated beam size is increased in size by a factor of 2.5 by a beam expander (lenses L1 and L2) to illuminate uniformly the Hartmann grid that is conjugated with the SLM. The Hartmann grid has dimensions as described in Section 4 and consists of a 1.5 mm thick glass substrate coated with a 200 *μ*m thick layer of Chromium. For the manufacturing of the chromium mask an Electron Beam Pattern Generator (EBPG) was used. The diffraction pattern at a distance of 50 mm is then collected by a CCD camera (1280 × 960 pixel) through a camera lens (L3). Between the beam expander and the Hartmann grid a beamsplitter is placed in order to direct half of the beam to a Shack-Hartmann wavefront sensor (S-H WFS) through a 4x beam reducer (the clear aperture of this latter device is 4.8 mm) to monitor in real time the wavefront distribution. The measurements taken with the latter device are used as a reference for a comparison with the phase measured by the phase retrieval procedure.

We generate two phase distributions with the SLM, namely a tilt in the *x*-direction and defocus. The phase is measured with the Shack-Hartmann wavefront sensor and the results are shown in the Fig. 9.

These plots reveal that the phase distribution suffers from other aberrations beside tilt or defocus, like astigmatism and spherical aberration. These are due to the other optical components (i.e., beam splitters and spherical lenses) that are present in the beam path. Apart from these aberrations, the tilt and defocus introduced in the optical system measured by the Shack-Hartmann sensor, has a P–V of 0.225*λ* and 0.258*λ* respectively. The phase retrieval results are presented in Fig. 10 and 11. Also, a image of the intensity pattern to which the phase retrieval algorithm is applied is shown, as well as the convergence of the optimization.

The results show other aberrations in addition to the aberration induced by the SLM. These are likely caused by the optics between the SLM and the wavefront sensor. Referring only to the retrieved Zernike coefficients for tilt and defocus we measured in this case a P–V of 0.248*λ* and 0.285*λ*, in a reasonable agreement with the measurement using a Shack-Hartmann sensor.

The convergence plots show that the algorithm reaches the minimum with only a few iterations (less than 10). The search reaches the optimum value for the first 5 Zernike order (i.e., 15 different Zernike modes) in less than 5 minutes on a personal computer.

## 6. Conclusion

We have shown the feasibility of applying a phase retrieval algorithm to an Hartmann Wavefront Sensor as an instrument to measure the aberration in a EUV Lithography machine with sub-nanometer accuracy.

The limitations of the wavefront slope evaluation for the phase reconstruction by the usual centroid algorithm have been discussed and simulations were conducted to compare the centroid based phase reconstruction with the phase retrieval algorithm in the presence of realistic noise. Simulations show that the rms wavefront error decreases by an order of magnitude with respect to the traditional wavefront reconstruction based on the wavefront slope measurement.

An experimental setup with visible light has been built in order to demonstrate the phase retrieval method. The experimental results validated the simulation and show a reasonable agreement with the phase measurement done with a Shack-Hartmann Sensor.

The results of this research can be applied to monitoring the wavefront in a EUV Lithography machine and eventually this information can be used to drive an adaptive optics system to achieve diffraction-limited image quality on the semiconductor wafer.

## Acknowledgments

This research is supported by the Dutch Ministry of the Economic Affairs and the Provinces of Noord-Brabant and Limburg in the frame of the “Pieken in de Delta” program. We also want to thank Anja van Langen from the Kavli Nanolab for manufacturing the Hartmann mask.

## References and links

**1. **C. A. Mack, *Fundamental Principles of Optical Lithography: The Science of Microfabrication* (Wiley-Interscience, 2007). [CrossRef]

**2. **V. Bakshi, *EUV Lithography* (SPIE Press, 2009).

**3. **R. Saathof, Precision and Microsystem Engineering Dept., Delft University of Technology (private communication).

**4. **G. E. Sommargren, D. W. Phillion, M. A. Johnson, N. Q. Nguyen, A. Barty, F. J. Snell, D. R. Dillon, and L. S. Bradsher, “100-picometer interferometry for EUVL,” Proc. SPIE **4688**, 316–328 (2002). [CrossRef]

**5. **A. Polo, F. Bociort, S. F. Pereira, and H. P. Urbach, “Wavefront measurement for EUV lithography system through Hartmann sensor,” Proc. SPIE **7971**, 79712R (2011). [CrossRef]

**6. **L. A. Carvalho, “A simple and effective algorithm for detection of arbitrary Hartmann-Shack patterns.” J. Biomed. Inf. **37**, 1–9 (2004). [CrossRef]

**7. **C. Leroux and C. Dainty, “Estimation of centroid positions with a matched-filter algorithm: relevance for aberrometry of the eye,” Opt. Express **18**, 1197–206 (2010). [CrossRef] [PubMed]

**8. **W. H Southwell, “Wave-front estimation from wave-front slope measurements,” J. Opt. Soc. Am. **70**, 998–1006 (1980). [CrossRef]

**9. **R. Shannon, R. Shack, J. Harvey, and R. Hooker, *Robert Shannon and Roland Shack: legends in applied optics*, Press Monograph (SPIE Press, 2005).

**10. **D. R. Neal, “Shack-Hartmann wavefront sensor precision and accuracy,” Proc. SPIE **4779**, 148–160 (2002). [CrossRef]

**11. **P. Mercère, P. Zeitoun, M. Idir, S. L. Pape, D. Douillet, X. Levecq, G. Dovillaire, S. Bucourt, K. A. Goldberg, P. P. Naulleau, and S. Rekawa, “Hartmann wave-front measurement at 13.4 nm with *λ _{EUV}*/120 accuracy,” Opt. Lett.

**28**, 1534–1536 (2003). [CrossRef] [PubMed]

**12. **P. Mercère, M. Idir, J. Floriot, X. Levecq, A. Erko, T. Krist, and A. Michette, *Modern developments in X-Ray and neutron optics*, (Springer BerlinHeidelberg, Berlin, Heidelberg, 2008).

**13. **H. H. Barrett, C. Dainty, and D. Lara, “Maximum-likelihood methods in wavefront sensing: stochastic models and likelihood functions,” J. Opt. Soc. Am. A. **24**, 391–414 (2007). [CrossRef]

**14. **J. W. Goodman, *Introduction to Fourier optics* (Roberts and Company Publishers, 2005).

**15. **R. W. Gerchberg and W. O. Saxton, “A practical algorithm for the determination of the phase from image and diffraction plane pictures,” Optik **35**, 237–246 (1972).

**16. **J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. **21**, 2758–69 (1982). [CrossRef] [PubMed]

**17. **J. R Fienup, “Phase-retrieval algorithms for a complicated optical system,” Appl. Opt. **32**, 1737–1746 (1993). [CrossRef] [PubMed]

**18. **T. F. Coleman and Y. Li, “An interior trust region approach for nonlinear minimization subject to bounds,” SIAM J. Optim. **6**, 418–445 (1996). [CrossRef]

**19. **T. F. Coleman and Y. Li, “On the convergence of reflective Newton methods for large-scale nonlinear minimization subject to bounds,” Math. Program. **67**, 189–224 (1994). [CrossRef]

**20. **With the autorization of Imagine Optic, patent no. Eur 1415133 - US 7,255,442 - Jap 4212472.

**21. **D. Malacara, *Optical Shop Testing* (Wiley-Interscience, 2007). [CrossRef]

**22. **C. López-Quesada, J. Andilla, and E. Martín-Badosa, “Correction of aberration in holographic optical tweezers using a Shack-Hartmann sensor,” Appl. Opt. **48**, 1084–1090 (2009). [CrossRef]