Preview only show first 10 pages with watermark. For full document please download

High-speed Variable-focus Optical System For Extended Depth Of Field

   EMBED


Share

Transcript

IEEE International Symposium on Industrial Electronics (ISIE 2009) Seoul Olympic Parktel, Seoul, Korea July 5-8, 2009 High-speed variable-focus optical system for extended depth of field N. Mizoguchi, H. Oku and M. Ishikawa Department of Information Physics and Computing University of Tokyo Tokyo, Japan Email: [email protected] Abstract— A high-speed camera is essential for observing dynamic phenomena. Such a camera needs a bright lens with large aperture due to the short exposure time. Since the depth of field (DOF) becomes shallow when using a bright lens, there is a demand for a high-speed camera with an extended depth of field. One commonly used technique to extend the DOF is to synthesize images captured at various focus positions. However, this method cannot be applied to a high-speed camera because no conventional mechanism can shift the focus position quickly enough. In this paper, we propose a new high-speed variablefocus optical system for extending the DOF. The proposed optical system is based on a high-speed variable-focus liquid lens with millisecond-order response time that was developed by our group. We show that our proposed system results in better light intensity than the method involving decreasing the diameter of the aperture stop. Furthermore, our experiment demonstrated that a 1000-fps movie with extended DOF was successfully acquired using the high-speed variable-focus optical system in combination with the image synthesizing technique. I. I NTRODUCTION Observation of dynamic phenomena with a high-speed camera is a task that is frequently performed in engineering, biology, and medical science. When using a high-speed camera, the aperture stop in the optical system must have a sufficiently large diameter because this kind of camera needs to capture a large amount of light due to its short exposure time. However, increasing the diameter of the aperture stop reduces the depth of field (DOF). Consequently, in the fields mentioned above, there is a demand for a high-speed camera that can acquire video images with extended depth of field. Focus stacking is an image synthesizing technique for extending the depth of field of photographs. In focus stacking, for each pixel in the image, the most in-focus pixel is identified from the images captured at various focus positions, and they are synthesized into an image with extended DOF. This image fusion method was studied by R.J. Pieper et al.[1] and P. Burt et al.[2], and a patent[3] based on so-called Burt Pyramid image analysis has been obtained. A freeware tool using focus stacking has also been developed[4], and it is commonly used by photographers. Acquiring images that can be used for focus stacking needs an optical system that can shift the object-side focal point. For example, a microscope that uses a special actuator to move the objective lens to shift the object-side focal point has been developed, but the operation frequency is low, and the 978-1-4244-4349-9/09/$25.00 ©2009 IEEE frame rate of the extended-DOF movies that can be acquired is limited to 30 fps. Another microscope that shifts the objectside focal point using a variable-focus lens has been developed by Ohba et al.[5], but it cannot attain a frame rate high enough for a high-speed camera because the operation frequency of the variable-focus lens is limited to 150 Hz. Thus, our goal was to realize a method for extending the DOF that is suitable even when using a high-speed camera. Our target frame rate of the synthesized movie with extended DOF is 1000 fps. In this paper, we propose a new highspeed variable-focus optical system for extending the DOF. The proposed optical system is based on a high-speed variablefocus liquid lens with millisecond-order response time that was developed by our group. We numerically show that our proposed system results in better light intensity than the method involving decreasing the diameter of the aperture stop. Furthermore, our experiment demonstrated that a 1000-fps movie with extended DOF was successfully acquired using the high-speed variable-focus optical system in combination with focus stacking. II. D ISCUSSION OF METHODS As mentioned in the previous section, the method involving decreasing the diameter of the aperture stop is unsuitable for our purpose, and therefore, we considered the method involving shifting the object-side focal point together with focus stacking. To synthesize a 1000-fps movie with this method, an optical system that can shift the object-side focal point back and forth periodically at 500 Hz is needed. One method of moving the object-side focal point of the optical system is to move a lens with an actuator. A piezo actuator and a voice coil motor are often used for this purpose. The stroke of a piezo actuator, however, is very short, typically ten micrometers. Therefore, a mechanical amplifier is needed to amplify the stroke. The piezo actuator with the mechanical amplifier is called the piezo positioner. The operation frequency of it is limited to about 20 Hz due to the eigenfrequency of the amplifier. That of a voice coil motor is also limited to about 80 Hz[6] in moving the mass of the lens. Another possibility is to use a varialble-focus lens. The varialbe-focus lens is an optical device that can change the focal length by changing the form of the physical lens or by using an electro-optical property of materials. It has 1668                                 Fig. 1.    The principle of DOF. Fig. 2. been extensively researched recently[7][8][9]. Considering one variable-focus lens called Dynamorph Lens can control the focal length in millisecond order[9], use of such lens is suitable for our goal. Wavefront Coding[10] is another method for extending the DOF. It uses a special phase filter as part of the optical system, rather than focus stacking. However, this approach has some problems: The amount of extension of DOF is fixed, deconvolution noise is inevitable, and the image projected by its optical system cannot be observed until the digital system processes it. Another method that does not involve focus stacking is to use a microlens array. With this method, however, an highspatial-resolution imager is required to obtain an image with sufficient quality and spatial resolution. Because the highspatial-resolution imager requires much time to transfer the image data, the method is not suitable for a high-speed camera. For these reasons, to extend the DOF we propose a method using a variable-focus optical system with a variable-focus lens to shift the object-side focal point, in combination with focus stacking. III. T HEORY In this section, we describe the principle of our proposed system using paraxial optics and a thin lens model, and show that it is superior in light intensity to the method involving decreasing the diameter of the aperture stop. First, we describe the DOF of the optical system theoretically to derive the required condition of the image recording. Then, we derive the required value of the diameter of the aqerture stop to reach the same DOF as the proposed system. Finally, we compare the light intensity of the proposed system with that of the method involving decreasing the diameter of the aperture stop. A. Depth of Field In this subsection, a quantitative treatment of DOF is described. In Fig. 1, f is the focal length, s is the distance between the object and the lens, v is the distance between the imager and the lens, d is the diameter of the entrance pupil, N is the F-number of the lens, c is the diameter of the acceptable circle of confusion, DN and DF are the distances between the farthest and nearest object points, respectively, where the rays concentrate on the acceptable circle of confusion, and vN and    The principle of our proposed system. vF are the distances between the respective focus points and the lens. DN and DF are given by: DN = sf 2 f 2 + N c(s − f ) (1) DF = sf 2 . f 2 − N c(s − f ) (2) and (For a detailed derivation of these, see Appendix A.) The difference DN − DF is the DOF.When s is the hyperfocal distance H = f 2 /N C + f , DF and DOF become infinity, but DN still follows (1). B. Principle of Our Proposed System In this subsection, the principle of our proposed system and of the method involving decreasing the diameter of the aperture stop are explained. We call the latter method ”the direct method” below, and we call the case where images are acquired in the initial conditions before extending the DOF ”the original case”. First, we explain the principle of our proposed system. Here, d is the diameter of the entrance pupil, f (0) is the initial focal length, s(0) is the initial object distance, v is the initial image distance, and DN (0) and DF (0) are the initial values of DN and DF , respectively. d and v are fixed.First, the imager is exposed for time t at s(0). Then, f is changed to satisfy DF (1) = DN (0), and s(0) is shifted to s(1). We assume that the time for changing f is negligible. Then, the imager is exposed for time t again. Next, f is changed to satisfy DF (2) = DN (1). Repeating this operation n times gives (n + 1) images. Then, applying focus stacking to them extends the DOF to DN (n)−DN (0). This is the principle of our proposed system. Second, we explain the direct method.In this method, d is changed to d and f to f  so as to get the same DOF as that of our proposed system. Then, the imager is exposed for the time (n + 1)t, and one image is acquired. d is given by: d . (3) n+1 (For a detailed derivation of this, see Appendix B.) Finally, in the original case, the imager is exposed for time (n + 1)t while keeping f and d fixed at the initial conditions. 1669 d = C. Illumination Intensity In this subsection, we show that our proposed system results in superior light intensity to the direct method. The light intensity E is in inverse proportion to the square of the working F-number Nw = (1 − m)f /d, where m = −v/s is the linear magnification. Eliminating s in m by using the thin lens formula ((14) in Appendix A) and substituting m into Nw gives v (4) Nw = . d In our proposed system, v and d are fixed, so Nw is also fixed. Because the exposure time in our proposed system is 1/(n+1) of that of the original case, the light intensity of our system, Ef , is given by: E0 Ef = (5) n+1 where E0 is the light intensity of the original case. According to (3), the working F-number of the direct method, Nd , is given by: v v Nd =  = (n + 1) = (n + 1)Nw . (6) d d Accordingly, the light intensity of the direct method, Ed , is given by E0 Ed = . (7) (n + 1)2 Fig. 3.    IV. E XPERIMENTS AND R ESULTS In order to confirm the superiority of our proposed method shown above, we developed the actual optical system designed on the basis of our proposed method. In this section, we describe the principle of DML, the setting of the optical system developed, the experimental method, and the experimental result. A. Dinamorph Lens (DML) The DML is a high-speed variable-focus liquid lens that has a high-speed step response of 2 ms and a high resolution in image space of 64 cycles/mm. A photograph of the DML is shown in Fig. 3. The operating principle of the DML is illustrated in Fig. 4. The focal length of the DML can be changed by supplying a voltage to the piezo actuator. The optical system in which the DML is installed can shift the object-side focal point periodically by supplying a suitable                 >  (8) Namely, the light intensity of our proposed system is always (n + 1) times that of the direct method. In this section, we described the principle of our proposed system using paraxial optics and a thin lens model. A real lens, however, has finite thickness, and the rays that deviate from the optical axis do not strictly follow paraxial optics. The effect of the thickness of the lens can be easily compensated for in most cases. Besides, most practical lenses satisfy the sine condition[12], and so the paraxial optics approximation is adequate.      = Therefore, Ef = (n + 1)Ed .     Photograph of DML.  ? Fig. 4. Schematic cross-sectional view of the ideal structure of the DML. voltage waveform. The DML uses an interface between two immiscible liquids as a refractive surface. Its radius of curvature is controlled by the liquid pressure. As shown in Fig. 4, we used polydimethyl-siloxane (PDMS; refractive index 1.40) and ultrapure water (refractive index 1.33) as the two liquids in this research. B. Optical System When designing the optical system, the values of the parameters of the optical system, such as the focal length, were not limited provided that the object distance did not exceed the hyperfocal length. In this paper, we designed the optical system illustrated in Fig. 5. The object consisted of six pieces of string and a plastic background, as illustrated in Fig. 6. C. Performance We examined whether the constructed optical system could achieve the target frame rate of 1000 fps. To acquire a 1000fps movie using this optical system, it is necessary to shift the object-side focal point from one end of the object to the other 1670                                                  Fig. 5.           Fig. 7. The input wave to the piezo actuator (broken line) and the position of the tip of the piezo actuator (solid line). The latter was measured by the position sensor in the piezo actuator. Optical system.                 Fig. 8. The images acquired when the object-side focal point was being shifted away from the camera.       Fig. 6. Object. end in 1 ms. To achieve this, a 500-Hz triangular wave was inputted to the piezo actuator. The input triangular wave and the corresponding position of the tip of the piezo actuator are shown in Fig. 7. The position of the tip of the piezo actuator followed a curved line delayed by about 400 us from the input triangular wave. Then, the interface in the DML changed with a delay on the order of microseconds. This delay could only be confirmed by observing the images acquired by the high-speed camera because there was no sensor in the DML. This delay determines the frame rate limit of the images synthesized by focus stacking. The 8000-fps images acquired by the high-speed camera (Phantom V4.3, Vision Research) are shown in Fig. 8 and in Fig. 9. In Fig. 8, the object-side focal point was being shifted away from the camera, and in Fig. 9, towards the camera. The piezo actuator performed this operation at 500 Hz. Therefore, applying focus stacking to each of the two sets of images produced the 1000-fps movie with extended DOF. The two images synthesized by focus stacking are shown in Fig. 10. Combine ZM[4] was used to apply focus stacking to the images. It could be confirmed visually, as shown in Fig. 10, that the DOF was extended to cover the entire range of the object. Fig. 9. The images acquired when the object-side focal point was being shifted towards the camera. D. Acquired Movie with Extended DOF Finally, a 1000-fps movie of a moving object with extended DOF was acquired with the optical system shown in Fig. 5. The settings, such as the input to the piezo actuator and the frame rate, were the same as those used in the previous subsection. Fig. 11 shows the images synthesized by focus stacking, together with unsynthesized images for comparison. Combine ZM was used to apply focus stacking to the images. It could be confirmed visually, as shown in Fig. 11, that the DOF was extended to cover the entire range of the object. Thus, our experiments demonstrated that our highspeed variable-focus optical system could acquire a 1000-fps movie with extended DOF by employing focus stacking. 1671 into (9) and (10) gives vN = fv f − Nc (12) vF = fv . f + Nc (13) and The thin lens formula[11] is given by: 1 1 1 + = . s v f (14) According this equation, vN and vF are given by: Fig. 10. The two images synthesized by focus stacking (left: image synthesized from the images shown in Fig. 8; right: image synthesized from those in Fig. 9. vN = DN f DN − f (15) vF = DF f . DF − f (16) and Substituting these two equations into (12) and (13), respectively, gives (1) and (2) A PPENDIX B Next, a derivation of (3) is given below.For the ith change of focal length, DF (i) = DN (i − 1). According to (2) and (11), s(i)f (i)2 DF (i) = . (17) f (i)2 − f (i) d c(s(i) − f (i)) Eliminating f (i) using this equation and (14) gives Fig. 11. 1000-fps movie of a moving object (top: images synthesized by focus stacking; bottom: unsynthesized images). s(i) = dvDF (i) . dv + cDF (i) (18) s(i)v . s(i) + v (19) According to (14), V. C ONCLUSION In this paper, we have proposed a new high-speed variablefocus optical system for extending the depth of field (DOF). The proposed optical system was based on a high-speed variable-focus liquid lens with millisecond-order response time, developed by our group. From a paraxial optics analysis, we showed that our proposed system results in better light intensity than the method involving decreasing the diameter of the aperture stop. Furthermore, our experiments demonstrated that a 1000-fps movie with extended DOF was successfully acquired using the high-speed variable-focus optical system in combination with focus stacking. A PPENDIX A First, a derivation of DOF is given below. According to the similarity of triangles in the image space in Fig. 1, c vN − v (9) = vN d and v − vF c (10) = . vF d Substituting the definition of F-number, N= f d f (i) = First, we consider the case where f2 + f. (20) Nc Then, s is derived from DF (0) and DN (n) because DF (0) is not infinity. According to (1) and (2), 1 1 2 (21) + = . DN DF s Thus, 2DN DF . (22) s= DN + DF Substituting this equation and (14) into (1) gives s H can be neglected here because it is inconsistent with DOF extension. When s(0) = H, s(n) is also at the hyperfocal distance, so s = H  = 2DN (n). R EFERENCES [1] R.J. Pieper, A. Korpel, “Image processing for extended depth of field,” Appl. Opt. 22, 1449-1453, 1983. [2] P. Burt and R. J. Kolczynski, “Enhanced Image Capture Through Fusion,” Proc. of International Conference on Computer Vision (ICCV), pages 173-182, 1993. [3] E.H. Adelson, “Depth-of-focus imaging process method,” U.S. patent no. 4,661,986, 1987. [4] http://www.hadleyweb.pwp.blueyonder.co.uk/CZM/combinezm.htm [5] K. Ohba, J. Carlos, P. Ortega, K. Tanie, M. Tsuji, et al., “Microscopic vision system with all-in-focus and depth images, Machine Vision and Applications,” Vol.15, pp.55-62, Dec., 2003. [6] J. Wals, J. Dovic, A. J. Niessen, M. Rieck, and R. M. G. Rijs, “FastAccess Optical Drive,” Japanese J. Appl. Phys. Part 1 39, 862-866, 2000. [7] B. Berge, “Liquid lens technology: principle of electrowetting based lenses and applications to imaging,” Proceedings of the MEMS 2005 conference, 2002. [8] K. Tanie, G. Rin, R. Dangi, Y. Takei, T. Kaneko, et al., “Real-time micro VR camera system,” Proc. of the Fourth Asian Conference on Computer Vision, pp. 503-513, 2000. [9] H. Oku and M. Ishikawa, “High-speed liquid lens with 2 ms response and 80.3 nm root-mean-square wavefront error,” Applied Physics Letters, 2009. (To appear) [10] E. R. Dowski, Jr., and W. T. Cathey, “Extended depth of field through wave-front coding,” Appl. Opt. 34, 1859-1866 (1995). [11] W.J. Smith, Modern Optical Engineering. 3rd edition (McGraw-Hill, 2000). [12] T. Tsuruta, Oyo-kogaku (Applied Optics) I. (Baifukan, Tokyo, 1990). (in Japanese) (30) Eliminating f in (1) by (14) gives d= csDN . v(s − DN ) (31) Substituting s = 2DN into this equation gives 2c DN . v Applying this equation to d gives d= 2c DN (n). v Then, substituting s(0) = 2DN (0) into (28) gives d = DN (i) = 2dvDN (0) . dv + 2c(2i + 1)DN (0) (32) (33) (34) Substituting this equation into (33) and rearranging gives (dv + 2c(2n + 1)DN (0))vd = 4cvDN (0)d. (35) Substituting (32) for d on the left-hand side of this equation, rearranging, and replacing i by n gives (3). 1673