Transcript
US 20040208346A1
(19) United States (12) Patent Application Publication (10) Pub. No.: US 2004/0208346 A1 (43) Pub. Date:
Baharav et al. (54)
SYSTEM AND METHOD FOR MULTIPLEXING ILLUMINATION IN COMBINED FINGER RECOGNITION AND FINGER NAVIGATION MODULE
(76) Inventors: Izhak Baharav, San Jose, CA (US); Travis N. Blalock, Charlottesville, VA (US); Akihiro Machida, Cupertino, CA
(US); George E. Smith, Cupertino, CA (US); J in Kiong Ang, Bayan Lepas
(MY) Correspondence Address: AGILENT TECHNOLOGIES, INC.
Publication Classi?cation (51)
Int. Cl.7 ..................................................... .. G06K 9/00
(52)
US. Cl.
Intellectual Property Administration
(57)
ABSTRACT
An apparatus for imaging a ?ngerprint operates in a selected mode to provide a ?nger recognition and/or a ?nger navi gation application. At least one light source illuminates at
source(s). The illuminating light sources are selected depending on the selected mode of operation to illuminate
Loveland, CO 80537-0599 (US)
only those partitions of the ?nger interface necessary for operation of the selected mode.
10/418,968
SWlPE SURFACE
20 ‘
Y
110SURFACE 115 JSURFACE
150 ) ‘225/’ SURFACE COUPLING
..... ..382/124
least one partition of a ?nger interface upon Which a user
P.O. Box 7599
Lx
Apr. 18, 2003
places a ?nger (or thumb). Light re?ected from the ?nger is captured by at least one image sensing region aligned to receive re?ected light emitted from the illuminating light
Legal Department, DL429
(21) Appl. No.:
(22) Filed:
Oct. 21, 2004
SWIPE
MODULE 100 SURFACE
128
EL —>
L_.. ?"
138
158
LENS
135
PR|SM 155
%____J opncs
130
.
128
\:% i
I20
PRISM
SENSOR
/~ I40
Patent Application Publication Oct. 21, 2004 Sheet 1 0f 13
US 2004/0208346 A1
Z
}‘X |*
\
, IMAGING
‘25
iNTERFACE 110/
1
128
I
130\ OPHCS
) SYSTEM ‘0
:
LIGHT
‘
_
SOURCE
” 128
120
|
_
_
F
140\
, 2107
|
|
|
SENSOR at? PROCESSOR
1 IMAGE I \__________4|[)ATA 50
220'‘
MODE
I
SWITCH
L
_
E _
_J
IMAGE PROCESSiNG SYSTEM 200
FI G.
1
SWIPE
/SORFACE z
L.
y
110
-\
X
SURFACE
150 \ & SURFACE COUPLING LENS
SWIPE
\ESURFACE 115
MODULE 100
128
EL —*
ISURFACE
/ _
138
?”
158
PR|SM~
W
155
OPTICS
130
PR'SM
135 128
.
\_% r140
C 120
SENSOR
FIG. 2
Patent Application Publication Oct. 21, 2004 Sheet 2 0f 13
US 2004/0208346 A1
FINCERPRINT SWIPE INTERFACE II0 "
T
g T '
“5
THICKNESS
119T I12 \ I I
125
‘
I '
V“
I20
MICROLENS ARRAY 170
COUPLING LENSES I50
IMAGING
}% ,
SENSOR
I
/
I40
/
LED
PHOTO 0ETEGT0Rs
I20
145
I \ SUBSTRATE 180
Fl G. 3A FOV OF MICROLENS 2
FOV 0E MICROLENS 5
190b
I900
/_ FINGER 20
FOV 0F \
/ EINGERRRINT
MICROLENS I
VALLEY 25b
1900
-
--_7/§< \\ /,/ TINGERPRINT \J/
RI0GE 250
SWIPE
I00 \\
A/INTERFACE HO
MICROLENS I 1750
MICROLENS 5 1750 I70
,IIIAGING SENSOR T40
1450 /
Q 145b
\PII0T0 0ETEGT0R 145G
Patent Application Publication Oct. 21, 2004 Sheet 3 0f 13
L z
US 2004/0208346 A1
PHOTO DETECTOR
Y
ARRAY I 48
r
FINGER RECOGNITION PHOTO DETECTORS I456
FINGER NAVIGATION PHOTO DETECTORS I 45d
FIG. 4 600 MODE SELECTION
61D
IMAGE ACQUISITION 620
I
DETERMINEAx,Ay 650
630 FINGER RECOGNITION MODE
N
?
640
PROVIDEAX, Ay AND IMAGE TO FINGER PRINT MATCHING APPLICATION
FIG. 8
PROVIDEAX, Ay TO OTHER APPLICATION
Patent Application Publication Oct. 21, 2004 Sheet 4 0f 13
US 2004/0208346 A1
Patent Application Publication Oct. 21, 2004 Sheet 5 0f 13
US 2004/0208346 A1
9mm——>
r ?\IN(IJPE INTERFACE
Z
Ly
/_ SWIPE INTERFACE
X
110
i
Q Q Q[OPTICS 130
——mm
5
i W114i
\E/ L5} SENSOR 140
3AM
a
1mm
FIG. 6B
Patent Application Publication Oct. 21, 2004 Sheet 6 0f 13
US 2004/0208346 A1
w:m$\\250M1G A _
0:139K
_ _ mm 3% ‘ _ 658
"1 42152r052v;8 A
_ _ il1
T a?I|1\_02 ; i _ 2Q23;
T ZQEEL _ r 4%$852E:38 Ea?\é;
EilI| ‘ _ 7§\own
_?gwE5%z02%/m?kr_ ENE 5%ii502i 1
25523\ %i ?_PQ
@29Nmo65?ozm
55m/ _ 2%i an5%E2/g/_ Omgag
i mm
2%/
Z/05 2;
anasM25i _
_\ I4Wu.QNKi|lI:
Patent Application Publication Oct. 21, 2004 Sheet 7 0f 13
US 2004/0208346 A1
365
/
I0
ON/OFF SWITCH
/360
LED
LE
I
I
\ 1200
D120b
SD120C
INAGE
TIMING GONIROI
.
SENSOR
I
SENSOR
\ 1400
SENSOR
\ 140k)
\ 140G
r220 MODE SWITCH
FIG. 7B ,GOO
IIOOE SELECTION
Y
FINGER
601 N
NAvIGAIION ')
602
I
\
603\\
‘
I
CENTER LIGHT
SEIEGI ALL
SOIIRGE SEIEGIEO
IIGIII SOIIRGES
IIIONINAIE GENIER
ILLUMINATE ENTIRE
0F SENSING AREA
SENSING AREA
l
I i
IMAGE ACQUISITION
FIG. 9
/610
/604
/ 605
Patent Application Publication Oct. 21, 2004 Sheet 8 0f 13
700
ACOUIRE REFERENCE FRAME 710
/ I/
720
I ACQUIRE CURRENT FRAME II
SHIFT FRAME IN ONE DIRECTION
73D
II
/
COMPUTE CORRELATION IN SHIFTED DIRECTION
MORE SHIFT DIRECTIONS 9
750
\ DETERMINE DIRECTION WITH HIGHEST CORRELATION II
760
\
DETERMINEAXAy
FIG. 10
US 2004/0208346 A1
Patent Application Publication Oct. 21, 2004 Sheet 9 0f 13
mzoEmE
vzoEmE
@zoEmom
N20581
Q2 58
@zoEmE
_205mg
wzoEwE
NzoEmE
US 2004/0208346 A1
N“
Patent Application Publication Oct. 21, 2004 Sheet 10 0f 13
30/
I
FIG.
12
800
\ 810
IMACE ACQUISITION II
\
MINUTIAE EXTRACTION
820 \
ALIGNMENT 830
840
\
\
US 2004/0208346 A1
II
MINUTIAE MATCHING II
OUTPUT MATCH RESULTS
FIG.
13
Patent Application Publication Oct. 21, 2004 Sheet 11 0f 13
US 2004/0208346 A1
EME T
IMAGE A 400
-
OVERLAP REGION
TH
‘II/‘G503 \_ — 77 7 ‘7 “f T T — IMAGE C
H2
‘7/ 410
____7_7~7—/-—/-—/——/—-/2\1/[E)RLAP REGION
400 \
FIG.
14
850
\
855
I \
860
STORE IMAGE
I 3
865
ACQUIRE IAIER IMAGE
I \
870
AGQUIRE INITIAL IMAGE
DEIERMINE OVERLAP AREA
I \ STORE REMAINING PORTION OF LATER IMAGE OVERWRITING OVERIAP AREA
MORE IMAGES I)
880 OUTPUT FINAL STITGHED IMAGE
FIG.
15
Patent Application Publication Oct. 21, 2004 Sheet 12 0f 13
y
SWIPE INTERFACE
TTME REFERENCE A
FINGER STROKE 900 SWIPE INTERFACE 1 10 FINE REFERENCE
‘
— — w
B
FINGER STROKE 90b
SWlPE
\ 900
910
T \ ACQUIRE REFERENCE FRAME 920\ T ACQUIRE CURRENT FRAME
930*
T DETERMINE Amy VALUES
DETERMINE CHARACTER USTNO Ax,Ay VALUES 960
T
DTSPLAY CHARACTER
FIG.
17
<—
US 2004/0208346 A1
Patent Application Publication Oct. 21, 2004 Sheet 13 0f 13
US 2004/0208346 A1
SWIPE INTERFACE
\110 NOHFICATION
245
1
MESSAGE INDICATOR
LOGIC
‘\240
7
0N/OFF
SWITCH
7
LED
120
\230 F] G. 18
guRFACE
SWIPE INTERFACE
5w|PE
110
MODULE
100
TOP SURFACE 560
SWIPE HOUSING
550
FIG. 79A
SWIPE INTERFACE / 3%“ 1 10 GROOVE
'00
565 TOP SURFACE 560
SWIPE HOUSING 550
FIG.
79B
Oct. 21, 2004
US 2004/0208346 A1
SYSTEM AND METHOD FOR MULTIPLEXING ILLUMINATION IN COMBINED FINGER RECOGNITION AND FINGER NAVIGATION MODULE CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This US. Nonprovisional application for patent is related by subject matter to copending and commonly assigned U.S. Nonprovisional applications for patent Ser. No. (Attorney Docket No. 10020683-1) and Ser. No. (Attorney Docket No. 10020685-1) ?led on even
Therefore, in order to provide ?nger recognition capabilities on such electronic devices, the sensing area needs to be as
small as possible. HoWever, there is not currently a ?nger print scanner available that has a small enough sensing area for implementation on such electronic devices, While still
enabling adequate ?nger-recognition algorithmic perfor mance.
[0008]
Another feature many electronic devices also pro
vide is a navigation mechanism for controlling a cursor or
pointer on a screen. By far, the most popular navigation mechanism in PC usage is the mouse. Recently, optical mice have been developed that are able to navigate on nearly any
arbitrary surface using a tracking algorithm that correlates
date hereWith. U.S. Nonprovisional applications for patent Ser. No. (Attorney Docket No. 10020683-1) and Ser. No. (Attorney Docket No. 10020685-1) are hereby incorporated by reference in their entirety.
DETECTOR FOR A SEEING EYE MOUSE, Which is
BACKGROUND OF THE INVENTION
optical mouse that images the spatial features of a surface
[0002]
1. Technical Field of the Invention
beloW the mouse and compares successive images to ascer tain the direction and amount of movement. In addition, as
[0003]
The present invention relates generally to the ?eld
sequential images in the direction of movement. For example, US. Pat. No. 6,281,882, entitled PROXIMITY
hereby incorporated by reference in its entirety, describes an
of image acquisition. More speci?cally, the present inven tion relates to systems and methods for obtaining and
processing images of ?ngerprints for navigation and recog nition purposes.
[0004] 2. Description of Related Art [0005] The use of ?ngerprints for identi?cation purposes can be dated back for centuries. For example, in 14th century
described in US. Pat. No. 6,057,540, entitled MOUSELESS OPTICAL AND POSITION TRANSLATION TYPE SCREEN POINTER CONTROL FOR A COMPUTER
SYSTEM, Which is hereby incorporated by reference in its entirety, an optical ?nger navigation device has also been developed that detects motion of the ?nger and translates the ?nger motion into corresponding motion of the cursor or pointer on the screen.
[0009] Placing separate sensing areas for ?nger recogni
Persia, various official government papers included ?nger prints (impressions), and one government of?cial observed
tion and ?nger navigation on the same electronic device is
that no tWo ?ngerprints Were exactly alike. In recent times, ?ngerprints play an important role in What is knoWn as biometrics, Which refers to the identi?cation of an individual
tive. HoWever, there is not currently a ?ngerprint scanner
based on his or her physiological or behavioral characteris
tics. Biometrics enables automatic personal identi?cation for a number of applications, such as criminal investigations, physical access to facilities and electronic access to com
inef?cient from both a cost perspective and a space perspec
available capable of performing both ?nger recognition and ?nger navigation. Therefore, What is needed is an optical
mechanism of combining ?nger recognition and ?nger navi gation using a single sensing area of a siZe suf?ciently small for integration With electronic devices. In addition, What is needed is a cost-effective imaging system capable of per
puters and/or data on computer systems.
forming both ?nger recognition and ?nger navigation, While
[0006]
minimiZing poWer consumption on small and/or portable
Fingerprints can noW be acquired directly in the
form of a digital image, Without the need for an intermediate step of obtaining an impression of the ?ngerprint on paper, as Was traditionally done. Digital ?ngerprints can be stored
electronic devices.
and processed in subsequent ?ngerprint enhancing and ?n
[0010] Embodiments of the present invention provide an apparatus for imaging a ?ngerprint in at least tWo different
gerprint matching applications. In order to capture a ?nger print image With enough features for recognition, a certain resolution and ?nger tip area are required. For eXample, the
SUMMARY OF THE INVENTION
modes of operation to provide both ?nger recognition and ?nger navigation applications. A sensing area of a ?nger
Federal Bureau of Investigation (FBI) recommends a 12x16 mm ?nger tip area, With 400 dpi resolution. In other appli cations Where siZe and cost are important factors, smaller
interface upon Which a user places a ?nger (thumb or toe) is divided into tWo or more partitions. Each partition is sepa
?nger tip areas, With the same or loWer resolutions, can be used. For eXample, a 9x12 mm ?nger tip area, With a resolution as loW as 300 dpi, has been used in many
re?ected from the ?nger is captured by optical image sensing region(s) aligned to receive re?ected light emitted from the illuminating light source(s). The illuminating light
applications. HoWever, in smaller area and/or loWer resolu
sources are selected depending on the selected mode of
rately illuminated by a respective light source. Light
tion ?ngerprint imaging applications, the ?nger-recognition
operation to illuminate only those partitions of the ?nger
algorithmic performance is usually inferior due to the reduc tion in the number of captured features.
interface necessary for operation in the selected mode. The
[0007] For some applications, dedicating an area of even 9x12 mm to capture a ?ngerprint image is undesirable. For
light is captured by the optical image sensing regions as image data corresponding to one or more partition images.
The captured image data is output by the image sensing
eXample, in the design of cell phones, laptop computers, personal digital assistants, electronic mice and other elec
regions for processing of the data in one of the at least tWo different modes.
tronic devices, there is a trend toWards miniaturiZation of the device itself, While at the same time offering more features.
[0011] Further embodiments provide an imaging system having a mode sWitch that selects betWeen the at least tWo
Oct. 21, 2004
US 2004/0208346 A1
different modes for processing the image data. The mode
[0021]
switch further selects one or more of the light sources for
sWipe module having multiplexed illumination of the sWipe
illumination of the desired partitions during operation of the image processing system in one of the at least tWo modes.
FIG. 5A is a simpli?ed pictorial side vieW of a
interface;
The image data received from the image sensing region(s) corresponding to the selected light source(s) is provided to
[0022] FIG. 5B is a perspective vieW of the sWipe module of FIG. 5A;
a processor programmed to process the image data in one of the at least tWo modes.
[0023] FIG. 5C schematically illustrates multiplexed illu
[0012] In ?nger navigation mode, the image data is pro cessed using a tracking algorithm capable of correlating sequential images to ascertain navigation information indi
[0024] FIGS. 6A and 6B schematically illustrate siZe reduction of the imaging system using multiplexed illumi nation of the sWipe interface;
cating the magnitude and direction of movement of the ?nger. The images are correlated using the micro texture
mination of an area module;
of the ?ngerprint captured by the image sensor.
[0025] FIG. 7A is a block diagram illustrating exemplary hardWare and processing components of the imaging system of the present invention;
[0013] In ?nger recognition mode, the image data is processed using both the tracking algorithm and a stitching algorithm that combines sequential images in order to form
[0026] FIG. 7B is a block diagram illustrating exemplary hardWare and processing components of a multiplexed imaging system of the present invention;
features (e.g., ridges and valleys) in the respective portions
one continuous image of the ?ngerprint having a suf?cient number of micro texture features for ?ngerprint matching.
The stitching algorithm uses the navigation information determined by the tracking algorithm to determine overlap betWeen successive images.
[0027]
FIG. 8 is a How chart illustrating an exemplary
process for operating in multiple modes; [0028]
FIG. 9 is a How chart illustrating an exemplary
process for multiplexing illumination in multiple modes;
[0014] In further embodiments, the imaging system is capable of operating in additional modes for other types of optical navigation applications, such as a stylus mode. In stylus mode, the navigation information extracted from the
process for operating in ?nger navigation mode;
image data representing the sequence of images of the ?nger
FIG. 10;
is used to determine the ?nger strokes made by the user that are associated With a desired letter, number or punctuation
mark. Another mode of operation of the imaging system is
[0029] [0030]
[0031]
FIG. 10 is a How chart illustrating an exemplary FIG. 11 is a schematic vieW of selected steps of
FIG. 12 illustrates a portion of a typical ?nger
print;
a blinking node, in Which the light source is used as a
[0032]
message indicator light.
process for operating in ?nger recognition mode;
[0015]
[0033]
Multiplexing illumination of the sensing areas of
the ?nger interface improves performance of the imaging system by minimiZing poWer consumption in small and/or portable electronic devices. Furthermore, the invention pro vides embodiments With other features and advantages in
FIG. 13 is a ?oWchart illustrating an exemplary
FIG. 14 illustrates successive images having over
lap therebetWeen; [0034] FIG. 15 is a How chart illustrating an exemplary process for stitching successive images together to form one
addition to or in lieu of those discussed above. Many of these
continuous ?ngerprint image;
features and advantages are apparent from the description beloW With reference to the folloWing draWings.
[0035] FIG. 16A is a pictorial representation of ?nger
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The disclosed invention Will be described With reference to the accompanying draWings, Which shoW important sample embodiments of the invention and Which
are incorporated in the speci?cation hereof by reference, Wherein:
[0017] FIG. 1 is a block diagram illustrating an imaging system having a sWipe module and an image processing system capable of operating in at least tWo different modes
to provide both ?nger navigation and ?nger recognition; [0018] FIG. 2 is a simpli?ed and magni?ed pictorial side vieW of the main components of the sWipe module in
strokes made by a user on the sWipe module in stylus mode; [0036]
FIG. 16B is a front vieW of a cell phone having a
display indicating a letter associated With the ?nger strokes made in FIG. 16A; [0037]
FIG. 17 is a ?oWchart illustrating an exemplary
process for operating in stylus mode; [0038] FIG. 18 is a block diagram illustrating the imaging system operating in a blinking mode; and [0039]
FIGS. 19A and 19B are perspective vieWs of the
sWipe module package. DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS
accordance With one embodiment of the invention;
[0019] FIGS. 3A and 3B are magni?ed cross-sectional vieWs of the main components of the sWipe module in accordance With another embodiment of the invention;
[0020] FIG. 4 illustrates an exemplary photo detector array of the image sensor of the present invention;
[0040] The numerous innovative teachings of the present application Will be described With particular reference to exemplary embodiments. HoWever, it should be understood that these embodiments provide only a feW examples of the many advantageous uses of the innovative teachings herein. In general, statements made in the speci?cation do not
Oct. 21, 2004
US 2004/0208346 A1
necessarily delimit any of the various claimed inventions.
array of LEDs designed to emit light 125 at a desired
Moreover, some statements may apply to some inventive
average intensity. The Wavelength of light 125 emitted from
features, but not to others.
the light source 120 is selected to maximiZe re?ection of the
[0041]
light 125 from human skin and enable the re?ected light 128 to be distinguished from unWanted light signals. In addition,
FIG. 1 illustrates an imaging system 10 capable of
operating in at least tWo different modes, one of Which is a
?nger navigation mode and another of Which is a ?nger recognition mode. Other modes are possible, depending on the application of the imaging system 10. For example, the imaging system 10 can be programmed to operate in a stylus mode, in Which ?nger strokes are used to represent Written letters, numbers, punctuation marks and other Written forms of communication in a convenient and ef?cient manner. The
imaging system 10 includes an apparatus, hereinafter referred to as a sWipe module 100, for capturing image data 50 representing an image of a human ?nger 20 and an image
processing system 200 for processing the image data 50 in one of the at least tWo different modes. As used beloW, the
the Wavelength of the light 125 can be chosen based on user or manufacturer preferences. For example, some manufac
turers may prefer blue light to red light in certain applica tions. The light source 120 can be in an “on state” in a continuous mode With either a steady or variable amount of illumination or in a duty-cycle mode, Where the light source
120 is pulsed on and off to control the exposure by serving the average amount of light. The intensity of illumination can be controlled using any knoWn technique.
[0045] Illumination optics (not shoWn) can also be used to direct the light 125 toWards the sWipe interface 110 at the desired angle of incidence. For example, illumination optics
term “?nger” includes any digit (?nger, thumb or toe) of a
could consist of LED dome lenses or a light pipe that
human hand or foot.
channels the light 125 toWards the sWipe interface 110 With
[0042]
preferred angle of incidence for illuminating opaque mate
a minimal amount of light loss. It is knoWn in the art that the
The sWipe module 100 includes a sWipe interface
110 having a top surface 115 upon Which a user can press
rial is a graZing angle Within the range of ?ve to tWenty
and move a ?nger 20. The top surface 115 may be ?at, or preferably, have a slight curvature. For example, a convex
degrees. An angle of incidence in this range provides a high
signal-to-noise ratio of image data representing inherent structural features of the object being imaged. HoWever, due
curvature aids in enlarging the sensing area of the top surface 115. The sensing area is the portion of the top surface 115 that is in focus for capturing an image of the ?nger 20
to the transparency of skin, such oblique angles are not
pressed against the top surface 115. In preferred embodi
ing an image of the user’s ?nger 20. Therefore, the selection
ments, the sensing area includes the entire top surface 115 of the sWipe interface 110. The sWipe interface 110 may be
of the angle of incidence is largely dependent upon the
formed of glass or other Wear resistant material that is transparent to light emitted from a light source 120 illumi
nating the top surface 115 of the sWipe interface 110.
necessary for adequately capturing image data 50 represent design of the sWipe module 100, such as the number and type of LEDs used, the thickness of the sWipe module 100 in the Z-direction and the optics (illumination and image
transfer) employed.
[0043] In one embodiment, the sWipe interface 110 is elongated in shape to provide a sensing area less than the
[0046] When the tip of a ?nger 20 is pressed against the top surface 115 of the sWipe interface 110, ridges and valleys
area of the tip of a human ?nger in at least one dimension. The area of the tip of a human ?nger is de?ned as the conventional 9x12 mm sensing area. Using the axes shoWn
in the skin, referred to collectively as micro texture features, are visible in the plane of the top surface 115. Image transfer
in FIG. 1, in one embodiment, the sWipe interface 110 is
optics 130 directs light 128 re?ected from those micro
elongated in the y-direction. As an example, the sWipe
texture features onto an array of photo detectors that is part of an optical image sensor 140, Which can be a CCD (Charge
interface 110 can be approximately 7-9 mm in the y-direc tion and 1 mm in the x-direction. In other embodiments, the
Coupled Device), a CMOS—APS (Complimentary Metal
sWipe interface 110 may be larger in the x-direction, up to 2
type of optical sensor knoWn in the art. Optical image
mm in some areas, to enable better performance at the cost
sensors 140 are preferred over thermal or capacitive image sensors due to the magni?cation/demagni?cation mecha nisms that can be used With optical image sensors to reduce the silicon area. Thermal and capacitive image sensors
of larger area. As shoWn in FIG. 1, the user’s ?nger 20 is positioned on the sWipe interface 110 With the length of the ?nger 20 from the ?nger tip to the base of the ?nger 20 in the x-direction and the Width of the ?nger 20 across the sides
of the ?nger 20 in the y-direction. Therefore, the length of the ?nger 20 is shoWn orthogonal to the elongated direction of the sWipe interface 110 to capture images of a portion of the tip of the ?nger 20 across the Width of the ?nger 20. HoWever, it should be understood that in other embodi ments, the length of the ?nger 20 may be axially rotated to any position With respect to the elongated direction of the
sWipe interface 110. For example, the length of the ?nger 20 may be parallel to the elongated direction of the sWipe interface 110. [0044]
The light source 120 can be any suitable source of
electromagnetic radiation (light 125). By Way of example, but not limitation, the light source 120 can be a single light
emitting diode (LED), multiple LEDs arranged to illuminate different portions of the sWipe interface surface 115 or an
Oxide Semiconductor—Active Pixel Sensor) or any other
typically require the silicon area to be equivalent in siZe to
the sensing area. In addition, capacitive image sensors might be susceptible to electrostatic discharge, Which can decrease
the signal-to-noise ratio, and thus degrade the image. [0047] The optical image sensor 140 acquires an image of the micro texture features in the form of image data 50
representing the intensity of the re?ected light 128 measured at each photo detector. Each photo detector captures a
picture element (pixel) of the image, and all pixels are combined to form the complete image. The photo detectors can be, for example, photodiodes or phototransistors arranged in an elongated array parallel to the elongated direction of the sWipe interface. For example, as shoWn in FIG. 1, both the sWipe interface 110 and the sensor 140 are
elongated in the y-direction. The siZe of the elongated array is dependent upon the magni?cation of the optics. For
Oct. 21, 2004
US 2004/0208346 A1
example, in one embodiment, the magni?cation of the optics is less than unity in a 1:3 ratio. Therefore, if the siZe of the sensing area (top surface 115) is 9 mm><1 mm, the siZe of the sensor 140 need only be 3 mm><0.3 mm. Since the features
on the ?ngertip 20 are large enough to vieW unaided With the human eye, the sensor 140 area can be reduced using a
magni?cation less than unity to reduce the cost of the sensor 140 and also to reduce the siZe of the sWipe module 100.
HoWever, it should be understood that other magni?cations less than unity, near unity or above unity can also be used, depending on the siZe constraints of the sWipe module 100 and the manufacturer’s preferences.
[0048] Each photo detector has a photo sensitive region
tion mode to increase the signal-to-noise ratio and improve the accuracy of ?ngerprint matching. In other embodiments, the sWipe module 110 can perform a “hold” function that
suspends production of image data 50 and reduces the intensity of light 125 emitted by the light source 120 When the user’s ?nger is not engaged With the sWipe module 110. The sWipe module 100 can initiate the “hold” function When
the re?ected light 128 no longer reaches the photo detectors With the same intensity, if at all, due to the re?ecting surface (i.e., the ?nger 20) being too far aWay or simply not in vieW. Even in an intensely lit environment, the “hold” function can be initiated in response to the outputs of the photo detectors
becoming largely uniform.
betWeen 5 and 60 pm square, With the spacing betWeen the
[0052]
photo detectors designed to achieve the desired spatial
frame rate at Which the sensor generates sets of image data
resolution of the sensor 140. For example, on a 3 mm><0.3 mm pixel area, to achieve a resolution of 400 dpi in the
representing successive images depending on the selected mode. For example, in ?nger recognition mode, the user typically sWipes the ?nger 20 at a sloWer rate than in ?nger
?nger sensing area of 9 mm><1 mm requires 144x16 photo detectors of a siZe of 21 pm by 21 pm. Regardless of the desired resolution, the siZe of the photo detectors and the spacing betWeen the photo detectors is constructed to have at least one (preferably more than one) photo detector per
image micro texture feature, and the overall siZe of the photo detector array is large enough to receive an image having several micro texture features.
[0049] The image sensor 140 provides image data 50 (e.g., raW pixel values) to a processor 210 Within the image
processing system 200 capable of processing the image data 50 in at least one of the at least tWo different modes. Separate processors 210 may be used for each mode, or one processor
210 may be programmed to operate in all modes. The processor 210 can be any microprocessor, microcontroller or
other processing device capable of processing the image data 50 in the selected mode and can also be embedded on the same chip as the image sensor 140. A mode sWitch 220 selects betWeen the different modes and controls the expo sure time of the sensor 140, the frame rate and the intensity of illumination of the light source 120, as Will be discussed in more detail beloW. The mode sWitch 220 can be toggled by a user depending on the application desired by the user and/or can be preset to toggle upon the completion of a task. For example, in one embodiment, the mode sWitch 220 can
Furthermore, the mode sWitch 220 can control the
navigation mode. In addition, in ?nger recognition mode, successive images are stitched together to form a complete
image, Whereas in ?nger navigation mode, successive images are compared to determine movement. Therefore,
the overlap betWeen successive images in ?nger recognition mode need only be minimal compared to in ?nger naviga tion mode. As an example, in ?nger recognition mode, if the user moves the ?nger 25 mm per second on a sensing area
of Width 1 mm, a frame rate of 26 frames per second is
suf?cient to capture a complete image of the ?ngerprint. Frame rates up to 500 frames per second may be needed in
?nger navigation mode. [0053] The imaging system 10 can be included Within a single electronic device or Within multiple electronic devices. For example, the sWipe module 100 can be imple mented in a remote electronic device, such as a mouse, While
the image processing system 200 can be implemented on a personal computer having an interface to the mouse. As
another example, the sWipe module 100 and image process ing system 200 can both be implemented in small and/or portable electronic devices, such as a cell phone, laptop computer or PDA. It should be understood that if the
imaging system 10 is implemented entirely in a single electronic device, the image processing system 200 can be
be initialiZed in ?nger recognition mode, and upon a positive
included Within the sWipe module 100 or connected to the
?ngerprint identi?cation, automatically toggle to ?nger
sWipe module 100.
navigation mode. [0050]
As discussed above, the selection of one mode or
another by the mode sWitch 220 determines hoW the image data 50 is processed by the processor 210. In addition, the
[0054] The sWipe module 100 has a thickness in the Z-direction dependent upon the requirements of the elec tronic device. For example, many electronic devices dictate a thickness of less than 5 mm. In order to build a sWipe
exposure time of the sensor 140 can vary depending on the
module 100 Within the thickness speci?cations of the elec
selected mode. For example, in ?nger navigation mode, the
tronic device, various techniques for folding the optical path
user may move the ?nger more rapidly and erratically over
of the light or reducing the siZe of the optics can be used.
the sWipe interface 110 than in ?nger recognition mode, alloWing more stray light into the image. In this case, the mode sWitch 220 can reduce the exposure time of the sensor
140 to reduce the amount of stray light detected, and thus, the amount of noise in the image. [0051] In addition to or instead of controlling the exposure time, the mode sWitch 220 can control the intensity of light 125 emitted from the light source 120 depending on the
selected mode. For example, in ?nger recognition mode, the mode sWitch 220 can increase the intensity of illumination
compared With the illumination intensity in ?nger naviga
[0055] One example of folded optics is shoWn in FIG. 2. FIG. 2 illustrates a simpli?ed and magni?ed pictorial side vieW of an exemplary sWipe module 100. Light 125 emitted from the LED 120 is coupled by a coupling lens 150 toWards a prism 155 that directs the light 125 at a desired angle of incidence to the sWipe interface 110. Depending on the
shape of the prism 155 and angle of incidence, the light 125 may be directed using a total internal re?ection (TIR) mechanism. In other embodiments, the light 125 may be directed using a re?ected light mechanism. In FIG. 2, the light 125 passes through a ?rst surface 158 of the prism 155
Oct. 21, 2004
US 2004/0208346 A1
and refracted towards the top surface 115 of the swipe interface 110 at the desired angle of incidence. Light 128 re?ected back from the ?nger 20 pressed against the surface 115 of the sWipe interface 110 is internally re?ected off the ?rst surface 158 of the prism 155 and passes through a second surface 159 of the prism 155.
corresponding photo detector 145a. The siZe of each photo detector 145 is selected to limit the ?eld-of-vieW (FOV)
190a, 190b, 190c of each photo detector 145a, 145b, 145c, respectively, so that there are no overlapping FOVs 190 With
adjacent microlenses 175.
[0061] Depending on the processing mode, the image data
[0056] The re?ected light 128 exiting from the prism 155 travels in the x-direction, orthogonal to the elongated direc tion of the sWipe interface 110, and passes through magni
can be acquired from all of the photo detectors 145 or only a portion of the photo detectors 145 in the photo detector
?cation optics 130 that directs the re?ected light 128 toWards another prism 135. The prism 135 internally re?ects the light 128 off of surface 138 to redirect the light 128 in the Z-direction to the sensor 140. By utiliZing folded optics in the x-direction, instead of traditional optics in the Z-di rection, the thickness of the sWipe module 110 in the
mm><1 mm sensing area is needed to capture a complete
Z-direction can be reduced. It should be understood that
additional optical components, such as apertures and lenses, can also be used in the illumination optics 150 and/or image
transfer optics 130. In addition, other optical arrangements can also be used to fold the optical path of the light instead of the optical arrangement shoWn in FIG. 2. [0057] FIGS. 3A and 3B illustrate one example of reduc ing the siZe of the optics to ?t Within the thickness tolerances of the sWipe module 100. FIGS. 3A and 3B are magni?ed side vieWs of an exemplary sWipe module 100. Light 125 emitted from the LED 120 is directed by coupling lenses 150 to a prism 155. The coupling lenses 150 include opposing
convex surfaces capable of collimating the light 125 diverg ing at a narroW angle from the LED 120. The shape and
position of the prism 155 Within the sWipe module 100 is designed to either direct the light 125 at a desired angle of incidence to the sWipe interface 110 for re?ection of the light or to direct the light 125 to the sWipe interface 110 for multiple total internal re?ections Within the sWipe interface
110, the latter being illustrated.
[0058] To perform multiple TIRs Within the sWipe inter face 110, the side internal surfaces 112 orthogonal to the top surface 115 of the sWipe interface 110 are preferably coated With a light-absorbing material to absorb re?ected light at the sides. In other embodiments, the side internal surfaces 112 can be mirror-?nished. Light 125 is directed to the top surface 115 at an angle of incidence greater than the critical angle of the sWipe interface 110 material to create a total internal re?ection of the light 125. The total internally re?ected light 125 from the top surface 115 is directed to a bottom surface 118 parallel to the top surface 115 at an angle of incidence greater than the critical angle to create another total internal re?ection of the light 125. The thickness 119 of the sWipe interface 110 is approximately 0.5-1 mm to enable a thinner sWipe module 100 (e.g., 2
[0059] Light 128 re?ected from the ?nger 20 is passed through the bottom surface 118 of the sWipe interface 110 and focused by a microlens array 170 onto an imaging sensor 140 having an array of photo detectors 145 thereon. The LED 120, dome lens 150 and imaging sensor 140 are formed over a substrate 180 of the sWipe module 100.
array. For example, in ?nger recognition mode, the entire 9
image of the ?ngerprint 25. HoWever, in ?nger navigation mode, only a feW ?ngerprint 25 features are required to determine the direction of movement, and in many cases, the ?nger 20 is not positioned in the entire 9 mm><1 mm sensing
area. Therefore, in ?nger navigation mode, image data may only need to be acquired from a portion of the photo detectors 145 that detect light from the area of the sWipe interface 110 in contact With the ?nger 20. [0062] As shoWn in FIG. 4, to reduce the number of photo
detectors 145 required for ?nger navigation While capturing suf?cient image data to determine motion of the ?nger, the photo detector array 148 can be modi?ed to accommodate a
Wider sensing area in the center of the array 148 for ?nger navigation. In FIG. 4, an assumption is made that the
placement of the ?nger on the sWipe interface in ?nger navigation mode is primarily at the center of the elongated direction (y-direction) of the sensing area. An assumption is also made that in ?nger recognition mode, a portion of the ?nger is at the center of the elongated direction of the sensing area, and the center portion can be used to obtain ?nger navigation information to enable stitching of succes sive images, as described in more detail beloW in connection
With FIGS. 14 and 15. Therefore, in the arrangement of
photo detectors 145 shoWn in FIG. 4, the corresponding sWipe interface (not shoWn) has a general siZe of 9 mm><1 mm, but the central region of the sWipe interface is Widened in the x-direction to provide a 2 mm><2 mm central region.
The central region is used for ?nger navigation. Correspond ingly, the central region (shoWn by dotted lines) of the sensor 140 is Widened in the x-direction to detect light from
the Widened central region of the sWipe module. Aresolution of 400 dpi corresponds to approximately 16 photo detectors per mm, Which corresponds to 144x16 ?nger recognition photo detectors 145e (shoWn Within the solid black line), and 32x32 ?nger navigation photo detectors 145d in the central region of the photo detector array 148 (shoWn Within the
dotted line). [0063] To facilitate using different regions of the sensor and different portions of the sensing area for different modes
(e.g., ?nger recognition and ?nger navigation), multiple light sources and/or multiple sensors can be used to illumi
nate and image different ?nger areas of interest depending on the selected mode of operation. FIGS. 5A and 5B
schematically illustrate one example of the imaging system 10 having a sWipe module 10 capable of multiplexing illumination of the sWipe interface 110 based on the selected
mode. The sWipe interface 110 is shoWn separated into three
[0060] As shoWn in FIG. 3B, each microlens 175a, 175b,
partitions 110a, 110b and 110c in the y-direction. The sWipe
175c in the microlens array 170 is coupled With one photo detector 145a, 145b, 145c, respectively, in a one-to-one manner. For example, microlens 175a focuses light 128 re?ected from a portion of a ?ngerprint 25 (e.g., either a
interface 110 is further shoWn illuminated by three LEDs 120a, 120b and 120c. The illumination is aligned such that LED 120a illuminates only the ?rst partition 110a of the
?ngerprint ridge 25a or a ?ngerprint valley 25b) onto
sWipe interface, LED 120b illuminates only the second partition 110b, and LED 120c illuminates the third partition
Oct. 21, 2004
US 2004/0208346 A1
110c. The illumination optics are not shown for simplicity purposes. However, it should be understood that separate illumination optics for each LED 120a-120c may be utiliZed to direct the light from each LED 120a-120c toWards the
values can be read from sensing region 140a and stored. At a later time, LED 120a is turned “off”, and LED 120b is turned “on” to illuminate the second partition 110b, Which is
respective partition 110a-110c of the sWipe interface 110
imaged onto sensing region 140b. The pixel values can be read off sensing region 140b and stitched With the previous
associated With the LED 120a-120c.
captured image from sensing region 140a, and the resulting
[0064] The illuminated ?nger area is imaged using image transfer optics 130, 130b and 130C. The image transfer optics 130a, 130b and 130C are also aligned in the y-direc tion and in the Z-direction such that each separately illumi
nated partition 110a, 110b and 110c of the sWipe interface
110 is imaged onto corresponding sensing regions 140a, 140b and 140c, respectively. A single sensor can include
sensing regions 140a, 140b and 140c, or sensing regions
stitched image can be stored. Finally, LED 120b is turned “off” and LED 1206 is turned “on” to capture the rest of the image in a similar manner. The timing of the sequential partition illuminations is set such that little to no noticeable
movement of the ?nger occurs betWeen the partition images.
Sequential acquisition of the image reduces the poWer consumption by using only one LED 120 at a time.
140a, 140b and 140c may be three sensors separated by circuit-design constraints, as is shoWn in FIG. 5B. The center sensing region 140b, and the center partition 110b can
[0069] Multiplexing illumination is a viable option not only for elongated sWipe modules that capture an image of only a portion of a ?ngerprint and require movement of the ?nger to form one continuous image of the ?ngerprint, but
be Widened in the x-direction, as described above in con nection With FIG. 4.
?ngerprint Without requiring movement of the ?nger. For
[0065] When using separate sensors, the spacing betWeen the different sensors is designed to simplify the optics design
example, FIG. 5C shoWs an area module implementing multiplexed illumination of an area interface 455.
and ensure a complete image in ?nger recognition mode. For example, With sWipe interface 110 having an area of 9x1 mm, With each partition 110a, 110b and 110c of the sWipe
[0070] The sensing area of the area interface 455 is shoWn divided into four partitions in the x and y-directions. Four LEDs 120a-120d illuminate the area interface 455, With
interface 110 being 3x1 mm, each sensing region 140a, 140b and 140c can have an area of 1x03 mm, With a 0.25-1 mm
spacing betWeen the sensing regions 140a, 140b, and 140c. In some embodiments, gaps in the acquired image due to the
spacing betWeen the sensing regions 140a, 140b and 140c may be acceptable, as long as the geometrical relations are
knoWn. In other embodiments, overlap betWeen the acquired images may be desirable to facilitate stitching in ?nger recognition mode.
[0066] Although three separate partitions 110a, 110b and 110c of the sWipe interface 110 and three separate sensing regions 140a, 140b and 140c are shoWn, it should be understood that the sWipe interface 110 can be divided into any number of partitions, each of Which can be imaged by a separate region of a single sensor or separate sensors. The
greater the number of partitions 110a, 110b and 110c used, the less poWer is required because the intensity of light necessary to illuminate one partition decreases proportion ately as the partition siZe decreases. Reducing the poWer
consumption is an important design parameter in small and/or portable electronic devices. For example, in cell phones, PDAs and laptop computers, Where poWer is sup plied by a rechargeable battery, reducing the poWer con sumption increases the battery life.
[0067] HoWever, increasing the number of partitions can also increase the complexity in the illumination and image transfer optics, in reading the pixel values off of the sen sor(s) and in processing the image data. Furthermore, the illumination design becomes more difficult and costly as the number of partitions increases because the sensing area illuminated by each LED is smaller and the number of LEDs increases.
[0068]
In further embodiments, illumination can be
designed to provide sequential acquisition of the image of the entire ?nger area over the sWipe interface. For example, LED 120a can be turned “on”, While LEDs 120b and 1206 are in an off condition, to illuminate only the ?rst partition
110a, Which is imaged onto sensing region 140a. The pixel
also for area modules that capture an image of the entire the
each LED 120a-120d illuminating one partition 455a-455d, respectively of the area interface 455. The illumination is
aligned such that LED 120a illuminates only a ?rst partition 45501 of the area interface 455, LED 120b illuminates only a second partition 455b, LED 120C illuminates only a third partition 455c and LED 120d illuminates only a fourth partition 455d. The illumination optics are not shoWn for simplicity purposes. HoWever, it should be understood that separate illumination optics for each LED 120a-120d may be utiliZed to direct the light from each LED 120a-120d toWards the respective partition 455a-a' of the sWipe inter face 110 associated With the LED 120a-120a'.
[0071] The illuminated ?nger area is imaged using image transfer optics 130a, 130b, 130C and 130d. The image transfer optics 130a, 130b, 130C and 130d are also aligned in the x and y-directions and in the Z-direction, such that
each separately illuminated partition 455a, 455b, 455c and 455d of the area interface 455 is imaged onto sensing
regions 140a, 140b, 140c and 140d. Sensing regions 140a, 140b, 140c and 140d can be separate sensors or a single
sensor having separate regions to capture each partition image. As discussed above in connection With the sWipe module, one or more partitions 455a, 455b, 455c or 455d can be illuminated at a time, depending on the selected mode.
[0072]
Referring noW to FIGS. 6A and 6B, multiplexing
illumination onto separate sensors or separate regions of a
single sensor further reduces the siZe of the sWipe module 100. As shoWn in FIG. 6A, in order to focus the light from the elongated sensing area in the y-direction (e.g., shoWn as a 9 mm-Wide sensing area) of the sWipe interface on a single sensor of 3 mm-Wide in the y-direction, the focal length from the optics to the sensor requires a spacing of “X” mm betWeen the optics and the sensor in the Z-direction. As shoWn in FIG. 6B, by dividing the 9 mm sensing area into
three separate 3 mm-Wide partitions, each having separate
optics for directing the light from the respective partition toWards a respective 1 mm-Wide sensor, the spacing betWeen