6
A Standoff System for Noncooperative Ocular Biometrics Plamen Doynov and Reza Derakhshani, Ph.D. Department of Computer Science Electrical Engineering University of Missouri at Kansas City Kansas City, MO 64110, USA Abstract—The iris and more recently vascular patterns seen on the white of the eye have been considered for ocular biometrics. The non-contact nature, uniqueness, and permanence of ocular features makes them promising. Among new challenges are to develop commercial systems for less constrained environments and at extended distances. Such systems need to have minimal burden on the user and be robust for non-cooperative users. We present the design and development of standoff system for noncooperative ocular biometrics using system integration approach. Review of existing commercial and experimental long-range biometric systems is presented. The process of selection of sensors and illumination techniques is described. The development of user interfaces and algorithms for a working prototype is explained. The performance is evaluated with images of 28 subjects, acquired at distances up to 9 meters. The conflicting requirements for the design of this standoff biometric system, and the resulting performance limitations with impact on image quality are discussed. Keywords-Ocular biometrics; Standoff biometric system; Noncooperative biometrics. I. INTRODUCTION Biometric technologies have been implemented in many application areas and are replacing traditional authentication methods [1-3]. Ocular biometrics refers to the imaging and use of characteristic features of the eyes for personal recognition. The proliferation of ocular biometrics is based on its inherent advantages [4, 5] and it is made possible by recent progress in related technologies and processing algorithms [6-8]. Traditional face and fingerprint recognition may also be augmented by additional biometric traits, such as ocular, for more accuracy and security [9]. However, many challenges remain, especially for iris image acquisition in unconstrained conditions and without the necessary degree of user cooperation [10-12]. Research teams and commercial developers have responded by creating uni- and multi-modal systems for real-world conditions [13, 14]. The goal is robust performance in variable lighting conditions and subject-to- camera distances for moving subjects, off-angle images, and other factors that generally diminish captured image quality. During next section of this paper, we review some notable long-range ocular biometric systems and describe their important parameters and limiting factors. Section three outlines the requirements of the front-end imaging system of a standoff ocular biometric system. We describe key components that affect image quality and overall system performance. Section four describes the development of a standoff ocular biometric system using system integration of commercially available, off-the-shelf components. In section five, we report the corresponding results for ocular images of 28 volunteers at distances of up to 9 meters. In conclusion, we outline the key attributes of the imaging system for standoff ocular biometrics, the challenges we faced, and the future work based on the lessons learned. II. ACQUISITION OF OCULAR BIOMETRIC TRAITS A. Standoff systems for Ocular Image Acquisition At long distances, capturing the eye with sufficient resolution and quality is a challenging proposition [15-17]. The challenges are elevated even further when the imaging system has to work well without cooperation from the user. Proenca et al. address important issues and trends in non- cooperative iris recognition, and have created UBIRISv2 database of iris images captured “on the move” and “at a distance” [18-20]. The authors use visible spectrum for imaging as an alternative to the customary near infrared (NIR). Wheeler et al. [21] describe a stand-off iris capturing system designed to work at up to 1.5 m using a pair of cameras with wide field of view for face localization and an iris camera to capture the iris. Dong et al. [22] discuss the design of a system to image the iris at a distance of 3 meters. The “Iris on the Move” system of Sarnoff Corporation has also a reported standoff distance of 3 meters. It is a portal-style system with a 210 mm, F/5.6 lenses [23]. Du et al. [24] describe and use the IUPUI multi-wavelength database, acquired at 10.3 feet from the camera to the subject using a MicroVista NIR camera with Fujinon zoom lens. AOptix Technologies [25] uses adaptive optics with wavefront sensing and close loop control for a standoff system with a work volume at 2 to 3 meters from the camera. Retica reports that their multi-biometric system achieves 77% true match rates at 4.5 meters on first attempt and 92% after three attempts [26]. B. Augmented Standoff Acquisition Systems Most current ocular biometric systems are based on the unique iris patterns of the human eye. Their performance depends directly on the iris image quality, which is adversely affected by distance. Recently, improvements using additional ocular modalities have been investigated [9, 12, and 34]. Simultaneous acquisition of iris, vascular patterns on the white of the eye, and periocular patterns may also reduce user constrains or requirements for compulsory user cooperation. 978-1-4673-2709-1/12/$31.00 ©2012 IEEE 144

A Standoff System for Noncooperative Ocular Biometrics

  • Upload
    umkc

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

A Standoff System for Noncooperative Ocular Biometrics

Plamen Doynov and Reza Derakhshani, Ph.D. Department of Computer Science Electrical Engineering

University of Missouri at Kansas City Kansas City, MO 64110, USA

Abstract—The iris and more recently vascular patterns seen on the white of the eye have been considered for ocular biometrics. The non-contact nature, uniqueness, and permanence of ocular features makes them promising. Among new challenges are to develop commercial systems for less constrained environments and at extended distances. Such systems need to have minimal burden on the user and be robust for non-cooperative users.

We present the design and development of standoff system for noncooperative ocular biometrics using system integration approach. Review of existing commercial and experimental long-range biometric systems is presented. The process of selection of sensors and illumination techniques is described. The development of user interfaces and algorithms for a working prototype is explained. The performance is evaluated with images of 28 subjects, acquired at distances up to 9 meters. The conflicting requirements for the design of this standoff biometric system, and the resulting performance limitations with impact on image quality are discussed. Keywords-Ocular biometrics; Standoff biometric system; Noncooperative biometrics.

I. INTRODUCTION Biometric technologies have been implemented in many

application areas and are replacing traditional authentication methods [1-3]. Ocular biometrics refers to the imaging and use of characteristic features of the eyes for personal recognition. The proliferation of ocular biometrics is based on its inherent advantages [4, 5] and it is made possible by recent progress in related technologies and processing algorithms [6-8]. Traditional face and fingerprint recognition may also be augmented by additional biometric traits, such as ocular, for more accuracy and security [9]. However, many challenges remain, especially for iris image acquisition in unconstrained conditions and without the necessary degree of user cooperation [10-12]. Research teams and commercial developers have responded by creating uni- and multi-modal systems for real-world conditions [13, 14]. The goal is robust performance in variable lighting conditions and subject-to-camera distances for moving subjects, off-angle images, and other factors that generally diminish captured image quality.

During next section of this paper, we review some notable long-range ocular biometric systems and describe their important parameters and limiting factors. Section three outlines the requirements of the front-end imaging system of a standoff ocular biometric system. We describe key components

that affect image quality and overall system performance. Section four describes the development of a standoff ocular biometric system using system integration of commercially available, off-the-shelf components. In section five, we report the corresponding results for ocular images of 28 volunteers at distances of up to 9 meters. In conclusion, we outline the key attributes of the imaging system for standoff ocular biometrics, the challenges we faced, and the future work based on the lessons learned.

II. ACQUISITION OF OCULAR BIOMETRIC TRAITS

A. Standoff systems for Ocular Image Acquisition At long distances, capturing the eye with sufficient

resolution and quality is a challenging proposition [15-17]. The challenges are elevated even further when the imaging system has to work well without cooperation from the user.

Proenca et al. address important issues and trends in non-cooperative iris recognition, and have created UBIRISv2 database of iris images captured “on the move” and “at a distance” [18-20]. The authors use visible spectrum for imaging as an alternative to the customary near infrared (NIR). Wheeler et al. [21] describe a stand-off iris capturing system designed to work at up to 1.5 m using a pair of cameras with wide field of view for face localization and an iris camera to capture the iris. Dong et al. [22] discuss the design of a system to image the iris at a distance of 3 meters. The “Iris on the Move” system of Sarnoff Corporation has also a reported standoff distance of 3 meters. It is a portal-style system with a 210 mm, F/5.6 lenses [23]. Du et al. [24] describe and use the IUPUI multi-wavelength database, acquired at 10.3 feet from the camera to the subject using a MicroVista NIR camera with Fujinon zoom lens.

AOptix Technologies [25] uses adaptive optics with wavefront sensing and close loop control for a standoff system with a work volume at 2 to 3 meters from the camera. Retica reports that their multi-biometric system achieves 77% true match rates at 4.5 meters on first attempt and 92% after three attempts [26].

B. Augmented Standoff Acquisition Systems Most current ocular biometric systems are based on the

unique iris patterns of the human eye. Their performance depends directly on the iris image quality, which is adversely affected by distance. Recently, improvements using additional ocular modalities have been investigated [9, 12, and 34]. Simultaneous acquisition of iris, vascular patterns on the white of the eye, and periocular patterns may also reduce user constrains or requirements for compulsory user cooperation.

978-1-4673-2709-1/12/$31.00 ©2012 IEEE 144

Periocular features maybe useful for long distance recognition, however they are not as specific as those of iris or vasculature seen on the white of the eye. The latter is especially amenable to being captured at longer distances in visible spectrum and with off-angle iris.

In an effort to extend the depth of field, another challenge in standoff ocular biometrics, Boddeti and Kumar [27] investigate the use of wavefront-coded imagery for iris recognition. They conclude that wavefront coding could help increase the depth of field of an iris recognition system by a factor of four. McCloskey et al. [28] explore a “flutter shutter” technique to acquire focused iris images from moving subjects eliminating motion blur. Researchers have explored “structured” light, visible spectrum, and imaging under different wavelength illumination as opposed to the NIR range (700 to 900 nm), which is typically used in commercial systems. Ross et al. [29] investigate imaging with illumination in the 950nm to 1650nm range at short distances. Grabowski et al. [30] describe iris imaging that allows characterization of structures in the iris tissue. He et al. [31] design their own camera for iris capture using a CCD sensor with 0.48 M pixels resolution and a custom lens with 250mm fixed focus. They use LED light source at 800nm and NIR band-pass filter to minimize specular reflections from the cornea of the eye.

III. PARAMETERS AND REQUIREMENTS OF STANDOFF OCULAR BIOMETRIC SYSTEMS

The acquisition of quality images is the most important step in standoff ocular biometrics. There are specific requirements and performance challenges to the image capturing equipment. Proximity to the subject, illumination, and viewing angle are among confounding variables. Moving subjects have a limited residence time in the field of view and within the imaging depth of field. Even with some degree of cooperation, the orientation of face and eyes is not always perfect. Image resolution decreases with increased distance. The collected light by the lens aperture decreases in inverse proportion to the square of the distance. Imaging with higher f-number increases the depth of field but limits the amount of collected photons and consequently, requires longer exposure times (or increased illumination intensity). Imaging with long exposure time is prone to motion blur.

In their comprehensive tutorial [32], Matey and Kennell examine the requirements for iris capturing at distances greater than one meter. The authors describe many relevant parameters, including wavelength, type of light source and eye safety, required characteristics of the lens and signal to noise ratio, capture volume and subject’s residence time. Describing the “Eye on the move” system, Matey summarizes the requirements for a standoff iris imaging system [33]. He indicates the need for approximately 100 microns resolution at the eye (100 pixel across the iris); 40 dB signal to noise ratio (SNR); 90 and 50 levels of intensity separation between iris-sclera and iris-pupil boundaries, respectively, for an 8-bit imaging sensor. To cover a double door passing area, Matey calculates that a system needs 150 mega-pixels sensor(s) to achieve a 100-pixel resolution across the imaged iris, given an

average iris size of 10mm. Because of this, one customary approach is to use a wide field-of-view camera to locate the eyes in tandem with a second, narrow field-of-view camera for imaging.

Localization of the eyes for subjects on the move is not a trivial task either. Again, many systems use a camera with a wide field of view to locate the face and subsequently the eyes, and a high resolution, high-magnification camera for iris capture. To cover a wider field of view, the high resolution camera maybe mounted on a pan and tilt stand, and use a lens with optical zoom (PTZ). In this case, the mechanical stability, speed, and pointing accuracy of the PTZ system become crucial. Even at only 1 meter standoff, 100 microns resolution for the iris capture requires 100 micro-radians (0.006 degrees) pointing stability [23].

Extreme standoff distances are limited by the governing laws of light propagation and the capability and price of current technology and components. In next section, we describe the design and construction of a standoff imaging system using system integration of commercially available components with low to medium cost.

IV. AN ACQUISITION PLATFORM FOR NONCOOPERATIVE, LONG-RANGE OCULAR BIOMETRICS

The following describes our novel acquisition platform for long-range ocular biometrics to image iris in NIR from standoff distances up to 10 meters and possibly without necessary cooperation from the subjects.

A. Design Approach and Considerations The original idea was to focus on development of the front

end optics and image sensor for long range acquisitions. Furthermore, to relax the requirements for subject cooperation, we investigated electronic and electromechanical components required to locate a subject’s eyes, detect optimal gaze, and acquire an image (or images) with sufficient quality. Our design concept addressed the following three functional aspects:

1. Eyes and iris localization with gaze detection and alignment: using a PTZ mechanism, camera is to make eye contact with the subjects (rather than the opposite, which requires cooperative subjects):

• Customized Eyebox2 (XUUK™ Inc., Kingston, ON, Canada) system for long-range person/face/eyes and gaze detection;

• Ability to control simultaneously multiple iris cameras for crowd scanning.

2. Use advance imaging techniques, especially lucky imaging, for long range acquisition of the iris:

• High-magnification precision optics; • NIR-enhanced high-speed image sensor in burst

mode (high frame rate); • Real-time algorithms to evaluate image quality and to

select the best image from a sequence; • Synchronized active illumination, and

3. Subject/eye tracking until a good quality image is obtained.

145

The performance requirements for resolutthe SNR and optical resolution of the acququality of images at a distance is related to thresidence time (frame exposure duration) and the imaging sensor. Practical usability and thblurring because of subject movements constrains. Maximum illumination is limitefactors. Thus, compliance with ANSI/IESN27.1-96 and its testing methodology is criticanewer standard, referred to as IEC 62471-200It addresses the photo-biological safety osystems, and specifically the safety of Lapplicable to ocular imaging illumination.

Design considerations have to include theof a lens and its dependence on the wavaperture diameter (optical resolution covrequirements on optical system and image stringent in case of non-cooperative subjects.

B. Component Selection and Design ImplemeThe focus areas, while implementing the

design, were: a) optics; b) illumination; c) imd) eye/gaze detection and tracking; and eocular image quality metrics.

Optical front end is the most important pasystem with three main components: long focaeyepiece for additional magnification; and a(camera) attached to the optics. After constelephoto lenses, we first used Infinity’microscope (Infinity, Boulder, CO, USA) digital camera (Fig. 1.a). K2/S is an excellenwith many advanced features. However, aperture and limited field of view, K2/S illumination and pointing accuracy, even at(e.g. less than 3m). To obtain the necessary rewe resorted to digiscoping as one of the simeffective methods for high magnification imagcamera lenses, a high performance glass objecproduce the sharpest images with the best coand optical resolution. For cost considerexplored the performance of Meade’s LX9Instruments, Irvine, CA, USA) Schmidt-Cass(Fig. 1.b), because of its large aperture and tphoton collection efficiency of a reflective tele

Figure 1. Telescopes coupled with cameras for standotesting: a) Infinity K2/S; b) Meade LX90; and c) Swar

In parallel, we evaluated the high-definispotting scope (Swarovski Optik North Amwith an 80mm objective lens and availabmagnifying and optical-zoom eyepieces objective lens focal length is 460mm and the Besides typical limitations of refractive telesc

tion are based on uired signal. The he lens’ aperture, light intensity at

he need to avoid are additional

d by eye safety A Standard RP-

al. In late 2008, a 06, was adopted. of lamps, lamp

LED sources, as

e diffraction limit velength and the variates). These sensor are more

entation standoff system

maging sensor(s); ) near real-time

art of the imaging al length lens; an an image sensor

sidering different s K2/S remote coupled with a

nt optical system with its small

requires intense t short distances

esolution at up to, mplest, yet most ging. Just as with ctive is needed to olor reproduction ations, first we 90-ACF (Meade segrain telescope the intrinsic high escopes.

off ocular imaging rovski STS80 HD.

ition STS80 HD merica, RI, USA)

ble selection of (Fig. 1.c). The weight is 1330g. opes, STS80 HD

has advantages such as a short m(5m), ease of coupling with varietyflexible camera adapter with directlocation and focus adjustment. In than eyepiece with a fixed power whiimage sensor behind the eyepiecimportantly, it produces less vignetedges of an image) compared to a ty

Because of the need for high mthe effect of subject-camera moverequires short exposure times usingimage sensors with larger pixillumination. Historically, NIR facilitate the imaging of irises wimultimodal ocular biometrics, visifor periocular imaging and to imagwhite of the eye (wavelengths in tillumination, however, does not econstriction reflex and thus the usefor standoff system is limited. IR-Areaches the sensitive cells of the irradiance sources the retina is at from localized light sources. IR-B and IR-C (far IR, 3000nm to 1mmskin and the cornea from “flash burthe cornea may be conducted to clouding (albeit with very long illumination intensity for short expo

Εγ Δγ < 1.8

where Eγ is the spectral irradiance ibandwidth in nm, and t is the e(International Commission on Protection, ICNIRP 1996, 2000). Wmultispectral intensity meter, PM(ThorLabs, Newton, NJ, USA) illumination intensity levels, and capturing capability of STS80 HD.

After testing different commselected LIR850-25 (LDP LLC – MUSA) with an effective range of usource has 147 IR LED and can opemode with the camera.

Figure 2. The system for standoff ocular imtelescope and two LIR850-

minimum focusing distance y of digital cameras, and a t access for imaging target he final design, we selected ch makes placement of the

ce less critical and, more tting (darkening around the ypical zoom eyepiece.

magnification and to reduce ements, a standoff system

g “fast” lenses and sensitive xels, plus proper active illumination is used to

ith dark pigmentation. For ible spectrum can be used ge the vasculature seen on the green bandwidth). NIR evoke the protective pupil e of extreme light intensity A (near IR, 780 to 1400nm)

retina, and thus for high risk from acute exposures (mid IR, 1400 to 3000nm)

m) present risk to both the rns.” The heat deposited in the eye’s lens and cause exposure times). The IR

osures should be limited to

t3/4 (1)

n W/cm2, Δγ is the spectral exposure time in seconds

Non-ionizing Radiation We used a digital hand-held M100 with S120B sensor,

to measure and monitor also to evaluate the light

mercial light sources, we MaxMax.com, Carlstadt, NJ, up to 150 m (Fig 2). Each erate in a synchronous flash

maging with Swarovski STS80 HD -25 lifgt sources.

146

We tested several cameras with enhancedThe SMX-150M camera was selected becauseNIR spectral characteristics of its IBIS5-sensor (Sumix, Oceanside, CA, USA). It has(1280 x 1024) sensor with high sensitivity inincreased light collection quantum efficiencpixel size of 6.7 x 6.7μm, monochromaextensive real time control and visualizatiointerface. The upper curve in Fig. 3 represenNIR efficiency of the sensor and the vertical dthe 850nm wavelength of the LED light sourchardware module from National InstrumenUSA) was used to trigger the light sources (USB-based digital and analog control module).

Figure 3. The SMX-150M camera (left) and the spectraits enhanced IBIS5-AE-1300 sensor (upper re

For eye localization and gaze detectionXUUK™ camera/light sources system (Fig. originally designed to detect and count tlooking at a target such as billboards. The apgaze based on the red-eye effect. We usedevelopment kit to write an application and recoordinates of detected face/eyes to pan-anpoint the mounted imaging system. We uspan-and-tilt unit (Fig. 4.b) from Direc(Burlingame, CA, USA). We selected this parits high-speed and accurate positioning of payat speeds up to 60 degrees/second and positi0.013 degrees. Application developed in LabInstruments) distributes the detected eyes conetworked pan and tilt controller using TCP/IP

Figure 4. The Eyebox2 XUUK™ camera (a) and PTU-

Fig. 5 displays the functionality of the Xlocating a person, his eyes, and presence of di5(a), the person is not detected. In 5(b), the and given an associated sequence number. Fi

d NIR sensitivity. e of the enhanced -AE-1300 image s 1.3-mega pixel

n 400 to 1000nm, cy due to larger atic design, and on using its PC nts the enhanced

dashed line marks ce. The USB6009 nts (Austin, TX,

and the camera and acquisition

al charachteristics of ed curve).

n, we used the 4.a), which was

the human eyes pplication detects ed the provided eport the relative

nd-tilt module to sed PTU-D46-70 cted Perception rticular model for yloads up to 9 lbs ion resolution of

bVIEW (National oordinates to the P.

-D46-70 module (b).

XUUK™ system irect gaze. In Fig. face is detected ig. 5(c) indicates

one person detected and “looking” (XUUK camera). Fig. 5(d) demonstlooking at the camera from about a

V. IMAGE ACQUISITION ANDEVALUATIO

The first image acquisition inunder the auspices of the Institutiacquisition included multiple burstwhich were obtained from distanceand 9 meters (Fig. 7). Backgrounmanual focus and a synchronized eused for burst captures. In a secondan additional group of 15 volunteers

Figure 5. Performance of the XUUK™ cama person/face (b); direct gaze detection at ab

of direct axial gaze at a distance of

Figure 6. Images acquired at 6 meters with

Figure 7. Images acquired at 9 m

The number of pixels per proportionally with distance betweWe noted that 6m distance, the c

(gaze directed axially at the trates detection of a person distance of 7.5 meters.

SYSTEM PERFORMANCE ON

ncluded trial 13 volunteers, onal Research Board. The s of five image sequences, es of 0.75, 6 (Fig. 6), 7, 8, nd NIR light was used for electronic NIR “flash” was d trial, the eye images from s were acquired.

mera: no detection (a); detection of out 1m distance (c); and detection

f approximately 7.5m (d).

hout (left) and with glasses (right).

meters standoff distance.

iris diameter decreases en the camera and subject.

combined magnification of

147

lens-eyepiece optical system results in asufficient to image a single eye with more thairis diameter. At 9m distance, the average irisdecreases to about 210 pixels.

All images were stored for compuassessment and features extraction. A numfactors were implemented and used to selectfrom a series of burst shots taken at distances The images were first segmented and the lowregion was used to compute the local quality mframe selection. Only correctly segmenteselected for computation of quality factors. Dmeasures were tested, including: gradien(Tenengrad, adaptive, separable and non-sepaLaplacian, Adaptive Laplacian); correlation-(autocorrelation function-single sample, arecentral peak of the correlation function);measures (absolute central moment, grey Chebytchev moments/ratios, entropy, histogrbased measures (Fourier transform: coefficiencosine transform, multivariate kurtosis, wavebased measures (step edge characteristics, local kurtosis) [34]. Focus quality assessmebest frame selection and to illustrate the qualidistance. Fig. 8 displays the best achieved qdifferent distances [35].

Figure 8. Best combined quality score for different stanour long-range aquision platform.

VI. DISCUSSION AND CONCLUS

In this paper we reported the design coimplementation of a standoff image acquisocular biometrics. We used system commercially available, off-the-shelf codescribed important and often conflicting requfront end optical system.

Regarding hardware, we concluded that r(e.g. by using XUUK™ system) can be succdetect gaze direction from localized distant facoordinates can be used to control a PTZ ssubject cooperation requirements. We conclud

• The embedded computing power isreal time performance of the eye discovealgorithm;

a field of view an 320 pixels per s image diameter

utational quality mber of quality t the best frames of 1 to 9 meters.

wer part of the iris measures for best ed frames were Different quality

nt-based metrics arable Tenengrad, -based measures a and height of statistics-based level variance,

ram); transform-nts & magnitude, elets); and edge-transition width,

ent was used for ity variation with quality scores at

ndoff distances using

SIONS onsiderations and ition system for integration of

omponents. We uirements for the

red-eye detection cessfully used to

ace and eyes. Eye system to reduce de that: s critical for the ery and tracking

• Main camera lens has to high photon collection and the quantum collection efficiency illumination (e.g. large pixel size);

• The camera needs to haelectronically controlled settings;

• Ambient light introducequality when NIR band-pass filter the different focal planes of differenand specular reflection;

• Overall, the optical fronmagnification and quality (e.g. lowlarge aperture, high photon transfefield). It needs to be adapted preferably, including a fast and accdistance;

• The sensor has to be syillumination sources considering iexposure safety requirements;

• Pan-and-tilt system has to to reference coordinates with fasmotion stabilization. Alternative antilt mechanisms (e.g. arc mountedbetter suited for this application to camera assembly and to operate wit

• A constellation of multiimaging cameras may cover largerbetter images based on control detection systems.

A number of quality factors wselect the best frames from a seriethe 28 subjects, taken from distanmeters. The results demonstrate quality with the increased standocomparison at the same distanceassessment and selective spectral increase the image quality. This design with multiple networked sources.

The illumination intensity levcritical. Light source’s spatial locaimportant. For example, the relatisource location is a probable differences in Fig. 8 for the left vs. with the camera produces specular which may degrade performancespecially if located on the pupil, tcould be used for focus adjustment

Future work will involve segmentation algorithm that wouldacquired from different distances, uscales, resolutions, locations and at Implementation of fast voting critebest frame before and after segmperformance of the system, and thnear real-time quality assessment is

have a large aperture for image sensor needs high at the wavelength of

ave high frame rate and

es degradation of image is not used. This is due to

nt wavelengths and/or glare

nt end has to have high w geometric distortion, fast, er, and sufficient depth of for electronic focus, and urate assessment of subject

ynchronized with the NIR intensity levels within the

have the ability to respond st vector movements and nd non-traditional pan and d or disk-based) could be accommodate the lens and

th less vibration; ple distributed networked

r work volume and acquire from one or more gaze

were calculated and used to es of burst shots of each of nces of 1, 6, 7, 8, and 9 the degradation of image

off distances. Inter-subject e suggests that iris color

illumination/imaging may option is applicable to a cameras and illumination

vel and its bandwidth are ation and direction are also ive difference in the light cause for image quality right eyes. Axial alignment reflections from the cornea

ce. On the other hand, the specular reflection size and even PSF calculations. implementing a robust

d perform better on images using iris images at different

various degradation levels. eria for the selection of the

mentation is critical for the hus further development of

needed. Implementation of

148

an integrated camera and illumination source feedback control in an embedded computational unit will make the system more robust and easier to use. Future work also needs to address illumination techniques (active, passive, and structured) for noncooperative standoff biometric systems.

VII. ACKNOWLEDEMENTS This work was supported in part by a grant from the Center

for Identification Technologies Research (CITeR). Research was performed in cooperation with Dr. Besma Abidi, University of Tennessee, Knoxville, TN. The authors thank her for many useful discussions and the quality metrics evaluation of the acquired images.

REFERENCES [1] A. K. Jain and A. Kumar, Second Generation Biometrics. Springer, Ch.

“Biometrics of Next Generation: An Overview”, 2010. [2] D. Bhattacharyya, R. Ranjan, P. Das, T. Kim, and S. K. Bandyopadhyay,

“Biometric Authentication Techniques and its Future Possibilities”, Second International Conference on Computer and Electrical Engineering, ICCEE '09, Volume: 2, pp. 652 – 655, 2009.

[3] Q. Xiao, “Technology review - Biometrics-Technology, Application, Challenge, and Computational Intelligence Solutions”, Computational Intelligence Magazine, IEEE Volume: 2, pp. 5–25, 2007.

[4] K. Bowyer, K. Hollingsworth, and P. Flynn. ”Image understanding for iris biometrics: A survey”. Computer Vision and Image Understanding, vol. 110, no. 2, pp. 281–307, 2008.2007.

[5] A. K. Jain, A. Ross, and S. Prabhakar, “An introduction to biometric recognition”, IEEE Transactions on Circuits and Systems for Video Technology, Volume: 14, Issue: 1, pp. 4 – 20, 2004.

[6] K. Bowyer, K. Hollingsworth, and P. Flynn, “Image understanding for iris biometrics: A survey.”, Computer Vision and Image Understanding, vol. 110, no. 2, pages 281–307, 2008.

[7] J. G. Daugman, “New methods in iris recognition.”, IEEE Transactions on Systems, Man, and Cybernetics - Part B: Cybernetics, vol. 37, no. 5, pp. 1167–1175, 2007.

[8] R. Derakhshani and A. Ross, “A Texture-Based Neural Network Classifier for Biometric Identification using Ocular Surface Vasculature”, International Joint Conference on Neural Networks, pp. 2982 – 2987, 2007.

[9] L. Nadel and T. Cushing, “Eval-Ware: Biometrics Resources [Best of the Web]”, Signal Processing Magazine, IEEE, Volume: 24, Issue: 6, pp. 136 – 139, 2007.

[10] C. Fancourt, L. Bogoni, K. Hanna, Y. Guo, R. Wildes, N. Takahashi, and U. Jain. “Iris recognition at a distance.”, In Proceedings of the 2005 IAPR Conference on Audio and Video Based Biometric Person Authentication, pp. 1–13, U.S.A., July 2005.

[11] K. Ricanek, M. Savvides, D. L. Woodard, and G. Dozier, “Unconstrained Biometric Identification: Emerging Technologies”, Computer, Volume: 43, Issue: 2, pp. 56 – 62, 2010.

[12] A. Ross, “Recent Progress in Ocular and Face Biometrics: A CITeR Perspective”, 2010, http://www.csee.wvu.edu/~ross.

[13] H. Proenca and L.A. Alexandre, “Iris Segmentation Methodology for Non-cooperative Recognition.”, IEEE Proc.—Vision, Image and Signal Processing, vol. 153, pp. 199-205, 2006.

[14] J. Ortega-Garcia et al., “The Multi-Scenario Multi-Environment BioSecure Multimodal Database (BMDB),” IEEE Trans. on Pattern Analysis and Machine Intelligence, 32(6), pp. 1097–1111, 2009.

[15] J. Choi, G. H. Soehnel, B. E. Bagwell, K. R. Dixon, and D. V. Wick, “Optical requirements withturbulence correction for long-range biometrics”, in Optics and Photonics in Global Homeland Security V and BiometricTechnology for Human Identification VI. Proceedings of theSPIE, Volume 7306, pp. 730622-730622-11 (2009).

[16] http://www.dodsbir.net/sitis/archives_display_topic.asp, SBIR Program A10-100, “Standoff-Biometric for Non-Cooperative Moving Subjects.”

[17] T. Boult and W. Scheirer, “Long-Range Facial Image Acquisition and Quality” in Handbook of Remote Biometrics for Surveillance and Security, Edited by Massimo Tistarelli, Stan Z. Li and Rama Chellappa, Springer, pp. 169-192, 2009.

[18] H. Proenca, “Non-cooperative iris recognition: Issues and Trends.”, 19th Europian Signal Processing Conference (EUSIPCO 2011), Barcelona, Spain, August29 – September 2, 2011.

[19] H. Proenca, S. Filipe, R. Santos, J. Oliveira, and L. A. Alexandre, “The UBIRIS.v2: A Database of Visible Wavelength Iris Images Captured On-The-Move and At-A-Distance.”, IEEE Trans. on Pattern Analysis and Machine Intelligence, in press.

[20] H. Proenca, “On the feasibility of the visible wavelength, at-a-distance and on-the-move iris recognition.” IEEE Workshop on Computational Intelligence in Biometrics: Theory, Algorithms, and Applications (CIB 2009) , 9-15, March 2009.

[21] F. W. Wheeler, A. G. Amitha Perera, G. Abramovich, B. Yu, and P. H. Tu, “Stand-off Iris Recognition System”. IEEE 2nd International Conference on Biometrics: Theory, Applications, and Systems (BTAS 08), Sept. 2008.

[22] W. Dong, Z. Sun, and T. Tan, “A Design of Iris Recognition System at a Distance”. Chinese Conference on Pattern Recognition (CCPR 2009), 1-5, Nov. 2009.

[23] J. R. Matey, “Iris Recognition: On the Move, At a Distance, and Related Technologies”, Sarnoff Corporation, Prinston, NJ, 08543

[24] Y. Du, N. L. Thomas, and E. Arslanturk, "Multi-level iris video image thresholding," in IEEE Workshop on Computational Intelligence in Biometrics:Theory, Algorithms, and Applications, 2009, pp. 38-45.

[25] Comprehansive Evaluation of Stand-Off Biometrics Techniques for Enhanced Surveillance during Major Events http://pubs.drdc.gc.ca/inbasket/mmgreene.110426_0911.DRDC_CSS_CR-2011-08.pdf

[26] F. Bashir, P. Casaverde, D. Usher, and M. Friedman, “Eagle-eye: a system for iris recognition at a distance”, 20008 IEEE Conference on Technologies for Homeland Security, 12-13 May 2008, pp. 426-431.

[27] V. N. Boddeti and B.V.K. Kumar, “Extended-Depth-of-Field Iris Recognition Using Unrestored Wavefront-Coded Imagery”. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans 40 (3), May 2010, 495-508.

[28] S. McCloskey, A.W. Au, and J. Jelinek, “Iris capture from moving subjects using a fluttering shutter”. Fourth IEEE International Conference on Biometrics: Theory Applications and Systems (BTAS 10), Sept. 2010.

[29] A. Ross, R. Pasula, and L. Hornak, “Exploring multispectral iris recognition beyond 900nm”. IEEE 3rd International Conference on Biometrics: Theory, Applications, and Systems (BTAS 09), Sept. 2009.

[30] K. Grabowski, W. Sankowski, M. Zubert, and M. Napieralska, “Iris structure acquisition method”. 16th International Conference Mixed Design of Integrated Circuits and Systems (MIXDES ’09), June 2009, 640-643.

[31] X. He, J. Yan, G. Chen, and P. Shi, “Contactless Autofeedback Iris Capture Design”. IEEE Transactions on Instrumentation and Measurement 57 (7), 2008, 1369-1375.

[32] J.R. Matey and L.R. Kennell, “Iris Recognition -Beyond One Meter, Handbook of Remote Biometrics, 2009.

[33] J. R. Matey, D. Ackerman, J. Bergen, and M. Tinker, ”Iris recognition in less constrained environments”, Springer Advances in Biometrics: Sensors, Algorithms and Systems, pp. 107–131, October, 2007.

[34] H. Bharadwaj, H.S. Bhatt, M. Vatsa, and R. Singh, “Periocular biometrics: When iris recognition fails”. Fourth IEEE International Conference on Biometrics: Theory Applications and Systems (BTAS 10), Sept. 2010.

[35] R. Derakhshani, P. Doynov, and B. Abidi, “An Acquisition Platform for Non-cooperative, Long Range Ocular Biometrics”, Project report, CITeR 2008.

149