Camera Characterization for Face Recognition

0 downloads 0 Views 913KB Size Report
... Mindestvorschriften zum Schutz von Sicherheit und. Gesundheit der Arbeitnehmer vor der Gefährdung durch physikalische Einwirkungen (künstliche optische.
Camera Characterization for Face Recognition under Active Near-Infrared Illumination Thorsten Gernoth and Rolf-Rainer Grigat Hamburg University of Technology Vision Systems, E-2 Harburger Schloßstr. 20, 21079 Hamburg, Germany Tel: +49 40 42878-3125, Fax: +49 40 42878-2911 http://www.ti1.tu-harburg.de in: Proceedings of SPIE. See also BibTEX entry below.

BibTEX: @inproceedings{gernoth10a, author = {Thorsten Gernoth and Rolf-Rainer Grigat}, title = {Camera characterization for face recognition under active near-infrared illumination}, booktitle = {Proceedings of SPIE}, volume = {7529}, publisher = {SPIE}, year = {2010}, pages = {75290Z}, url = {http://link.aip.org/link/?PSI/7529/75290Z/1}, doi = {10.1117/12.839001} }

© 2010 Society of Photo-Optical Instrumentation Engineers. This paper was published in Proceedings of SPIE and is made available as an electronic reprint with permission of SPIE. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited.

Camera Characterization for Face Recognition under Active Near-Infrared Illumination Thorsten Gernotha and Rolf-Rainer Grigata a Hamburg

University of Technology, Vision Systems E-2, Harburger Schloßstr. 20, 21079 Hamburg, Germany ABSTRACT

Active near-infrared illumination may be used in a face recognition system to achieve invariance to changes of the visible illumination. Another benefit of active near-infrared illumination is the bright pupil effect which can be used to assist eye detection. But long time exposure to near-infrared radiation is hazardous to the eyes. The level of illumination is therefore limited by potentially harmful effects to the eyes. Image sensors for face recognition under active near-infrared illumination have therefore to be carefully selected to provide optimal image quality in the desired field of application. A model of the active illumination source is introduced. Safety issues with regard to near-infrared illumination are addressed using this model and a radiometric analysis. From the illumination model requirements on suitable imaging sensors are formulated. Standard image quality metrics are used to assess the imaging device performance under application typical conditions. The characterization of image quality is based on measurements of the Opto-Electronic Conversion Function, Modulation Transfer Function and noise. A methodology to select an image sensor for the desired field of application is given. Two cameras with low-cost image sensors are characterized using the key parameters that influence the image quality for face recognition. Keywords: Infrared imaging, active illumination, face recognition, camera characterization

1. INTRODUCTION In the modern society there is a high demand to automatically and reliably determine or verify the identity of a person. For example, to control entry to restricted access areas. Using biometric data to identify a target person has some well known conceptual advantages such as, the identification procedure is immutable bound to the person which should be identified. Using face images as a biometric characteristic has gained much attention and commercially available face recognition systems exist.1, 2 However unconstrained environments with variable ambient illumination are still challenging for many face recognition systems. The appearance of a face can vary dramatically if the intensity or the direction of the light source changes. One way to address the problem of illumination is to use an active illumination source and a imagery other than that in the visible band. Therefore face recognition using active near-infrared illumination gained popularity.3–5 Another benefit of using active near-infrared illumination for face recognition is the bright pupil effect which may be utilized to assist eye detection.6–8 When placing an near-infrared lighting source close to the optical axis of a camera, the pupils look unnaturally bright because of reflections on the fundus of the eyes (Fig. 1). The pupillary light reflex is very weak under direct near-infrared illumination. This is beneficial for eye detection based on the bright pupil effect but raises safety concerns. Long time exposure to near-infrared radiation is hazardous to the eyes.9 The level of illumination is therefore limited by potentially harmful effects to the eyes. Using low illumination levels is contradictory to the requirements on image quality for a face recognition system. Generally high illumination levels allow faster shutter speeds, which reduces motion artefacts, and smaller apertures, which gives larger depth of fields. Also the signal-to-noise ratio of an imaging sensor generally increases at higher illumination levels. Image sensors for face recognition under active near-infrared illumination have therefore to be carefully selected to provide optimal image quality in the desired field of application. This paper explores the applicability of objective image quality metrics for an access control system using face recognition and near-infrared illumination. Especially important are stable and reproducible conditions in both Further author information: E-mail: [email protected], Telephone: +49 (0)40 42878 2277 Image Quality and System Performance VII, edited by Susan P. Farnand, Frans Gaykema, Proc. of SPIE-IS&T Electronic Imaging, SPIE Vol. 7529, 75290Z · © 2010 SPIE-IS&T · CCC code: 0277-786X/10/$18 · doi: 10.1117/12.839001

SPIE-IS&T/ Vol. 7529 75290Z-1

Figure 1: Bright pupil effect under active near-infrared illumination. enrollment and verification/identification phase. Image quality depends on the overall capture system including process environment, illumination conditions and imaging sensor. A model of the active illumination source is introduced in Sec. 2. Safety issues with regard to near-infrared illumination are addressed using this model and a radiometric analysis. Potential hazards to the eyes are outlined using current safety guidelines. The key parameters of image sensors that influence the image quality for face recognition under near-infrared illumination are identified in Sec. 3 and a methodology to select an image sensor for the desired field of application is given. Characterization of image quality is based on standard measurements such as Opto-Electronic Conversion Function, Modulation Transfer Function and noise. Two cameras with low-cost image sensors are characterized in Sec. 4 using the identified parameters under application typical conditions.

2. ACTIVE NEAR-INFRARED ILLUMINATION As stated in Sec. 1, illumination is one of the challenges of face recognition. One way to address the illumination problem is to use an active illumination source and an imagery other than that in the visible band. Ionizing radiation and optical radiation below the visible spectrum can pose a considerable hazard to the human body. Therefore, in particular infrared illumination is considered as a viable alternative for face recognition. An survey on infrared face recognition with special focus on thermal infrared is given in.10 As another example, the Equinox infrared face database contains images captured using mostly long-wave infrared spectra.11 However using thermal infrared has generally the disadvantages of costly thermal sensors and poor image quality. We are interested in using near-infrared (NIR) radiation with wavelength from 780 nm to 1000 nm. Common digital cameras with silicon sensors can be used to capture images in this range. Besides being comparable inexpensive, image quality is generally higher. Another advantage of this range of the spectrum is that it is still reflective. An active illumination source can be used. An advantage over the visible range is its invisibility to the human eye. This permits an unobtrusive active illumination.3 Possible surrounding light in the visible spectrum is filtered out in our system12 to be insensitive to visible ambient illumination changes. The bright pupil effect is employed to assist eye detection.6, 7 We use image processing to detect bright spots in the images and thus can reliable detect the eyes.8

2.1 Active Illumination Source Since the bright pupil effect is employed to detect the eyes, the existence of the bright pupil effect must be ensured. This requires the infrared illumination source to be as close as possible to the optical axis of the camera.13 Already with an illumination angle of 3◦ between the optical axis of the camera and the radiation source, the bright pupil effect is barely visible (Fig. 2a). Infrared-emitting diodes (IRED) are very compact and are therefore used as radiating sources. With IREDs, no additional parts such as mirrors are required. To attain the necessary irradiance levels for the desired field of application, several diodes are distributed on a circle around the optical axis of the camera (Fig. 2b). The radiant intensity distribution of a single diode is in many cases rotationally symmetric with a strong angular dependence. In this case, the intensity distribution can be modeled as a function of the angle between the viewing direction and the normal of the diode itself φ:14

SPIE-IS&T/ Vol. 7529 75290Z-2

y

ρ

β

x z

(a) Illumination angle β

(b) Distribution of diodes

Figure 2: Geometric model of the active near-infrared illumination source.

I(φ) = I0 cosm φ.

(1)

I0 is the radiant intensity normal to the diode (φ = 0◦ ). The coefficient m describes the drop in intensity with viewing angle. It can be estimated from the angle Θ1/2 , generally give by the diode manufacturers, where the diode’s intensity is half of I0 :15 m=−

ln 2 . ln cos Θ1/2

(2)

The irradiance of a single diode on a small patch at distance r, angle φ and perpendicular to the normal of the diode is given as: E(φ, r) = I(φ)

cosm+1 φ cos φ = I , 0 r2 r2

(3)

N sources are placed on a planar circular ring with radius ρ around the optical axis of the camera. Assuming a Cartesian coordinate system with origin at the center of the camera, z-axis along the optical axis of the camera and xy-plane perpendicular to the optical axis of the camera (Fig. 2), the origin of the n’th diode is 2πn (ρ cos 2πn N , ρ sin N , 0). The irradiance on a small patch of a parallel plate at position (x, y, z) is denoted as Ee (x, y, z). It can be expressed as the sum of the individual irradiances of all N diodes:

Ee (x, y, z) = I0 z

m+1

 N  n=1

2πn x − ρ cos N

with the change of coordinates: φ = cos−1 √

z , x2 +y 2 +z 2

2



2πn + y − ρ sin N

r=



− m+3 2

2 +z

2

,

(4)

x2 + y 2 + z 2 .

The uniformity of the irradiance distribution is largely dependent on the angle Θ1/2 . As it can be seen from Fig. 3, the irradiance of typical diodes with Θ1/2 = 22◦ on a plate perpendicular to the optical axis of the camera is not uniform. There is a significant fall off from the center to the side. Uniform irradiance can be achieved by either using additional IREDs, e.g. a second ring, and/or tilting of the diodes.16 Since the fall off is monotonic decreasing, it is acceptable for the desired field of application. From the model of the irradiance distribution in Eq. (4) and the fundus reflectance of the eye,17 the radiance of the pupils can be calculated. The pupils then become a radiating source for the camera and the image irradiance can be calculated. Due to the bright pupil effect, the pupils are generally the brightest spots in the captured images

SPIE-IS&T/ Vol. 7529 75290Z-3

2 0.6

1

Ee

E

e

1.5 0.4

0.5

0.2

0 0.5

0 0.5 0.5 0 y

0.5 0

0 −0.5

−0.5

y

x

(a) z = 0.66 m

0 −0.5

−0.5

x

(b) z = 1.2 m

Figure 3: Irradiance distributuon of 11 diodes on plane parallel to image plane. The diodes are at distance ρ = 1.05 cm, have a peak radiant intensity of I0 = 70 mW sr−1 and a angle of half intensity of Θ1/2 = 22◦ .

2.2 Safety Issues Although infrared cannot create photochemical damage like ultraviolet radiation, high infrared irradiance levels can create damage to the human eye. As described in in Sec. 2.1, multiple diodes are arrange as a ring to create a more powerful source. To ensure the existence of the bright pupil effect, the diodes are also arrange close to the optical axis of the camera and the user will look directly into the radiation source. Near-infrared irradiance is particular dangerous since the eye’s protective mechanisms to bright light such as aversion response do virtually not exist. The International Commission on Non-Ionizing Radiation Protection (ICNIRP) published threshold guidelines for the assessment of potential ocular hazards of infrared-emitting diodes.18, 19 The thresholds were also adopted by the European Union in their requirements for the protection of workers against the risks arising from exposure to artificial optical radiation.20 The biological effect of infrared radiance is mere thermal. Infrared radiation has a heating effect on any surface that it is radiating on. Radiation in the relevant portion of the spectrum of 780nm to 1400nm (IR-A) can reach the fundus of the eye. From the different potential hazards of incoherent optical radiation should therefore be considered:18 • Thermal injury to the retina • Near-infrared thermal hazards to the cornea and the lens 2.2.1 Retinal Hazards The retinal irradiance can be determined from radiance of the radiating source. The spectral radiance of the source Lλ is weighted in the relevant portion of the spectrum (780 nm to 1400 nm) with the retinal thermal hazard function R(λ):18 LIR =

1400nm 

Lλ R(λ)Δλ,

(5)

780nm

with the retinal thermal hazard function in the IR-A portion of the spectrum: R(λ) = 100.002(700nm−λ) .

SPIE-IS&T/ Vol. 7529 75290Z-4

(6)

Although an diode has generally a peak spectral intensity at a wavelength, the whole spectral width should be considered. Assuming continuous emission operation of the diodes, the relevant threshold for thermal injury to the retina without strong stimulus is for t > 10 s: LIR ≤

0.6 α

 W cm−2 sr−1 ,

(7)

with α the source subtended angle. 2.2.2 Cornea and Lens Hazards To avoid thermal injury to the cornea and lens, the total irradiance of infrared radiation in the range from 780 nm to 3000 nm should be limited for viewing times longer than 1000 s (≈ 16 min) to 10mW cm−2 .18 For shorter durations, higher irradiance levels are permissible.

3. OBJECTIVE IMAGE QUALITY METRICS For face recognition and other machine vision task, the quality of an image depends on the information it contains. For the desired application, the face of a person is illuminated by the near-infrared source described in Sec. 2. The radiation is reflected on the face and reaches the lens of the camera. Finally, the irradiance is converted by the sensor to digital values which can be interpreted by the face recognition system. All parts of the overall capture system influence the image quality. Here, we address the most important factors for the selection of suitable image sensors for the desired field of application. The camera need to be sensitive in the the desired range of the spectrum. Standard silicon sensors are also sensitive in the near-infrared range, generally up to 1000 nm.21, 22 Commonly, however an infrared cut-off filter is integrated into the camera to limit the sensitivity of the sensor to the visible range. Spectral sensitivity of sensors is generally reported by the manufacturer for the visible range, but quite frequently missing for the infrared range. Also the exact information how the measurements were performed are often missing. This makes it difficult to rely on the data published by the manufacturers for the comparison of spectral sensitivity of cameras. Therefore, measurements of the spectral sensitivity are generally required.23 The ability of the camera system to relate sensor irradiance to digital output values can be described using the Opto-Electronic Conversion Function (OECF). The OECF can be determined for the whole camera system including the lens using a test chart with patches of different densities.24 For the application, the performance of different cameras under given illumination conditions are of interest. The cameras need to be able to differentiate different levels of radiance reflected from the face of the person in-front of the camera. ISO 1573925 proposes the measurement of the signal-to-noise ratio (SNR) from the OECF test chart. Noise at application typical irradiance levels is an important quality measure to consider when selecting a suitable camera for a given application. High sensitivity in the relevant range of the spectrum alone does not ensure good imagery. The signal-to-noise ratio refers to noise or uncertainty in the pixel intensity values at given irradiance conditions: SNR =

mean pixel value . standard deviation of pixel value

(8)

The dynamic range of a camera system indicates the range of irradiance levels that the system can differentiate. It is generally defined as ratio of highest to smallest recognizable irradiance level for which the SNR is at least 1.25 An image sensor cannot be selected without discussing the lens of the camera. The properties of the lens for a face recognition system under NIR illumination are for the most parts determined by the operating conditions. Focal length and aperture size of the lens determine the field-of-view which can be captured sharply by the camera. A small aperture seems to be advantageously since the depth-of-field is large. But at small apertures, the resolution becomes limited by diffraction. Diffraction is also wavelength dependent26 and in the infrared the diffraction spot is larger than in the visible range. The most common performance measure to characterize image sharpness and resolution is the Modulation Transfer Function (MTF).27, 28 The MTF of a camera system is a measure of the contrast or the modulation of the image as a function of different spatial frequencies.

SPIE-IS&T/ Vol. 7529 75290Z-5

0.8

rel

Relative Responsivity R (λ)

1

0.6

0.4

0.2 Camera A Camera B 0

400

450

500

550

600

650 λ

700

750

800

850

900

Figure 4: Spectral sensitivity of evaluated cameras.

4. CAMERA CHARACTERIZATION Two different cameras are compared under application typical conditions using the quality metrics of Sec. 3. Both low-cost cameras have monochrome sensors (1/3  ) and are sensitive in the relevant range of the spectrum. Camera A has a Sony ICX424AL CCD sensor and a resolution of 640 pixel × 480 pixel, while camera B has a MT9V032 CMOS sensor from Aptina and a resolution of 752 pixel × 480 pixel. Each camera has a NIR-coated F3.5/6 mm lens.

4.1 Spectral Sensitivity Measurements were taken by projecting monochromatic light of different wavelength λ on the cameras. A stabilized light source with known illumination spectral distribution and a set of narrow-band inference filters was used to generate monochromatic light at different wavelength.23 The inference filters cover the area of 380 nm-720 nm in 10 nm steps and 750 nm, 800 nm, 850 nm and 905 nm. The camera parameters were kept identical during a series of measurements for one camera. All the gains were set at the lowest value. The measurements of pixel intensities were averaged at the center of the illuminated area of each image. Instead of the absolute numbers, the relative spectral responsivity Rrel (λ) of the cameras is presented in Fig. 4. The peak of the spectral responsivity is scaled to 1. The peak sensitivity of camera A in Fig. 4 is around 540 nm, while the peak of camera B is at 750 nm. This is a typical behavior of CMOS and CCD sensors. Generally the spectral sensitivity of CMOS in comparison to CCD sensors is shifted slightly towards the infrared.

4.2 Opto-Electronic Conversion Function The Opto-Electronic Conversion Function is measured based on ISO 1452424 with lens and using application typical lightning. The near-infrared irradiance source described in Sec. 2 was mounted to an integrating sphere. An transmissive OECF and noise test chart29 (TE241) with 20 test patches of different densities was illuminated by the integrating sphere. The infrared transmissibility of the density patches was verified with respect to the reference patch corresponding to a density value of 0 using a spectrometer. The Opto-Electronic Conversion Functions in Fig. 5a are calculated from six analyzed images respectively. To obtain comparable results under the illumination conditions, similar exposure settings were used for both cameras. Both cameras show a very similar performance. The only significant difference is the black level. Camera B has an higher black level, which can be easily compensated.

SPIE-IS&T/ Vol. 7529 75290Z-6

2

250 200

Camera A Camera B

1

10 150 SNR

Digital Values

10

Camera A Camera B

100

0

10

−1

10

50 0

−2

4

3

2 Density

1

0

10

4

(a) OECF

3

2 Density

1

0

(b) SNR

Figure 5: OECF and SNR at different irradiance levels measured using TE241 test chart. 0.4

0.4 Camera A Camera B

Camera B

Nyquist Frequency 0.3 Contrast

Contrast

0.3

0.2

0.1

0

Nyquist Frequency

0.2

0.1

150

200 250 Resolution [LP/PH]

300

0

(a) F3.5/6 mm lens

150

200 250 Resolution [LP/PH]

300

(b) F2.8/6 mm lens

Figure 6: MTF of two different lenses at application typical operation conditions.

4.3 Noise The signal-to-noise ratio (Fig. 5b) at application typical illumination conditions is the most significant quality measure. The SNR at different irradiance levels and the OECF were measured simultaneous using the same test chart.25 At low irradiance levels both cameras perform similar. The three patches with highest densities are indistinguishable. The dynamic range of the two cameras, measured as the ratio of the irradiation levels where the cameras reach their maximum unclipped output signal and where the SNR passes the value of 1,25 is therefore similar. Camera B shows in general a better signal-to-noise ratio at medium irradiation levels. The mean SNR is therefore better.

4.4 Modulation Transfer Function The interaction of camera and lens under the illumination conditions of the application is evaluated by looking at the Modulation Transfer Function. The MTF is measured using a modulated Siemens star.28, 29 The Siemens star is illuminated using the infrared illumination source of Sec. 2. Only the resolution in the center in an application typical distance of 80 cm is determined. The desired field of application is face recognition and it can be assumed for the application that the face of the person to be recognized is in the center. The limiting resolution is defined as the frequency where the MTF of the camera reaches a 10 % contrast value. For both cameras (Fig. 6a), the limiting resolution is below the Nyquist frequency. This is more a statement about the quality of the lens and the aperture diameter of the lens. In the case of the F3.5/6 mm M12-mount lens in combination with both cameras, the resolution is limited by diffraction. A lens with slightly larger aperture seems more suitable, to be able to use the full resolution of the cameras.

SPIE-IS&T/ Vol. 7529 75290Z-7

Figure 6b shows the repeated experiment for camera B but equipped with a NIR-coated lens with larger aperture (F2.8/6 mm). With this lens and its limiting resolution of 205 LP/PH, almost the full resolution of the camera can be used.

5. CONCLUSION Image quality is critical for face recognition. It depends on the overall capture system including process environment, illumination conditions and imaging sensor. An illumination model for a face recognition system under active near-infrared illumination was developed. Potential hazards of the infrared illumination to the eyes were outlined using current safety guidelines. The key parameters of the capture system that influence the image quality for the desired field of application were discussed. Two cameras with low-cost image sensors were characterized using the identified parameters under application typical conditions.

ACKNOWLEDGMENTS This work is part of the project KabTec – Modulares integriertes Sicherheitssystem funded by the German Federal Ministry of Economics and Technology.

REFERENCES [1] Zhao, W., Chellappa, R., Phillips, P. J., and Rosenfeld, A., “Face recognition: A literature survey,” ACM Computing Surveys 35(4), 399–458 (2003). [2] Phillips, P. J., Scruggs, W. T., O’Toole, A. J., Flynn, P. J., Bowyer, K. W., Schott, C. L., and Sharpe, M., “FRVT 2006 and ICE 2006 Large-Scale Results,” Tech. Rep. NISTIR 7408, National Institute of Standards and Technology (2007). [3] Dowdall, J., Pavlidis, I., and Bebis, G., “Face detection in the near-ir spectrum,” Image and Vision Computing 21(7), 565–578 (2003). [4] Zou, X., Kittler, J., and Messer, K., “Ambient illumination variation removal by active near-ir imaging,” in [Proc. International Conference on Biometrics], 19–25 (January 2006). [5] Li, S. Z., Chu, R., Liao, S., and Zhang, L., “Illumination invariant face recognition using near-infrared images,” IEEE Transactions on Pattern Analysis and Machine Intelligence 29(4), 627–639 (2007). [6] Morimoto, C. H., Koons, D., Amir, A., and Flickner, M., “Pupil detection and tracking using multiple light sources,” Image and Vision Computing 18(4), 331–335 (2000). [7] Haro, A., Flickner, M., and Essa, I., “Detecting and tracking eyes by using their physiological properties, dynamics, and appearance,” in [Proc. IEEE Conference on Computer Vision and Pattern Recognition], 1, 163–168 (June 2000). [8] Zhao, S. and Grigat, R.-R., “Robust eye detection under active infrared illumination,” in [Proc. 18th International Conference on Pattern Recognition (ICPR 2006) ], 481– 484 (August 2006). [9] Devereux, H. and Smalley, M., “Are infra red illuminators eye safe?,” in [Proc. Institute of Electrical and Electronics Engineers 29th Annual 1995 International Carnahan Conference on Security Technology], 480– 481 (October 1995). [10] Kong, S. G., Heo, J., Abidi, B. R., Paik, J., and Abidi, M. A., “Recent advances in visual and infrared face recognition: a review,” Computer Vision and Image Understanding 97(1), 103–135 (2005). [11] Lawrence B. Wolff, D. A. S. and Eveland, C. K., [Computer Vision Beyond the Visible Spectrum], ch. Face Recognition in the Thermal Infrared, 167–191, Springer, London (2005). [12] Zhao, S. and Grigat, R.-R., “An automatic face recognition system in the near infrared spectrum,” in [Proc. 4th International Conference on Machine Learning and Data Mining in Pattern Recognition (MLDM 2005)], 437–444 (July 2005). [13] Watanabe, J., Ando, H., Sekiguchi, D., Maeda, T., and Tachi, S., “The study of remote saccade sensing system based on retroreflective feature of the retina,” in [Proc. 13th International Conference on Artificial Reality and Telexistence ], 77–82 (December 2003). [14] Wood, D., [Optoelectronic semiconductor devices], Prentice Hall, London (1994). [15] Moreno, I. and Sun, C.-C., “Modeling the radiation pattern of leds,” Optics Express 16(3), 1808–1819 (2008).

SPIE-IS&T/ Vol. 7529 75290Z-8

[16] Moreno, I., Muñoz, J., and Ivanov, R., “Uniform illumination of distant targets using a spherical lightemitting diode array,” Optical Engineering 46(3), 033001 (2007). [17] Atchison, D. and Smith, G., [Optics of the Human Eye], Butterworth-Heinemann, Oxford (2000). [18] International Commission on Non-Ionizing Radiation Protection (ICNIRP), “Guidelines on Limits of Exposure to Broad-Band Incoherent Optical Radiation (0.38 to 3μm),” Health Physics 73(3), 539–554 (1997). [19] International Commission on Non-Ionizing Radiation Protection (ICNIRP), “Light-Emitting Diodes (LEDS) and Laser Diodes: Implications for Hazard Assessment,” Health Physics 77(2), 744–752 (2000). [20] Europäische Union, “Rl 2006/25/eg,” Amtsblatt 49, 38–59 (April 2006). Richtlinie 2006/25/EG des Europäischen Parlaments und des Rates vom 5. April 2006 über Mindestvorschriften zum Schutz von Sicherheit und Gesundheit der Arbeitnehmer vor der Gefährdung durch physikalische Einwirkungen (künstliche optische Strahlung). [21] Janesick, J. R., [Scientific Charge-Coupled Devices], SPIE Press, Bellingham, WA (2001). [22] Holst, G. C. and Lomheim, T. S., [CMOS/CCD Sensors and Camera Systems ], SPIE Press, Bellingham, WA (2007). [23] Image Engineering Dietmar Wueller, Frechen, Germany, camSPECS - Measurement of Spectral Response. [24] ISO 14524, Photography – Electronic still-picture cameras – Methods for measuring opto-electronic conversion functions (OECFs). International Organization for Standardization (ISO) (1999). [25] ISO 15739, Photography – Electronic still-picture cameras – Noise measurements. International Organization for Standardization (ISO) (2003). [26] Daniels, A., [Field Guide to Infrared Systems], SPIE Press, Bellingham, WA (2006). [27] ISO 12233, Photography – Electronic still-picture cameras – Resolution measurements. International Organization for Standardization (ISO) (2000). [28] Loebich, C., Wueller, D., Klingen, B., and Jaeger, A., “Digital camera resolution measurement using sinusoidal siemens stars,” in [Proc. SPIE ], Digital Photography III 6502, 65020N (January 2007). [29] Wueller, D., “Evaluating digital cameras,” in [Proc. SPIE ], Digital Photography II 6069, 60690K.1– 60690K.15 (January 2006).

SPIE-IS&T/ Vol. 7529 75290Z-9