Semi professional digital camera calibration

9 downloads 0 Views 2MB Size Report
Dec 8, 2016 - Download by: [CAPES], [Luís Crusiol]. Date: 05 May 2017, ... Fujifilm S200-EXR cameras coupled to a Tarot Iron Man 1000 octocop- ter (UAV) at 200 and ... green, blue, and NIR bands were obtained great correlations (r > 0.90) except for red ...... “Viewspec Pro User's Guide: Viewspec ProTM User Manual.
International Journal of Remote Sensing

ISSN: 0143-1161 (Print) 1366-5901 (Online) Journal homepage: http://www.tandfonline.com/loi/tres20

Semi professional digital camera calibration techniques for Vis/NIR spectral data acquisition from an unmanned aerial vehicle Luís Guilherme Teixeira Crusiol, Marcos Rafael Nanni, Guilherme Fernando Capristo Silva, Renato Herrig Furlanetto, Anderson Antonio da Silva Gualberto, Aline de Carvalho Gasparotto & Mariana Nunes De Paula To cite this article: Luís Guilherme Teixeira Crusiol, Marcos Rafael Nanni, Guilherme Fernando Capristo Silva, Renato Herrig Furlanetto, Anderson Antonio da Silva Gualberto, Aline de Carvalho Gasparotto & Mariana Nunes De Paula (2017) Semi professional digital camera calibration techniques for Vis/NIR spectral data acquisition from an unmanned aerial vehicle, International Journal of Remote Sensing, 38:8-10, 2717-2736 To link to this article: http://dx.doi.org/10.1080/01431161.2016.1264032

Published online: 08 Dec 2016.

Submit your article to this journal

Article views: 103

View related articles

View Crossmark data

Citing articles: 1 View citing articles

Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=tres20 Download by: [CAPES], [Luís Crusiol]

Date: 05 May 2017, At: 12:39

INTERNATIONAL JOURNAL OF REMOTE SENSING, 2017 VOL. 38, NOS. 8–10, 2717–2736 http://dx.doi.org/10.1080/01431161.2016.1264032

Semi professional digital camera calibration techniques for Vis/NIR spectral data acquisition from an unmanned aerial vehicle Luís Guilherme Teixeira Crusiol , Marcos Rafael Nanni, Guilherme Fernando Capristo Silva, Renato Herrig Furlanetto, Anderson Antonio da Silva Gualberto, Aline de Carvalho Gasparotto and Mariana Nunes De Paula Remote Sensing and Geoprocessing Laboratory, Department of Agronomy, Maringá State University, Maringá, Brazil ABSTRACT

ARTICLE HISTORY

Unmanned aerial vehicles (UAVs) equipped with multispectral digital cameras are very effective in spectral information obtainment. In order to check the radiometric response of the multispectral digital camera and provide better target’s spectral dynamics investigation, the used camera must be carefully calibrated in laboratory and field conditions. This article aimed to achieve the radiometric calibration of a multispectral Vis/NIR camera attached to an UAV, using as reference a hyperspectral sensor. First, a laboratory calibration, under controlled conditions, was performed. Then, a field calibration with no controlled conditions was accomplished. Finally, a cross calibration between laboratory and field data was developed. Multispectral Fujifilm S200EXR digital camera, sensible to infrared spectrum radiation and 8 bits of radiometric resolution (256 digital numbers (DNs)) and Fieldspec 3 Jr spectroradiometer (ASD Inc.), with spectral resolution of 3 nm between 350 and 1400 nm and 30 nm between 1400 and 2500 nm were used for spectral data acquisition. Six tarpaulins were used as reference targets. In laboratory, reference values were collected by Fieldspec 3 Jr and photographs of the reference targets and Spectralon panel were taken with no optical filter and using seven optical filters, which block the visible (Vis) and near-infrared (NIR) radiation in different intensities. In field, reflectance values were collected by the hyperespectral sensor on ground level and Vis/NIR images were taken using two identical Fujifilm S200-EXR cameras coupled to a Tarot Iron Man 1000 octocopter (UAV) at 200 and 600 m flight altitude on 22 January 2016 in clear weather conditions. All models and calibrations equations obtained by the correlation of DNs and reflectance values were significant to the Student’s t-test (p ≤ 0.05). As in laboratory as well as in field, for red, green, blue, and NIR bands were obtained great correlations (r > 0.90) except for red band in laboratory (r = 0.88). The cross calibration (laboratory vs field) obtained models for red, green, blue, and NIR bands presented high Pearson coefficients (r > 0.90). Under these circumstances, the calibrations models for red, green, blue, and NIR bands, point out the potential of cross calibration such way that the reference target’s reflectance values can be acquired in laboratory and, from digital images obtained by cameras attached to UAVs, DNs in

Received 27 July 2016 Accepted 4 November 2016

CONTACT Luís Guilherme Teixeira Crusiol 5790, J-45, 87020-900 Maringá, Brazil

[email protected]

© 2016 Informa UK Limited, trading as Taylor & Francis Group

Maringá State University, Colombo Avenue,

2718

L. G. T. CRUSIOL ET AL.

field conditions can be collected, making able the easy obtainment of trusty spectral information.

1. Introduction The remote sensing, by the interaction of electromagnetic radiation – target – sensor, has different levels of spectroradiometric information acquisition. Currently, unmanned aerial vehicles (UAVs) have been used to collect information in different knowledge areas, such as forest monitoring (Getzin, Nuske, and Wiegand 2014; Aicardi et al. 2016; Merino et al. 2012; Zhang et al. 2016), geology (Bemis et al. 2014; Vasuki et al. 2014), geomorphology (OleireOltmanns et al. 2012), marine monitoring (Ventura et al. 2016), glacial dynamic (Immerzeel et al. 2014), and agriculture (Gonzalez-Dugo et al. 2013, Johnson 2014; Lofton et al. 2012). UAVs equipped with digital cameras are very effective in spectral information obtainment due to reasons such as the low weight of the sensors (Lelong et al. 2008), the possibility of obtaining high spatial resolution (Berni et al. 2009), not being limited to the use of the orbital images, which has different temporal resolutions and the possibility of cloud cover (Xiang and Tian 2011), and the low coast of the equipment if compared to some sensors available in the market. Digital cameras are generally equipped with charge-coupled device (CCD) detector, sensible to electromagnetic radiation between 350 and 1100 nm (Lebourgeois et al. 2008). To select the incident wavelength on the sensor, interchangeable optical filters are widely used by many authors (e.g. Hunt et al. 2010; Lelong et al. 2008; Nebiker et al. 2008). The spectral response of remote sensors is the key to understand the obtained data, leading to better processing of them (Barsi et al. 2014). In digital cameras, commonly equipped with CCD detector, the incident radiation is distributed on the pixels of the acquired image and then converted into electric signal, providing the output as digital numbers (DNs) (Mangold, Shaw, and Vollmer 2013). However, it is pointed out that major attention is necessary to the relationship between DN and radiance (Kuusk and Paas 2007) especially to the near-infrared (NIR) band, which may have some limitations (Laliberte et al. 2011). DNs may change according to the atmospheric and illumination conditions at the moment of flight mission and image acquisition (Honkavaara et al. 2009), which disable them to be considered as quantitative unit and only the understanding of the radiometric behaviour of each pixel make the obtainment of quantitative information about specific targets possible (Kelcey and Lucieer 2012; Del Pozo et al. 2014). Thus, Schaepman-Strub et al. (2006) affirm the prerequisite to the acquisition of quantitative values is the development of radiometric calibration of the used sensor. The ratio between a specific targets’ radiance and the standard surface’s radiance results into reflectance values and allows, by this conversion, the integration of spectral data from different systems and remote sensors (Schaepman-Strub et al. 2006) and better spectral characterization of targets once the atmospheric and illumination interferences are eliminated by that conversion (Honkavaara et al. 2009).

INTERNATIONAL JOURNAL OF REMOTE SENSING

2719

Digital cameras’ calibrations have been studied in laboratory for over 30 years (Clark and Fryer 1998; Fraser 1997, 1998; Healey 1994). However, considering the difference between field and laboratory operation conditions, the calibration should not be conducted only in laboratories (Honkavaara et al. 2009). Field collections contribute positively to the empirical relationship radiance/reflectance, emphasizing that to improve the quality of the results a larger number of calibrations targets is needed (Smith and Milton 1999). The calibration models must be specifically adjusted to each need, so that each camera has its own model, avoiding the use of same model in distinct cameras (Withagen, Groen, and Schutte 2007). Thus, remote sensors calibrations in different levels of data acquisition are researchers greatest interests, e.g.: orbital level (Brook and Ben Dor 2011; Ponzoni, Zullo Junior, and Lamparelli 2008; Barsi et al. 2014; Mishra et al. 2014), aerial level by airborne sensors (Zhang 2000; Quemada, Gabriel, and Zarco-Tejada 2014), aerial level by sensors attached to UAVs (Berni et al. 2009; Lebourgeois et al. 2008; Sanseemoung et al. 2012; Honkavaara et al. 2009, 2013; Laliberte et al. 2011; Zarco-Tejada, GonzálesDugo, and Berni 2012). In Brazil, due to the large territorial extension (over millions of square kilometres) the use of UAVs is commonly associated to the monitoring of crop and environment. Natural resources and agricultural crops have their spectral behaviour affected by the time and need, therefore, multitemporal imagery. What has been observed is that many researchers and companies related to the agricultural crop and natural resources monitoring are accomplishing their flights and applying different methods to process their images, obtaining different and strange results and assuming those results to be true information. Due to the interference of atmospheric and illumination conditions on the acquired images, the radiometric calibration of the sensor attached to the UAV becomes extremely important and should be performed in every mission flight. Thus, a protocol or statement about image acquisition by low cost cameras attached to UAV and processing methodologies would be very useful to improve the reliability of spectral data acquisition and to improve the development of this growing technology in such potential area of study. Based on currently progress, this article aimed to achieve the radiometric calibration of a multispectral visible (Vis)/NIR semi professional camera attached to an UAV using as reference a hyperspectral Fieldspec 3 Jr spectroradiometer (ASD Inc.), which has been scientifically and methodologically used successfully. The subject discussed in this article has been widely discussed in many journals because of the improvements in methods and techniques needed to provide better aerial imagery using UAVs and image processing, following the fast development of this technology. However, this article proposes the use of semi-professional digital camera with easy operation and low cost when compared to others sensors. Furthermore, a cross-calibration method between laboratory and field data is proposed, reducing the time and amount of spectral data to be collected. Besides that, this article explores the nonlinear relationship between digital camera and spectroradiometer, discussed only by few authors nowadays.

2. Material and methods This article was developed in the Remote Sensing and Geoprocessing Laboratory of the Agronomy Department of Maringá State University, Maringá, Paraná State, Brazil.

2720

L. G. T. CRUSIOL ET AL.

The calibration process of sensors, such as the used digital camera, can be divided in three steps: laboratory, to previously understanding of the spectral behaviour of the sensor before taking it to field conditions; on board through aerial images; and in situ, which enable the sensor’s absolute calibration (Dinguirard and Slater 1999). In addition to those calibration steps, this article proposes a cross-calibration method between reflectance collected in laboratory and DN collected on field conditions, which reduces the calibration time and the amount of collected data, allowing the calibration processes to be easily performed in every single mission flight. First, to evaluate the used sensors’ behaviour (digital camera and spectroradiometer), a laboratory calibration, under controlled conditions, was performed. Then, a field calibration with no controlled conditions was accomplished. Finally, a cross calibration between laboratory and field data was developed.

2.1. Sensors used on spectral data acquisition Two sensors were used on spectral data acquisition: (a) Two identical Semi Professional Fujifilm S200-EXR digital camera, with an internal charge-coupled device (CCD) detector, sensible to infrared spectrum radiation and 8 bits of radiometric resolution, 256 DNs. The used cameras are originally manufactured without the internal NIR filter and the DNs are provided in three spectral bands (red, green and blue). One camera was used for Vis spectral data acquisition and the other one for NIR spectral data acquisition (as described below). To check the radiometric response, both cameras were evaluated using an optical integration sphere (UMKK-280, Gigahertz Optik Inc.) as presented in Figure 1. The two

Figure 1. Digital camera positioned in front of the optical integration sphere for radiometric response assessment.

INTERNATIONAL JOURNAL OF REMOTE SENSING

2721

cameras showed no significant difference (Student’s t-test; p ≤ 0.05) between their radiometric response on red, green, and blue bands. The images collected by each camera using the optical integration sphere were statistically evaluated by software Envi 5.3, using the ‘ROI tool’, selecting the entire image, to check the interference of the vignetting effect, caused by internal errors of the sensor which may lead to the progressive decrease of sensibility of CCD detector from the centre to the boundaries of the imagem (Kelcey and Lucieer 2012). (b) Fieldspec 3 Jr spectroradiometer (ASD Inc.), with spectral resolution of 3 nm between 350 and 1400 nm and 30 nm between 1400 and 2500 nm. The ASD Pistol accessory, which directs the incident radiation flux to the reference target and the ASD 8º Lens accessory, which allows radiation values acquisition in a very restrict area, avoiding adjacent targets’ spectral interference, were used.

2.2. Calibration reference targets A number of calibration targets interfere directly on the calibration model quality (Smith and Milton 1999). However, there is no statement about the minimum number of calibration targets. This way, six tarpaulins in different shades were used: red, green, blue, yellow, dark grey, and light grey. The tarpaulins that were used in laboratory to capture spectral values by Fieldspec 3 Jr have 30 x 30 cm dimension, the ones that were used to capture the multispectral images have 5 x 5 cm dimension (the six tarpaulins were allocated in the same plate to avoid illumination interference on the targets). On field, the targets had 5 x 5 m dimension to the acquisition of spectral values by the two sensors. Both tarpaulins, the ones used in laboratory and field, were identical and had the same conservation condition.

2.3. Laboratory and field spectral data collect In laboratory, both sensors were operated under controlled conditions, with complete isolation of external light source, where the roof, floor, and walls were in a dark colour. During the spectral data acquisition, the laboratory was maintained at 17°. The geometry between light source (600 W halogen lamp) and the calibration target was at 30°. To the photograph acquisition, the lens of the camera was positioned 25 cm far from the reference targets and the 8° lens accessory of Fieldspec 3 Jr was positioned 8 cm far from them (Figure 2). Seven optical filters were used to select the incident wavelength on multispectral camera’s CDD: one that only allows the Vis spectrum radiation pass (under violetinfrared-cut (uv-ir-cut-off)) and six that only allows the infrared spectrum radiation pass with different block intensities: 630, 680, 720, 760, 850, and 950 nm. To evaluate the filters’ efficiency, as displayed in Figure 3, the Fieldspec 3 Jr. was used and Spectralon panel (Labsphere ©) was applied as reference to the reflectance acquisition. Lately, each filter was allocated in front of the 8° lens, and it was able to capture reflectance values transmitted by their own.

2722

L. G. T. CRUSIOL ET AL.

Digital camera

25 cm

Spectroradiometer Light source

8 cm

30 °

Calibration targets and Spectralon panel

Figure 2. Scheme of reflectance and digital number (DN) acquisition in laboratory.

Figure 3. Spectral features acquired by Spectralon panel and spectroradiometer of the seven optical filters evaluated.

Two photographs were taken with each used filters: one that contained calibration targets of 5 x 5 cm dimension, and other that contained a Spectralon panel. In addition, a photograph of the reference targets without optical filters was made. To avoid external interferences on reference targets’ spectral behaviour, the focus of the lens was locked and a white balance was performed for each illumination condition of photograph acquisition. The photographs were taken in RAW format, large size 4:3 with 4000 × 3000 pixels and ISO 800. The used ISO was chosen because of the quality of the obtained images. Laliberte et al. (2011) affirm that the DN may present variations according to the ISO configuration. However, this article did not aim to evaluate the interferences of ISO on the obtained images but to propose and develop calibrations

INTERNATIONAL JOURNAL OF REMOTE SENSING

2723

models, which showed to be satisfactory with the ISO 800. Thus, the ISO 800 was standardized for laboratory and field spectral data collect. The Fujifilm S200-EXR was set on EXR mode, which controls the diaphragm opening and the shutter opening time according to the amount of incident radiation needed to sensitize the CCD detector. To the acquisition of the dark frame and offset values, a photograph was taken with complete absence of illumination and with the lens covered by a dark body. In each reference target with 30 x 30 cm dimension five reflectance values were collected by Fieldspec 3 Jr As reflectance reference, Spectralon panel was used. To field spectral data acquisition (Figure 4) the reference targets were placed on the soccer field of Maringá State University. The place was chosen because of the absence of buildings and large vegetation, minimizing external interferences on obtained data. The spectral data collect was accomplished on 22 January 2016 (summer season in Brazil) in clear weather conditions, without clouds cover, between 11 am and 12 pm. The flight altitude was 200 m related to the reference targets. For this purpose, two identical Fujifilm S200-EXR cameras, with the same angle of view, were coupled to a UAV. The cameras were respectively equipped with uv-ir-cut-off and 760 nm filters, thus allowing the simultaneously acquisition of red (R), green (G), blue (B), and NIR bands DNs. Both cameras were set following the configuration used for the laboratory spectral data acquisition. The cameras were trigged simultaneously, which avoids radiometric interferences on the R, G, B, and NIR spectral bands (Gehrke and Greiwe 2014). After the 200 m flight, a 600 m flight was performed at the same date and time from calibration flight using only the camera equipped with the uv-ir-cut-off filter. The purpose of the 600 m flight was to check if the atmosphere and illumination interferences were enclosed in the calibration models developed even in a great flight altitude. Reflectance values (10 in each reference target) were collected by the hyperespectral

UAV

600 m

200 m

Spectroradiometer 0.8 m

Calibration targets and Spectralon panel

Figure 4. Scheme of reflectance and DN acquisition on field condition.

2724

L. G. T. CRUSIOL ET AL.

sensor, with the ASD Pistol and 8º lens positioned 0.80 m from the reference targets. Once more, Spectralon panel was used as reflectance reference.

2.4. Unmanned aerial vehicle In the assessment an UAV (Figure 5) octocopter manufacturer by Tarot model Iron Man 1000 produced structurally in carbon fibre 3k equipped with brushless motors from the manufacturer T-Motor model MN5212 with 340 kV and power of 600 W per axis was used, using propellers composite by carbon fibre 3k with 17 × 5.5 in. The UAV platform was developed for transport of the multispectral and hyperspectral sensors with capacity of carrying a 4 kg payload and flight autonomy of about 20 min using lithium-polymer battery (Lipo) with capacity of 16.000 mAh 6S 22.8 V and discharge rate of 10 C. Aboard the UAV platform, a stabilization camera system was built to avoid vibrations during the capture of images on flights. The UAV is controlled by an autopilot system 3DR Robotics model Pixhawk consisting of two 32 bit processors Cortex M4 of 128 MHz and stabilization system with gyro (ST Micro L3GD20H 16bit), accelerometer axis x–y–z (Invensense MPU 6000) and barometer (MEAS MS5611) with accuracy of stabilization in z-axis of 10 cm. Absolute positioning of the UAV was obtained by a GPS Ublox Lea-6H with high precision. The flights for the acquisition of the images were planned in the Mission Planner software using waypoints in the altitudes set out for the acquisition of the images in this study. On the ground station, a transmitter Futaba T14SG was used operating on frequency 2.4 GHz and FASSTest protocol transmission data for communication with the UAV, for data transmission of the position, altitude, time of flight, and captures images were obtained by telemetry operating in 900 MHz frequency connected on a notebook. A first person view (FPV) transmission system of 5.8 GHz was used for live monitoring of flight, making corrections in the positioning of the UAV over the calibration target when necessary.

Figure 5. Unmanned aerial vehicle operation system.

INTERNATIONAL JOURNAL OF REMOTE SENSING

2725

2.5. Multispectral image and hyperespectral data processing All multispectral images acquired in the field and laboratory were converted from RAW to TIFF format by the software FinePix Viewer provided by camera’s manufacturer (Fujifilm). In digital cameras the red, green, and blue spectral bands may be sensible to NIR radiation (Bobbe, Mclean, and Zigadlo 1995; Zigadlo et al. 2001), but the red bands are the one that manifest the higher sensibility to NIR radiation (Hunt et al. 2010). So that, the obtained images by using infrared optical filters were processed in two different approaches: (a) assuming the red band’s DNs values as infrared spectrum values and (b) promoting the equalization of the DNs values from red, green, and blue bands, converting the image into grey scale and assuming these values as infrared spectrum values. As stated by Honkavaara et al. (2009), Del Pozo et al. (2014), and Smith and Milton (1999), no atmosphere correction was performed on the obtained images (either on 200 or 600 m flight). The referred authors propose that the calibration into reflectance values eliminates atmosphere and illumination interferences. Then, by the software Envi 5.3, DNs values from each reference target in every obtained image were extracted (red, green, blue, and IR bands, in both processing ways). Finally, the offset value from each band was subtracted from the DNs, resulting into the value used in the calibration process. Reflectance data, obtained by Fieldspec 3 Jr spectroradiometer, were exported and converted from .asd to .txt format by Viewspec Pro (ASD 2008). The reflectance values were separated on red, green, blue, and NIR (according to the used optical filter) bands as described by Nanni and Demattê (2006) providing data normalization and proving more suitability.

2.6. Laboratory and field calibrations development The correlation between DNs’ average and reflectance’s average values were established from calibration targets of the R, G, and B bands, with and without uv-ir-cut-off optical filter, and NIR band, with six optical filters, which had different intensities of block under the two processing approaches. To the field calibration, the correlation between DNs’ average and reflectance’s average was established from calibration targets of the R, G, and B bands, with the uv-ir-cut-off optical filter, and NIR band, with the 760 nm optical filter. To the cross calibration, field versus laboratory, was established the correlation between the DNs of the calibration targets on field conditions and their reflectance values in laboratory for R, G, B, and NIR bands.

2.7. Pseudoreflectance factor obtainment by the DN of Spectralon panel The Spectralon panel’s DNs values from R, G, B, and NIR bands were extracted and then, the ratio from calibration targets’ DNs and Spectralon panel’s DNs was performed, resulting into pseudoreflectance factor. Furthermore, the resulting pseudoreflectance factor of the calibration targets was correlated with the reflectance of calibration targets acquired by the spectrorradiometer. This procedure was accomplished in laboratory and field, allowing a model development for pseudoreflectance factor obtainment only by digital photographs from calibration targets and Spectralon panel.

2726

L. G. T. CRUSIOL ET AL.

2.8. Statistical analysis The obtained models were evaluated by their determination coefficient (R), Pearson coefficient (r), and the root-mean-square deviation (RMSE). The values used to develop the calibration models were submitted to Student’s t-test (p ≤ 0.05) using SAS/STAT® software.

3. Results and discussion There is lack of studies related to each camera component as well as its developed calibration models’ accuracy. Modern digital cameras have an internal gamma effect system which performs the image mapping based on the incident sensor’s radiation and provides output values which emulate the human eye behaviour, improving the visual image quality (Withagen, Groen, and Schutte 2007). The gamma effect does not assume that if some pixel receives twice radiance it should present twice DN. Thus, to human eyes, an object that supply twice radiance will not be seen with twice brightness. This way, on obtained images, gamma effect emulate the human eye behaviour and for this reason, the linear relationship between DN and radiance is affected by gamma effect (Gehrke and Greiwe 2014), which may result into a nonlinear behaviour (Ritchie et al. 2008). Because of the lack of gamma effect’s precision in general cameras and the need to choose the right model that best fits on the data (Withagen, Groen, and Schutte 2007), a second degree polynomial equation was chosen to describe the obtained models. All models and calibrations equations obtained were significant to the Student’s t-test (p ≤ 0.05).

3.1. Vignetting correction The mean, standard deviation, and coefficient variation of DNs of red, green, and blue bands of the two used cameras acquired using the optical integration sphere are presented in Table 1. As stated by Kelcey and Lucieer (2012), the vignetting effect can cause radiometric interferences on the CCD detector. But it is also stated that the vignetting correction may cause alterations on the radiometric response of the pixels of the image (Laliberte et al. 2011). On the acquired images, a very low variation coefficient (Table 1) on the DNs of red, green, and blue bands of 0.64 (minimum) and 1.40 (maximum) was observed, which Table 1. Mean, standard deviation and coefficient variation of digital numbers (DNs) of red, green, and blue bands acquired using the optical integration sphere. Digital camera

Spectral band

Mean

Standard deviation

Coefficient variation

Camera 1

Red Green Blue Red Green Blue

209.47 161.89 111.48 215.07 168.23 113.95

1.47 1.60 1.56 1.39 1.37 1.60

0.70 0.99 1.40 0.64 0.81 1.40

Camera 2

INTERNATIONAL JOURNAL OF REMOTE SENSING

2727

indicates that, although the images may present vignetting effect, the homogeneity on DNs distribution on CCD detector was kept. Thus considering that the DNs of R, G, and B bands collected from both camera using the optical integration sphere, which is assumed to be an optimal uniformly distributor of light, showed very low coefficient variation and the homogeneous distribution of DNs on CCD detector was preserved, we opted to not consider the vignetting effect and to not perform any corrections.

3.2. Laboratory calibration 3.2.1. No optical filter digital and uv-ir-cut-off optical filter images calibration The obtained calibration models for digital images without and using the uv-ir-cut-off optical filter are presented in Table 2 for R, G, and B bands. For R band, the use of optical filter did not result into rise of Pearson coefficient. Although, for G and B bands the use of uv-ir-cut-off optical filter resulted in higher Pearson coefficient (from 0.91 to 0.95 for G band and from 0.80 to 0.92 for B band). It was observed that, except for light grey tarpaulin, the DNs obtained without the use of optical filter were higher when compared to the DNs obtained with the uv-ir-cut-off filter, which allows only Vis radiation pass. R, G, and B bands may be sensible to NIR spectrum radiation which may cause an additive effect on their DNs, resulting in higher values of the target’s spectral feature (Bobbe, Mclean, and Zigadlo 1995; Zigadlo et al. 2001). Thus, Pearson coefficient values express that the use of the optical filter that only allows Vis spectrum radiation pass results in rise of Pearson coefficient, especially for blue band. 3.2.2. NIR optical filters digital images calibration Table 3 presents the correlation between reflectance values and DNs of NIR bands, processed, respectively, by (a) and (b) approaches, using the filters 630, 680, 720, 760, 850, and 950 nm. Considering all NIR filters used, the processing by pixels values of R, G, and B bands equalization ((b) approach) showed better efficiency than the processing that used the values from R band as reference to NIR band ((a) approach). DNs acquisition from NIR band by R, G, and B pixels equalization demonstrated high correlation to the hyperspectral sensor (r > 0.95). The highest Pearson coefficient was obtained for the 950 nm filter (r = 0.9983). However, to NIR images acquisition in field conditions, due to cameras shutter opening time, which is automatically adjusted according to the necessary amount of radiance to Table 2. Correlation between reflectance factors and DNs and calibration models for red, green, and blue bands without and using uv-ir-cut-off optical filter in laboratory conditions. Optical filter No optical filter Uv-ir-cut-off

Spectral band Red Green Blue Red Green Blue

Equation y y y y y y

= = = = = =

2

3E-05x 1E-05x2 5E-05x2 2E-05x2 1E-05x2 2E-05x2

− − − − − −

0.0063x + 0.3784 0.0019x + 0.1054 0.0064x + 0.2529 0.003x + 0.16340 0.0012x + 0.0784 0.0011x + 0.0820

R2

r

RMSE

0.959 0.960 0.941 0.909 0.991 0.951

0.915 0.916 0.800 0.914 0.956 0.928

0.1546 0.1188 0.0213 0.1677 0.0248 0.0486

2728

L. G. T. CRUSIOL ET AL.

Table 3. Correlation between reflectance factors and DNs and calibration models for (a) and (b) image processing approaches for NIR 630 nm, NIR 680 nm, NIR 720 nm, NIR 760 nm, NIR 850 nm, and NIR 950 nm bands using infrared optical filters in laboratory conditions. Approach

Spectral band

(a) Approach

NIR NIR NIR NIR NIR NIR NIR NIR NIR NIR NIR NIR

(b) Approach

630 680 720 760 850 950 630 680 720 760 850 950

Equation

nm nm nm nm nm nm nm nm nm nm nm nm

y y y y y y y y y y y y

= = = = = = = = = = = =

2

2E-05x 4E-05x2 6E-05x2 8E-05x2 6E-05x2 9E-05x2 2E-05x2 1E-05x2 2E-05x2 1E-05x2 2E-05x2 7E-06x2

− − − − − − + + − − − +

0.0059x 0.0147x 0.0219x 0.0309x 0.0200x 0.0360x 0.0009x 0.0011x 0.0021x 0.0001x 0.0017x 0.0019x

+ + + + + + − − + + + −

0.3962 1.2644 2.0051 2.9457 1.8165 3.5085 0.0266 0.0450 0.1023 0.0130 0.0512 0.3177

R2

r

RMSE

0.441 0.696 0.813 0.875 0.925 0.985 0.923 0.966 0.992 0.993 0.996 0.997

0.660 0.775 0.862 0.898 0.945 0.979 0.955 0.978 0.983 0.989 0.992 0.998

0.1910 0.2623 0.1066 0.1543 0.1578 0.2463 0.1580 0.2930 0.1248 0.1149 0.1879 0.7298

sensitize CDD detector, the 760 nm filter was used. The 760 nm filter allows a larger radiation amount pass when compared to 950 nm filter (Figure 2). Otherwise, the 760 nm filter opening time is smaller than the 950 nm filter. Thus, because of the shutter opening time and the UAV’s shaking (caused by its own operation and windy conditions even using positioning system), it is avoided blurry images which would fail the radiometric data acquisition.

3.3. Field calibration 3.3.1. Uv-ir-cut-off and NIR 760 nm optical filters digital images calibration The second degree polynomial behaviour of the calibration models and the correlation between DNs and reflectance on field conditions are shown in Figures 6 (a–d) for R, G, B, and NIR bands, respectively. (b) 0.5

(a) 0.7

0.4

Reflectance factor

Reflectance factor

0.6 0.5 0.4

y = 2E-05x2 - 0.0042x + 0.2191 R² = 0.9182 r = 0.8883

0.3 0.2

0.3

y = 2E-05x2 - 0.0053x + 0.3681 R² = 0.9934 r = 0.9375

0.2 0.1

0.1 0

0 0

50

100

150

200

250

0

300

50

100

150

200

250

300

Digital number

Digital number

(d) 0.8

(c) 0.4

Reflectance factor

Reflectance factor

0.7 0.3

y = 2E-05x2 - 0.0037x + 0.2788 R² = 0.9881 r = 0.9421

0.2

0.1

0.6 0.5

y = 1E-06x2 + 0.0067x - 0.6451 R² = 0.9389

0.4

r = 0.9689 0.3 0.2 0.1

0

0 0

50

100

150

Digital number

200

250

0

50

100

150

200

250

Digital number

Figure 6. Correlation between reflectance factors and DNs and calibration models for red (a), green (b), blue (c), and NIR 760 nm (d) bands in field conditions.

INTERNATIONAL JOURNAL OF REMOTE SENSING

2729

The obtained data showed great correlation for G, B, and NIR bands, which Pearson coefficient were higher than 0.93. The minimum Pearson coefficient was obtained for R band (r = 0.88). Comparing laboratory and field data, except for B band, the Pearson coefficients acquired in laboratory were higher than those acquired on field conditions. This can be explained by the controlled conditions found in laboratory. To evaluate the efficiency of generated models to enclose the atmosphere interference, an image on R, G, and B bands using uv-ir-cut-off filter, was acquired at 600 m flight altitude.

3.3.2. Uv-ir-cut-off optical filter digital images acquired on 600 m calibration The correlation between DNs obtained on the flight at 600 m altitude and the reflectance collected on ground level are presented on Table 4 for R, G, B, and NIR bands. It was observed that the DNs values obtained at 600 flight altitude metres were higher than those obtained in 200 flight altitude metres (not shown data). It is likely that this occurred due to the atmosphere interference, which makes the radiation reflected by the aerosols that was captured by the sensor, raising the obtained values. However, high Pearson coefficients (r = 0.99) were obtained for all analysed bands by the correlation between DNs acquired at 200 and 600 flight altitude metres. The obtained calibration equations at 600 altitude metres presented higher Pearson Coefficients than those obtained at 200 m. The modification of the spatial resolution could explain the acquisition of precise models. In the 200 altitude metres flight, the pixels presented 3 cm dimension, in comparison to the 25 cm dimension pixels obtained at the 600 altitude metres flight. Thus, the radiance of the material that constitutes the reference targets show better homogeneity with 25 cm pixels dimension when compared to 3 cm pixels dimension. The nonuniformity of tarpaulins as reference targets was also observed by (Honkavaara et al. 2009). Smith and Milton (1999) propose that the illumination and atmospheric composition interferences are eliminated by models generated by the correlation of ground and in-flight reflectance values. Thus, the models obtained by the acquisition of images at 200 and 600 m show great correlation with spectral data collected on ground level. Although the atmosphere may cause additive effect at higher altitudes on obtained DNs, at 200 such as 600 flight altitude metres, excellent Pearson coefficients were obtained, in such way that the 200 m images obtainment had the advantage of better visual description of the targets. Table 4. Correlation between reflectance factors and DNs acquired at 600 m and calibration models for red, green, blue, and NIR 760 nm bands in field conditions. Optical filter Uv-ir-cut-off

Spectral band Red Green Blue

Equation y = 3E-05x − 0.0061x + 0.3787 y = 3E-05x2 − 0.0076x + 0.5716 y = 2E-05x2 − 0.0049x + 0.3836 2

R2 0.910 0.991 0.990

r 0.897 0.994 0.951

RMSE 0.1547 0.0826 0.0369

2730

L. G. T. CRUSIOL ET AL.

3.4. Laboratory and field cross calibration Even though the calibration models obtained on field conditions has shown high Pearson coefficients, this article proposes a cross-calibration model between laboratory (which values show no variation) and field (which values may show variation) data. The proposed methodology allows the time optimization, the reduction in amount of data to be collected during spectral data acquisition, especially in multitemporal evaluation and avoids expensive equipment to be taken to field conditions, preventing possible damages that may be caused. First, the correlation between laboratory and field reflectance and second, between laboratory and field DNs was established.

3.4.1. Reflectance and DNs correlation between laboratory and field conditions Table 5 presents the correlations between laboratory and field reflectance and between laboratory and field DNs for R, G, B, and NIR bands. The obtained Pearson’s coefficient for reflectance of R, G, B, and NIR bands (r = 0.99) and DNs for R, G, B, and NIR bands (r > 0.90) demonstrated high correlation between laboratory and field values. 3.4.2. Field digital images and laboratory reflectance cross calibration Table 6 presents the calibration models for R, G, B, and NIR bands developed by the correlation between DNs collected in field (variable condition) and reflectance collected in laboratory (non-variable condition). The obtained models presented high Pearson coefficients (r > 0.90). The minimum Pearson coefficient was observed for R band, and the highest was observed for NIR band (r = 0.95). Under these circumstances, the calibrations models for R, G, B, and NIR bands, point out the potential of cross calibration in such way that the reference target’s reflectance Table 5. Correlation between laboratory and field reflectance and between laboratory and field DNs for R, G, B, and NIR bands. Spectral data Reflectance

DN

Spectral band Red Green Blue NIR Red Green Blue NIR

y y y y y y y y

= = = = = = = =

Equation 1.1799x + 0.0106 1.1639x − 0.0015 1.0026x + 0.0116 1.4305x − 0.0225 0.9467x + 7.9477 0.8657x + 52.500 1.0741x + 69.894 0.6447x + 32.979

R2 0.991 0.990 0.988 0.987 0.973 0.984 0.824 0.852

r 0.995 0.995 0.994 0.993 0.986 0.992 0.908 0.923

RMSE 0.2977 0.0122 0.0087 0.0177 13.7639 7.8441 17.0639 19.9993

Table 6. Calibration models for R, G, B, and NIR bands developed by the correlation between DNs collected in field and reflectance collected in laboratory. Optical filter Uv-ir-cut-off NIR 760 nm

Spectral band Red Green Blue NIR 760 nm

y y y y

= = = =

Equation 2E-05x − 0.0036x + 0.1744 2E-05x2 − 0.0041x + 0.2859 2E-05x2 − 0.0039x + 0.2846 −2E-05x2 + 0.0112x − 0.9068 2

R2 0.950 0.998 0.976 0.938

r 0.904 0.946 0.931 0.954

RMSE 0.0510 0.0897 0.1023 2.4059

INTERNATIONAL JOURNAL OF REMOTE SENSING

2731

values can be acquired in laboratory and, from digital images obtained by cameras attached to UAVs, DNs in field conditions can be collected.

3.5. Pseudoreflectance laboratory and field calibrations Reflectance factor is acquired by the ratio of the reference target’s radiance value by the radiance value of the surface pattern, with known spectral behaviour. As the DNs are a result of the incident radiance distributed on camera’s DNs, it is possible to acquire pseudoreflectance factors only by the DNs of calibration targets and by the surface pattern (Spectralon panel on this study).

3.5.1. Laboratory calibration Table 7 presents the correlation between pseudoreflectance factors obtained by DNs and reflectance values obtained by spectroradiometer in laboratory conditions, for R, G, B, NIR 630 nm, NIR 680 nm, NIR 720 nm, NIR 760 nm, NIR 850 nm, and NIR 950 nm bands. The ratio between calibration targets’ DNs and surface pattern’s DNs presented great correlation with reflectance values obtained by spectroradiometer. The obtained Pearson coefficients were higher than 0.91 (r > 0.91) for all Vis and NIR bands using different optical filters. However, it was observed that for R band, two reference targets (red and yellow tarpaulins) had overestimated values (over than 100% reflectance) after the ratio by their DNs and the Spectralon panel’s DNs. This likely occurred by the observed DNs that for many times showed values next to the saturation (DN = 255), and the Spectralon panel showed, sometimes, values lightly below 255. 3.5.2. Field calibration Table 8 presents the correlation between pseudoreflectance factors obtained by DNs and reflectance values obtained by spectrorradiometer in field conditions, for R, G, B, and NIR bands. The Pearson coefficients obtained points out to a great correlation between pseudoreflectance value, obtained by DNs, and reflectance values obtained by spectroradiometer. Table 7. Correlation between pseudoreflectance factors obtained by DNs and reflectance values obtained by spectroradiometer in laboratory conditions, for R, G, B, NIR 630 nm, NIR 680 nm, NIR 720 nm, NIR 760 nm, NIR 850 nm, and NIR 950 nm bands. Optical filter Uv-ir-cut-off NIR 760 nm

Spectral band Red Green Blue NIR 630 nm NIR 680 nm NIR 720 nm NIR 760 nm NIR 850 nm NIR 950 nm

y y y y y y y y y

= = = = = = = = =

Equation 1.0380x − 0.7577x 0.6831x2 − 0.2997x 0.8789x2 − 0.2595x 0.3985x2 + 0.1309x 0.3606x2 + 0.2058x 0.8647x2 − 0.4866x 0.6936x2 − 0.2349x 0.8326x2 − 0.4113x 0.3634x2 + 0.4439x 2

+ + + − − + + + −

0.1634 0.0784 0.0820 0.0266 0.0450 0.1023 0.0130 0.0563 0.3177

R2 0.909 0.999 0.951 0.923 0.966 0.992 0.993 0.996 0.997

r 0.914 0.956 0.928 0.955 0.979 0.983 0.989 0.992 0.998

RMSE 0.0612 0.0037 0.0183 0.0886 0.0743 0.1479 0.1170 0.0811 0.0565

2732

L. G. T. CRUSIOL ET AL.

Table 8. Correlation between pseudoreflectance factors obtained by DNs and reflectance values obtained by spectroradiometer in field conditions, for R, G, B, NIR 760 nm bands. Optical filter Uv-ir-cut-off NIR 760 nm

Spectral band Red Green Blue NIR 760 nm

Equation y y y y

= = = =

1.3593x − 1.0338x + 0.2191 1.4133x2 − 1.3281x + 0.3681 1.x0132x2 − 0.9213x + 0.2788 0.0489x2 + 1.2417x − 0.6451 2

R2

r

RMSE

0.918 0.993 0.988 0.938

0.888 0.937 0.942 0.968

0.0691 0.0117 0.0091 1.0182

For R band, the minimum Pearson coefficient was observed (r = 0.88). For the other bands, Pearson coefficients were superior than 0.93. The higher Pearson coefficient was observed for the NIR band (r = 0.96). The same way that was verified in the laboratory calibration was observed overestimated DNs after the conversion to pseudoreflectance factor that may have occurred for the same reason that occurred in the laboratory. Although the scientific reliability of this model is lower than the others, the pseudoreflectance factors acquisition only by DNs may be viable to the professional activities, once that hyperespectral sensors present high cost when compared to digital cameras.

4. Conclusions Digital cameras calibration allows spectral data acquisition in large scale by UAVs in such way that the obtained results can be trusty. The use of two identical cameras makes possible the simultaneously obtainment of red, green, blue and NIR images. The internal error of the used cameras showed homogeneous distribution on the images and did not affect the quality of the calibration models. Furthermore, the used ISO 800 provided great image quality. The internal gamma effect was observed, resulting into a nonlinear relationship between radiance and DNs and leading to second degree polynomial calibration model. Regarding the cameras, when calibrated in laboratory, the use of optical filter which allows only the Vis spectrum radiation pass improved the quality of the obtained values. Regarding the infrared spectrum radiation, the images processed by pixels equalization, converting into grey scale, presented better efficiency and the 950 nm filter was the one that showed higher Pearson coefficient. However, for use in cameras attached to UAVs, the 760 nm filter, that also presented great efficiency, showed to be more viable, once in aerial imagery the little UAV’s stability can lead to blurry images acquisition. On field conditions, such as in laboratory, the images acquisition on R, G, B, and NIR bands presented high correlation with hyperspectral sensor. The used calibration models evolve illumination and atmosphere conditions and can be used in photographs taken in different altitudes and illumination conditions. The proposed cross-calibration model showed high correlation between DNs collected on field (variable conditions) and reflectance values acquired in laboratory (controlled condition). Thus, by the proposed model, the calibration time can be optimized, once the amount of data to be acquired is limited to DNs from calibration targets on field.

INTERNATIONAL JOURNAL OF REMOTE SENSING

2733

Pseudoreflectance factor acquisition also presented high correlation with the hyperspectral sensor. The proposed model also makes it able to acquire spectral data without the use of a hyperspectral sensor, which, despite of the lower scientific precision, might be useful on daily professional activities. The selection of reference targets for calibration is important once their DNs may present saturation, values next to the maxim, while the reflectance values may be different among them, resulting in lower correlation on the obtained models. Techniques and models proposed in this article make able the obtainment of trusty spectral information by the use of digital cameras, which allows high precision and scientific acknowledgment on aerial imagery accessed from UAVs, highlighting the need of camera calibration in every single mission flight.

Acknowledgements This work was supported by the [Coordination for the Improvement of Higher Education Personnel – CAPES] and [National Council for Scientific and Technological Development – CNPq]. The authors would like to thank the personal of the Remote Sensing and Geoprocessing Laboratory of the Agronomy Department of Maringá State University.

Disclosure statement No potential conflict of interest was reported by the authors.

Funding This work was supported by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (Capes), Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq), and Financiadora de Estudos e Projetos (FINEP) [grant number 01.09.1567.00].

ORCID Luís Guilherme Teixeira Crusiol

http://orcid.org/0000-0002-2387-964X

References Aicardi, I., M. Garbarino, A. Lingua, E. Lingua, R. Marzano, and M. Piras. 2016. “Monitorin Post-Fire Forest Recovery Using Multi-Temporal Digital Surface Models Generated from Different Platforms.” EARSeL eProceedings 15 (1): 1–8. doi:10.12760/01-20116-1-01. ASD Inc. 2008. “Viewspec Pro User’s Guide: Viewspec ProTM User Manual.” ASD Document 600555 Rev. A. Barsi, J. A., K. Lee, G. Kvaran, B. L. Markham, and J. A. Pedelty. 2014. “The Spectral Response of the Landsat-8 Operational Land Imager.” Remote Sensing 6: 10232–10251. doi:10.3390/ rs61010232. Bemis, S. P., S. Micklethwaite, D. Turner, M. R. James, S. Akciz, S. T. Thiele, and H. A. Bangash. 2014. “Ground-Based and UAV-Based Photogrammetry: A Multi-Scale, High-Resolution Mapping Tool for Structural Geology and Paleoseismology.” Journal of Structural Geology 69: 163–178. doi:10.1016/j.jsg.2014.10.007.

2734

L. G. T. CRUSIOL ET AL.

Berni, J. A. J., P. J. Zarco-Tejada, L. Suarez, and E. Fereres. 2009. “Thermal and Narrowband multispectral Remote Sensing for Vegetation Monitoring from an Unmanned Aerial Vehicle.” IEEE Transactions on geoscience and Remote Sensing 47: 722–738. doi:10.1109/ TGRS.2008.2010457. Bobbe, T., J. Mclean, and J. P. Zigadlo. 1995. “An Evaluation of Natural Color and Color Infrared Digital Cameras as a Remote Sensing Tool for a Natural Resources Management.” Airborne Reconnaissance XIX: 151–157. doi:10.1117/12.218601. Brook, A., and E. Ben Dor. 2011. “Supervised Vicarious Calibration (SVC) of hyperspectral RemoteSensing Data.” Remote Sensing of Environment 115: 1543–1555. doi:10.1016/j.rse.2011.02.013. Clark, T. A., and J. G. Fryer. 1998. “The Development of Camera Calibration Methods and Models.” Photogrammetric Record 16 (51): 51–66. doi:10.1111/0031-868X.00113. Del Pozo, S., P. Rodríguez-Gonzálvez, D. Hernández-López, and B. Felipe-García. 2014. “Vicarious radiometric Calibration of a Multispectral Camera on Board an Unmanned Aerial System.” Remote Sensing 6: 1918–1937. doi:10.3390/rs6031918. Dinguirard, M., and P. N. Slater. 1999. “Calibration of Space-Multispectral imaging Sensors.” Remote Sensing of Environment 68: 194–205. doi:10.1016/S0034-4257(98)00111-4. Fraser, C. S. 1997. “Digital Camera Self-Calibration.” ISPRS Journal of Photogrammetry & Remote Sensing 52: 149–159. doi:10.1016/S0924-2716(97)00005-1. Fraser, C. S. 1998. “Automated Processes in Digital Photogrammetric Calibration, Orientation, and Triangulation.” Digital Signal Processing 8: 277–283. doi:10.1006/dspr.1998.0321. Gehrke, R., and A. Greiwe. 2014. “RGBI Images with UAV Off-The-Shelf Compact Cameras: An Investigations of Linear Sensor Characteristics.” EARSeL eProceedings Special Issue, 34th EARSeL Symposium, Warsaw, June 16–20, 53–58. Getzin, S., R. S. Nuske, and K. Wiegand. 2014. “Using Unmanned Aerial Vehicles (UAV) to Quantify Spatial Gap Patterns in Forests.” Remote Sensing 6: 6988–7004. doi:10.3390/rs6086988. Gonzalez-Dugo, V., P. Zarco-Tejada, E. Nicolás, P. A. Nortes, J. J. Alarcón, D. S. Intrigliolo, and E. Fereres. 2013. “Using High Resolution UAV Thermal Imagery to Assess the Variability in the Water Status of Five Fruit Tree Species within a Commercial Orchard.” Precision Agriculture 14: 660–678. doi:10.1007/s11119-013-9322-9. Healey, G. E. 1994. “Radiometric CCD Camera Calibration and Noise Estimation.” IEEE Transaction on Pattern Analysis and Machine Intelligence 16 (3): 267–276. doi:10.1109/34.276126. Honkavaara, E., R. Arbiol, L. Markelin, L. Martinez, M. Cramer, S. Bovet, L. Chandelier, et al. 2009. “Digital Airborne Photogrammetry – A New Tool for Quantitative Remote Sensing? – A State-OfThe-Art Review on Radiometric Aspects of Digital Photogrammetric Images.” Remote Sensing 1: 577–605. doi:10.3390/rs1030577. Honkavaara, E., H. Saari, J. Kaivosoja, I. Pölönen, T. Hakala, P. Litkey, J. Mäkynen, and L. Pesonen. 2013. “Processing and Assessment of Spectrometric, Stereoscopic Imagery Collected Using a Lightweight UAV Spectral Camera for Precision Agriculture.” Remote Sensing 5: 5006–5039. doi:10.3390/rs5105006. Hunt, E. R. Jr., W. D. Hively, S. J. Fujikawa, D. S. Linden, C. S. T. Daughtry, and G. W. Mccarty. 2010. “Acquisition of NIR-Green-Blue Digital Photographs from Unmanned Aircraft for Crop Monitoring.” Remote Sensing 2: 290–305. doi:10.3390/rs2010290. Immerzeel, W. W., P. D. A. Kraaijenbrink, J. M. Shea, A. B. Shrestha, F. Pellicciotti, M. F. P. Bierkens, and S. M. D. Jong. 2014. “High-Resolution Monitoring of Himalayan Glacier Dynamics Using Unmanned Aerial Vehicles.” Remote Sensing of Environment 150: 93–103. doi:10.1016/j. rse.2014.04.025. Johnson, D. M. 2014. “An Assessment of Pre- and Within- Season Remotely Sensed Variables for Forecasting Corn and Soybean Yields in the United States.” Remote Sensing of Environment 141: 116–128. doi:10.1016/j.rse.2013.10.027. Kelcey, J., and A. Lucieer. 2012. “Sensor Correction of a 6-BAND Multispectral Imaging Sensor for UAV Remote Sensing.” Remote Sensing 4: 1462–1493. doi:10.3390/rs4051462. Kuusk, A., and M. Paas. 2007. “Radiometric Correction of Hemispherical Images.” ISPRS Journal of Photogrammetry and Remote Sensing 61: 405–413. doi:10.1016/j.isprsjprs.2006.10.005.

INTERNATIONAL JOURNAL OF REMOTE SENSING

2735

Laliberte, A. S., M. A. Goforth, C. M. Steele, and A. Rango. 2011. “Multispectral Remote Sensing from Unmanned Aircraft: Image Processing Workflows and Applications for Rangeland Environments.” Remote Sensing 3: 2529–2551. doi:10.3390/rs3112529. Lebourgeois, V., A. Bégué, S. Labbé, B. Mallavan, L. Prévot, and B. Roux. 2008. “Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test.” Sensors 8: 7300– 7322. doi:10.3390/s8117300. Lelong, C. C. D., P. Burger, G. Jubelin, B. Roux, S. Labbé, and F. Baret. 2008. “Assessment of Unmanned Aerial Vehicles Imagery for Quantitative Monitoring of Wheat Crop in Small Plots.” Sensors 8: 3557–3585. doi:10.3390/s8053557. Lofton, J., B. S. Tubana, Y. Knke, J. Teboh, H. Viator, and M. Dalen. 2012. “Estimating Sugarcane Yield Potential Using an In-Season Determination of Normalized Difference Vegetative Index.” Sensors 12: 7529–7547. doi:10.3390/s120607529. Mangold, K., J. A. Shaw, and M. Vollmer. 2013. “The Physics of Near-Infrared Photography.” European Journal of physics 34: S51–S71. doi:10.1088/0143-0807/34/6/S51. Merino, L., F. Caballero, J. R. Martínez-De-Dios, I. Maza, and A. Ollero. 2012. “An unmanned Aircraft System for Automatic Forest Fire Monitoring and Measurement.” Journal of Intelligent & Robotic Systems 65: 533–548. doi:10.1007/s10846-011-9560-x. Mishra, N., M. O. Haque, L. Leigh, D. Aaron, D. Helder, and B. Markham. 2014. “Radiometric Cross Calibration of Landsat 8 Operational Land Imager (OLI) and Landsat 7 Enhanced Thematic Mapper Plus (ETM+).” Remote Sensing 6: 12619–12638. doi:10.3390/rs61212619. Nanni, M. R., and J. A. M. Demattê. 2006. “Spectral Reflectance Methodology in Comparison to Traditional Soil Analysis.” Soil Science Society of America Journal 70: 393–407. doi:10.2136/ sssaj2003.0285. Nebiker, S., A. Annen, M. Scherrer, and D. Oesch. 2008. “A Light-Weight Multispectral Sensor for Micro UAV – Opportunities for Very High Resolution Airborne Remote Sensing.” The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences 37 (part B1): 1193–1200. http://www.isprs.org/proceedings/XXXVII/congress/1_pdf/ 204.pdf. Oleire-Oltmanns, S. D., I. Marzolff, K. D. Peter, and J. B. Ries. 2012. “Unmanned Aerial Vehicle (UAV) for Monitoring Soil Erosion.” Remote Sensing 4: 3390–3416. doi:10.3390/rs4113390. Ponzoni, F. J., J. Zullo Junior, and R. A. C. Lamparelli. 2008. “In-Flight Absolute Calibration of the CBERS-2 CCD Sensor Data.” Annals of the Brazilian Academy of Sciences 80 (2): 373–380. doi:10.1590/S0001-37652008000200015. Quemada, M., J. L. Gabriel, and P. Zarco-Tejada. 2014. “Airborne Hyperspectral Images and GroundLevel Optical Sensors as assessment Tools for Maize Nitrogen Fertilization.” Remote Sensing 6: 2940–2962. doi:10.3390/rs6042940. Ritchie, G. L., D. G. Sullivan, C. D. Perry, J. E. Hook, and C. W. Bednarz. 2008. “Preparation of a LowCost Digital Camera System for Remote Sensing.” Applied Engineering in Agriculture 24 (6): 885– 894. doi:10.13031/2013.25359. Sanseemoung, G., P. Soni, H. P. W. Jayasuriya, and V. M. Salokhe. 2012. “Application of Low Altitude Remote Sensing (LARS) Platform for Monitoring Crop Growth and Weed Infestation in a Soybean Plantation.” Precision Agriculture 13: 611–627. doi:10.1007/s11119-012-9271-8. Schaepman-Strub, G., M. E. Schaepman, T. H. Painter, S. Dangel, and J. V. Martonchik. 2006. “Reflectance Quantities in Optical Remote Sensing – Definitions and Case Studies.” Remote Sensing of Environment 103: 27–42. doi:10.1016/j.rse.2006.03.002. Smith, G. M., and E. J. Milton. 1999. “The Use of the Empirical Line Method to Calibrate Remotely Sensed Data to Reflectance.” International Journal of Remote Sensing 20 (13): 2653–2662. doi:10.1080/014311699211994. Vasuki, Y., E. Holden, P. Kovesi, and S. Micklethwaite. 2014. “Semi-Automatic Mapping of Geological Structures Using UAV-Based Photogrammetric Data: An Image Analysis Approach.” Computers & Geosciences 69: 22–32. doi:10.1016/j.cageo.2014.04.012. Ventura, D., M. Bruno, G. J. Lasinio, A. Belluscio, and G. Ardizzone. 2016. “A Low-Cost Drone Based Application for Identifying and Mapping of Coastal Fish Nursery Grounds.” Estuarine, Coastal and Shelf Science 171: 85–98. doi:10.1016/j.ecss.2016.01.030.

2736

L. G. T. CRUSIOL ET AL.

Withagen, P. J., F. C. A. Groen, and K. Schutte. 2007. “CCD Color Camera Characterization for Image Measurements.” IEEE Transactions on Instrumentation and Measurement 56 (1): 199–203. doi:10.1109/TIM.2006.887667. Xiang, H., and L. Tian. 2011. “Development of a Low-Cost Agricultural Remote Sensing System Based on an Autonomous Unmanned Aerial Vehicle (UAV).” Biosystems Engineering 108: 174– 190. doi:10.1016/j.biosystemseng.2010.11.010. Zarco-Tejada, P. J., V. Gonzáles-Dugo, and J. A. J. Berni. 2012. “Fluorescence, Temperature and Narrow-Band Índices Acquired from a UAV Platform for Water Stress Detection Using a MicroHyperespectral Imager and a Thermal Camera.” Remote Sensing of Environment 117: 322–337. doi:10.1016/j.rse.2011.10.007. Zhang, J., J. Hu, J. Lian, Z. Fan, X. Ouyang, and W. Ye. 2016. “Seeing the Forest from Drones: Testing the Potential of Lightweight Drones as a Tool for Long-Term Forest Monitoring.” Biological Conservation 198: 60–69. doi:10.1016/j.biocon.2016.03.027. Zhang, Z. 2000. “A Flexible New Technique for Camera Calibration.” IEEE Transactions on Pattern Analysis and Machine Intelligence 22 (11): 1330–1334. doi:10.1109/34.888718. Zigadlo, J. P., C. L. Holden, M. E. Schrader, and R. M. Vogel. 2001. “Eletronic Color Infrared Camera.” US Patent 6292212 B1.