Reliable and Accurate Wheel Size Measurement

0 downloads 0 Views 7MB Size Report
Dec 6, 2018 - Keywords: High dynamic; reflective conditions; stripe image ..... Let gi be the brightness of each stripe pixel, µg be the mean value, and σg be ...
sensors Article

Reliable and Accurate Wheel Size Measurement under Highly Reflective Conditions Xiao Pan , Zhen Liu and Guangjun Zhang * Ministry of Education Key Laboratory of Precision Opto-Mechatronics Technology, Beihang University, Beijing 100083, China; [email protected] (X.P.); [email protected] (Z.L.) * Correspondence: [email protected]; Tel.: +86-010-8233-8768 Received: 4 November 2018; Accepted: 30 November 2018; Published: 6 December 2018

 

Abstract: Structured-light vision sensor, as an important tool for obtaining 3D data, is widely used in fields of online high-precision measurement. However, the captured stripe images can show high-dynamic characteristics of low signal-to-noise ratio and uneven brightness due to the complexity of the onsite environment. These conditions seriously affect measurement reliability and accuracy. In this study, a wheel size measurement framework based on a structured-light vision sensor, which has high precision and reliability and is suitable for highly reflective conditions, is proposed. Initially, the quality evaluation criterion of stripe images is established, and the entire stripe is distinguished into high- and low-quality segments. In addition, the multi-scale Retinex theory is adopted to enhance stripe brightness, which improves the reliability of subsequent stripe center extraction. Experiments verify that this approach can remarkably improve measurement reliability and accuracy and has important practical value. Keywords: High dynamic; reflective conditions; stripe image enhancement; wheel size measurement

1. Introduction High-speed and high-precision vision measurement in outdoor environments has become an increasingly important tool for industrial scenes, e.g., vision perception [1], defect detection [2,3], and size measurement [4,5]. Among them, the online measuring instrument represented by a structured-light vision sensor is widely used, e.g., wheel size measurement [6,7], reconstruction of large forging parts [8], and monitoring of pantograph running status [9,10], where the online wheel size measurement has tremendous and remarkable practical meanings [11]. However, this measurement holds three main characteristics in the measuring process: (1) High speed, because the moving speed of the measured wheel is considerably fast, real-time calculation is required for the equipment; (2) high reflection, which is due to long time fraction, resulting in smooth wheel tread surfaces and specular reflection with active illumination, thereby causing overexposure (generally, the camera is set to working modes of a large aperture and short exposure time); and (3) irregular surface, that is, the topography of the measured wheel is irregular and the light is reflected away from the camera, generating low-brightness images. Regarding the restriction of sensor observation frustum, ensuring a good imaging status in the entire surface is difficult. The imaging characteristics in highly reflective conditions are mainly reflected in dramatic brightness changes, that is, a high dynamic. The dynamic range (DR) of images represents the ratio of maximum and minimum brightness. A high DR indicates a large difference between the maximum brightness and minimum brightness of images. In dynamic images, the stripe brightness and width distribution are uneven. Figure 1 presents the camera imaging light path, where the measured wheel has high speed, high reflection, and an irregular surface; the camera is set to working modes of a large aperture, short exposure, and high sensitivity; and the stripe images captured have Sensors 2018, 18, 4296; doi:10.3390/s18124296

www.mdpi.com/journal/sensors

Sensors 2018, 2018, 18, 18, 4296 x FOR PEER REVIEW Sensors

22 of 20 of 20

captured have the characteristics of overexposure and uneven brightness. On the one hand, the the characteristics of overexposure and uneven brightness. thehand, one hand, the stripe cannot are be stripe cannot be extracted in low-brightness segments. On theOn other overexposed segments extracted in low-brightness segments. On the other hand, overexposed segments are not extracted not extracted correctly. The two phenomena in highly reflective conditions restrict the measurement correctly. The accuracy two phenomena in highly reflective restrict the measurement reliability reliability and of the structured-light visionconditions system, especially in outdoor environments. In and accuracy of the measurement structured-lightsystems vision system, especially in outdoor In addition, addition, current of structured-light vision environments. in outdoor illumination current measurement systems to of external structured-light outdoor illumination are environments are vulnerable factors, vision such asinan inaccurate trigger, environments vibration, surface vulnerable to external factors,and such pollutants. as an inaccurate trigger, vibration, material, light condition, material, light condition, These factors resultsurface in image unevenness, low and pollutants. These factors result in image unevenness, low signal-to-noise ratio (SNR), leakage, signal-to-noise ratio (SNR), leakage, and false-positive cases, which seriously affect system and false-positive seriouslydecrease affect system reliability.time In this we commonly decrease reliability. In this cases, case, which we commonly the exposure forcase, reducing the brightness of the exposure time for reducing the brightness of overexposed increase exposure time overexposed segments and increase exposure time for segments enhancingand weak-lighting segments. for enhancing weak-lighting segments. Nevertheless, in practical some contradictions Nevertheless, in practical applications, some contradictions existapplications, between these aspects. For one exist between theseimage aspects. For one aspect, the whole is dimmed by decreasing the exposure aspect, the whole is dimmed by decreasing theimage exposure time. Although the original bright time. Although the original segments are darkened, dark segments become darker and segments are darkened, thebright dark segments become darkerthe and may not even be detected. For may not even be detected. For another perspective, if the only exposure time is increased, thereby another perspective, if the only exposure time is increased, thereby forming smears, then the original forming smears, then the original bright segments become brighter or even some overexposed. bright segments become brighter or even overexposed. Furthermore, unused Furthermore, stripes may some unused stripes may appear. The uneven brightness distribution, various widths, and low SNR of appear. The uneven brightness distribution, various widths, and low SNR of stripe images seriously stripe images seriously the reliability and performance of the measurement system in highly affect the reliability and affect performance of the measurement system in highly reflective conditions. For reflective For example, an online wheel size measurement system, theisstrip example, conditions. in an online wheel size in measurement system, when the strip imagewhen quality low,image high quality low, ahigh leakage andrate a false-positive can result, thereby seriously normal leakageisand false-positive can result, rate thereby seriously affecting the affecting normal the usage and usage and of reliability of the equipment. Thisonly casebrings not only brings additional for train inspectors, reliability the equipment. This case not additional work for work train inspectors, but also but also generates considerable on train normal train operation. generates considerable security security risks on risks normal operation.

Figure Camera imaging imaging light Figure 1. 1. Camera light path path of of the the online online wheel wheel size size measurement measurement system. system.

At At present, present, experts experts have have been been investigating investigating wheel wheel size size measurement measurement sensors sensors [12–18], [12–18], and and numerous types of products have been invented and applied, such as MERMEC [19], IEM numerous types of products have been invented and applied, such as MERMEC [19], IEM [20], [20], and and KLD KLD [21], [21], which which are are suitable suitable for for low-speed low-speed trains. trains. A A stable stable and and low-speed low-speed environment environment is is selected selected to to avoid measurement instability caused by outdoor high reflection. When the trains reach a high avoid measurement instability caused by outdoor high reflection. When the trains reach aspeed, high the measurement accuracy seriously a specially designed high-resolution speed, the measurement accuracy decreases. seriously MERMEC decreases.adopts MERMEC adopts a specially designed camera, and its measurement accuracy can reach 0.2 mm when the pass speed does not exceed 20 km/h. high-resolution camera, and its measurement accuracy can reach 0.2 mm when the pass speed does To date, existing measurement systems are mainly installed at the fence or garage entrance, under not exceed 20 km/h. To date, existing measurement systems are mainly installed at the fence or ideal environmental conditions. However, systems [17] installed in the main line are seriously affected garage entrance, under ideal environmental conditions. However, systems [17] installed in the main by lighting and other factors. lighting Not oneand measurement system of structured-light vision of is lineoutdoor are seriously affected by outdoor other factors. Not one measurement system robust and reliable and meets the requirement of online measurement in highly reflective conditions. structured-light vision is robust and reliable and meets the requirement of online measurement in Therefore, the quality of stripe images inthe highly reflective conditions must be improved increase highly reflective conditions. Therefore, quality of stripe images in highly reflective to conditions system reliability. To our knowledge, no solution has been proposed to improve the measurement must be improved to increase system reliability. To our knowledge, no solution has been proposed system of structured-light vision in highly reflective conditions. methods often focus to improve the measurement system of structured-light visionThe in conventional highly reflective conditions. The on image brightness enhancement. Recently, various image enhancement methods have been proposed conventional methods often focus on image brightness enhancement. Recently, various image to improve low-brightness images. methods of spatial and frequency domains. enhancement methods have beenThese proposed to consist improve low-brightness images. TheseFrequency methods methods mainly address the transform coefficients in a certain frequency domain of the image, and consist of spatial and frequency domains. Frequency methods mainly address the transform then the enhanced image is obtained by inverse transformation. In Reference [22], a multi-resolution coefficients in a certain frequency domain of the image, and then the enhanced image is obtained by overlapping sub-block equalization method on wavelet is proposed. Thesub-block image is decomposed inverse transformation. In Reference [22], based a multi-resolution overlapping equalization and equalized and then reconstructed through inverse wavelet transform, but the method is method based on wavelet is proposed. The image is decomposed and equalized unsuitable and then

Sensors 2018, 18, 4296

3 of 20

for images in the histogram concentrated distribution. The gray field transformation method adjusts the DR or contrast of the image, which is an important means of enhancement. This method transforms the grayscale in the original image into a new value, which can increase the gray DR. The categories of transformation function forms include linear, piecewise linear, and nonlinear. For the nonlinear gamma correction method [23,24], the selection of a nonlinear correction function is particularly important, its local processing capacity is poor, and the effect on structured-light image enhancement is not evident. In addition, histogram equalization [25] can increase the uniformity of gray distribution and enhance image contrast. However, this method does not consider the frequency and details of the image, which is likely to cause overenhancement. The gradient enhancement method [26] converts the image to the logarithm domain with gradient processing, thereby reducing the gradient value for compressing the image DR and increasing the large gradient to enhance edges in the image. Nevertheless, this method is unsuitable for real-time application. The Retinex [27] enhancement method utilizes the Gaussian smoothing function to estimate the luminance component of the original image. This method is suitable for handling images with low local gray value, thereby effectively enhancing the details of the dark part and maintaining the original image brightness to a certain extent while compressing the image contrast. In recent years, numerous forms based on Retinex theory, e.g., single scale [28], multi-scale [29], and color image enhancement [30], have achieved the ideal enhancement effect. The accuracy of stripe center extraction determines the measurement precision, whereas the stripe quality affects the accuracy of stripe center extraction. Therefore, this paper presents a method for wheel size measurement in highly reflective conditions. First, the stripe imaging process in highly reflective conditions is analyzed. Second, the stripe quality evaluation is proposed to locate enhanced stripe segments accurately. The image brightness is enhanced based on multi-scale Retinex (MSR). Physical experiments proved that, in the outdoor complex environment of the railway, the proposed method, compared with the existing methods, can improve the accuracy of the online wheel size measurement system and effectively reduce leakage and false-positive rates. The remainder of this paper is organized as follows. Section 2 briefly reviews the wheel profile reconstruction and overviews the scheme of high-dynamic image processing. Section 3 describes the stripe imaging in highly reflective conditions and presents an effective enhancement approach for a high-dynamic stripe image. Section 4 discusses the experiments and evaluations. Section 5 draws the conclusion. 2. System Overview The online wheel size dynamic measurement system is composed of equipment on rail, control unit, data processing, and wheel size calculation modules. Equipment on rail includes two structured-light vision sensors, which are installed under a measured wheel. The control unit receives the wheel arrival signal from a magnetic trigger and generates data synchronously to cameras and lasers. Data processing involves image acquisition, image processing, stripe center extraction, and 3D reconstruction, which facilitate wheel size calculation. Wheel cross profile acquisition is a critical process in measurement, and its layout is shown in Figure 2. Two structured-light vision sensors are installed inside and outside the measured wheel, and their laser planes are adjusted to be coplanar in space and pass through the wheel center. When the wheel passes the trigger position, the inner and outer sensors observe the portion of the wheel cross profile. Among them, the inner camera is mainly used to detect the inner rim and part flange, and the outer camera is utilized to detect the tread and part flange. Let Ki , Ko be the inner and outer camera intrinsic parameters, Fi , Fo be plane functions in a camera coordinate system (CCF), and Roi , Toi be transformation matrices of the outer camera to the inner camera. The sensors are calibrated offline [31,32]. Let Ci and Co be the wheel profiles from the inner and outer sensors. After translating Co to the inner CCF, C0o = Roi · Co + Toi , the complete wheel profile is Cc = Ci ∪ C0o , which is used to calculate the parameters in accordance with the wheel size definition.

Sensors 2018, 18, 4296 Sensors 2018, 18, x FOR PEER REVIEW Sensors 2018, 18, x FOR PEER REVIEW

4 of 20 4 of 20 4 of 20

Measured wheel Measured wheel

Outer Laser Outer Laser

Inner Laser Inner Laser

Inner InnerCamera Camera

R oi , Toi R oi , Toi

Outer OuterCamera Camera

Figure 2. 2. Layout profile acquisition acquisition sensors. sensors. Figure Layout of of wheel wheel profile Figure 2. Layout of wheel profile acquisition sensors.

Specifically, Specifically, we we focus focus on on solving solving the the stripe stripe image image enhancement enhancement in in aa wheel wheel size size measurement measurement Specifically, we focus on solving the stripe image enhancement in a wheel size measurement system, system, the the diagram diagram of of which which is is shown shown in in Figure Figure 3. 3. Figure Figure 33 illustrates illustrates that that stripe stripe skeleton skeleton extraction extraction system, the diagram of which is shown in Figure 3. Figure 3 illustrates that stripe skeleton extraction is implemented on the raw images captured by each camera. Based on the stripe quality evaluation is implemented on the raw images captured by each camera. Based on the stripe quality evaluation is implemented on the raw images captured by each camera. Based on the stripe quality evaluation criteria, enhanced are are established established along the stripe stripe skeleton skeleton trajectory. trajectory. Based criteria, the the stripe stripe segments segments to to be be enhanced along the Based criteria, the stripe segments to be enhanced are established along the stripe skeleton trajectory. Based on on MSR MSR theory, theory, the the stripe stripe with with low-brightness low-brightnesssegments segmentsare areenhanced enhancedfor forcenter centerextraction. extraction. on MSR theory, the stripe with low-brightness segments are enhanced for center extraction.

𝜇 ,𝜎  

𝜌 =

Raw Images

Stripe center Stripe center extraction extraction

Stripe skeleton extraction

Enhanced stripe Enhanced images stripe images

||ℎ − ℎ ⊗ 𝐺 �   𝜇 ,𝜎 𝜇𝑑 , 𝜎𝑑

|| , 𝜇 , 𝜎 𝜇

𝜌

𝜌

Stripe quality evaluation

Enhanced segments Enhance estimation estim

Figure Diagram of of stripe stripe image Figure 3. 3. Diagram image enhancement enhancement in in highly highly reflective reflective conditions. conditions. Figure 3. Diagram of stripe image enhancement in highly reflective conditions.

The The reliable reliable and and accurate accurate wheel wheel size size measurement measurement framework framework can can be be concluded concluded as as follows: follows: The reliable and accurate wheel size measurement framework can be concluded as follows: Step Step 1. Stripe images are synchronously captured from sensors installed aroundaround the measured wheel. 1. Stripe images are synchronously captured from sensors installed the measured Step 1. Stripe images are synchronously captured from sensors installed around the measured wheel. Step 2. Image dilation and low threshold segmentation are adopted to obtain the stripe area in which wheel. Step 2. Image and and low the threshold areare adopted to obtain the stripe area in valid stripesdilation are located centralsegmentation skeleton points extracted. Step 2. Image dilation and low threshold segmentation are adopted to obtain the stripe area in which valid stripes are located and the central skeleton points are extracted. Step 3. valid Stripestripes imageare quality criteria are established in accordance with the center which locatedevaluation and the central skeleton points are extracted. Step 3. Stripeprinciple. image quality evaluation criteria are established in accordance with the center extraction Step 3. Stripe image quality evaluation criteria are established in accordance with the center extraction principle. Step 4. Based on the stripe quality criteria, the stripe segments with low brightness are obtained along extraction principle. Step 4. Basedskeleton on the stripe quality criteria, the stripe segments with low brightness are obtained the stripe trajectory. Step 4. Based on the stripe quality criteria, the stripe segments with low brightness are obtained along5.the stripe skeletonenhancement trajectory. Step MSR brightness along the stripe skeleton trajectory. is implemented to the low-quality stripe segments generated in Step 5.4.MSR brightness enhancement is implemented the low-quality segments step Moreover, the valid enhanced stripe is segmentedtoto accurately based onstripe the reflectivity. Step 5. MSR brightness enhancement is implemented the low-quality stripe segments generated in step 4. Moreover, the valid enhanced stripe is segmented accurately based on the Step 6. The incenter points of the the enhanced stripe arestripe extracted and the accurately wheel cross profile is generated step 4. Moreover, valid enhanced is segmented based on the reflectivity. reconstructed. Finally, wheel size parameters are calculated. reflectivity. Step 6. The center points of the enhanced stripe are extracted and the wheel cross profile is Step 6. The center points of the enhanced stripe are extracted and the wheel cross profile is reconstructed. Finally, wheel size parameters are calculated. reconstructed. Finally, wheel size parameters are calculated.

Sensors 2018, 18, x FOR PEER REVIEW Sensors 2018, 18, 4296

5 of 20 5 of 20

3. Algorithm Implementation In this section, the characteristics of wheel imaging and the cause of low brightness are 3. Algorithm Implementation analyzed. High-dynamic stripe enhancement includes stripe quality evaluation, locating enhanced In this section, the characteristics of wheel imaging and the cause of low brightness are analyzed. segments, stripe enhancement, and enhanced stripe segmentation. High-dynamic stripe enhancement includes stripe quality evaluation, locating enhanced segments, stripe enhancement, and enhanced stripe segmentation. 3.1. Stripe Imaging Model and Analysis

3.1. Stripe Imaging Model and Analysis 3.1.1. Stripe Imaging Description 3.1.1. The Stripe Imaging Description structured-light vision sensor projects a laser onto the measured object surface and obtains the image of the reflectedvision laser sensor line. The camera-sensitized illumination can be expressed as The structured-light projects a laser onto the measured model object surface and obtains I I Is I =image I e + I d of + Ithe , where is the ambient component, denotes the diffuse component, and the reflected laser line. The camera-sensitized illumination model can be expressed as s e d Irepresents = Ie + Id +the Is , specular where Ie is the ambientIcomponent, Id denotes the diffuse component, and Is represents component. e is independent of the viewing angle and related to the light the specular component. Ie is independent of the viewing angle and related to the light source and source and object reflectivity, which is negligible in structured-light measurement due to an optical object reflectivity, which is negligible in structured-light measurement due to an optical filter. Id is filter. I d is determined by the view angle and object surface reflectivity. When the view angle is determined by the view angle and object surface reflectivity. When the view angle is small, the diffuse small, the diffuse reflection component received is considerable. Meanwhile, the view angle, object reflection component received is considerable. Meanwhile, the view angle, object reflectivity, and reflectivity, and specular direction determine . When the between the view vector and specular direction determine Is . When the angleI s between the angle view vector and specular direction specular the direction decreases, the specular and the stripe is prone to decreases, specular component increases andcomponent the stripe is increases prone to overexposure. Therefore, in the overexposure. Therefore, in the structured-light imaging to component avoid the effect of structured-light imaging process, we attempt to avoid theprocess, effect ofwe theattempt specular while the specular while increasing the the diffuse component improve the image and increasing thecomponent diffuse component to improve image SNR and to uniform brightness andSNR ensure uniform brightness and ensure measurement accuracy. measurement accuracy. Figure 4a,b shows the stripe images captured by outer the outer and inner cameras, respectively. Figure 4a,b shows the stripe images captured by the and inner cameras, respectively. After After the line laser is irradiated to the wheel surface, most of the parts, except those absorbed by the the line laser is irradiated to the wheel surface, most of the parts, except those absorbed by the wheel, wheel, are reflected and composed by IsI. dMost and ofI sthe . Most the laser energy isaway reflected to are reflected and composed by Id and laserofenergy is reflected dueaway to thedue high reflectivity of the wheel surface, changes topography, and the imaging of the camera, the high reflectivity of the wheellarge surface, large in changes in topography, and the angle imaging angle of the resulting in partial in darkness of the stripe image, as image, shown as in Figure and r2 Similarly, camera, resulting partial darkness of the stripe shown 4a in r1 Figure 4asegments. r1 and r2 segments. for the large angleview of the inner camera the reflectivity of the wheel side, theside, stripe Similarly, forview the large angle of the innerand camera and the reflectivity of theinner wheel inner the brightness of Figure r1 and are darkare indark the inner stripe brightness of 4b Figure 4b r2 r1 segments and r2 segments in therim. inner rim.

Camera-1

r3

r2 r1

r4

diffuse reflection specular reflection view direction

(a)

Camera-2

r1

r2

diffuse reflection specular reflection view direction

(b)

Figure Figure4.4.Laser Laserbrightness brightnessperception perceptionof ofstripe stripeimages imagescaptured capturedby byouter outerand andinner innercameras. cameras.(a) (a)Laser Laser brightness perception of the image from the outer camera; (b) laser brightness perception of the image brightness perception of the image from the outer camera; (b) laser brightness perception of the from thefrom inner camera. image the inner camera.

Figure gray value distribution in the stripe and and normal directions. Ideally, the gray Figure55displays displaysthe the gray value distribution in the stripe normal directions. Ideally, the distribution is even in the stripe direction and is inclined to a Gaussian distribution in the normal gray distribution is even in the stripe direction and is inclined to a Gaussian distribution in the direction. However, the grayscale distribution of the image captured on the site normal direction. However, the grayscale distribution of the image captured onchanges the sitestrongly. changes Figure 5b shows the gray distribution in the stripe in direction anddirection the curveand fluctuate severely and strongly. Figurethat 5b shows that the gray distribution the stripe the curve fluctuate are accompanied by two dark curve segments marked with a red dotted frame. Figure 5c illustrates severely and are accompanied by two dark curve segments marked with a red dotted frame. Figurethe 5c

Sensors 2018, 18,FOR 4296 Sensors 2018, 18, x xFOR PEER Sensors 2018, 18, PEERREVIEW REVIEW

6 of6206of of 2020

illustratesthe thegray graydistribution distributioncurves curves in in aa normal normal direction direction at illustrates at approximately approximatelyeight eightsampling samplingpoints, points, gray distribution curves in a normal direction at approximately eight sampling points, presenting as presentingasasspiked, spiked,steep, steep,and and narrow. If If we increase the camera exposure time, then the presenting the time, camera then thebright bright spiked, steep, and narrow. If wenarrow. increase thewe camera exposure thenexposure the bright time, segments become segments become brighter, which leads leadsTherefore, to overexposure. overexposure. Therefore, the method isisto segments become which to Therefore, thebest best method toenhance enhance brighter, whichbrighter, leads to overexposure. the best method is to enhance the brightness of dark the brightness of dark segments and maintain the original ones. the brightness of dark segments and maintain the original ones. segments and maintain the original ones.

(a) (a)

(b) (b)

(c) (c)

Figure 5. Gray distribution of the stripe image in the stripe and normal directions. (a) Stripe image 5. Gray distribution the stripe image in in the stripe and normal directions. (a) Stripe image Figure 5. Gray distribution of of thevalue stripe image the stripe and normal directions. (a) Stripe image fromFigure the outer camera. (b) Gray distribution in the stripe direction. (c) Gray distribution curves from the outer camera. (b) Gray value distribution in the stripe direction. (c) Gray distribution curves of the in the direction. from thestripe outerimage camera. (b) normal Gray value distribution in the stripe direction. (c) Gray distribution curves of the stripe image in the normal direction. of the stripe image in the normal direction.

Figure 6 displays the capturedfrom fromthe theouter outer cameras. Three different Figure 6 displays theonline onlinestripe stripe images images captured cameras. Three different segments are used, namely, low brightness, overexposure, and stray light. Overexposure and other Figure 6 displays the online stripe images captured and from thelight. outer cameras. Three different segments are used, namely, low brightness, overexposure, stray Overexposure and other stray lights that arenamely, unavoidable ininreal are reduced reduced bydecreasing decreasing exposure time. segments are used, low brightness, overexposure, and stray light. Overexposure and other stray lights that are unavoidable realmeasurement measurement are by thethe exposure time. Generally, thethe low-brightness stripe image isis the mainare factor causing the leakage, which is the Generally, stripe image the main factor causing leakage, which is the keykey stray lights that arelow-brightness unavoidable in real measurement reduced bythe decreasing the exposure time. focus of this study. focus of this study. Generally, the low-brightness stripe image is the main factor causing the leakage, which is the key

focus of this study.

2

1

2

1

3

4

3

4

Low brightness:

Over exposure:

Reflection imge:

1 and 3③ Figure 6. Online stripe images capturedfrom fromthe theouter outer camera. camera.

present thethe stipe images Figure 6. Online stripe images captured ① and present stipe images Low brightness: Over exposure: Reflection imge: 2 and 4 exhibit the stipe images captured with high exposure. captured with low exposure.

captured with low exposure. ② and ④ exhibit the stipe images captured with high exposure. Figure 6. Online stripe images captured from the outer camera. ① and ③ present the stipe images

captured low exposure. and ④ exhibit theofstipe images captured with7ahigh exposure. Figure with 7 shows the center②extraction results stripe images. Figure displays the images

with low brightness in tread segments. Figure 7b illustrates the stripe images with overexposure in shows the center extraction results of stripe images. Figure 7a displays the images theFigure flange 7segments. The former cases cause useful segments to not be extracted accurately. By with low brightness in tread segments. Figure 7b illustrates the stripe images with overexposure contrast, the key segments in the tread are almost not extracted, which causes wheel parameters toin the flange segments. The former cases cause useful segments to not be extracted accurately. By

Sensors 2018, 18, 4296

7 of 20

Figure 7 shows the center extraction results of stripe images. Figure 7a displays the images with low brightness in tread segments. Figure 7b illustrates the stripe images with overexposure in the flange segments. The former cases cause useful segments to not be extracted accurately. By contrast, the key segments in the tread are almost not extracted, which causes wheel parameters to not Sensors 2018, 18, x FOR PEER REVIEW 7 of 20 be calculated.

(a)

(b)

Figure 7. Center extractionresults resultsof ofstripe stripe images. images. (a) segments; Figure 7. Center extraction (a) Stripe Stripeimages imageswith withlow-brightness low-brightness segments; stripe images with overexposedsegments. segments. (b) (b) stripe images with overexposed

3.1.2. Stripe Brightness 3.1.2. Stripe BrightnessAnalysis Analysis When thethe incident light is aislaser, thethe wheel surface is irregularly undulating even though thethe light When incident light a laser, wheel surface is irregularly undulating even though light is Therefore, regular. Therefore, the reflected light shows irregularity with bright, dark, or uneven is regular. the reflected light shows irregularity with bright, dark, or uneven distribution. α1 incident Letdistribution. α1 and α2 be and angles, respectively. Then, the scatter light intensity Letthe and α 2 be thescatter incident and scatter angles, respectively. Then, the scatter light is described intensityas: is described as: ( Is ( x, y) = I0 ( x, y) exp(−σϕ2 ) (1) 2 ,  Iσs ϕ( x=, y2π )λ σhI(0cos ( x, αy1) + exp( = cos−ασ2ϕ) )

 (1)  2π , where I0 ( x, y) represents = the incident denotes theα surface roughness, and λ refers to α1 + cos 2) σ ϕ light,σσhh(cos λ  the wavelength.

In the online wheel size dynamic measurement system, the near-infrared laser wavelength is σ h denotes the surface roughness, and λ refers to λ= 808 nm, and the wheel surface roughness is σh ∈ [40, 100]. Figure 8 shows that the abscissa is the the wavelength. ratio of the surface roughness to the wavelength, and the vertical axis is the normalized intensity of In the online wheel size dynamic measurement system, the near-infrared laser wavelength is scattered light. The scattered light increases as the ratio of the surface roughness to the wavelength λ = 808 nm , and the wheel surface roughness is σ h ∈ [40,100] . Figure 8 shows that the abscissa is increases and decreases as the view angle increases. Given that the relative roughness of the surface is the ratio of the surface roughness to the wavelength, and the vertical axis is the normalized intensity small, it can be regarded as a fixed value. Figure 8 S1 illustrates that the surface is smooth, and the of scattered light. The scattered light increases as the ratio of the surface roughness to the scattered light intensity is weak. The angle of observation and surface normal vector are large, and the wavelength increases and decreases as the view angle increases. Given that the relative roughness of scattered light is weak. Figure 8 S2 indicates that the stripe observed by the camera becomes dark after the surface is small, it can be regarded as a fixed value. Figure 8 S1 illustrates that the surface is Sensors 2018, 18, x FOR PEER REVIEW 8 of 20 a long period friction of the tread area, iswhereas other rough areas become smooth, andofthe scattered light intensity weak. The angle of observation and bright. surface normal vector where I 0 ( x, y ) represents the incident light,

are large, and the scattered light is weak. Figure 8 S2 indicates that the stripe observed by the camera becomes dark after a long period of friction of the tread area, whereas other rough areas become bright. Figure 4a exhibits that the stripe in the tread area is darker than other rough areas due to smoothness. By contrast, Figure 4b shows a dark area in the inner rim. In the stripe image captured by the outer camera, as shown in Figure 4a, tread segments of r1 and r2, due to the smooth surface and large view angle, allow minimal light into the digital sensor. The reflectivity of the stripe image from the inner camera, due to oil pollution above the inner rim, is reduced and the stripe is dimmed in Figure 4b r1 and r2 segments.

Figure 8. Scattering intensity asasthe ofthe thesurface surface roughness and wavelength. Figure 8. Scattering intensity theratio ratio of roughness and wavelength. 3.2. High-Dynamic Stripe Image Processing 3.2.1. Quality Evaluation of Stripe Image A good laser stripe is a prerequisite for structured-light vision sensor measurement. Therefore, laser stripe image quality evaluation is conductive for determining whether or which stripe

Sensors 2018, 18, 4296

8 of 20

Figure 4a exhibits that the stripe in the tread area is darker than other rough areas due to smoothness. By contrast, Figure 4b shows a dark area in the inner rim. In the stripe image captured by the outer camera, as shown in Figure 4a, tread segments of r1 and r2, due to the smooth surface and large view angle, allow minimal light into the digital sensor. The reflectivity of the stripe image from the inner camera, due to oil pollution above the inner rim, is reduced and the stripe is dimmed in Figure 4b r1 and r2 segments. 3.2. High-Dynamic Stripe Image Processing 3.2.1. Quality Evaluation of Stripe Image A good laser stripe is a prerequisite for structured-light vision sensor measurement. Therefore, laser stripe image quality evaluation is conductive for determining whether or which stripe segments need to be enhanced. Xie Fei [33] conducted a statistical behavior analysis for the laser stripe center detector based on Steger algorithm [34]. The quantitative relationship between the center point uncertainty and its surrounding SNR is determined. Based on the stripe extraction algorithm, the stripe line satisfies the Gaussian distribution in a normal direction, and its center coordinate is the laser energy point. The features of the ideal stripe include a uniform brightness in the stripe direction, a Gaussian distribution in the normal direction, and space continuous. Consequently, three aspects are considered, namely, gray distribution in the stripe direction, gray distribution in the normal direction, and the space continuous. a. Stripe direction Let gi be the brightness of each stripe pixel, µ g be the mean value, and σg be the variance. Generally, a high µ g and small σg indicate that the stripe has uniform brightness without fluctuations. In other words, the overall quality is good. Based on experience, the image with µ g > 0.8, σg < 0.2 is a good stripe image. b. Normal direction Based on the center extraction algorithm, the gray in the normal stripe direction meets the Gaussian distribution, which induces a small positioning error. We determine the difference between the origin and Gaussian-filtered stripes to describe the normal gray distribution. The small difference proves that the stripe conforms to the Gaussian distribution. We set the difference (namely, image

noise), ρi = ∑ hi − hi ⊗ Gµ−σ , and calculate the mean, µρ , and variance, σρ , where hi is the normal sample gray and Gµ−σ represents the Gaussian convolution mask. A small µρ and σρ manifest few burr and uneven parts, which indicates a good quality. The template size of the selected Gaussian filter should refer to the actual width of the stripe. Generally, we set µρ < 0.024, σρ < 0.008 as the evaluation criteria in the normal direction. c. Stripe continuity A good laser stripe should be continuous in space. Correspondingly, when the disconnection segment is detected, image enhancement is implemented. We set D p as the percentage of the effective stripe in the entire stripe. Let the distance between the adjacent stripe points be di , the mean distance of di in the entire stripe be µd , and the variance be σd . Similarly, a small µd and σd indicates that the stripe is complete and with few disconnected parts. Usually, when adjacent stripe points’ µd < 5pixel, it is valid for measurement. Furthermore, this parameter is determined by the real measurement requirement. In addition, µi and σi , i = g, ρ, d are normalized in [0, 1]. Regarding the stripe quality, the highest priority is stripe continuity, which ensures measurement reliability. Then, the gray value in the stripe direction and the noise in the normal direction determine the measurement accuracy.

by the real measurement requirement. In addition, µi and σ i , i = g , ρ , d are normalized in [0,1] . Regarding the stripe quality, the highest priority is stripe continuity, which ensures measurement reliability. Then, the gray value in the stripe direction and the noise in the normal direction determine the measurement accuracy. Sensors 2018, 18, 4296

9 of 20

3.2.2. Enhanced Stripe Segments Location

Based on the Stripe stripeSegments quality evaluation criteria described in Section 3.2.1, we conduct stripe 3.2.2. Enhanced Location quality evaluation to enhance the stripe segments. Therefore, to obtain the coordinates of the stripe Based on the stripe quality evaluation criteria described in Section 3.2.1, we conduct stripe image center roughly, the stripe is segmented out, and binary operation and skeleton extraction are quality evaluation to enhance the stripe segments. Therefore, to obtain the coordinates of the stripe implemented, continuous points along the direction. Figure image centerthereby roughly,obtaining the stripe the is segmented out,stripe and binary operation andstripe skeleton extraction are 9 shows the establishment process of the stripe quality regions from ① to ④. Quality evaluation implemented, thereby obtaining the continuous stripe points along the stripe direction. Figure 9 shows is implemented in each stripe point inquality accordance with criteria; thus, we can obtain 1evaluation 4 Quality the establishment process of the stripe regions from to . evaluation is implemented low-quality segments for brightness enhancement and extract good-quality segments avoiding in each stripe point in accordance with evaluation criteria; thus, we can obtain low-quality for segments for brightness enhancement and extract good-quality segments for avoiding additional calculations. additional calculations. 1

2

3

4

1

2

3

4

R1

R3 R2

(a)

(b)

Figure 9. Locationofofstripe stripe segments areare to betoenhanced. (a) Images capturedcaptured by the outer Figure 9. Location segmentsthat that be enhanced. (a) Images bycamera, the outer 1 2 3 (b) inner images.

is the original image,

shows the segmented and binarized image,

presents camera, (b) inner images. ① is the original image, ② shows the segmented and binarized image, 4 indicates the regions established through stripe quality evaluation. the stripe the skeleton, and

③ presents stripe skeleton, and ④ indicates the regions established through stripe quality evaluation. 1 is the original Figure 9 displays the images captured by the outer and inner cameras, where 2 presents the image whose brightness is uneven and contains good- and low-quality segments. Figure 9 displays the images captured by the outer and inner cameras, where ① is the original 3 shows the stripe skeleton points extracted segmented and binarized image from the original image. image whose is uneven and contains good- and low-quality segments. ② presents the 2 4 the red points indicate the stripe segments with good quality, and the green dot from

. Inbrightness

, segmented binarized image tofrom the original ③R1shows stripe skeleton points rectangleand region is the segment be enhanced, e.g., image. segments and R2the in the tread and flange extracted from ②. In ④, the red points indicate the stripe segments with good quality, and the regions and R3 in the rim.

green dot rectangle region is the segment to be enhanced, e.g., segments R1 and R2 in the tread and 3.2.3. Imageand Brightness Enhancement flange regions R3 in the rim. Retinex theory was introduced by Edwin. H. Land [32], who found that the color perceived

3.2.3. Brightness byImage the human visionEnhancement system is determined by the reflection of an object and does not involve

the light from a scene. Based on this analysis, the stripe image brightness depends mainly on the Retinex theory was introduced by Edwin. H. Land [32], who found that the color perceived by surface reflectivity. Hence, stripe brightness enhancement based on Retinex is appropriate for solving the human vision system is determined by the reflection of an object and does not involve the light high-reflection problems. The Retinex algorithm that holds the image is composed of two parts, namely, fromthea reflected scene. Based on this analysis, the stripe image brightness depends mainly on the surface and incident components. To acquire the reflection component further to restore the reflectivity. Hence, stripe enhancement based ison Retinex appropriate for solving original appearance of thebrightness object, the illuminance component obtained by is computing the brightness high-reflection problems. The Retinex algorithm that holds the image is composed of two parts, between pixels. The image can be expressed as follows:

namely, the reflected and incident components. To acquire the reflection component further to ( x, y) =the R( illuminance x, y) · L( x, y), component is obtained by computing (2) restore the original appearance of the Iobject, the brightness between pixels. The image can be expressed as follows: where L( x, y) represents the incident light and R( x, y) denotes the reflection property of the object. I ( x, y) indicates the image to be enhanced. In fact, the incident light L( x, y) directly determines the DR of a pixel reaching in the image. R( x, y) determines the intrinsic nature of the image. The purpose of Retinex theory is to obtain R( x, y) from I ( x, y). Equation (2) is transformed into the logarithmic domain: log( I ( x, y)) = log( R( x, y)) + log( L( x, y)).

(3)

Sensors 2018, 18, 4296

10 of 20

Land proposed a center/surround Retinex algorithm with the basic idea that the brightness of each center pixel is estimated by setting different weights around the pixel. Jobson finally determined that the Gaussian surround function can achieve good results. Equation (3) can be expressed as: log( R( x, y)) = log( I ( x, y)) − log( F ( x, y) · I ( x, y)),

(4)

( x 2 + y2 )

1 where the Gaussian function,F ( x, y) = 2πσ ), and σ refers to the scale parameter, which 2 exp(− 2σ2 directly affects the estimation of the incident component. When σ is small, the Gaussian template is small and the Gaussian function is relatively steep. The estimated component of the incident after convolution is also relatively rugged, with a strong dynamic compression. However, the detail is retained and the brightness is poor. On the contrary, when σ is large, the Gaussian template is large and the Gaussian function is relatively gentle; the convolution of the incident component is also relatively smooth and the performance of brightness fidelity is good. However, the dynamic compression ability is poor and the details of the enhancement are not evident. Thus, the Retinex algorithm cannot guarantee detail and brightness enhancements based on a single σ. The MSR is a generalization of the single-scale Retinex algorithm, which ensures detail and brightness enhancement: K

r ( x, y) =

∑ Wj



log I ( x, y) − log[ I ( x, y) · Fj ( x, y)] ,

(5)

j =1

K

where r ( x, y) = log( R( x, y)), K is the total number of σ, Wj signifies the weight, and ∑ Wj = 1. j =1

Under normal circumstances, MSR adopts high, medium, and low scales, that is, K = 3, W1 = W2 = W3 = 1/3. When the reflection is computed, the enhanced pixels are mapped by normalizing its corresponding reflection to [0,255]. The pseudocode of high-dynamic stripe image enhancement is shown in Algorithm 1 as follows: Algorithm 1. High Dynamic Stripe Image Enhancement Input: Low brightness stripe image I. Output: High quality stripe image Ie . 1: Function stripeImgHandler 2: Compute surface reflectivity f ; 3: Normalize f to gray value and get enhanced image It ; 4: Calculate the initial vaue of segmentation threshold T; 5: T = 0.5 × (min(T) + max(T)) 6: Set segment flag = false; 7: While(!flag) do 8: Find the image index, g = find(f > T); 9: Calculate stripe area segmentation threshold, 10: Tn = 0.5 × (mean(f (g)) + mean(f (~ g))) 11: flag = abs(T − Tn ) < 0.1; 12: T = Tn 13: End while 14: Segment It with T; 15: return Ie ; 16: end function.

Figure 10 shows the stripe images before and after enhancement. Figure 10a presents the original image with uneven luminance distribution. In particular, the brightness of the middle region (which is used to calculate the tread wear) is dark, which easily affects tread-base point extraction. In Figure 10b, the tread area is recovered, thereby ensuring the accuracy of stripe center extraction.

region (which is used to calculate the tread wear) is dark, which easily affects tread-base point Figure 10 shows the stripe images before and after enhancement. Figure 10a presents the extraction. In Figure 10b, the tread area is recovered, thereby ensuring the accuracy of stripe center original image with uneven luminance distribution. In particular, the brightness of the middle extraction. region (which is used to calculate the tread wear) is dark, which easily affects tread-base point extraction. In Figure 10b, the tread area is recovered, thereby ensuring the accuracy of stripe center Sensors 2018, 18, 4296 11 of 20 extraction.

(a)

(b)

Figure 10. Stripe images before and after brightness enhancement. (a) Stripe image before enhancement; (b) stripe image (a) after enhancement. (b) Figure 10.Stripe Stripe images and and after brightness enhancement. (a) Stripe image enhancement; Figure 10. imagesbefore before after brightness enhancement. (a)before Stripe image before 3.2.4. Stripe Segmentation (b) stripe image after enhancement. enhancement; (b) stripe image after enhancement.

The enhanced image often contains an enhanced background and stray stripes; thus, 3.2.4. Stripe Segmentation segmenting the effective stripe is necessary. This study does not directly use the image segmentation 3.2.4. Stripe Segmentation enhanced often contains an enhanced background andchanges stray stripes; thus, segmenting methodThe based on theimage gray threshold because the image brightness greatly and is difficult to The enhanced image often contains an enhanced background and stray stripes; the effective stripe is necessary. This study does not directly use the image segmentation method basedthus, segment properly. Nevertheless, the reflection of the stripe area is distinct from the background. We on the gray because image brightness changes greatly andthe is difficult to image segment properly. segmenting thethreshold effective stripe is necessary. This study does notout directly use the adopt reflectivity to generate athevalid stripe mask for filtering enhanced stripe.segmentation Figure 11a,b Nevertheless, reflection of the stripe area is distinct from the background. We adopt reflectivity to to method based onthe the gray threshold because the image brightness greatly is difficult shows the gray and reflectivity distributions, respectively. Thechanges gray image is and not considerably generate a valid stripe mask for filtering out the enhanced stripe. Figure 11a,b shows the gray and segment properly. Nevertheless, the reflection of the stripe area is distinct fromand the the background. We different in the background. By contrast, the reflectivity distribution is evident, stripe section reflectivity distributions, respectively. The gray image is not considerably different in the background. adopt reflectivity toaccurately. generate a valid stripe mask for filtering out the enhanced stripe. Figure 11a,b can be segmented By contrast, the reflectivity distribution is evident, and the stripe section can be segmented accurately. shows the gray and reflectivity distributions, respectively. The gray image is not considerably different in the background. By contrast, the reflectivity distribution is evident, and the stripe section can be segmented accurately.

(a)

(b)

Figure Stripeimage imagegray gray and and reflectivity reflectivity distribution. distribution of the stripe image; Figure 11.11.Stripe distribution.(a)(a)Gray Gray distribution of the stripe image; (b) reflectivity distribution of the stripe image. (b) reflectivity distribution of the stripe image.

(a)

(b)

4. Experiments and Evaluations Figure 11. Stripe image gray and reflectivity distribution. (a) Gray distribution of the stripe image; 4.1. Experimental Setup (b) reflectivity distribution of the stripe image. To verify the effectiveness of this proposed method, we applied it to the online wheel size measurement system installed in Lixian County, Hebei Province, China. Figure 12 shows that four structured-light vision sensors are equipped under the rail, and each wheel is measured by two structural light sensors mounted inside and outside the rail. Given that this device is installed outdoors and in the main line, complex lighting with various interferences, complicated wheel appearance, and reflectivity cause the collected images to show high-dynamic characteristics, especially in the wheel tread. Table 1 lists the sensor configuration parameters.

structured-light vision sensors are equipped under the rail, and each wheel is measured by two structural light sensors mounted inside and outside the rail. Given that this device is installed outdoors and in the main line, complex lighting with various interferences, complicated wheel appearance, and reflectivity cause the collected images to show high-dynamic characteristics, Sensors 2018, 18, 4296 12 of 20 especially in the wheel tread. Table 1 lists the sensor configuration parameters.

Rail

Outer Sensor

Inner Sensor

Laser

Camera

Figure 12. Wheel size measurement system in an outdoor complex environment.

Figure 12. Wheel size measurement system in an outdoor complex environment. Table 1. Calibration parameters of wheel sensors.

Table 1. Calibration parameters of wheel sensors. Wheel Sensors Parameters

Wheel SensorsuParameters fx fy v0 k1 k2 0 f x f y u 0 v 0 k 1 k 2 Camera 1 1978.7510 1978.6804 680.5250 537.8380 −0.1383 0.1901 Camera 2 1971.5685 1972.6837 697.6733 536.6688 − 0.1328 0.1476 Camera 1 1978.7510 1978.6804 680.5250 537.8380 −0.1383 0.1901 a c Camera 2 1971.5685 1972.6837b 697.6733 536.6688 −0.1328 d0.1476 PlaneFunc1 0.3190a 0.8092 b −0.4934c 204.2166 d PlaneFunc2 0.1931 −0.8568 0.4782 −196.3384 PlaneFunc1 0.3190 0.8092 204.2166 −0.4934   −0.4728 −0.4973 −0.7275 333.1798 PlaneFunc2 0.1931 −0.8568  T12 0.4016 −0.6760 ,0.4782 R =  0.6178 t =  279.5493 −196.3384 0.6283 − 0.7690 0.1173 348.4826  −0.4728 −0.4973 −0.7275  333.1798     R =  0.6178 0.4016 −0.6760  , t =  279.5493 T12 4.2. Comparison of Stripe Image Processing Results  0.6283 −0.7690 0.1173  348.4826  The processing methods include the proposed method (PM), histogram equalization (HE), contrast adjustment (CA), logarithmic transformation (LT), convolution filter (CF), median filtering (MF), and 4.2. Comparison of Stripe Image Processing Results sharpening operation (SO). The abbreviations for each method are utilized to facilitate the description.

The processing methods include the proposed method (PM), histogram equalization (HE), 4.2.1. Comparison of Stripe Quality contrast adjustment (CA), logarithmic transformation (LT), convolution filter (CF), median filtering verify the improvement of the image, its quality andare after the enhancement (MF), andTo sharpening operation (SO). Thestripe abbreviations for eachbefore method utilized to facilitate the treatment was measured. We select four data sets; each contains 100 images, where data sets 1, 2, description. 3, and 4 present 40%, 50%, 60%, and 70% low brightness stripes (induced by a highly reflective object surface), respectively. The image was enhanced by applying the abovementioned methods 4.2.1. Comparison of Stripe Quality and the stripe quality evaluation parameters are calculated. Table 2 shows the statistics of image quality evaluation parameters for dataimage, sets corresponding different enhancement methods. To verify the improvement ofdifferent the stripe its qualitytobefore and after the enhancement The optical stripe quality evaluation parameters corresponding to the proposed method are best1, 2, 3, treatment was measured. We select four data sets; each contains 100 images, where datathe sets different40%, datasets, thelow proposed method can effectively the stripe qualityobject and 4inpresent 50%,indicating 60%, andthat 70% brightness stripes (inducedimprove by a highly reflective under highly reflective conditions.

surface), respectively. The image was enhanced by applying the abovementioned methods and the stripe4.2.2. quality Timeevaluation Statistics parameters are calculated. Table 2 shows the statistics of image quality evaluation parameters for different data sets corresponding to different enhancement methods. The To verify the processing efficiency of each method, the calculation time of different processing optical stripe quality evaluation parameters corresponding to the proposed method are the best in steps is separately calculated. The computer configuration is a 32-bit Windows 7 operating system, different indicating that the proposed method effectively improve theTostripe quality with datasets, a 3.1 GHz CPU and 8 GB memory. Table 3 lists the timecan consumption of each process. simplify underthe highly reflective conditions. expression, SD, SE, SS, SCE, TBA, and AC stand for stripe detection, stripe enhancement, stripe segmentation, stripe center extraction, total before acceleration, and after acceleration, respectively.

Sensors 2018, 18, 4296

13 of 20

Table 2. Stripe quality evaluation under different processing methods. Data

Methods

µg

σg

µρ

σρ

Dp

µd

σd

DATA1

Origin LT MF SO Our

0.767 0.671 0.829 0.749 0.769

0.230 0.130 0.179 0.220 0.153

0.060 0.053 0.064 0.058 0.063

0.176 0.154 0.179 0.170 0.170

62.016% 47.651% 48.756% 60.221% 98.480%

0.142 0.221 0.233 0.215 0.024

0.214 0.294 0.271 0.250 0.009

DATA2

Origin LT MF SO Our

0.820 0.673 0.882 0.794 0.807

0.184 0.090 0.140 0.171 0.131

0.007 0.007 0.007 0.007 0.008

0.023 0.027 0.025 0.024 0.027

53.280% 51.181% 50.524% 52.624% 99.868%

0.116 0.076 0.091 0.116 0.008

0.172 0.181 0.198 0.179 0.0

DATA3

Origin LT MF SO Our

0.838 0.689 0.939 0.818 0.690

0.180 0.082 0.133 0.187 0.188

0.008 0.008 0.009 0.008 0.005

0.020 0.029 0.026 0.020 0.017

38.523% 34.496% 35.167% 37.583% 96.004%

----0.014 0.010 ----0.016

----0.0 0.0 ----0.012

DATA4

Origin LT MF SO Our

0.590 0.582 0.676 0.563 0.564

0.191 0.228 0.228 0.178 0.264

0.005 0.007 0.004 0.005 0.006

0.026 0.041 0.027 0.028 0.028

53.667% 34.704% 35.241% 53.846% 90.595%

0.673 0.344 0.233 0.677 0.038

0.0 0.4743 0.384 0.0 0.038

Note: “- - - - - ” indicates that the result is calculated unsuccessfully.

Table 3. Time statistics for stripe image processing (ms). Methods

SD

SE

SS

SCE

TBA

Levels

AC

HE CA LT CF MF SO PM

------------------222

28 15 29 90 14 24 137

------------------32

520 427 71 57 24 27 30

548 442 100 147 38 51 421

18 14 4 5 2 2 13

30 31 25 30 19 26 33

Note: “- - - - ” indicates that the item is missing.

We adopt multi-thread flow acceleration and a serialized output program, which can theoretically reach unlimited speed. However, when considering the hardware overhead, including thread switching and additional calculation, the actual processing speed based on 13-level acceleration can increase to 30 HZ with the train speed of up to 189 km/h. Although the proposed method induces numerous additional calculations, it can still meet the measurement needs of high-speed trains through multi-thread flow acceleration, which meets the online testing requirements of normal freight and passenger trains. Furthermore, the acceleration framework is versatile and suitable for other image processing methods. 4.2.3. Comparison of Various Image Enhancement Methods To verify the effectiveness of the proposed method, we compare various enhancement methods. Figures 13 and 14 show the inner and outer stripe image enhancement results, respectively. Each row contains four types of images to be processed and presents the processing result of an image enhancement method. Figures 13 and 14 illustrate that this method has a greater improvement than other methods. By contrast, the HE method leads to serious stripe overexposure. The CA method adjusts all pixels’ gray values, but the brightness of the enhanced part remains relatively low. LT, MF, and SO do not play a great role in brightness enhancement.

reconstructed result corresponding to the original image is compared with the proposed method. Figure 17 shows the wheel cross contour reconstruction results without image processing. Some parts are missing in stripe images, including the tread and inner rim, causing the last wheel contour to be incomplete and to not be utilized for calculation. On the contrary, Figure 18 illustrates that all stripe images are enhanced properly and the last wheel cross contour is complete; thus, the original Sensors 2018, 18, 4296 14 of 20 missed wheels can still be measured accurately. (1)

(2)

(3)

(4)

(a)

(b)

(c)

(d)

(e)

(f)

(g)

(h)

Figure 13.13. Enhanced results of outer stripe images based on different methods. Columns (1)–(4) present Figure Enhanced results of outer stripe images based on different methods. Columns (1)–(4) thepresent tread segments different Row (a) displays raw images. (b–h) the tread with segments withmissing differentpercentages. missing percentages. Row (a) the displays the rawRows images. Rowsthe (b–h) exhibit the results by CF, HE,MF, CA,and LT, SO CF,methods, MF, and SO methods, respectively, exhibit results processed by processed HE, CA, LT, respectively, and row (b) and shows Sensorsrow 18,shows x FOR the PEER REVIEW 15 of 20 (b) results generated throughmethod. the proposed method. the 2018, results generated through the proposed (1)

(2)

(3)

(4)

(a)

(b)

(c)

(d)

(e)

(f)

(g)

(h)

Figure14.14.Enhanced Enhanced results results of of inner inner stripe stripe images (1)–(4) Figure images based basedon ondifferent differentmethods. methods.Columns Columns (1)–(4) present tread segments with different missing percentages. Row displays images. present thethe tread segments with different missing percentages. Row (a) (a) displays thethe rawraw images. Rows Rows (b–h) exhibit the results processed by CA, HE, LT, CA,CF, LT, MF, CF, MF, methods, respectively, androw (b–h) exhibit the results processed by HE, and and SO SO methods, respectively, and (b) shows the results generated through the proposed method. (b)row shows the results generated through the proposed method. (2) (3) Figures 15 and(1)16 illustrate the outer and inner stripe image center extraction(4) results based on different methods. However, the center extraction results of other methods are unsatisfactory; thus, (a) it cannot be applied to actual measurement. Figure 15c,f presents numerous stay stripes extracted. In Figure 15d,e,g,h, the stripe is incomplete for measurement. In Figure 15b, the entire stripe is (b)

(c)

(d)

(g)

(h)

Sensors 2018, 18, 4296

15 of 20

Figure 14. Enhanced results of inner stripe images based on different methods. Columns (1)–(4) present the tread segments with different missing percentages. Row (a) displays the raw images. extracted correctly, especially the processed tread area. result exists in Figure Rows (b–h) exhibit the results by Correspondingly, HE, CA, LT, CF, MF,the andsame SO methods, respectively, and 16, that is, onlyrow the(b) proposed method can guarantee the integrity and accuracy of stripe extraction. shows the results generated through the proposed method. (1)

(2)

(3)

(4)

(a)

(b)

(c)

(d)

(e)

(f)

(g)

(h)

Figure images based basedon ondifferent differentmethods. methods.Columns Columns Figure15.15.Center Centerextraction extraction results results of of outer outer stripe stripe images (1)–(4) present thethe tread segments with different missing percentages. RowRow (a) shows the raw (1)–(4) present tread segments with different missing percentages. (a) shows the images. raw images. Rows (b–h) the display the extraction center extraction of images through CA, CF,MF, MF,and andSO Rows (b–h) display center resultsresults of images through HE,HE, CA, LT,LT, CF, SO methods, respectively, and(b) row (b) exhibits the center extraction of images processed methods, respectively, and row exhibits the center extraction resultsresults of images processed through Sensors 2018, 18,the x FOR PEER REVIEW 16 of 20 proposed method. thethrough proposed method. (1)

(2)

(3)

(4)

(a)

(b)

(c)

(d)

(e)

(f)

(g)

(h)

Figure stripe images images based basedon ondifferent differentmethods. methods.Columns Columns Figure16.16.Center Centerextraction extraction results results of inner stripe (1)–(4) present thethe tread segments with different missing percentages. RowRow (a) shows the raw (1)–(4) present tread segments with different missing percentages. (a) shows the images. raw Rows (b–h) display center resultsresults of images through HE,HE, CA, LT,LT, CF, images. Rows (b–h) the display the extraction center extraction of images through CA, CF,MF, MF,and andSO methods, respectively, and row (b) exhibits the center extraction results of images processed through SO methods, respectively, and row (b) exhibits the center extraction results of images processed the method. proposed method. thethrough proposed

Sensors 2018, 18, 4296

16 of 20

Given that the coordinates of the center point obtained by other methods differ greatly from the stripe required for actual measurement, contrasting is not needed. In this study, only the reconstructed result corresponding to the original image is compared with the proposed method. Figure 17 shows the wheel contour reconstruction without image processing. Some parts are missing in Figurecross 16. Center extraction results ofresults inner stripe images based on different methods. Columns stripe(1)–(4) images, including the tread and inner rim, causing the last wheel contour to be incomplete present the tread segments with different missing percentages. Row (a) shows the raw and to not be utilized calculation. On the contrary, 18 through illustrates images images. Rows (b–h)for display the center extraction resultsFigure of images HE,that CA,all LT,stripe CF, MF, and are enhanced properly and the last wheel cross contour is complete; thus, the original missed wheels SO methods, respectively, and row (b) exhibits the center extraction results of images processed can still be measured accurately. through the proposed method.

(1)

(2)

(3)

Figure 17. without image processing. Column (1) presents the raw Figure 17. Wheel Wheel cross crosscontour contourreconstruction reconstruction without image processing. Column (1) presents the images. Column (2) shows the center extraction results of (1).ofColumn (3) exhibits the wheel cross raw images. Column (2) shows the center extraction results (1). Column (3) exhibits the wheel contour resultsresults of each cross contour ofwheel. each wheel.

4.3. Performance Evaluation of Online Measurement For evaluating the online measuring performance of the proposed method, we repeated the experiment with the standard wheel, performed comparative experiments under highly reflective conditions, and conducted onsite dynamic measurement. As we cannot determine the precise size of each wheel, the statistical leakage and false-positive rates are mainly explained. 4.3.1. Measurement Accuracy The standard freight wheel is adopted as a measured object whose size is identified. We implement 100 repeated-experiments under various highly reflective conditions (highly reflective wheel moving back and forth in the measuring process) and perform statistics of three key wheel size parameters, including flange thickness, tread wear, and rim thickness. Table 4 presents the measurement accuracy of the wheel size by using different methods. Only the results of PM, HE, CF, and SO methods are calculated. Furthermore, the measurement accuracy of the proposed method is considerably higher than that of other methods.

Sensors 2018, 18, 4296 Sensors 2018, 18, x FOR PEER REVIEW

(1)

17 of 20 17 of 20

(2)

(3)

Figure Wheel cross contour reconstruction with image processing. Column presents the raw Figure 18.18. Wheel cross contour reconstruction with image processing. Column (1)(1) presents the raw images. center extraction ofof (1). displays the wheel cross images.Column Column(2)(2)shows showsthe the center extractionresults results (1).Column Column(3)(3) displays the wheel cross contour results of of each wheel. contour results each wheel. Table 4. Measurement accuracy of wheel size by using different methods (mm).

4.3. Performance Evaluation of Online Measurement

Processingthe Methods Flange Thickness Wear Rimmethod, Thickness For evaluating online measuring performanceTread of the proposed we repeated the WPstandard wheel, performed ------ - - - - - experiments- -under - - - - highly reflective experiment with the comparative PM 0.17 0.14 0.56 the precise size of conditions, and conducted onsite dynamic measurement. As we cannot determine HE 5.82 5.40 8.28 each wheel, the statistical leakage and false-positive rates -are explained. CA ------ - - -mainly -----LT

---------------6.62 2.51 2.40 MF ---------------The standard SO freight wheel is adopted size is identified. We 6.20 as a measured 5.16 object whose2.00

4.3.1. MeasurementCF Accuracy

implement 100 repeated-experiments under various highly reflective conditions (highly reflective Note: “WP” stands for “without processing” and “- - - - - - ” indicates that the result is calculated unsuccessfully. wheel moving back and forth in the measuring process) and perform statistics of three key wheel size parameters, including flange thickness, tread wear, and rim thickness. Table 4 presents the measurement accuracy of the wheel size by using different methods. Only the results of PM, HE, CF, 4.3.2. Detection Rate Statistics and SO methods are calculated. Furthermore, the measurement accuracy of the proposed method is To evaluate the robustness and accuracy of the proposed method, an equipment installed in considerably higher than that of other methods. Shuohuang railway, Hebei, China, is selected as the experimental subject. We analyze the leakage and false-positive rates from 00:00:00 to 13:00:00 on 12 2017. methods A total of 56 vehicles and Table 4. Measurement accuracy of wheel sizeNovember by using different (mm). 23,436 wheel sets pass through the measurement equipment. Processing Methods Flangedepartment Thickness is Tread Wear Rim Thickness In practical application, the inspection most concerned about the reliability and WP ---------------- the reliability accuracy of the equipment during operation. The leakage rate is used to evaluate 0.14 of the equipment and PM is a prerequisite for 0.17 an accurate measurement. The 0.56 false-positive rate is HE 5.82 5.40 8.28 utilized to check the accuracy of the equipment, which can effectively improve the inspector efficiency. CA ------rates with different ------ methods in -----Figure 19 illustrates the leakage and false-positive an outdoor complex LT ---------------environment, respectively. CF 6.62 2.51 2.40 MF ----------------

SO

SO

MF

MF

CF

CF Methods

Methods

In practical application, the inspection department is most concerned about the reliability and accuracy of the equipment during operation. The leakage rate is used to evaluate the reliability of the equipment and is a prerequisite for an accurate measurement. The false-positive rate is utilized to check the accuracy of the equipment, which can effectively improve the inspector efficiency. Figure 19 illustrates the leakage and false-positive rates with different methods in an outdoor complex Sensors 2018, 18, 4296 18 of 20 environment, respectively.

LT CA HE Our

LT CA HE Our

Origin 0

Origin 5

10 15 20 25 30 35 40 Leakage rate (%)

(a)

0

2

4 6 8 10 12 14 16 False Positive rate(%)

(b)

Figure 19. Leakage and false-positive ratesofofdifferent different methods in an complex environment. Figure 19. Leakage and false-positive rates methods in outdoor an outdoor complex environment. (a) Leakage rates. (b) False-positive rates. (a) Leakage rates. (b) False-positive rates.

Figure 19a shows that only SO, CF, and the proposed method can effectively reduce the leakage

Figure 19a shows that only SO, CF, and the proposed method can effectively reduce the leakage rate. As other methods have limited brightness enhancement of the original image, the leakage rate rate. As other methods have limited brightnesswith enhancement of proposed the original image, the leakage rate is not considerably improved. In comparison SO and CF, the method can reduce the is not considerably improved. In comparison with SO CF, the proposed the leakage rate to 0.38%. Moreover, in comparison with theand false-positive rate shownmethod in Figure can 19b, reduce the SO and CF methods are considerably greaterwith thanthe 0.14%, which is achieved by theinproposed leakage rate to 0.38%. Moreover, in comparison false-positive rate shown Figure 19b, the method. Therefore, the proposed method effectively andby false-positive ratesmethod. SO and CF methods are considerably greatercan than 0.14%,reduce whichthe is leakage achieved the proposed and can meet the requirements of the railway inspection department. Therefore, the proposed method can effectively reduce the leakage and false-positive rates and can meet the5.requirements Conclusions of the railway inspection department. High speed, dynamic, and precise laser vision sensors are widely used in online measurement in a complex environment, for example, an online wheel size measurement system. However, the stripe images show and high-dynamic characteristics of loware SNR due to onsite highly measurement reflective High speed, dynamic, precise laser vision sensors widely used in online in conditions, which seriously affects the measurement accuracy and reliability. To address this issue, a a complex environment, for example, an online wheel size measurement system. However, the stripe high-precision and high-reliability wheel size measurement system suitable for highly reflective images show high-dynamic characteristics of low SNR due to onsite highly reflective conditions, which conditions was proposed in this study. Initially, the stripe image quality evaluation mechanism was seriouslyestablished affects theand measurement reliability. addressMSR thistheory issue,was a high-precision and used to assessaccuracy which partand of the stripe wasTo enhanced. adopted to high-reliability wheel size measurement system suitable for highly reflective conditions was proposed enhance the stripe brightness, and reliable stripe segmentation was achieved through reflectivity, therebyInitially, solving the that low-brightness stripes aremechanism not extractedwas in reflective conditions. in this study. theproblem stripe image quality evaluation established and used to Physical experiments analyzed the before and after enhancement results, thereby proving that the assess which part of the stripe was enhanced. MSR theory was adopted to enhance the stripe brightness, method can effectively improve the measurement reliability and greatly reduce the leakage and

5. Conclusions

and reliable stripe segmentation was achieved through reflectivity, thereby solving the problem that low-brightness stripes are not extracted in reflective conditions. Physical experiments analyzed the before and after enhancement results, thereby proving that the method can effectively improve the measurement reliability and greatly reduce the leakage and false-positive rates. The method has great practical value for improving the accuracy and reliability of a wheel size measurement system in highly reflective conditions. However, some space remains for promotion; as we review the enhanced images, the width of the stripe changes slightly. Therefore, research on a multi-scale stripe center extraction algorithm is necessary, in which case the stripe center points can be extracted adequately. Furthermore, for a high measurement frequency, we plan to utilize an FPGA or GPU hardware equipment to improve the stripe processing and center extraction speed, which renders it suitable for a wide range of applications. For some cases, such as the stray stripe images with extra high reflectivity and overexposure shown in Figure 7a, improvement was impossible. The proposed method was the best approach for decreasing exposure and rejecting stray stripes. Similarly, no laser was captured in tread segments in Figure 7b, and our method also cannot restore stripes, which is unavoidable. Author Contributions: X.P. conceived and designed the experiments. X.P. and Z.L. performed the experiments and analyzed the data. X.P. wrote the paper under the supervision of Z.L. and G.Z.

Funding: This work was supported by the National Natural Science Foundation of China under Grant No. 51575033. This work was also supported by the Central University in Beijing Major Achievements Transformation Project (Online Dynamic Detection System for Train Pantograph and Catenary) and National Key Scientific Instrument and Equipment Development Projects of China (2012YQ140032). Conflicts of Interest: The authors declare no conflict of interest.

Sensors 2018, 18, 4296

19 of 20

References 1.

2. 3. 4. 5. 6. 7. 8. 9.

10. 11. 12. 13.

14. 15. 16. 17. 18. 19. 20. 21. 22.

23. 24.

Huh, S.; Shim, D.H.; Kim, J. Integrated navigation system using camera and gimbaled laser scanner for indoor and outdoor autonomous flight of UAVs. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 3158–3163. Krummenacher, G.; Cheng, S.; Koller, S. Wheel Defect Detection with Machine Learning. IEEE Trans. Intell. Transp. Syst. 2018, 19, 1176–1187. [CrossRef] Liu, L.; Zhou, F.; He, Y. Vision-based fault inspection of small mechanical components for train safety. IEEE Trans. Intell. Transp. Syst. 2016, 10, 130–139. [CrossRef] Torabi, M.; Mousavi, S.M.; Younesian, D. A High Accuracy Imaging and Measurement System for Wheel Diameter Inspection of Railroad Vehicles. IEEE Trans. Ind. Electron. 2018, 65, 8239–8249. [CrossRef] Liu, Z.; Sun, J.; Wang, H.; Zhang, G. Simple and fast rail wear measurement method based on structured light. Opt. Lasers Eng. 2011, 49, 1343–1351. [CrossRef] Wu, K.; Zhang, J.; Yan, K. Online measuring method of the wheel set wear based on CCD and image processing. Proc. SPIE 2005, 5633, 409–415. Xing, Z.; Chen, Y.; Wang, X. Online detection system for wheel-set size of rail vehicle based on 2D laser displacement sensors. Optik 2016, 127, 1695–1702. [CrossRef] Wang, B.; Liu, W.; Jia, Z. Dimensional measurement of hot, large forgings with stereo vision structured light system. Proc. Inst. Mech. Eng. 2011, 225, 901–908. [CrossRef] Di Stefano, E.; Ruffaldi, E.; Avizzano, C.A. Automatic 2D-3D vision based assessment of the attitude of a train pantograph. In Proceedings of the IEEE International Smart Cities Conference, Trento, Italy, 12–15 September 2016. Meng, H.; Yang, X.R.; Lv, W.G.; Cheng, S.Y. The Design of an Abrasion Detection System for Monorail Train Pantograph Slider. Railw. Stand. Des. 2017, 8, 031. Chi, M. Influence of wheel-diameter difference on running security of vehicle system. Traffic Transp. Eng. 2008, 8, 19–22. Gong, Z.; Sun, J.; Zhang, G. Dynamic Measurement for the Diameter of a Train Wheel Based on Structured-Light Vision. Sensors 2016, 16, 564. [CrossRef] Feng, Q.; Chen, S. New method for automatically measuring geometric parameters of wheel sets by laser. In Proceedings of the Fifth International Symposium on Instrumentation and Control Technology, Beijing, China, 24–27 October 2003; pp. 110–113. Gao, Y.; Feng, Q.; Cui, J. A simple method for dynamically measuring the diameters of train wheels using a one-dimensional laser displacement transducer. Opt. Laser Eng. 2014, 53, 158–163. [CrossRef] Zhang, Z.; Gao, Z.; Liu, Y.; Jiang, F. Computer Vision Based Method and System for Online Measurement of Geometric Parameters of Train Wheel Sets. Sensors 2012, 12, 334–346. [CrossRef] [PubMed] Zhang, Z.; Su, Y.; Gao, Z.; Wang, G.; Ren, Y.; Jiang, F. A novel method to measure wheelset parameters based on laser displacement sensor on line. Opt. Metrol. Insp. Ind. Appl. 2010, 7855, 78551. Chen, X.; Sun, J.; Liu, Z.; Zhang, G. Dynamic tread wear measurement method for train wheels against vibrations. Appl. Opt. 2015, 54, 5270–5280. [CrossRef] [PubMed] Cheng, X. A Novel Online Detection System for Wheelset Size in Railway Transportation. Sensors 2016, 1, 1–15. [CrossRef] MERMEC Group. Wheel Profile and Diamter. Available online: http://www.mermecgroup.com/diagnosticsolutions/train-monitoring/87/1/wheelprofile-and-diameter.php (accessed on 15 January 2016). IEM, Inc. Wheel Profile System. Available online: http://www.iem.net/wheel-profilesystem-dp2 (accessed on 15 January 2016). KLD Labs, Inc. Wheel Profile Measurement. Available online: http://www.kldlabs.com/index.php?s= wheel+profile+measurement (accessed on 15 January 2016). Jin, Y.; Fayad, L.M.; Laine, A.F. Contrast enhancement by multiscale adaptive histogram equalization. In Proceedings of the International Symposium on Optical Science and Technology, San Diego, CA, USA, 29 July–3 August 2001. Hang, F.L.; Levine, M.D. Finding a small number of regions in an image using low-level features. Pattern Recognit. 2002, 35, 2323–2339. Farid, H. Blind inverse gamma correction. IEEE Trans. Image Process. 2001, 10, 1428–1433. [CrossRef]

Sensors 2018, 18, 4296

25. 26. 27. 28. 29. 30. 31. 32. 33.

34.

20 of 20

Lucchese, L.; Mitra, S.K. Color Image Segmentation: A State-of-the-Art Survey. Eng. Comput. 2001, 67, 1–15. Subr, K.; Majumder, A.; Irani, S. Greedy Algorithm for Local Contrast Enhancement of Images. In Proceedings of the International Conference on Image Analysis and Processing, Cagliari, Italy, 6–8 September 2005. Land, E.H. Recent Advances in Retinex Theory and Some Implications for Cortical Computations: Color Vision and the Natural Image. Proc. Natl. Acad. Sci. USA 1983, 80, 5163–5169. [CrossRef] Jobson, D.J.; Rahman, Z.; Woodell, G.A. Properties and performance of a center/surround retinex. IEEE Trans. Image Process. 1997, 6, 451–462. [CrossRef] Tarel, J.P.; Hautiere, N.; Caraffa, L. Vision Enhancement in Homogeneous and Heterogeneous Fog. IEEE Intell. Transp. Syst. 2012, 4, 6–20. [CrossRef] Jobson, D.J.; Rahman, Z.; Woodell, G.A. A multiscale retinex for bridging the gap between color images and the human observation of scenes. IEEE Trans. Image Process. 1997, 6, 965–976. [CrossRef] [PubMed] Zhang, Z. A Flexible New Technique for Camera Calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [CrossRef] Zhou, F.; Zhang, G. Complete calibration of a structured light stripe vision sensor through planar target of unknown orientations. Image Vis. Comput. 2005, 1, 59–67. [CrossRef] Qi, L.; Zhang, Y.; Zhang, X.; Wang, S.; Xie, F. Statistical behavior analysis and precision optimization for the laser stripe center detector based on Steger’s algorithm. Opt. Express 2013, 21, 13442–13449. [CrossRef] [PubMed] Steger, C. An unbiased detector of curvilinear structures. IEEE Trans. Pattern Anal. Mach. Intell. 1998, 20, 113–125. [CrossRef] © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).