Next Article in Journal
GradFreeBits: Gradient-Free Bit Allocation for Mixed-Precision Neural Networks
Next Article in Special Issue
Identification of Vibration Frequencies of Railway Bridges from Train-Mounted Sensors Using Wavelet Transformation
Previous Article in Journal
Fast Near-Field Frequency-Diverse Computational Imaging Based on End-to-End Deep-Learning Network
Previous Article in Special Issue
Contactless Deformation Monitoring of Bridges with Spatio-Temporal Resolution: Profile Scanning and Microwave Interferometry
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Three-Dimensional Digital Image Correlation Based on Speckle Pattern Projection for Non-Invasive Vibrational Analysis

by
Alvaro Souto Janeiro
1,*,
Antonio Fernández López
1,
Marcos Chimeno Manguan
1 and
Pablo Pérez-Merino
2,*
1
Department of Aeronautics, Universidad Politécnica de Madrid, 28040 Madrid, Spain
2
Centre for Microsystems Technology, Ghent University and Imec, Technologiepark 126, 9052 Ghent, Belgium
*
Authors to whom correspondence should be addressed.
Sensors 2022, 22(24), 9766; https://doi.org/10.3390/s22249766
Submission received: 9 November 2022 / Revised: 6 December 2022 / Accepted: 12 December 2022 / Published: 13 December 2022
(This article belongs to the Special Issue Structural Health Monitoring Based on Sensing Technology)

Abstract

:
Non-contact vibration measurements are relevant for non-invasively characterizing the mechanical behavior of structures. This paper presents a novel methodology for full-field vibrational analysis at high frequencies using the three-dimensional digital image correlation technique combined with the projection of a speckle pattern. The method includes stereo calibration and image processing routines for accurate three-dimensional data acquisition. Quantitative analysis allows the extraction of several deformation parameters, such as the cross-correlation coefficients, shape and intensity, as well as the out-of-plane displacement fields and mode shapes. The potential of the methodology is demonstrated on an Unmanned Aerial Vehicle wing made of composite material, followed by experimental validation with reference accelerometers. The results obtained with the projected three-dimensional digital image correlation show a percentage of error below 5% compared with the measures of accelerometers, achieving, therefore, high sensitivity to detect the dynamic modes in structures made of composite material.

1. Introduction

Non-contact optical sensing technologies combined with data processing algorithms play an essential role in non-invasively assessing a structure’s dynamic behavior. Structures with non-uniform material properties and/or complex geometry exhibit spatially local structural behaviors and are temporally transient. Therefore, full-field vibration-based measurements at very high spatial resolutions are required for the characterization and analysis of the structural dynamics for accurate identification of the modal parameters (i.e., modal shapes, frequencies and damping ratios) [1,2].
In recent years, optical-based vibration measurements have been proposed to measure structural dynamics, as they overcome the limitations of contact sensors (e.g., accelerometers), provide high-resolution measurements and do not change the structural dynamic behavior during testing. There exist a number of optical techniques for analyzing the dynamic response of a structure, such as laser Doppler vibrometry [3], electronic speckle interferometry [4,5], digital speckle shearography [6,7] and digital image correlation [8,9,10,11], among others. Laser Doppler vibrometry is the most extensively used technique for high spatial resolution vibration measurements owing to its superior resolution, sensitivity, robustness and non-contact nature, but it is an expensive and time-consuming method, which only offers pointwise measurements. Alternatively, scanning and continuous scanning laser Doppler vibrometry integrate a scanning system for full-field vibration measurements; however, different authors described (1) pseudo-vibration due to surface roughness and (2) measurement errors due to defocusing and motion artifacts at some scanning points [12,13].
Benefiting from the rapid development of high-speed advanced sensors and image processing techniques, digital image correlation (DIC) has been presented as a contactless alternative to the techniques above due to its simplicity, robustness, full-field, large deformation capabilities and accuracy of under 0.01-pixel precision with the use of the Newton–Rapshon method [14] and a bicubic interpolation [15,16]. DIC is a simple optical method that uses image registration to measure the strain and out-of-plane displacement of a structure and can be built up for full fields in three dimensions (3D) with the incorporation of two sensors or other strategies for stereo imaging (3D-DIC) [17,18,19,20].
For optimal use of 3D-DIC, it requires mechanical load synchronization with high-resolution and high-speed sensors to provide the tightest possible tolerance for spatio-temporal displacement calculation. Furthermore, the surface of interest must have a high-quality speckle pattern, which deforms together with the structure [21]. This configuration, together with the development of image processing algorithms, allows the matching of dense sets of points of the speckle pattern over a temporal sequence: before (unloaded), during (loaded) and after (unloaded) the mechanical excitation. On this matter, various authors have proposed different methodological approaches, on the one hand, to test the quality of the speckle pattern and, on the other hand, to improve the performance of the computational algorithms, such as the Shannon entropy [22], the mean intensity gradient [23], the mean intensity of the second derivative or, recently, the sum of the square of subset intensity gradients (SSSIG) [24,25].
The speckle pattern for 3D-DIC can be classified into three different groups: (i) a natural random pattern [26]; (ii) a synthetic speckle pattern: black dots randomly distributed on a white background by using a high-contrast paint sprayed with an airbrush air gun [27]; and (iii) a projected speckle pattern [28,29]. The synthetic speckle pattern is the most widely used for 3D-DIC applications; however, the physical surface treatment could be a source of contamination of the structure and manufacturing facilities, and it presents the main limitation of speckle distortion in high-temperature environments [30]. Moreover, it is a time-consuming method that does not ensure the desired size and density of the spots on the surface, and its invasive nature represents a clear limitation to perform measurements in optical materials or to apply it for in vivo measurements in biomedical applications [31,32]. However, the projection of the speckle pattern has multiple advantages with respect to artificial pattern speckles, i.e., the speckle pattern can be easily adapted to the experiment’s needs (the speckle pattern can be changed during the trial to achieve higher performance), and the speckle pattern is not painted on the surface of analysis, it is projected. This issue expands the technique to areas or experiments in which it is not possible to paint the speckle into the surface of a structure: optical surfaces (such as reflecting mirrors and lenses, solar cells and glass structures [33]) or biomedical applications (e.g., three-dimensional analysis of corneal biomechanics and intraocular pressure [34]). Moreover, it should be mentioned that vibration analysis fits very well with the speckle pattern projection in 3D-DIC since the displacement of the vibration modes always occurs out-of-plane [35,36,37,38].
In the last few years, two main directions have been described in the state-of-the-art for the development of non-invasive projected patterns: fringe patterns and laser speckle patterns. Fringe patterns have been combined with speckle patterns and a single camera for DIC in two dimensions and mechanical trials that required a low-frequency ratio of images [39,40]. The random granular effect in the laser speckle pattern has its origin in the high coherence of the light source that is projected to the sample and in the decorrelation when the sample moves in the out-of-plane direction. However, this technique entailed some increase in the uncertainty of the out-of-plane displacements [41]. In addition to the speckle pattern, different studies described other factors that affect the performance and accuracy of DIC: the criterion to evaluate the similarity between subsets [42,43], the iterative algorithm to process the information [44], the sub-pixel interpolation scheme [45,46], the quality and resolution of images [47], the speckle pattern quality [48] and the subset size [24].
This study proposes a custom-developed 3D-DIC methodology based on speckle pattern projection and image processing routines for full-field vibrational analysis at high frequency. The potential of the technique has been demonstrated on an Unmanned Aerial Vehicle (UAV) wing made of composite material, followed by experimental validation with reference accelerometers. The vibrational parameters, such as 3D displacement fields, natural frequencies and modal shapes, have been obtained and studied to demonstrate the capability of the developed methodology.

2. Materials and Methods

2.1. Projected 3D-DIC: Experimental Set-Up

Figure 1 illustrates the custom experimental set-up. Basically, the system comprises an optical channel for speckle pattern projection and two high-speed sensors. The optical channel consists of an on-axis illumination module (Broadband halogen fiber optic illuminator, OSL2, Thorlabs, Bergkirchen, Germany), which emits a stable power in the spectral range between 185 and 2000 nm, the optical elements required to collimate the beam and generate a uniform irradiance across the sample include two converging lenses and a printed speckle pattern in acetate. The off-axis imaging channel consists of two high-speed and high-resolution cameras (Mini AX100, Photron, Bucks, UK; 1024 × 1024 pixels at 4000 fps; pixel size: 20 µm; accuracy: 0.01 pixel) provided with a 150 mm focal length objective lens (Irix 150 mm f/2.8, Irixlens, Switzerland) and mounted on the X–Y-axis translation stage with a high precision rotation of 360 degrees (PR01, Thorlabs, Bergkirchen, Germany).

2.2. Projected 3D-DIC: Fundamentals

For high-frequency vibration analysis in projected 3D-DIC, the grey intensity of the reference image (unloaded state) is compared with the high-speed acquired images under deformation (loaded state). However, DIC is not directly applied between the reference and the deformed images; the acquired images are processed and divided into subsets (Figure 2).
Figure 3 describes the developed procedure for structural characterization using projected 3D-DIC. First, before obtaining the calibration images, the high-speed sensors, the optical channel for speckle pattern projection and the sample should be perfectly aligned and uniformly illuminated for 3D-DIC applications. The high-precision rotation stage ensures an accurate orientation of the cameras at 30 degrees in relation to the sample for stereo imaging. This configuration resulted in a scale of 0.5 mm/pixel. Once the speckle pattern is focused, the image acquisition can be performed on unloaded and loaded state images. Then, image processing algorithms and routines were developed to obtain the 3D field displacement and include: (i) the calibration and distortion parameters estimation, (ii) the alignment algorithm to obtain the displacement between subsets (2D-DIC) and (iii) the 3D reconstruction.

2.2.1. Camera Calibration Parameters

The calibration of the high-speed cameras was performed in two different steps: (i) data acquisition for the calibration; and (ii) calculation of the 3D coordinates through an iterative process that requires the estimation of intrinsic (the optical characteristics of the sensor) and extrinsic (position and orientation of the sensor in the set-up) parameters. Figure 4 shows the scheme of the coordinate system. To calibrate a single camera, the method developed by Zhang [49] was implemented, where a set of images of a chessboard with different positions and orientations covering the largest possible area of the analysis region was required. The relationship between 3D points and planar points is defined by Equation (1), using Zhang’s notation.
A 3D point is defined by [ X Y Z 1 ] T and the coordinates in the image plane are given by [ u v 1 ] T . The matrix constituted by the intrinsic parameter set is denoted by A . In this matrix, u 0 and v 0 are the coordinates of the principal point resulting from the intersection of the image plane with the optical axis, which is represented by C in Figure 4. Coefficients α and β are the focal length and γ is the skew coefficient. The extrinsic parameter set is constituted by rotation matrix R and the translation vector T between the world coordinate system and the camera system. Finally, λ is an arbitrary scale factor.
λ [ u v 1 ] = A [ R | T ] [ X Y Z 1 ] = [ α γ u 0 0 β v 0 0 0 1 ] [ r 1 , 1 r 1 , 1 r 1 , 1 r 1 , 1 r 1 , 1 r 1 , 1 r 1 , 1 r 1 , 1 r 1 , 1 r 1 , 1 r 1 , 1 r 1 , 1 ] X Y Z 1  
The following items describe the process of solving the single-camera calibration problem:
  • A set of thirty images of the chessboard with different orientations was recorded. The coordinate of each corner was obtained ( [ u v 1 ] T ) and it was associated with the three-dimensional point defined by [ X Y Z 1 ] T ;
  • The analytical solution of Equation (1) was calculated using Zhang’s method;
  • A nonlinear optimization based on the maximum likelihood criterion was developed. The procedure followed is explained in detail in Appendix A.
The two high-speed cameras were synchronized to obtain the images simultaneously for stereoscopic calibration and measurements. Therefore, the rotation matrix R and the translation vector T between the two cameras used for that pairs of calibration images were obtained. Each pair of images correspond with the same chessboard scene. The nonlinear iterative procedure for the stereoscopic calibration is described in Appendix B.

2.2.2. Distortion Correction

Nonlinear distortion is inherent to imaging systems: radial distortion [50] and tangential distortion [51]. The undistorted image coordinates ( u ˜ v ˜ ) are expressed by Equations (2) and (3), where x and y are the coordinates with respect to the principal point of image plane ( C in Figure 4), k i is the coefficient of distortion and r is given by r = x 2 + y 2 .
u ˜ = u + x [ k 1 r 2 + k 2 r 4 + k 3 r 6 radial + 2 k 4 x y + k 5 ( r 2 + 2 x 2 ) tangential ]
v ˜ = v + y [ k 1 r 2 + k 2 r 4 + k 3 r 6 radial + k 4 ( r 2 + 2 x 2 ) + 2 k 5 x y tangential ]
Both Equations (2) and (3) have been integrated into the calibration estimation procedure. In particular, in the nonlinear optimization of single calibration and in the nonlinear optimization employed in the stereoscopic calibration.

2.2.3. Alignment Algorithm

The developed routine to obtain the displacement between two subsets is based on the inverse compositional Gauss–Newton algorithm (IC-GN) [15,52], and it consists of the alignment of a template (deformed) image to a target image. For that, the grey-level intensity of each reference subset is correlated to the grey-level intensity of the corresponding template subset. The mathematical expression of the IC-GN algorithm performed on a zero-mean normalized sum of squared difference (ZNSSD) correlation criteria is shown in Equations (4) to (11); in which functions F and G represent the subset of reference images and the subset of deformed images, respectively. The term H corresponds to the hessian calculated by means of Equation (8), variable p is the pre-computed deformation parameter vector, variable ξ designs the local coordinates of pixel point in each subset and W is the warp function given by Equation (9).
The working principle of the IC-GN algorithm is represented in Figure 5. The initial estimation of p = [ u 0 0 v 0 0 ] is usually calculated by means of the direct application of the ZNSSD criteria or another equivalent criteria, such as zero normalized cross-correlation (ZNCC). This allows us to obtain an initial estimation of the u and v components of the displacement vector between the reference subset and the target subset. After that, an iterative loop is performed until achieving the convergence criteria (represented in Figure 5).
C Z N S S D ( Δ p ) = ξ { F ( x + W ( ξ ; Δ p ) ) F ¯ Δ F G ( x + W ( ξ ; p ) ) G ¯ Δ G } 2  
Δ F = ξ ( F ( x + W ( ξ ; Δ p ) ) F ¯ ) 2
Δ G = ξ ( G ( x + W ( ξ ; Δ p ) ) G ¯ ) 2
Δ p = H 1 ξ [ F W p ] T [ F ( x + ξ ) F ¯ Δ F Δ G ( G ( x + W ( x ; p ) ) G ¯ ) ]  
H = ξ [ F W p ] T [ F W p ]
W ( ξ , p ) = [ ( 1 + u x ) Δ x + u y Δ y + u v x Δ x + ( 1 + v y ) Δ y + v 1 ]
p = [ u , u x , u y , v , v x , v y ]
ξ = [ Δ x , Δ y , 1 ]

2.2.4. Three-Dimensional Reconstruction

The three-dimensional reconstruction or triangulation is the reconstruction of a 3D point from its 2D projections from two or more cameras. The least square method (LSM) [53] was implemented because of its accuracy and computational efficiency. The relationships between the left image coordinate ( u l v l ) , the right coordinate ( u r v r ) and the world coordinate system ( X Y Z ) are indicated in Equations (12) and (13), respectively [53]. Integrating both equations, the world coordinate system can be expressed by Equation (14), where b is given by (15) and M + is the left-pseudo inverse ( M + = ( M T M ) 1 M T ) of M matrix (16).
[ u l v l ] = [ α l γ l u 0 l 0 β l v 0 l ] [ X / Z Y / Z 1 ]
[ u r v r ] = [ α r γ r u 0 r 0 β r v 0 r ] [ r 1 , 1 X + r 1 , 2 Y + r 1 , 3 Z + t 1 r 3 , 1 X + r 3 , 2 Y + r 3 , 3 Z + t 3 r 2 , 1 X + r 2 , 2 Y + r 2 , 3 Z + t 2 r 3 , 1 X + r 3 , 2 Y + r 3 , 3 Z + t 3 1 ]
[ X Y Z ] = M + b  
b = [ 0 0 ( t 3 α r + t 2 γ r + t 3 ( u 0 r u r ) ) ( t 2 β r + t 3 ( v 0 r v r ) ) ]
M = [ α l γ l u 0 l u l 0 β l v o l v l M 3 , 1 M 3 , 2 M 3 , 3 M 4 , 1 M 4 , 2 M 4 , 3 ]  
M 3 , 1 = r 1 , 1 C α r + r 2 , 1 C γ r + r 3 , 1 C ( u 0 r u r )
M 3 , 2 = r 1 , 2 α r + r 2 , 2 γ r + r 3 , 2 ( u 0 r u r )
M 3 , 3 = r 1 , 3 α r + r 2 , 3 γ r + r 3 , 3 ( u 0 r u r )
M 4 , 1 = r 2 , 1 β r + r 3 , 1 ( v 0 r v r )  
M 4 , 2 = r 2 , 2 β r + r 3 , 2 ( v 0 r v r )  
M 4 , 3 = r 2 , 3 β 3 + r 3 , 3 C ( v 0 r v r )

2.3. Projected 3D-DIC: Test and Validation

The structure under testing was the horizontal tail plain (HTP) of the DIANA Unmanned Aerial Vehicle (UAV) with a mass of 530 g and the following dimensions: 455 mm (long) × 265 mm (wide). The DIANA UAV was manufactured with HexPly® M56/35%UD194/IMA12K prepreg, and it was composed of a main skin made of a symmetric laminate [60,–60,0], a longitudinal stringer and reinforcements in the borders made of a symmetric laminate [60,–60,03]. All mechanical joints were made with adhesive. For validation, three accelerometers were placed in the structure (mass of the accelerometer: 8.6 g). Their positions are described in Table 1 (referenced to the upper left vertex) and illustrated in Figure 6. In this study, the type of accelerometer employed was a triaxial sensor with a measuring range of 50 g, a typical sensitivity of 100 Mv/g and an analysis range of up to 10 kHz. For the series of test, the UAV wing was installed on a slipping table fixed to an electromagnetic shaker (LDS 406) powered by a PA100E power amplifier. The drive signal, as well as the acquisition of the accelerometer signals, was performed by means of an LDS Focus II acquisition system. First, a modal identification survey to identify the natural frequencies of the structure was performed; and then the base of the structure was displaced sinusoidally at each natural frequency in order to validate the projected 3D-DIC with the accelerometers.

3. Results

In order to evaluate the potential of the projected 3D-DIC for long-duration random vibration measurements, a series of trials were conducted focusing on two major results: (i) modal characterization, which means the modal identification survey to obtain the natural frequencies of the structure inherent to the vibration testing; and (ii) validation of the method with excitation of the UAV wing at the detected natural frequencies of the structure. The 3D-DIC results were compared to the accelerometers’ measurements.

3.1. Modal Characterization

For modal determination, a frequency sweep from 0 to 200 Hz was performed to obtain the most relevant natural frequencies of the structure (accelerometers) by programming a sine test at a rate of 1/8 min with a frequency resolution that increases linearly with the frequency sweep. The accelerometers show a resonance frequency of 10 kHz in order to ensure high accuracy in the tested frequency range. The most relevant natural frequencies obtained from the UAV wing were: 42.8 Hz (bending), 57.5 Hz (twisting), 83.1 Hz (twisting) and 97.5 Hz (twisting). Hence, the high-speed cameras were programmed for high-frequency sampling (1000 Hz) at maximum resolution (1024 × 1024 pixels). Three different subset sizes were considered: 20 × 20 pixels, 30 × 30 pixels and 40 × 40 pixels. Figure 7 shows the cross-correlation coefficients for each subset size. It should be noted that as the size of the subset decreases, the probability that the cross-correlation provides erroneous displacements increases mainly due to two reasons: (i) decreasing the subset size increases the magnitude of the residual correlation coefficients, which increases the probability of obtaining wrong displacements; (ii) the smaller the size of the subset, the greater the probability that any of the subsets is empty, with a low number of speckle points in its interior. In the case of applying the cross-correlation to an empty subset, the displacement obtained would be a random number. For the subset size of 20 × 20 pixels, the cross-correlation residues reached values comparable to the maximum value of cross-correlation coefficients. The subset sizes of 30 × 30 and 40 × 40 pixels showed reliable results, so for further analysis, the configuration with the smaller subset size (30 × 30 pixels) was considered.
Figure 8 shows the amplitude of the displacements for each natural frequency. It should be mentioned that the UAV wing was excited to each of these natural frequencies by means of a sine load of the natural frequency. The amplitude of the displacement was calculated using expression (23):
D = d X 2 + d Y 2 + d Z 2
where d X , d Y and d Z represent the displacement of each component and D represents the total displacement. Moreover, the sign of displacement was taken into account. The parts of the wing with positive displacement represent an area of the UAV wing that has deformed in the opposite direction to areas with negative displacement. The parts of the wing with null displacement represent a wing area that has not been deformed. Figure 8, Videos S1 and S2 show the raw data (unprocessed images of camera 1 and camera 2) and the temporal evolution of the displacements for the first natural frequency (Figure 9 shows a screenshot of each visualization, the videos can be found as supplementary material).

3.2. Experimental Validation

The displacements obtained with the projected 3D-DIC for the first vibration mode (42.8 Hz) were compared with the results obtained with the accelerometers. The values of the accelerometers were normalized with respect to accelerometer 1, the ratio between accelerometers 2 and 1 was 2.75 and the ratio between accelerometers 3 and 1 was 3.25. The displacements obtained with the projected 3D-DIC were also normalized with respect to displacement corresponding to coordinates of accelerometer 1. For that, an offset is applied to all displacements in order to fix the displacement of accelerometer 1 to 1 pixel. After that, the displacements were contrasted with respect to the ratios of accelerometers 2 and 3. Assuming that the accelerometers were not perfectly placed in the coordinates indicated in Table 1, a sweep was performed from −1 subset to +1 subset around the position of both accelerometers (accelerometers 2 and 3). The result of this operation is shown in Figure 10. For Figure 10a, the errors ranged from 1.8% (center) to 3.38 % (bottom left), while for Figure 10b, the error ranged from 3.7% (center) to 6.3% (bottom left).

4. Discussion

This study proposed an accurate method for full-field vibration measurement. The method is based on projected 3D-DIC and is capable of non-invasively identifying the full-field mode shapes and natural frequencies of a UAV wing made of composite material, which could potentially benefit structural dynamics and health monitoring applications [54]. Previous studies described fringe patterns [55,56], laser speckle [41,57] or a combination of both [40] to create a non-invasive pattern for full-field vibration measurement in DIC applications. In a recent study, Felipe-Sesé et al. [40] proposed a combination of a fringe pattern, a laser speckle and a single-camera DIC to measure the three-dimensional shapes and the associated field of displacements of a plate excited by a shaker. The authors validated their results with theoretical predictions and obtained a maximum error near the edges of the structure of 4.95%. Moreover, Pang et al. [58], evaluated the flexural behavior of concrete sleepers with a laser speckle imaging sensor. The authors validated the methodology with foil strain gauges and described a maximum error of 7.15% in their measurements.
In this study, a speckle pattern projection has been chosen not only as an alternative to the conventional invasive painted speckle patterns but also to other non-invasive and complex options (i.e., laser speckle or fringe patterns). Overall, the study holds similarities with those recently reported in the state-of-the-art and includes the novelty of the analysis on a composite material at different vibration modes, showing high sensitivity to detect the temporal evolution of the vibration modes. Accurate 3D mode shapes were obtained for different resonances, the results show a percentage of error below 3% and 5% compared with reference accelerometers 2 and 3, respectively. Accelerometer 3 was placed near the edge of the structure, and the discontinuity at this region might be the reason for a reduced accuracy [59,60].
Although the measurements were performed for a sampling frequency of 1000 Hz, one of the limitations of the study is that the more relevant vibration modes of the UAV structure occurred in the range from 0 to 100 Hz. Another limitation is associated with the out-of-plane motion tracking since the projected speckle pattern does not move tightly together with the surface of the structure. However, the latter might not be considered a weakness for vibrational analysis because the excitation of the structure is applied longitudinally on each axis, and the whole out-of-plane displacement measurements of the measured area are fundamental for the evaluation of the vibration modes of the structure. The advantages of the proposed methodology are the high-frequency capacity (up to 4000 Hz under the same configuration) and great versatility to (i) customize the speckle projected pattern for canceling the residues of the cross-correlation coefficients and (ii) configurate the set-up for measurements in a selected spectrum range (i.e., near-infrared). This methodology could be directly applied for vibrational analysis of optical surfaces, such as solar cells, glass structures and lenses, or even for biomechanical analysis in human tissues (e.g., transparent tissues such as the cornea).

5. Conclusions

3D-DIC based on speckle pattern projection is an accurate method to characterize the structural dynamics of a composite material. The proposed method successfully obtained full-field vibration measurement with high spatial resolution and is capable of reliably extracting the shape and intensity, as well as the out-of-plane displacement fields and mode shapes of the structure.

6. Patents

The authors indicate the following financial disclosure(s): PCT/ES2018/070757, “Device and method for obtaining mechanical, geometric and dynamic measurements of optical surfaces” (P.P.-M., A.F.L.).

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/s22249766/s1, Video S1: Raw data visualization, Video S2: Displacement fields.

Author Contributions

Conceptualization, A.S.J., A.F.L. and P.P.-M.; methodology, A.S.J., A.F.L., M.C.M. and P.P.-M.; software, A.S.J., A.F.L. and P.P.-M.; validation, A.S.J., A.F.L., M.C.M. and P.P.-M.; formal analysis, A.S.J., A.F.L., M.C.M. and P.P.-M.; investigation, A.S.J., A.F.L., M.C.M. and P.P.-M.; resources, A.F.L. and P.P.-M.; data curation, A.S.J., A.F.L., M.C.M. and P.P.-M.; writing—original draft preparation, A.S.J., A.F.L., M.C.M. and P.P.-M.; writing—review and editing, A.S.J., A.F.L., M.C.M. and P.P.-M.; visualization, A.S.J.; supervision, A.F.L. and P.P.-M.; project administration, A.F.L. and P.P.-M.; funding acquisition, A.F.L. and P.P.-M. All authors have read and agreed to the published version of the manuscript.

Funding

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No 101028137. This research has also received funding from the national research program Retos de la Sociedad under the Project STARGATE: Desarrollo de un sistema de monitorización estructural basado en un microinterrogador y redes neuronales (reference PID2019-105293RB-C21); FIPSE 3388-18 (Fundación para la Innovación y la Prospectiva en Salud en España) and DTS18/00107 (Instituto de Salud Carlos III, Spanish Government).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors thank Antonio Castelo, Jose Alfonso Endrina and Carlos Molina (Alava Ingenieros, SA) for their technical assistance with the high-speed cameras.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Estimation of the Single Camera Calibration Parameters

After taking the images of the chessboard calibration pattern, all the coordinates of the points through Harris corner detection were solved [61]. Then, the camera calibration parameters were calculated by minimizing the difference between the two-dimensional coordinates at the edges of the calibration test by means of Equation (1). The procedure to obtain the solution was explained by Zhang [49]; however, it should be noted that the analytical solution of Zhang’s method needs further refinement because of nonlinear distortion. Therefore, in order to minimize the error of the calibration parameters, a nonlinear iterative loop was implemented by using the following equations. Figure A1 illustrates the method for the estimation of the single-camera calibration parameters.
P ( w ) l + 1 = P ( w ) l + { 1 ( 1 η ) l + 1 } H ( w ) 1 E ( w )
ρ = { A , k , R j , T j }
w = { ρ , M i , j }
F ( w ) = m ^ ( w ) = [ x p y p ] = [ α x u + γ y u + u 0 β y u + v 0 ]  
x u = X c Z c a + 2 k 4 X c Z c Y c Z c + k 5 ( r 2 + 2 X c 2 Z c 2 )
y u = Y c Z c a + k 4 ( r 2 + 2 Y c 2 Z c 2 ) + 2 k 5 X c Z c Y c Z c
a = 1 + k 1 r 2 + k 2 r 4 + k 3 r 6
[ X c Y c Z c ] = R [ X Y Z ] + T  
E ( w ) = m i , j m ^ ( w )
H ( w ) = [ F ( w ) w 1 w 1 F ( w ) w 1 w 2 F ( w ) w 2 w 1 F ( w ) w 2 w 2 F ( w ) w 1 w k F ( w ) w 2 w k F ( w ) w k w 1 F ( w ) w k w 2 F ( w ) w k w k ]
where,
  • A is the matrix of the intrinsic parameters constituted by α , β , γ , u 0 and v 0 according to Zhang’s nomenclature [49].
  • k represents a vector with radial and tangential distortion parameters.
  • R j is the rotation matrix, which corresponds to image j .
  • T j is the translation vector, which corresponds to image j .
  • P ( w ) l is the vector constituted by w parameters, which corresponds to iteration l .
  • E ( w ) is a vector with the error estimation of each corner.
  • H ( w ) 1 is the inverse matrix of the hessian matrix defined in Equation (A10).
  • η is the smoothing factor.
  • m i , j is the two-dimensional corner coordinates of the chessboard, they must be obtained with Harris corner detector.
  • m ^ ( w ) are the estimated corner coordinates by Equation (A4).
  • X c , Y c and Z c are three-dimensional coordinated in the camera plane.
  • x u and y u are undistorted coordinates.
  • x p and y p are the projected pixel coordinates of corners.
  • M i , j is the three-dimensional corners’ coordinates.
Figure A1. Iterative refinement of single-camera parameters.
Figure A1. Iterative refinement of single-camera parameters.
Sensors 22 09766 g0a1

Appendix B

The stereoscopic calibration parameters are constituted by the rotation matrix from the left camera system to the right camera system ( R r l ), which defines the orientation between both cameras and the translation vector between the left camera system to the right camera system ( T r l ). Just as the estimation of the single camera parameters, the solution was calculated by minimizing the difference between the two-dimensional known coordinates (Harris corner detector) and the two-dimensional estimated coordinates of the corners. However, it is important to highlight that before estimating the stereoscopic parameters, the parameters of each camera should be calculated. As shown in Equation (A11), the number of parameters has grown significantly by incorporating the left single-camera parameters (denoted with superscript l ), the right single-camera parameters (defined with superscript r ) and the estereoscopic parameters (without superscript). The iterative refinement starts with an initial estimation of ρ parameters. The majority of the parameters were previously calculated by means of single-camera calibration and the initial estimation of R r l and T r l were calculated according to the mean values obtained via Equations (A13) and (A14).
ρ = { A l , A r , k l , k r , R j r , R j l , R r l , T j r , T j l , T r l }
w = { ρ , M i , j r , M i , j l }
R r w = R r l R l w
T r w = T r l + R r l T l w
The iterative loop (see Figure A2) was defined by the following steps:
  • Estimation of the projected pixel coordinates corresponding to the corners of the images (left camera) by using Equation (A4).
  • Estimation of the rotation matrix from the world system to the right camera system ( R r w ) by using Equation (A13) and the translation vector from the world system to the right camera system ( T r l ) by means of Equation (A14).
  • Estimation of the projected pixel coordinates corresponding to the corners of the images (right camera) by using Equation (A4).
  • Calculation of the vector error as the difference between the two-dimensional known coordinates and the estimated coordinates of the corners of all the images (both cameras, left and right).
  • Extraction of the hessian matrix by Equation (A10).
  • Update of the parameters by means of Equation (A1).
Figure A2. Iterative refinement of stereoscopic parameters.
Figure A2. Iterative refinement of stereoscopic parameters.
Sensors 22 09766 g0a2

References

  1. Ewins, D. Modal Testing: Theory, Practice, and Application; 436 Research Studies Press: Baldock, Hertfordshire, UK, 2000. [Google Scholar]
  2. Fan, W.; Qiao, P. Vibration-based damage identification methods: A review and 438 comparative study. Struct. Health Monit. 2011, 10, 83–111. [Google Scholar] [CrossRef]
  3. Martarelli, M.; Ewins, D.J. Continuous scanning laser Doppler vibrometry and speckle noise occurrence. Mech. Syst. Signal. Process 2006, 20, 2277–2289. [Google Scholar] [CrossRef]
  4. Creath, K.; Slettemoen, G.A. Vibration-observation techniques for digital speckle-pattern interferometry. J. Opt. Soc. Am. 1985, 2, 1629–1636. [Google Scholar] [CrossRef] [Green Version]
  5. Wang, W.C.; Hwang, C.H.; Lin, S.Y. Vibration measurement by the time-averaged electronic speckle pattern interferometry methods. Appl. Opt. 1996, 35, 4502–4509. [Google Scholar] [CrossRef] [PubMed]
  6. Hung, Y.Y. Shearography: A new optical method for strain measurement and nondestructive testing. Opt. Eng. 1982, 21, 213391. [Google Scholar] [CrossRef]
  7. Tay, C.J.; Fu, Y. Determination of curvature and twist by digital shearography and wavelet transforms. Opt. Lett. 2005, 30, 2873–2875. [Google Scholar] [CrossRef] [PubMed]
  8. Baqersad, J.; Poozesh, P.; Niezrecki, C.; Avitabile, P. Photogrammetry and optical methods in structural dynamics—A review. Mech. Syst. Signal. Process. 2017, 86, 17–34. [Google Scholar] [CrossRef]
  9. Beberniss, T.J.; Ehrhardt, D.A. High-speed 3D digital image correlation vibration measurement: Recent advancements and noted limitations. Mech. Syst. Signal. Process. 2017, 86, 35–48. [Google Scholar] [CrossRef]
  10. Helfrick, M.N.; Niezrecki, C.; Avitabile, P.; Schmidt, T. 3D digital image correlation methods for full-field vibration measurement. Mech. Syst. Signal. Process. 2011, 25, 917–927. [Google Scholar] [CrossRef]
  11. Gao, Y.; Cheng, T.; Su, Y.; Xu, X.; Zhang, Y.; Zhang, Q. High-efficiency and high-accuracy digital image correlation for three-dimensional measurement. Opt. Lasers Eng. 2015, 65, 73–80. [Google Scholar] [CrossRef]
  12. Rothberg, S.J.; Allen, M.S.; Castellini, P.; Di Maio, D.; Dirckx, J.J.J.; Ewins, D.J.; Halkon, B.J.; Muyshondt, P.; Paone, N.; Ryan, T.; et al. An international review of laser Doppler vibrometry: Making light work of vibration measurement. Opt. Lasers Eng. 2017, 99, 11–22. [Google Scholar] [CrossRef] [Green Version]
  13. Sheng, Z.; Chen, B.; Hu, W.; Yan, K.; Miao, H.; Zhang, Q.; Yu, Q.; Fu, Y. LDV-induced stroboscopic digital image correlation for high spatial resolution vibration measurement. Opt. Express 2021, 29, 28134–28147. [Google Scholar] [CrossRef] [PubMed]
  14. Bruck, H.; McNeill, S.; Sutton, M.; Peters, W. Digital image correlation using Newton-Raphson method of partial differential correction. Exp. Mech. 1989, 29, 261–267. [Google Scholar] [CrossRef]
  15. Pan, B.; Li, K.; Tong, W. Fast, robust and accurate digital image correlation calculation without redundant computations. Exp. Mech. 2013, 53, 1277–1289. [Google Scholar] [CrossRef]
  16. Pan, B.; Li, K. A fast digital image correlation method for deformation measurement. Opt. Lasers Eng. 2011, 49, 841–847. [Google Scholar] [CrossRef]
  17. Cardenas-Garcia, J.; Yao, H.; Zheng, S. 3D reconstruction of objects using stereo imaging. Opt. Lasers Eng. 1995, 22, 193–213. [Google Scholar] [CrossRef]
  18. Genovese, K.; Casaletto, L.; Rayas, J.; Flores, V.; Martinez, A. Stereo-digital image correlation (DIC) measurements with a single camera using a biprism. Opt. Lasers Eng. 2013, 51, 278–285. [Google Scholar] [CrossRef]
  19. Yao, L.; Ma, L.; Zheng, Z.; Wu, D. A low cost 3D shape measurement method based on a strip shifting pattern. ISA Trans. 2007, 46, 267–275. [Google Scholar] [CrossRef]
  20. Hamzah, R.; Kadmin, A.; Hamid, M.; Ghani, S.; Ibrahim, H. Improvement of stereo matching algorithm for 3D surface reconstruction. Signal Process. Image Commun. 2018, 65, 165–172. [Google Scholar] [CrossRef]
  21. Li, J.; Xie, X.; Yang, G.; Zhang, B.; Siebert, T.; Yang, L. Whole-field thickness strain measurement using multiple camera digital image correlation system. Opt. Lasers Eng. 2017, 90, 19–25. [Google Scholar] [CrossRef]
  22. Liu, X.Y.; Li, R.L.; Zhao, H.W.; Cheng, T.H.; Cui, G.J.; Tan, Q.C.; Meng, G.W. Quality assessment of speckle patterns for digital image correlation by Shannon entropy. Optik 2015, 126, 4206–4211. [Google Scholar] [CrossRef]
  23. Pan, B.; Lu, Z.; Xie, H. Mean intensity gradient: An effective global parameter for quality assessment of the speckle patterns used in digital image correlation. Opt. Lasers Eng. 2010, 48, 469–477. [Google Scholar] [CrossRef]
  24. Pan, B.; Xie, H.; Wang, Z.; Qian, K.; Wang, Z. Study on subset size selection in digital image correlation for speckle patterns. Opt. Express 2008, 16, 7037–7048. [Google Scholar] [CrossRef] [PubMed]
  25. Bomarito, G.; Hochhalter, J.; Ruggles, T.; Cannon, A. Increasing accuracy and precision of digital image correlation through pattern optimization. Opt. Lasers Eng. 2017, 91, 73–85. [Google Scholar] [CrossRef]
  26. Gauvin, C.; Jullien, D.; Doumalin, P.; Dupre, J.; Gril, J. Image correlation to evaluate the influence of hygrothermal loading on wood. Strain 2014, 50, 428–435. [Google Scholar] [CrossRef] [Green Version]
  27. Tung, S.H.; Sui, C.H. Application of digital-image-correlation techniques in analysing cracked cylindrical pipes. Sadhana 2010, 35, 557–567. [Google Scholar] [CrossRef] [Green Version]
  28. Song, J.; Yang, J.; Liu, F.; Lu, K. High temperature strain measurement method by combining digital image correlation of laser speckle and improved RANSAC smoothing algorithm. Opt. Lasers Eng. 2018, 111, 8–18. [Google Scholar] [CrossRef]
  29. Brillaud, J.; Lagattu, F. Limits and possibilities of laser speckle and white-light image-correlation methods: Theory and experiments. Appl. Opt. 2002, 41, 6603–6613. [Google Scholar] [CrossRef]
  30. Yang, X.; Liu, Z.; Xie, H. A real time deformation evaluation method for surface and interface of thermal barrier coatings during 1100 °C thermal shock. Meas. Sci. Technol. 2012, 23, 105604. [Google Scholar] [CrossRef]
  31. Sutton, M.A.; Ke, X.; Lessner, S.M.; Goldbach, M.; Yost, M.; Zhao, F.; Schreier, H.W. Strain field measurements on mouse carotid arteries using microscopic three-dimensional digital image correlation. J. Biomed. Mater. Res. 2008, 84, 178–190. [Google Scholar] [CrossRef]
  32. Hokka, M.; Mirow, N.; Nagel, H.; Irqsusi, M.; Vogt, S.; Kuokkala, V.T. In-vivo deformation measurements of the human heart by 3D digital image correlation. J. Biomech. 2015, 48, 2217–2220. [Google Scholar] [CrossRef] [PubMed]
  33. Bedon, C.; Fasan, M.; Amadio, C. Vibration Analysis and Dynamic Characterization of Structural Glass Elements with Different Restraints Based on Operational Modal Analysis. Buildings 2019, 9, 13. [Google Scholar] [CrossRef] [Green Version]
  34. Pérez-Merino, P.; Bekesi, N.; Fernández-López, A. Intraocular pressure and three-dimensional corneal biomechanics. In Air-Puff Tonometers Challenges and Insights; Koprowski, R., Ed.; IOP Publishing Ltd.: Bristol, UK, 2019. [Google Scholar]
  35. Passieux, J.C.; Navarro, P.; Périé, J.N.; Marguet, S.; Ferrero, J.F. A digital image correlation method for tracking planar motions of rigid spheres: Application to medium velocity impacts. Exp. Mech. 2014, 54, 1453–1466. [Google Scholar] [CrossRef] [Green Version]
  36. Hagara, M.; Hunady, R. The influence of sampling frequency on the results of motion analysis performed by high-speed digital image correlation. Appl. Mech. Mater. 2015, 816, 397–403. [Google Scholar] [CrossRef]
  37. Frankovský, P.; Delyová, I.; Sivák, P.; Bocko, J.; Živčák, J.; Kicko, M. Modal Analysis Using Digital Image Correlation Technique. Materials 2022, 15, 5658. [Google Scholar] [CrossRef]
  38. Reu, P.L.; Rohe, D.P.; Jacobs, L.D. Comparison of DIC and LDV for practical vibration and modal measurements. Mech. Syst. Signal Process. 2017, 86, 2–16. [Google Scholar] [CrossRef] [Green Version]
  39. Molina-Viedma, A.J.; Felipe-Sesé, L.; López-Alba, E.; Díaz, F.A. Comparative of conventional and alternative Digital Image Correlation techniques for 3D modal characterisation. Measurement 2020, 151, 107101. [Google Scholar] [CrossRef]
  40. Felipe-Sesé, L.; Molina-Viedma, A.J.; López-Alba, E.; Díaz, F.A. RGB colour encoding improvement for three-dimensional shapes and displacement measurement using the integration of fringe projection and digital image correlation. Sensors 2018, 18, 3130. [Google Scholar] [CrossRef] [Green Version]
  41. Briers, D.; Duncan, D.D.; Hirst, E.R.; Kirkpatrick, S.J.; Larsson, M.; Steenbergen, W.; Stromberg, T.; Thompson, O.B. Laser speckle contrast imaging: Theoretical and practical limitations. J. Biomed. Opt. 2013, 18, 066018. [Google Scholar] [CrossRef] [Green Version]
  42. Pan, B.; Xie, H.; Wang, Z. Equivalence of digital image correlation criteria for pattern matching. Appl. Opt. 2010, 49, 5501–5509. [Google Scholar] [CrossRef]
  43. Mohammad, H.K.; Asemani, D. Surface defect detection in tiling Industries using digital image processing methods: Analysis and evaluation. ISA Trans. 2014, 53, 834–844. [Google Scholar]
  44. Bing, P.; Xie, H.-M.; Xu, B.-Q.; Dai, F.-L. Performance of sub-pixel registration algorithms. Meas. Sci. Technol. 2006, 17, 1615. [Google Scholar] [CrossRef]
  45. Schreier, H.W.; Braasch, J.R.; Sutton, M.A. Systematic errors in digital image correlation caused by intensity interpolation. Opt. Eng. 2000, 39, 2915–2921. [Google Scholar] [CrossRef] [Green Version]
  46. Yang, Q.; Zhang, Y.; Zhao, T.; Chen, Y. Single image super-resolution using self-optimizing mask via fractional-order gradient interpolation and reconstruction. ISA Trans. 2018, 82, 163–171. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  47. Reu, P.; Sweatt, W.; Miller, T.; Fleming, D. Camera system resolution and its influence on digital image correlation. Exp. Mech. 2015, 55, 9–25. [Google Scholar] [CrossRef]
  48. Park, J.; Yoon, S.; Kwon, T.H.; Park, K. Assessment of speckle-pattern quality in digital image correlation based on gray intensity and speckle morphology. Opt. Lasers Eng. 2017, 91, 62–72. [Google Scholar] [CrossRef]
  49. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
  50. Ma, L.; Chen, Y.; Moore, K. Rational radial distortion models with analytical undistortion formulae. arXiv 2003, arXiv:cs/0307047. [Google Scholar]
  51. Heikkila, J.; Silvén, O. A four-step camera calibration procedure with implicit image correction. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Juan, PR, USA, 17–19 June 1997; pp. 1106–1112. [Google Scholar]
  52. Wu, R.; Kong, C.; Li, K.; Zhang, D. Real-time digital image correlation for dynamic strain measurement. Exp. Mech. 2016, 56, 833–843. [Google Scholar] [CrossRef]
  53. Zhong, F.; Shao, X.; Quan, C. A comparative study of 3D reconstruction methods in stereo digital image correlation. Opt. Lasers Eng. 2019, 122, 142–150. [Google Scholar] [CrossRef]
  54. Mousa, M.A.; Yussof, M.M.; Udi, U.J.; Nazri, F.M.; Kamarudin, M.K.; Parke, G.A.R.; Assi, L.N.; Ghahari, S.A. Application of Digital Image Correlation in Structural Health Monitoring of Bridge Infrastructures: A Review. Infrastructures 2021, 6, 176. [Google Scholar] [CrossRef]
  55. Zhang, S. High-Speed 3D Imaging with Digital Fringe Projection Techniques; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
  56. Felipe-Sesé, L.; Molina-Viedma, A.J.; Díaz, F.A. FP+DIC for low-cost 3D full-field experimental modal analysis in industrial components. Mech. Syst. Signal Process. 2019, 128, 329–339. [Google Scholar] [CrossRef]
  57. Bruno, L.; Panotta, L.; Poggialini, A. Laser speckle decorrelation in NDT. Opt. Lasers Eng. 2000, 34, 55–65. [Google Scholar] [CrossRef]
  58. Pang, Y.; Lingamanaik, S.; Chen, B.K.; Yu, S.F. Measurement of deformation of the concrete sleepers under different support conditions using non-contact laser speckle imaging sensor. Eng. Struct. 2020, 205, 110054. [Google Scholar] [CrossRef]
  59. Hassan, G.M. Deformation measurement in the presence of discontinuities with digital image correlation: A review. Opt. Lasers Eng. 2021, 137, 106394. [Google Scholar] [CrossRef]
  60. Hassan, G.M.; Dyskin, A.V.; MacNish, C.K.; Pasternak, E.; Shufrin, I. Discontinuous digital image correlation to reconstruct displacement and strain fields with discontinuities: Dislocation approach. Eng. Fract. Mech. 2018, 189, 273–292. [Google Scholar] [CrossRef]
  61. Harris, C.; Stephens, M. A Combined Corner and Edge Detector. In Proceedings of the 4th Alvey Vision Conference, Manchester, UK, 31 August–2 September 1988; pp. 147–151. [Google Scholar]
Figure 1. Experimental custom 3D-DIC set-up based on speckle pattern projection for full-field vibration measurements: (a) schematic layout of the experimental set-up and testing conditions (UAV wing made of composite material (test sample) clamped to the shaker); (b) set-up employed in the vibration test; (c) image of the UAV wing fixed to the electromagnetic shaker (lateral excitation); (d) optical channel for speckle pattern projection and high-speed sensors (distance of the tested sample during the measurements: 3.75 m; instantaneous field of view of the sensor: 0.5 mm).
Figure 1. Experimental custom 3D-DIC set-up based on speckle pattern projection for full-field vibration measurements: (a) schematic layout of the experimental set-up and testing conditions (UAV wing made of composite material (test sample) clamped to the shaker); (b) set-up employed in the vibration test; (c) image of the UAV wing fixed to the electromagnetic shaker (lateral excitation); (d) optical channel for speckle pattern projection and high-speed sensors (distance of the tested sample during the measurements: 3.75 m; instantaneous field of view of the sensor: 0.5 mm).
Sensors 22 09766 g001
Figure 2. Schematic representation of DIC based on the correlation between the reference subset (a) and the target subset (b). The variables u and v represent the displacement experienced by the subset in the deformed image with respect to the reference image.
Figure 2. Schematic representation of DIC based on the correlation between the reference subset (a) and the target subset (b). The variables u and v represent the displacement experienced by the subset in the deformed image with respect to the reference image.
Sensors 22 09766 g002
Figure 3. Structural characterization with the projected 3D-DIC configuration (schematic illustration).
Figure 3. Structural characterization with the projected 3D-DIC configuration (schematic illustration).
Sensors 22 09766 g003
Figure 4. Camera coordinate system composed of the camera systems of both cameras: camera 1 (left) and camera 2 (right). The relation between both coordinate systems is given by a rotation matrix ( R ) and a translation vector ( T ).
Figure 4. Camera coordinate system composed of the camera systems of both cameras: camera 1 (left) and camera 2 (right). The relation between both coordinate systems is given by a rotation matrix ( R ) and a translation vector ( T ).
Sensors 22 09766 g004
Figure 5. Scheme of the principle of the inverse compositional Gauss–Newton algorithm.
Figure 5. Scheme of the principle of the inverse compositional Gauss–Newton algorithm.
Sensors 22 09766 g005
Figure 6. Unmanned Aerial Vehicle (UAV) wing made of composite material: (a) Isometric view and (b) dimensions and accelerometer locations (Accelerometer 1: Acel 1; Accelerometer 2: Acel 2; Accelerometer 3: Acel 3). Mass of the tested system: 530 g (UAV wing) + 140 g (steel adaptor) + 25.8 g (3 accelerometers).
Figure 6. Unmanned Aerial Vehicle (UAV) wing made of composite material: (a) Isometric view and (b) dimensions and accelerometer locations (Accelerometer 1: Acel 1; Accelerometer 2: Acel 2; Accelerometer 3: Acel 3). Mass of the tested system: 530 g (UAV wing) + 140 g (steel adaptor) + 25.8 g (3 accelerometers).
Sensors 22 09766 g006
Figure 7. Distribution of the cross-correlation coefficients for three different subset sizes: (a) 20 × 20 subset, (b) 30 × 30 subset and (c) 40 × 40 subset.
Figure 7. Distribution of the cross-correlation coefficients for three different subset sizes: (a) 20 × 20 subset, (b) 30 × 30 subset and (c) 40 × 40 subset.
Sensors 22 09766 g007
Figure 8. Static representations of the vibration mode for the first relevant natural frequencies from two different perspective points: (a) displacement field for 42.8 Hz, (b) displacement field for 57.5 Hz, (c) displacement field for 83.1Hz and (d) displacement field for 97.5 Hz. The color bars represent the displacement field (pixels).
Figure 8. Static representations of the vibration mode for the first relevant natural frequencies from two different perspective points: (a) displacement field for 42.8 Hz, (b) displacement field for 57.5 Hz, (c) displacement field for 83.1Hz and (d) displacement field for 97.5 Hz. The color bars represent the displacement field (pixels).
Sensors 22 09766 g008aSensors 22 09766 g008b
Figure 9. Screenshots of supplementary videos: (a) raw data acquisition (Video S1) and (b) the displacement field corresponding to the post-processing of raw data from two different perspective points (Video S2). The color bars represent the displacement field (mm).
Figure 9. Screenshots of supplementary videos: (a) raw data acquisition (Video S1) and (b) the displacement field corresponding to the post-processing of raw data from two different perspective points (Video S2). The color bars represent the displacement field (mm).
Sensors 22 09766 g009
Figure 10. Percentage of error as a result of contrasting the ratio of the accelerometers with respect to the displacements calculated by the 3D-DIC technique in a surface range of −1 to +1 subsets around the accelerometer: (a) accelerometer 2; (b) accelerometer 3. The color bars represent the percentage of error.
Figure 10. Percentage of error as a result of contrasting the ratio of the accelerometers with respect to the displacements calculated by the 3D-DIC technique in a surface range of −1 to +1 subsets around the accelerometer: (a) accelerometer 2; (b) accelerometer 3. The color bars represent the percentage of error.
Sensors 22 09766 g010
Table 1. Coordinates of the accelerometers.
Table 1. Coordinates of the accelerometers.
AccelerometersCoordinates (mm)Coordinates (Subsets)
Accelerometer 1[205,330][11,21]
Accelerometer 2[105,100][7,7]
Accelerometer 3[60,140][4,11]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Souto Janeiro, A.; Fernández López, A.; Chimeno Manguan, M.; Pérez-Merino, P. Three-Dimensional Digital Image Correlation Based on Speckle Pattern Projection for Non-Invasive Vibrational Analysis. Sensors 2022, 22, 9766. https://doi.org/10.3390/s22249766

AMA Style

Souto Janeiro A, Fernández López A, Chimeno Manguan M, Pérez-Merino P. Three-Dimensional Digital Image Correlation Based on Speckle Pattern Projection for Non-Invasive Vibrational Analysis. Sensors. 2022; 22(24):9766. https://doi.org/10.3390/s22249766

Chicago/Turabian Style

Souto Janeiro, Alvaro, Antonio Fernández López, Marcos Chimeno Manguan, and Pablo Pérez-Merino. 2022. "Three-Dimensional Digital Image Correlation Based on Speckle Pattern Projection for Non-Invasive Vibrational Analysis" Sensors 22, no. 24: 9766. https://doi.org/10.3390/s22249766

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop