Next Article in Journal
Enabling Blockchain Services for IoE with Zk-Rollups
Previous Article in Journal
Advantages in Using Colour Calibration for Orthophoto Reconstruction
Previous Article in Special Issue
The Characterization of Optical Fibers for Distributed Cryogenic Temperature Monitoring
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

CNN–Aided Optical Fiber Distributed Acoustic Sensing for Early Detection of Red Palm Weevil: A Field Experiment †

1
Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division, King Abdullah University of Science and Technology (KAUST), Thuwal 23955-6900, Saudi Arabia
2
Department of Electronic and Information Engineering, The Hong Kong Polytechnic University, Hong Kong SAR, China
3
Zhongshan Institute of Changchun University of Science and Technology, Zhongshan 528400, China
4
Center of Date Palms and Dates, Ministry of Environment, Water and Agriculture, Al-Hassa 31982, Saudi Arabia
*
Author to whom correspondence should be addressed.
This paper is an extended version of our paper published in Ashry, I.; Mao, Y.; Wang, B.; Sait, M.; Guo, Y.; Al-Shawaf, A.; Ng, T.K.; Ooi, B.S. CNN-based detection of red palm weevil using optical-fiber-distributed acoustic sensing. In Proceedings of the Photonic Instrumentation Engineering IX, San Francisco, CA, USA, 22–27 January 2022; SPIE: Bellingham, DC, USA, 2022; Volume 12008, p. 120080U-1.
Sensors 2022, 22(17), 6491; https://doi.org/10.3390/s22176491
Submission received: 27 July 2022 / Revised: 21 August 2022 / Accepted: 23 August 2022 / Published: 29 August 2022
(This article belongs to the Special Issue Distributed Optical Fiber Sensors: Applications and Technology)

Abstract

:
Red palm weevil (RPW) is a harmful pest that destroys many date, coconut, and oil palm plantations worldwide. It is not difficult to apply curative methods to trees infested with RPW; however, the early detection of RPW remains a major challenge, especially on large farms. In a controlled environment and an outdoor farm, we report on the integration of optical fiber distributed acoustic sensing (DAS) and machine learning (ML) for the early detection of true weevil larvae less than three weeks old. Specifically, temporal and spectral data recorded with the DAS system and processed by applying a 100–800 Hz filter are used to train convolutional neural network (CNN) models, which distinguish between “infested” and “healthy” signals with a classification accuracy of ∼97%. In addition, a strict ML-based classification approach is introduced to improve the false alarm performance metric of the system by ∼20%. In a controlled environment experiment, we find that the highest infestation alarm count of infested and healthy trees to be 1131 and 22, respectively, highlighting our system’s ability to distinguish between the infested and healthy trees. On an outdoor farm, in contrast, the acoustic noise produced by wind is a major source of false alarm generation in our system. The best performance of our sensor is obtained when wind speeds are less than 9 mph. In a representative experiment, when wind speeds are less than 9 mph outdoor, the highest infestation alarm count of infested and healthy trees are recorded to be 1622 and 94, respectively.

1. Introduction

The red palm weevil (RPW) Rhynchophorus ferrugineus (Olivier) is one of the world’s major invasive pest species that attacks date, coconut, ornamental, and oil palms in a variety of agricultural ecosystems worldwide [1,2]. In the past four decades, the RPW has spread rapidly and has been detected in more than 60 countries in the Mediterranean, North Africa, the Middle East, and parts of the Caribbean and Central America [1,3]. This plague has a significant social and economic impact on the date palm industry and the livelihoods of farmers in the affected areas [4,5]. The RPW causes economic losses estimated at millions of USD annually, whether through lost production or pest control costs. In Italy, Spain, and France, for example, the control of the RPW and losses are expected to reach about $235 million by 2023, unless a strict containment program is implemented [1].
Treatment of RPW-infested trees by chemical injection [6], for example, is a straightforward and effective method; however, the detection of the RPW threat at an early stage is challenging. Since the RPW larvae feed internally in tree trunks, they are difficult to detect in palm groves before the tree shows visible signs of distress in a well-advanced infestation stage, when the tree is difficult to save by treatment [7]. In the literature, sniffing dogs [8], electronic nose [9], X-ray-based tomography [10], and thermal imaging [11] show promising results for the early detection of RPW; however, they lack feasibility on large farms due to their slow scanning processes. For large-scale implementation, in contrast, the most promising early detection methods rely on acoustic sensors that identify the gnawing sounds of RPW larvae while they are chewing on the core of a palm trunk [12,13,14]. Current acoustic detection methods implant acoustic probes into individual tree trunks and construct a wireless network to communicate with the sensors [13]. Existing acoustic detection methods suffer from the following drawbacks: (1) assigning an acoustic probe to each tree is not cost-effective, especially for a large farm with hundreds of trees, (2) the detection provides point sensing at the location of inserting the acoustic probe. In other words, the sensor cannot monitor the entire tree trunk, with the same sensitivity, (3) acoustic probes are invasive and may damage trees or create nests for insects.
For the purpose of early detection of RPW, we recently introduced the use of an optical fiber distributed acoustic sensor (DAS), designed using the phase-sensitive optical time-domain reflectometer ( Φ -OTDR) [12,15,16]. The original approach is described in [12], where, starting at a DAS interrogation unit, a single optical fiber cable is extended and wound non-invasively around tree trunks to possibly monitor a vast farm in a short time. Compared to the point sensing offered by the acoustic probes, the optical fiber DAS can provide distributed monitoring of many trees and also along the trunk of each tree. However, in [12], the distinction between healthy and infected trees is based on a simple signal processing method (signal-to-noise ratio (SNR) measurement), which is difficult to rely on in an outdoor farm with different noise sources. Thus, in [15], we then presented the use of neural network-based machine learning (ML) algorithms as powerful tools for classifying healthy and infested trees, using the data recorded by an optical fiber DAS. However, the latter work was carried out in a laboratory environment using an artificial sound of RPW larvae, produced by a loudspeaker implanted within a tree. Finally, in [16], we extended our aforementioned work to use the ML-assisted optical fiber DAS to detect true weevil larvae in a well-controlled environment.
Here, we substantially extend our aforementioned work to use a convolutional neural network (CNN)-aided optical fiber DAS to recognize healthy and truly RPW-infested trees in an outdoor farm. The overall sensing approach is presented in Figure 1, where the optical fiber DAS unit records and processes acoustic signals from individual trees on the farm. Then, the processed data are passed to the trained CNN model that distinguishes healthy and infested trees. Training, validation, and testing of the CNN model are performed using acoustic temporal/spectral “infested” signals (from trees infested with 2–3-week-old RPW larvae) and “healthy signals" (from healthy trees placed in calm or noisy environment). Additionally, we discuss the limitations of using the designed sensor outdoors. To the best of our knowledge, no such deployment of ML-assisted optical fiber DAS for RPW detection in an outdoor farm has been previously conducted. Integrating ML with optical fiber DAS to detect the true sound of RPW larvae, especially in outdoor farms, would be very useful for controlling the spread of RPW infestation, and this work adds an important step toward designing a practical RPW detection sensor.

2. Experimental Setup

The Φ -OTDR-based optical fiber DAS used for the detection of RPW is schematically shown in Figure 2a [17], where a narrow linewidth laser produces a continuous wave (CW) light of a 1550-nm wavelength, a 40-mW optical power, and a 100-Hz linewidth. Using an acousto-optic modulator (AOM), the CW light is modulated into optical pulses of a 5-kHz repetition rate and a 50-ns width (∼5-m spatial resolution DAS). Next, the optical pulses are amplified with an erbium-doped fiber amplifier (EDFA) and then injected through a circulator into a standard single-mode fiber (SMF) of a ∼1-km length. The SMF is extended throughout the farm, and we loop a ∼5-m fiber section around each tree trunk. We further add a layer of plastic wrap over the fiber section to reinforce the fiber attachment to the tree and to mitigate the impact of the environmental acoustic noise. The backscattered Rayleigh signal from the SMF is directed via the circulator toward another EDFA for power amplification, and the amplified spontaneous emission (ASE) noise of the EDFA is discarded using a fiber Bragg grating (FBG). Finally, the filtered Rayleigh signal is detected by a photodetector (PD) and sampled by a digitizer. The design of the used optical fiber DAS system is conventional, which was initially described in [18,19]. However, the combination of the DAS system with ML for the early detection of RPW outdoors is new and significantly beneficial.
Figure 2b shows an example of a Rayleigh trace recorded along the ∼1-km SMF. The high-power signal found at the beginning of the SMF is common and is caused by the Fresnel reflection from the front facet of the SMF. In ideal scenarios when the refractive index is unperturbed along the optical fiber, the subsequent temporal Rayleigh traces along the fiber should be identical [17,20]. Thus, the differential signal between the subsequent temporal Rayleigh traces and an initial reference one should ideally be zero along the entire fiber. In the case that weevil larvae are chewing on a tree trunk, their eating sound perturbs the refractive index of the SMF, which yields to altering the Rayleigh intensity only at the site of the infested tree. Applying the normalized differential method [21] and the fast Fourier transform (FFT) to the subsequent Rayleigh traces, the temporal and spectral acoustic signals along the optical fiber can be calculated, respectively.

3. Classifying “Infested” and “Healthy” Acoustic Signals Using CNNs

In general, neural networks can provide high efficiency in image classification [22]. Recently, additional advanced methods such as integrating principal component analysis (PCA) and local binary pattern (LBP) [23], and mathematical morphology spectrum entropy [24] are used to improve the accuracy and generalization ability of hyperspectral image classification and signal feature extraction, respectively. It was found that CNN architectures can handle a large amount of data, similar to that produced by the optical fiber DAS, and at the same time can reveal patterns associated with the larvae eating sound [15]. In this section, we compare the efficiencies of classifying “infested” and “healthy” acoustic signals when using the DAS temporal and spectral data as separate inputs to CNN architectures. To reduce the sensor false alarm rate, in addition, we present an approach for integrating the classification results generated when using the temporal and spectral data.
In terms of data organization and labeling for the CNN architectures, the spatial sampling of the digitizer used is ∼0.5 m and we wind a ∼5-m fiber section around each tree trunk; thus, the fiber around each tree trunk is represented by 10 spatial points. For each spatial point on the tree trunk, a digitizer reading lasts for a 100-ms period, which is 500 temporal measurements because the pulse repetition rate is 5 kHz. Since CNNs have been proven to be highly effective in classifying images [22], we organize the temporal data into a 2D matrix (10 spatial points × 500 temporal measurements). Similarly, the spectral data are organized as a (10 spatial points × 250 spectral components) 2D matrix, obtained by applying the FFT to the temporal data of each spatial point.
During the CNN training process, we rely on supervised learning such that the data are labeled based on the tree condition (infested or healthy) and the SNR value of the temporal acoustic signal at the location of the tree. The “infested” data are recorded from six artificially infested trees with weevil larvae less than three weeks old (Figure 3a), which is considered to be an early stage of infestation [12]. A detailed description of the artificial infestation process and age control of weevil larvae is provided in [12]. To ensure that the recorded acoustic “infested” signals are caused by the larvae, we place the artificially infested trees in a well-controlled environment so that the trees are not exposed to major acoustic noise such as that produced by outdoor wind [15]. Under these conditions for the infested trees, if the SNR is greater than 2 dB (the minimum acceptable SNR for optical fiber DAS [21]), we label and record the signal as “infested”. On the other hand, the ”healthy” data are collected from 10 healthy trees, of which six are on an outdoor farm that includes typical sources of acoustic noise produced by wind, birds, humans, etc. and the other four healthy trees are in the above-mentioned controlled environment. We divide the ”healthy” data as ”calm” and ”noisy” signals, where the SNR is <2 dB and >2 dB, respectively.
In total for the CNN architecture associated with the temporal/spectral data, we record 18,000 examples of the “infested” signals and another 18,000 examples (9000 “calm” and 9000 “noisy”) of the “healthy” signals. To evaluate the performance of the CNN architectures, the recorded temporal/spectral examples are split as 60% (21,600 examples) training, 20% (7200 examples) validation, and 20% (7200 examples) testing datasets. All of the examples are processed by applying a [100–800 Hz] band-pass filter. This filter mitigates the environmental acoustic noise that typically has low frequencies, less than 100 Hz, and discards the high-frequency (larger than 800 Hz) noise produced by the electronic/optical components in the DAS system, without affecting the dominant weevil larvae acoustic frequencies [12,15]. Figure 3b,c shows representative examples of the input images for the CNN models when using the “infested”, “calm”, and ”noisy” temporal data and their corresponding spectral images, respectively.
Figure 4a shows the architecture of the CNN model used to handle the temporal (spectral) input data. The CNN architecture includes an input layer, two pairs of convolutional and max pooling layers, a flattened layer, a fully-connected layer, and an output layer, respectively. The first convolutional layer has the ReLu activation function and comprises 16 (32) filters of a 3 × 50 (3 × 5) size and a 1 × 1 (1 × 1) stride, while the first max pooling layer has a 2 × 2 (2 × 2) pool size. In contrast, the second convolutional layer also has the ReLu activation function and includes 32 (32) filters of a 3 × 3 (3 × 3) size and a 1 × 1 (1 × 1) stride, while the second max pooling layer includes a 2 × 2 (2 × 2) pool size. Following the flattened layer, the fully-connected layer has the ReLU activation function and includes 50 (50) nodes. Eventually, the output layer of the CNN contains a single node with a sigmoid activation function for binary classification (“infested” or “healthy” signal).
The adopted CNN model, shown in Figure 4a, contains many structural configuration features and parameters for the training process. The setting of the structure and parameters are very flexible, and there is no universal rule among different tasks. We follow the standard practice and use the classification accuracy as the primary evaluation standard to try different parameters repeatedly until the performance stops improving. For instance, about the number of interlayers in the model, we start with one pair of convolutional layers and max pooling and increase it gradually. We find that two pairs can obviously provide higher accuracy than one pair, and more pairs will bring more time consumption but no more performance gain. Thus, we use two pairs of convolutional layers and max pooling finally. Some key parameters, such as the convolution window size and sliding step, are limited by the input graph size and determined by repeated trails. Moreover, we keep the model’s default values for some parameters that do not affect the performance.
Figure 4b,d show the evolution of the training/validation accuracy and loss with the epoch, when the temporal and spectral data are used, respectively. At the end of the training cycles, 96.97% and 96.78% validation accuracy values are obtained for the temporal and spectral data, accordingly. Following the training and validation processes, we use the testing datasets to evaluate the performance of the two CNN models. The confusion matrixes when using the temporal and spectral data are shown in Figure 4c,e, respectively. Of the classification values, 97.0% and 97.1% are, respectively, obtained using the CNN models of the temporal and spectral data. The results of the confusion matrixes in this contrast experiment confirm the effectiveness of the CNN models to distinguish between the “infested” and “healthy” signals.
The F a l s e A l a r m (false infested or false positive) is a critical performance metric of the CNN models that should be decreased in our experiments, to avoid removing or applying a treatment to a healthy tree because of sensor false alarms. Given the false positives F P and the true positives T P in a confusion matrix, the F a l s e A l a r m is expressed as F a l s e A l a r m = F P / ( T P + F P ) [25]. Using the results of the confusion matrixes in Figure 4c,e, the value of the F a l s e A l a r m is 3.64% and 3.56% for the CNN models of the temporal and spectral data, respectively. To reduce the value of the F a l s e A l a r m , we introduce integrating the classification results of the two CNN models such that a temporal example and its corresponding spectral one are marked to be ”infested” if and only if the two CNN models produce ”infestation” classification results. In other words, if a temporal example is classified as ”infested” by the temporal CNN model, while its corresponding spectral example is classified as ”healthy” by the spectral CNN model, then we classify this overall example as ”healthy”. By adopting this approach, the sensor F a l s e A l a r m is decreased to 2.82%. Compared with the original 3.64% and 3.56% F a l s e A l a r m values of the temporal and spectral data, the new 2.82% F a l s e A l a r m obtained after using the strict decision-making method has improvement percentages of 22.5% and 20.8%, respectively. Consequently, we decide to apply the introduced merged classification approach to count the infestation alarms when classifying the infested and healthy trees in the subsequent section.

4. Classifying Infested and Healthy Trees Using CNNs

In this section, we use the aforementioned merged classification approach with the trained CNN models to distinguish between infested and healthy trees. In other words, in an experiment involving infested and healthy trees, we record equal data examples from the individual trees and pass them to the CNN models with the merged classification approach to count the number of infestation alarms for each tree. These experiments are carried out when trees are located in a controlled environment and in an outdoor farm.
Focusing first on the controlled environment experiments where the trees are located in a closed room with windows so that the trees may be exposed to mild acoustic noise produced by birds flying around the room and/or humans inside the room. We arrange two different experiments (Exp. 1 and Exp. 2) in the controlled environment such that each experiment involves four trees (two infested and two healthy). Part of the data collected in Exp. 1 are used to train the CNN models; however, the trees and data of Exp. 2 are never included in training the CNN models. This experimental design is important for investigating the generalization of the trained CNN models. The infested trees in Exp. 1 and Exp. 2 include larvae less than three weeks old, which is controlled during the artificial infestation process [12]. The height range of the infested and healthy trees, placed in the controlled environment, is 1–1.5 m. Figure 5a shows an example of a tree used in the experiments, while the optical fiber is wrapped around it and a plastic wrap is added as an outer layer over the fiber and the tree. For each tree in Exp. 1 (Exp. 2), we record 129,761 (144,755) temporal images with identical number of their corresponding spectral images. As Figure 5b,c show, the merged classification approach is generalized and can efficiently distinguish between the infested and healthy trees in the two experiments by providing obvious contrasts in the number of alarms between the infested and healthy trees. Thus, these contrast experiments demonstrate the efficacy of the reported method for identifying the infested and healthy trees in the designed controlled environment.
In turn, we also conduct other experiments on an outdoor farm that includes different types of acoustic noises produced by wind, birds, farm animals, and vehicles. The experiments (Exp. 3–Exp. 5), conducted on the outdoor farm, include the same trees consisting of two artificially infested trees and 17 healthy trees, but the experiments are carried out under different wind speed ranges. The two artificially infested trees and two other healthy trees are short (1–1.5 m) (Figure 6a), while the remaining 15 healthy trees are typical and tall (Figure 6b). Again, the trees and data of Exp. 3–Exp. 5 are completely new to the CNN-trained models. For each tree in Exp. 3, we record 172,686 data examples during which the wind speed changes within a [0, 8] mph range. Considering the number of infestation alarms in Figure 6c of Exp. 3, the merged classification approach can distinguish well between the infested and healthy trees on the outdoor farm.
To investigate the impact of the wind speed on the performance of our sensor, we further carry out Exp. 4 and Exp. 5 at different wind speed ranges. In particular, Exp. 4 is carried out in the “light air” and “light breeze” conditions where 16,694 data examples per tree are recorded when the wind speed is within a [3, 5] mph range. In contrast, we collect 22,763 data examples per tree for Exp. 5 in the “gentle breeze” and “moderate breeze” conditions where the wind speed is within a [9, 14] mph range. In Exp. 4, when the wind speed is relatively low, the system performs outstandingly and perfectly discriminates between the infested and healthy trees (Figure 6d). As the wind speed increases to that range of Exp. 5, the performance of the sensing system degrades. These results are in good agreement with our findings in [15]; wind is the main source of noise in our system, compared to the acoustic noise of birds and humans that is greatly attenuated when propagating through the air before reaching the fiber [26]. Thus, these contrasting experiments conducted outdoors show that the best performance of our sensor can be obtained when wind speeds are less than 9 mph.

5. Discussion

In the experiments conducted in the controlled environment and outdoor farm, one can observe that the infestation alarm count for an infested tree is much lower than the tree’s total number of recorded examples. This is attributed to the fact that the larvae may not continuously produce sound and/or their sound is sometimes not strong enough to be picked up by the optical fiber. Thus, it is important to differentiate between classifying “infested” and “healthy” acoustic signals, presented in Section 3, and classifying infested and healthy trees, described in Section 4. For example, regarding the acoustic signals, it is straightforward to calculate the F a l s e A l a r m values because the data size and class are known. However, for the real scenario of classifying the trees, the F a l s e A l a r m cannot be calculated because even an infested tree produces “infested” and “healthy” signals. Thus, we decide to rely on counting the infestation alarms to distinguish between the healthy and infested trees. Considering the practical application of the sensor, we can select few healthy trees as references and based on their maximum infestation false alarm count, we can set an appropriate threshold for infestation alarm count to announce a tree as infested.
We also compare our optical fiber DAS and CNN method with existing RPW detection technologies. Table 1 summarizes the comparison results. We can observe that the acoustic detection methods have attracted the most research interest for RPW detection in past years. Among all methods
Based on acoustic sensors, our technique based on DAS with CNN algorithm demonstrates advantages in most aspects of concern, including high detection accuracy, 24/7 monitoring, unattended, early detection capability, low cost for large-scale applications, and moderate computational complexity. However, our sensor suffers from the degradation in performance outdoors at high wind speeds, which will require further investigation and improvement. Thus, we believe that our DAS-based method is worthy of implementation in large-scale practical applications.
To sum up, this work aims to use optical fiber DAS to monitor RPW infestation in outdoor date plantations. The acoustic data recorded by the optical fiber DAS are passed to a trained CNN model to decide whether the acoustic signal is “infested” or “healthy”. For each tree, the infestation alarm count produced by the CNN model can be used to decide whether the tree is infested or healthy. The significance of this work is to pave the way for future experiments, as we plan to use our sensor to detect RPW in naturally infested trees. However, this may require challenging arrangements as it is difficult to find a tree at an early stage of infestation because the tree only shows signs of visual distress at a very advanced stage of infestation. In addition, we will consider improving the overall performance of the CNN model by training it further on diverse data to improve the performance of our system in terms of the contrast between the infestation alarm counts of the infested and healthy trees.

6. Conclusions

We report on the integration of optical fiber DAS and CNN for the early detection of RPW in large farms. The temporal and spectral acoustic signals recorded by the optical fiber DAS are used to train CNN models, resulting in classifying the “infested” and “healthy” signals with accuracy values of 97.0% and 97.1%, respectively. Merging the classification results of the temporal and spectral CNN models can reduce the F a l s e A l a r m performance metric of the sensor by ∼20%. Our sensor shows success in recognizing the infested and healthy trees in a controlled environment and an outdoor farm, with a high efficiency when the wind speeds are less than 9 mph outdoors. The main advantage of the reported sensor, compared to other current technologies, is that the sensor can provide 24/7 monitoring while offering wide coverage of the farming area, using only a single optical fiber cable. In contrast, the performance of the reported sensor still requires improvement when working outdoor at high wind speeds.

Author Contributions

Conceptualization, I.A., B.W. and Y.M.; methodology, I.A., B.W., Y.M., M.S., Y.G., Y.A.-F. and A.A.-S.; software, I.A., B.W. and Y.M.; validation, I.A., B.W., Y.M., M.S. and Y.G.; formal analysis, I.A., B.W. and Y.M.; investigation, I.A., B.W. and Y.M.; resources, Y.A.-F. and A.A.-S.; data curation, I.A., B.W. and Y.M.; writing—original draft preparation, I.A.; writing—review and editing, B.W., T.K.N. and B.S.O.; visualization, I.A. and B.W.; supervision, T.K.N. and B.S.O.; project administration, T.K.N. and B.S.O.; funding acquisition, B.S.O. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by KAUST–Research Translation Funding (REI/1/4247-01-01), KAUST (BAS/1/1614-01-01), and NEOM (RGC/3/4932-01-01).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on reasonable request from the corresponding author. The data are not publicly available due to privacy.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Food Chain Crisis. Available online: http://www.fao.org/food-chain-crisis/how-we-work/plant-protection/red-palm-weevil/en/ (accessed on 14 December 2020).
  2. Wahizatul, A.A.; Zazali, C.; Abdul, R.; Nurul’Izzah, A.G. A new invasive coconut pest in Malaysia: The red palm weevil (Curculionidae: Rhynchophorus ferrugineus). Planter 2013, 89, 97–110. [Google Scholar]
  3. Ferry, M.; Gomez, S. The red palm weevil in the Mediterranean area. Palms 2002, 46, 172–178. [Google Scholar]
  4. El-Mergawy, R.; Al-Ajlan, A. Red palm weevil, Rhynchophorus ferrugineus (Olivier): Economic importance, biology, biogeography and integrated pest management. J. Agric. Sci. Technol. A 2011, 1, 1–23. [Google Scholar]
  5. Mukhtar, M.; Rasool, K.G.; Parrella, M.P.; Sheikh, Q.I.; Pain, A.; Lopez-Llorca, L.V.; Aldryhim, Y.N.; Mankin, R.; Aldawood, A.S. New initiatives for management of red palm weevil threats to historical Arabian date palms. Fla. Entomol. 2011, 94, 733–736. [Google Scholar] [CrossRef]
  6. Llácer, E.; Miret, J.A.J. Efficacy of phosphine as a fumigant against Rhynchophorus ferrugineus (Coleoptera: Curculionidae) in palms. Span. J. Agric. Res. 2010, 3, 775–779. [Google Scholar] [CrossRef]
  7. Mankin, R. Towards user-friendly early detection acoustic devices and automated monitoring for red palm weevil management. In Proceedings of the Symposium Proceedings, Amsterdam, The Netherlands, 25–27 September 2019; pp. 142–147. [Google Scholar]
  8. Suma, P.; La Pergola, A.; Longo, S.; Soroker, V. The use of sniffing dogs for the detection of Rhynchophorus ferrugineus. Phytoparasitica 2014, 42, 269–274. [Google Scholar] [CrossRef]
  9. Rizzolo, A.; Bianchi, G.; Lucido, P.; Cangelosi, B.; Pozzi, L.; Villa, G.; Clematis, F.; Pasini, C.; Curir, P. Electronic nose for the early detection of red palm weevil (Rhynchophorus ferrugineous Olivier) infestation in palms: Preliminary results. In Proceedings of the II International Symposium on Horticulture in Europe 1099, Angers, France, 1–5 July 2012; pp. 347–355. [Google Scholar]
  10. Haff, R.; Slaughter, D. Real-time X-ray inspection of wheat for infestation by the granary weevil, Sitophilus granarius (L.). Trans. ASAE 2004, 47, 531. [Google Scholar] [CrossRef]
  11. Golomb, O.; Alchanatis, V.; Cohen, Y.; Levin, N.; Cohen, Y.; Soroker, V. Detection of red palm weevil infected trees using thermal imaging. In Precision Agriculture’15; Wageningen Academic Publishers: Wageningen, The Netherlands, 2015; pp. 322–337. [Google Scholar]
  12. Ashry, I.; Mao, Y.; Al-Fehaid, Y.; Al-Shawaf, A.; Al-Bagshi, M.; Al-Brahim, S.; Ng, T.K.; Ooi, B.S. Early detection of red palm weevil using distributed optical sensor. Sci. Rep. 2020, 10, 3155. [Google Scholar] [CrossRef]
  13. Rach, M.M.; Gomis, H.M.; Granado, O.L.; Malumbres, M.P.; Campoy, A.M.; Martín, J.J.S. On the design of a bioacoustic sensor for the early detection of the red palm weevil. Sensors 2013, 13, 1706–1729. [Google Scholar] [CrossRef]
  14. Siriwardena, K.; Fernando, L.; Nanayakkara, N.; Perera, K.; Kumara, A.; Nanayakkara, T. Portable acoustic device for detection of coconut palms infested by Rynchophorus ferrugineus (Coleoptera: Curculionidae). Crop. Prot. 2010, 29, 25–29. [Google Scholar] [CrossRef]
  15. Wang, B.; Mao, Y.; Ashry, I.; Al-Fehaid, Y.; Al-Shawaf, A.; Ng, T.K.; Yu, C.; Ooi, B.S. Towards Detecting Red Palm Weevil Using Machine Learning and Fiber Optic Distributed Acoustic Sensing. Sensors 2021, 21, 1592. [Google Scholar] [CrossRef] [PubMed]
  16. Ashry, I.; Mao, Y.; Wang, B.; Sait, M.; Guo, Y.; Al-Shawaf, A.; NG, T.K.; Ooi, B.S. CNN-based detection of red palm weevil using optical-fiber-distributed acoustic sensing. In Proceedings of the Photonic Instrumentation Engineering IX, San Francisco, CA, USA, 22–27 January 2022; SPIE: Bellingham, WA, USA, 2022; Volume 12008, p. 120080U-1. [Google Scholar]
  17. Bao, X.; Zhou, D.P.; Baker, C.; Chen, L. Recent development in the distributed fiber optic acoustic and ultrasonic detection. J. Light. Technol. 2016, 35, 3256–3267. [Google Scholar] [CrossRef]
  18. Taylor, H.F.; Lee, C.E. Apparatus and Method for Fiber Optic Intrusion Sensing. U.S. Patent 5,194,847, 16 March 1993. [Google Scholar]
  19. Juškaitis, R.; Mamedov, A.; Potapov, V.; Shatalin, S. Distributed interferometric fiber sensor system. Opt. Lett. 1992, 17, 1623–1625. [Google Scholar] [CrossRef]
  20. Lu, Y.; Zhu, T.; Chen, L.; Bao, X. Distributed vibration sensor based on coherent detection of phase-OTDR. J. Light. Technol. 2010, 28, 3243–3249. [Google Scholar]
  21. Ashry, I.; Mao, Y.; Alias, M.S.; Ng, T.K.; Hveding, F.; Arsalan, M.; Ooi, B.S. Normalized differential method for improving the signal-to-noise ratio of a distributed acoustic sensor. Appl. Opt. 2019, 58, 4933–4938. [Google Scholar] [CrossRef]
  22. Schmidhuber, J. Deep learning in neural networks: An overview. Neural Networks 2015, 61, 85–117. [Google Scholar] [CrossRef] [PubMed]
  23. Chen, H.; Miao, F.; Chen, Y.; Xiong, Y.; Chen, T. A hyperspectral image classification method using multifeature vectors and optimized KELM. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 2781–2795. [Google Scholar] [CrossRef]
  24. Yao, R.; Guo, C.; Deng, W.; Zhao, H. A novel mathematical morphology spectrum entropy based on scale-adaptive techniques. ISA Trans. 2022, 126, 691–702. [Google Scholar] [CrossRef]
  25. Wu, H.; Chen, J.; Liu, X.; Xiao, Y.; Wang, M.; Zheng, Y.; Rao, Y. One-Dimensional CNN-Based Intelligent Recognition of Vibrations in Pipeline Monitoring With DAS. J. Light. Technol. 2019, 37, 4359–4366. [Google Scholar] [CrossRef]
  26. Mao, Y.; Ashry, I.; Alias, M.S.; Ng, T.K.; Hveding, F.; Arsalan, M.; Ooi, B.S. Investigating the performance of a few-mode fiber for distributed acoustic sensing. IEEE Photonics J. 2019, 11, 1–10. [Google Scholar] [CrossRef]
  27. Pinhas, J.; Soroker, V.; Hetzroni, A.; Mizrach, A.; Teicher, M.; Goldberger, J. Automatic acoustic detection of the red palm weevil. Comput. Electron. Agric. 2008, 63, 131–139. [Google Scholar] [CrossRef]
  28. Potamitis, I.; Ganchev, T.; Kontodimas, D. On automatic bioacoustic detection of pests: The cases of Rhynchophorus ferrugineus and Sitophilus oryzae. J. Econ. Entomol. 2009, 102, 1681–1690. [Google Scholar] [CrossRef]
  29. Gutiérrez, A.; Ruiz, V.; Moltó, E.; Tapia, G.; del Mar Téllez, M. Development of a bioacoustic sensor for the early detection of Red Palm Weevil (Rhynchophorus ferrugineus Olivier). Crop. Prot. 2010, 29, 671–676. [Google Scholar] [CrossRef]
  30. Tofailli, K. The early detection of red palm weevil: A new method. In Proceedings of the IV International Date Palm Conference 882, Abu Dhabi, United Arab Emirates, 15–17 March 2010; pp. 441–449. [Google Scholar]
  31. Hetzroni, A.; Soroker, V.; Cohen, Y. Toward practical acoustic red palm weevil detection. Comput. Electron. Agric. 2016, 124, 100–106. [Google Scholar] [CrossRef]
  32. Rasool, K.G.; Husain, M.; Salman, S.; Tufail, M.; Sukirno, S.; Mehmood, K.; Farooq, W.A.; Aldawood, A.S. Evaluation of some non-invasive approaches for the detection of red palm weevil infestation. Saudi J. Biol. Sci. 2020, 27, 401–406. [Google Scholar] [CrossRef]
  33. Koubaa, A.; Aldawood, A.; Saeed, B.; Hadid, A.; Ahmed, M.; Saad, A.; Alkhouja, H.; Ammar, A.; Alkanhal, M. Smart Palm: An IoT framework for red palm weevil early detection. Agronomy 2020, 10, 987. [Google Scholar] [CrossRef]
  34. Mohamed, A.; Hany, A.; Adly, I.; Atwa, A.; Ragai, H. AI for Acoustic Early Detection of the Red Palm Weevil. In Proceedings of the 2021 16th International Conference on Computer Engineering and Systems (ICCES), Cairo, Egypt, 15–16 December 2021; pp. 1–4. [Google Scholar]
  35. Kagan, D.; Alpert, G.F.; Fire, M. Automatic large scale detection of red palm weevil infestation using street view images. ISPRS J. Photogramm. Remote Sens. 2021, 182, 122–133. [Google Scholar] [CrossRef]
  36. Karar, M.E.; Abdel-Aty, A.H.; Algarni, F.; Hassan, M.F.; Abdou, M.; Reyad, O. Smart IoT-based system for detecting RPW larvae in date palms using mixed depthwise convolutional networks. Alex. Eng. J. 2022, 61, 5309–5319. [Google Scholar] [CrossRef]
Figure 1. Overall approach for RPW detection using an optical fiber DAS and machine learning.
Figure 1. Overall approach for RPW detection using an optical fiber DAS and machine learning.
Sensors 22 06491 g001
Figure 2. (a) Experimental setup of the Φ -OTDR-based optical fiber DAS used for the detection of RPW. Cir.: circulator. (b) A representative Rayleigh trace recorded by the optical fiber DAS along a 1-km SMF.
Figure 2. (a) Experimental setup of the Φ -OTDR-based optical fiber DAS used for the detection of RPW. Cir.: circulator. (b) A representative Rayleigh trace recorded by the optical fiber DAS along a 1-km SMF.
Sensors 22 06491 g002
Figure 3. (a) Weevil larvae less than three weeks old. Representative examples of the temporal “infested”, “calm”, and ”noisy” images (b), and their corresponding spectral images (c).
Figure 3. (a) Weevil larvae less than three weeks old. Representative examples of the temporal “infested”, “calm”, and ”noisy” images (b), and their corresponding spectral images (c).
Sensors 22 06491 g003
Figure 4. (a) The CNN architecture for classifying “infested” and “healthy” temporal (spectral) signals. Conv: convolutional; FC: fully-connected; The dimensions of the CNN architecture associated with the spectral data are written in green. Training and validation history (b,d) and confusion matrix (c,e) when using the temporal/spectral data.
Figure 4. (a) The CNN architecture for classifying “infested” and “healthy” temporal (spectral) signals. Conv: convolutional; FC: fully-connected; The dimensions of the CNN architecture associated with the spectral data are written in green. Training and validation history (b,d) and confusion matrix (c,e) when using the temporal/spectral data.
Sensors 22 06491 g004
Figure 5. (a) Example of a tree used in the controlled environment experiments. Infestation alarm count produced by our sensor during Exp. 1 (b) and Exp. 2 (c). I: infested tree; H: healthy tree.
Figure 5. (a) Example of a tree used in the controlled environment experiments. Infestation alarm count produced by our sensor during Exp. 1 (b) and Exp. 2 (c). I: infested tree; H: healthy tree.
Sensors 22 06491 g005
Figure 6. In the outdoor farm, there are two short infested and two short healthy trees (a), and another 15 tall typical trees (b). Infestation alarm count produced by our sensor during Exp. 3 (c), Exp. 4 (d), and Exp. 5 (e). I: infested tree; H: healthy tree.
Figure 6. In the outdoor farm, there are two short infested and two short healthy trees (a), and another 15 tall typical trees (b). Infestation alarm count produced by our sensor during Exp. 3 (c), Exp. 4 (d), and Exp. 5 (e). I: infested tree; H: healthy tree.
Sensors 22 06491 g006
Table 1. Comparison of our DAS+CNN method with existing sensors for RPW detection, in chronological order.
Table 1. Comparison of our DAS+CNN method with existing sensors for RPW detection, in chronological order.
MethodProcessing TechniqueInvasive or NotPerformance or AccuracyAdvantages (Disadvantages)
An acoustic sensor (commercial piezoelectric microphone), 2008 [27]Speech recognition method, vector quantization (VQ), and Gaussian mixture modeling (GMM)Not98% accuracyAutomatic detection using simple commercial hardware (A sound-isolated box is used)
An acoustic sensor (Piezoelectric sensor), 2009 [28]Feature extraction, GMMInvasive99.1% accuracyAutomatic detection with well-designed algorithms (High computational complexity)
An acoustic sensor (electronic device with acoustic probe), 2010 [29]FFT, studying the sound intensity around 2250 HzInvasiveThe infested sound intensity increases around 1 dB from −20 dBDetection of a small number of larvae with a simple signal processing method (Low contrast between infested and non-infested sound)
An acoustic device (acoustic probe and headphone set), 2010 [14]Bandpass filtering, amplificationInvasive97% accuracySimple and portable hardware (Manual identification with four detection positions needed)
A radiography system (X-ray technology), 2012 [30]Visual detection based on X-ray photosNotObservable larvae on the photosSimple and visual operation (Difficult for large-scale applications)
An acoustic sensor (audio probe), 2013 [13]Filtering and amplification, feature vector quantizationInvasive90% accuracyAutonomous and continuous detection with explicit audio analysis algorithm (Extensive field experiments are needed in the future)
Thermal imaging (infrared thermal camera), 2015 [11]Thermal infrared images (TIR), leaf temperature maps, canopy representative temperature, crop water stress index (CWSI)NotLess than 75% accuracyLarge-scale and non-invasive detection (Susceptible to environmental conditions)
An acoustic sensor (piezoelectric microphone), 2016 [31]Likelihood indication by observer, speech recognition algorithm same as that in Ref. [27]Not75% accuracy by humans, 80% accuracy by machineManual and automated detection are compared (Susceptible to wind)
Some optical devices (digital camera, thermal camera, TreeRadarUnit (Radar 2000, Radar 900), resistograph, magnetic DNA biosensor, and near-infrared spectroscopy (NIRS)), 2020 [32]Visual analysis, the analysis of variance (ANOVA) PROC GLM procedure, response of the leaf spectral absorbanceNotAccuracy: visual approach 87%, Radar 2000 77%, Radar 900 73%, resistograph 73%, thermal camera 61%, digital camera 52%, and magnetic DNA 63%All used methods are non-invasive with a detailed comparison (Accuracy needs to be further improved)
An IoT system (commercial accelerometer sensor), 2020 [33]FFT, the estimation of power spectral density (PSD), peaks average difference (PAD) analysisInvasiveObservable signature of the infestationSimple hardware with a connection to network (Low sensitivity and contrast)
An acoustic sensor (USB microphone), 2021 [34]Feature extraction using Mel Frequency Cepstrum Coefficient (MFCC), discrete Fourier transform (DFT), artificial neural network (ANN), Alexnet-convolutional neural networks (CNN)Not99.2% accuracySimple hardware and concise algorithm (A plastic tube is used to imitate the real tree)
A large-scale imaging detection method (aerial and street view), 2021 [35]CNN, faster R-CNN ResNet-50 FPN, XResNet,NotAerial and street images can be mapped to actual palm treesAutomatic large-scale detection (Limited number of infested palm tree images available online)
An IoT system (acoustic detection of the public TreeVibes dataset), 2021 [36]Modified mixed depthwise CNN (MixConvNet)Invasive95.90% accuracyIntegration in a smartphone application with advanced algorithm (Only verified on the public TreeVibes dataset)
An optical fiber distributed acoustic sensor (ours)CNNNotAround 97.0% accuracyProvides 24/7 monitoring on large-scale farms (Low performance at high wind speeds)
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ashry, I.; Wang, B.; Mao, Y.; Sait, M.; Guo, Y.; Al-Fehaid, Y.; Al-Shawaf, A.; Ng, T.K.; Ooi, B.S. CNN–Aided Optical Fiber Distributed Acoustic Sensing for Early Detection of Red Palm Weevil: A Field Experiment. Sensors 2022, 22, 6491. https://doi.org/10.3390/s22176491

AMA Style

Ashry I, Wang B, Mao Y, Sait M, Guo Y, Al-Fehaid Y, Al-Shawaf A, Ng TK, Ooi BS. CNN–Aided Optical Fiber Distributed Acoustic Sensing for Early Detection of Red Palm Weevil: A Field Experiment. Sensors. 2022; 22(17):6491. https://doi.org/10.3390/s22176491

Chicago/Turabian Style

Ashry, Islam, Biwei Wang, Yuan Mao, Mohammed Sait, Yujian Guo, Yousef Al-Fehaid, Abdulmoneim Al-Shawaf, Tien Khee Ng, and Boon S. Ooi. 2022. "CNN–Aided Optical Fiber Distributed Acoustic Sensing for Early Detection of Red Palm Weevil: A Field Experiment" Sensors 22, no. 17: 6491. https://doi.org/10.3390/s22176491

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop