An integrated UGV-UAV system for construction site data collection
Graphical Abstract
Introduction
Construction industry is one of the major economic sectors in most countries, with 9%– 15% of total Gross Domestic Product (GDP) being allocated to their built environment [1]. Despite this economic importance, this industry is plugged with low productivity and inefficiencies. During the past few decades, the productivity rate in many sectors has been steadily increasing; however, this rate in the construction industry has barely increased, and it may have even decreased [2]. Automated systems and robotics are known as technologies that have the potential to revolutionize the construction industry by addressing the productivity challenges while improving quality [3].
In the past few years, on-site semi-automated and automated Unmanned Vehicles (UVs) have received significant attention for construction applications. They have been proposed for various activities including inspection and structural health monitoring [4, 5], floor cleaning [6], building components production [[7], [8], [9]], building components assembly [[10], [11], [12]], material handling [13], and construction surveying and monitoring [[14], [15], [16], [17], [18], [19]]. According to the applications in the latter activity, implementation of UVs for automated data collection is limited to the outdoor environment. Typical autonomous UVs on job sites use either GPS [[20], [21], [22]] or BIM-driven map [17, 23, 24] for autonomous navigation. GPS technology is mainly suitable for outdoor applications. The BIM-based motion planning solutions are inefficient in a cluttered construction site with many temporary structures that are not present in BIM.
An autonomous UV system capable of data collection on construction sites needs to have the following capabilities: 1) ability to collect multiple types of sensory and visual data that are required for construction performance monitoring, 2) ability to process the collected data in real-time using a low computational complex platform and 3) capability to navigate efficiently and autonomously on a construction site [25]. The authors' previous work on an integrated mobile robotic system [15] presents vision-based UGV for autonomous data collection on construction sites. This system addresses the first two concerns for an efficient autonomous data collection system. However, it is inefficient for autonomous navigation in a cluttered indoor scene where all the locations are not accessible by the UGV for data collection. Sites occluded with barriers or surfaces with different elevations are some examples of such scenes.
To address this issue, this study proposes a mobile robotic system that integrates two custom-built aerial and ground UVs. The complementary skills provided by each vehicle overcome the specific limitation of the other. Unmanned aerial vehicles (UAVs) offer a broad field of view and rapid coverage of search areas, which is ideal for mapping and monitoring tasks. However, they are constrained by their low payload (hundreds of grams) in relation to their size and short operational time (tens of minutes). UGVs, on the other hand, can carry substantial payloads and operate for extended periods. They offer high-resolution sensing, but with less field of view and lower coverage speed compared to UAVs. They are much more susceptible to obstacles, occlusions, and other sensing limitations. Therefore, coordinated operations between UGVs and UAVs can create highly beneficial synergies, such as multi-domain sensing and enhanced line of sight communications.
The objective of this paper is to present a collaborative and explorative approach using UGV and UAV. To achieve this goal, the given UGV-UAV system periodically visits a set of places of interest. These places have been pre-selected by the construction management team. During this mission, the UGV autonomously moves on the site and continuously scans the environment by its sensors. The relative pose between the two vehicles is estimated consistently, which enables the UAV to follow the UGV's path during its navigation. If the place of interest is not accessible by the UGV due to environmental constraints, the UGV sends the UAV to the desired location to scan the area of interest. The UAV does not process all data onboard; instead, the sensor's data are transmitted to the UGV's onboard computer through the network, where all the processing takes place. The UAV then returns to the UGV, and the heterogeneous team continues towards the next area of interest. It is necessary to mention that the places of interest for data collection are selected by the construction management team either directly through the laptop on the UGV, before starting the mission, or by commanding the UGV remotely using Secure Shell (SSH), which enables the operator to provide waypoints even during the mission.
To have an ideal and efficient system for indoor surveillance and monitoring, a custom-built indoor blimp is designed as the UAV for this study. Use of indoor blimp limits the application of the system to indoor environments. However, outdoor blimps which are more stable against air disturbance, have the potential to makes the system suitable for outdoor applications. The use of blimp instead of other types of UAVs (e.g., quadrotor, hexacopter, etc.) has the advantages of being safe in a cluttered indoor space and having lower cost, energy consumption, and noise.
The main contributions of this paper are 1) a comprehensive literature review on previous integrated UAV-UGV systems with different applications and 2) an integrated UAV-UGV system that can autonomously navigate and collect visual data for construction monitoring applications. This system addresses most of the limitations of past studies (to be further detailed in Section 2.3 UAV-UGV collaborative systems). The critical aspects of the system are 1) localization of both UGV and UAV, (2) capability of being contextually aware of the environment, (3) mapping of the surrounding environment, and (4) efficient path planning. Based on these aspects, the proposed system processes the context awareness (via semantic image segmentation), localization, mapping, and control planning in real-time, as illustrated in Fig. 1. To evaluate the performance of the proposed system, the system is implemented on an indoor cluttered construction-like environment (to be further detailed in Section 5 Experimental setup and results) for data collection purposes, demonstrating the feasibility of real-time performance for construction applications.
Section snippets
Background
The proposed system focuses on autonomous cooperation between UGV and UAV for navigation and indoor environment monitoring. Simultaneous Localization and Mapping (SLAM) and image segmentation algorithms are used for robot's localization, environment mapping, and scene understanding. Therefore, this section provides background information for visual SLAM and scene understanding. Additionally, it dissects into the previous UAV-UGV cooperative systems and discusses their application, sensors for
Hardware description
The proposed integrated system consists of two custom-built unmanned and autonomous platforms, a ground vehicle and a blimp. Mounted in the center of the UGV's chassis is a laptop with the following specification to carry out most of the necessary calculations of all modules: 16 GB DDR3 RAM, Intel Core i7-4710HQ quad-core Haswell processor, and NVIDIA GeForce GTX 960M. Two Raspberry Pis, one on the UGV and the other on the UAV, are used to control the actuators. The laptop and Raspberry Pis are
System architecture
This section describes the proposed integrated UAV-UGV system for autonomous data collection on construction sites. This multi-robot system uses ROS to integrate a variety of open-source packages. This architecture enables passing data between different modules, called nodes, within multiple computers through the publisher (i.e., a node that continually broadcasts a message) and the subscriber (i.e., a node that receives a message from a publisher). Each of these modules has many capabilities
Experimental setup and results
The proposed system has been tested on Constructed Facilities Lab (CFL) at North Carolina State University. This lab environment resembles an indoor construction site as there are stacks of construction materials and ongoing fabrication of structural components (e.g., columns, walls, bridge spans, etc.). Fig. 16 shows the test environments (A and B) and the corresponding 2D global map. Before the experiments, through an initial data collection, a comprehensive map of the construction site
Conclusion and future work
Over the past few years, autonomous UAVs and UGVs have received significant popularity in the construction industry, namely construction site surveying, existing structure inspection, and work-in-progress monitoring. However, there are still numerous open problems for further research such as efficient autonomous navigation in cluttered GPS-denied environments, where some places are inaccessible by the UGVs. To address this issue and increase the degree of automation through vision-based data
Declaration of competing interest
All authors confirm that there are no known conflicts of interest associated with this publication and there has been no significant financial support for this work that could have influenced its outcome.
Acknowledgments
We would like to thank all the students who were part of the ECE 592 class of Spring 2018 for their assistance with this project.
References (102)
- et al.
Understanding the implications of digitisation and automation in the context of Industry 4.0: a triangulation approach and elements of a research agenda for the construction industry
Comput. Ind.
(2016) The future of construction automation: technological disruption and the upcoming ubiquity of robotics
Autom. Constr.
(2015)- et al.
Tunnel structural inspection and assessment using an autonomous robotic system
Autom. Constr.
(2018) - et al.
Localisation of a mobile robot for bridge bearing inspection
Autom. Constr.
(2018) - et al.
Floor cleaning robot with reconfigurable mechanism
Autom. Constr.
(2018) - et al.
Feasibility verification of brick-laying robot using manipulation trajectory and the laying pattern optimization
Autom. Constr.
(2009) - et al.
Large-scale 3D printing of ultra-high performance concrete a new processing route for architects and builders
Mater. Des.
(2016) - et al.
Automated re-prefabrication system for buildings using robotics
Autom. Constr.
(2017) - et al.
Feasibility study for drone-based masonry construction of real-scale structures
Autom. Constr.
(2018) - et al.
Automated content-based filtering for enhanced vision-based documentation in construction toward exploiting big visual data from drones
Autom. Constr.
(2019)