An integrated UGV-UAV system for construction site data collection

https://doi.org/10.1016/j.autcon.2019.103068Get rights and content

Highlights

  • We present a cooperative UAV-UGV system for autonomous data collection in construction.

  • We provide a comprehensive literature review on previous integrated UAV-UGV systems.

  • We integrate multiple modules on UAV and UGV for real-time construction applications.

  • UAV succeeds in following the UGV through a real-time UAV-UGV interaction.

  • We propose the first heterogeneous blimp and UGV in GPS-denied cluttered environments.

Abstract

There have been recent efforts to increase the degree of automation and frequency of data collection for construction applications using Unmanned Aerial/Ground Vehicles (UAV/UGV). However, the current practice of data collection is traditionally performed, which is manual, costly, time-consuming, and error-prone. Developing vision-based mobile robotic systems that are aware of its surrounding and capable of autonomous navigation are becoming essential to many construction applications, namely surveying, monitoring, and inspection. Nevertheless, the systems above suffer from a series of performance issues. One major problem is inefficient navigation in indoor and cluttered scenes with many obstacles and barriers, where some places are inaccessible by a UGV. To provide a solution to this problem, this paper designs a UAV-UGV team that integrates two custom-built mobile robots. The UGV autonomously navigates through space, leveraging its sensors. The UAV acts as an external eye for the UGV, observing the scene from a vantage point that is inaccessible to the UGV. The relative pose of the UAV is estimated continuously, which allows it to maintain a fixed location that is relative to the UGV. The key aspects for the development of this system that is capable of autonomous navigation are the localization of both UAV and UGV, mapping of the surrounding environment, and efficient path planning using multiple sensors. The proposed system is tested in an indoor and cluttered construction-like environment. The performance of the system demonstrates the feasibility of developing and deploying a robust and automated data collection system for construction applications in the near future.

Introduction

Construction industry is one of the major economic sectors in most countries, with 9%– 15% of total Gross Domestic Product (GDP) being allocated to their built environment [1]. Despite this economic importance, this industry is plugged with low productivity and inefficiencies. During the past few decades, the productivity rate in many sectors has been steadily increasing; however, this rate in the construction industry has barely increased, and it may have even decreased [2]. Automated systems and robotics are known as technologies that have the potential to revolutionize the construction industry by addressing the productivity challenges while improving quality [3].

In the past few years, on-site semi-automated and automated Unmanned Vehicles (UVs) have received significant attention for construction applications. They have been proposed for various activities including inspection and structural health monitoring [4, 5], floor cleaning [6], building components production [[7], [8], [9]], building components assembly [[10], [11], [12]], material handling [13], and construction surveying and monitoring [[14], [15], [16], [17], [18], [19]]. According to the applications in the latter activity, implementation of UVs for automated data collection is limited to the outdoor environment. Typical autonomous UVs on job sites use either GPS [[20], [21], [22]] or BIM-driven map [17, 23, 24] for autonomous navigation. GPS technology is mainly suitable for outdoor applications. The BIM-based motion planning solutions are inefficient in a cluttered construction site with many temporary structures that are not present in BIM.

An autonomous UV system capable of data collection on construction sites needs to have the following capabilities: 1) ability to collect multiple types of sensory and visual data that are required for construction performance monitoring, 2) ability to process the collected data in real-time using a low computational complex platform and 3) capability to navigate efficiently and autonomously on a construction site [25]. The authors' previous work on an integrated mobile robotic system [15] presents vision-based UGV for autonomous data collection on construction sites. This system addresses the first two concerns for an efficient autonomous data collection system. However, it is inefficient for autonomous navigation in a cluttered indoor scene where all the locations are not accessible by the UGV for data collection. Sites occluded with barriers or surfaces with different elevations are some examples of such scenes.

To address this issue, this study proposes a mobile robotic system that integrates two custom-built aerial and ground UVs. The complementary skills provided by each vehicle overcome the specific limitation of the other. Unmanned aerial vehicles (UAVs) offer a broad field of view and rapid coverage of search areas, which is ideal for mapping and monitoring tasks. However, they are constrained by their low payload (hundreds of grams) in relation to their size and short operational time (tens of minutes). UGVs, on the other hand, can carry substantial payloads and operate for extended periods. They offer high-resolution sensing, but with less field of view and lower coverage speed compared to UAVs. They are much more susceptible to obstacles, occlusions, and other sensing limitations. Therefore, coordinated operations between UGVs and UAVs can create highly beneficial synergies, such as multi-domain sensing and enhanced line of sight communications.

The objective of this paper is to present a collaborative and explorative approach using UGV and UAV. To achieve this goal, the given UGV-UAV system periodically visits a set of places of interest. These places have been pre-selected by the construction management team. During this mission, the UGV autonomously moves on the site and continuously scans the environment by its sensors. The relative pose between the two vehicles is estimated consistently, which enables the UAV to follow the UGV's path during its navigation. If the place of interest is not accessible by the UGV due to environmental constraints, the UGV sends the UAV to the desired location to scan the area of interest. The UAV does not process all data onboard; instead, the sensor's data are transmitted to the UGV's onboard computer through the network, where all the processing takes place. The UAV then returns to the UGV, and the heterogeneous team continues towards the next area of interest. It is necessary to mention that the places of interest for data collection are selected by the construction management team either directly through the laptop on the UGV, before starting the mission, or by commanding the UGV remotely using Secure Shell (SSH), which enables the operator to provide waypoints even during the mission.

To have an ideal and efficient system for indoor surveillance and monitoring, a custom-built indoor blimp is designed as the UAV for this study. Use of indoor blimp limits the application of the system to indoor environments. However, outdoor blimps which are more stable against air disturbance, have the potential to makes the system suitable for outdoor applications. The use of blimp instead of other types of UAVs (e.g., quadrotor, hexacopter, etc.) has the advantages of being safe in a cluttered indoor space and having lower cost, energy consumption, and noise.

The main contributions of this paper are 1) a comprehensive literature review on previous integrated UAV-UGV systems with different applications and 2) an integrated UAV-UGV system that can autonomously navigate and collect visual data for construction monitoring applications. This system addresses most of the limitations of past studies (to be further detailed in Section 2.3 UAV-UGV collaborative systems). The critical aspects of the system are 1) localization of both UGV and UAV, (2) capability of being contextually aware of the environment, (3) mapping of the surrounding environment, and (4) efficient path planning. Based on these aspects, the proposed system processes the context awareness (via semantic image segmentation), localization, mapping, and control planning in real-time, as illustrated in Fig. 1. To evaluate the performance of the proposed system, the system is implemented on an indoor cluttered construction-like environment (to be further detailed in Section 5 Experimental setup and results) for data collection purposes, demonstrating the feasibility of real-time performance for construction applications.

Section snippets

Background

The proposed system focuses on autonomous cooperation between UGV and UAV for navigation and indoor environment monitoring. Simultaneous Localization and Mapping (SLAM) and image segmentation algorithms are used for robot's localization, environment mapping, and scene understanding. Therefore, this section provides background information for visual SLAM and scene understanding. Additionally, it dissects into the previous UAV-UGV cooperative systems and discusses their application, sensors for

Hardware description

The proposed integrated system consists of two custom-built unmanned and autonomous platforms, a ground vehicle and a blimp. Mounted in the center of the UGV's chassis is a laptop with the following specification to carry out most of the necessary calculations of all modules: 16 GB DDR3 RAM, Intel Core i7-4710HQ quad-core Haswell processor, and NVIDIA GeForce GTX 960M. Two Raspberry Pis, one on the UGV and the other on the UAV, are used to control the actuators. The laptop and Raspberry Pis are

System architecture

This section describes the proposed integrated UAV-UGV system for autonomous data collection on construction sites. This multi-robot system uses ROS to integrate a variety of open-source packages. This architecture enables passing data between different modules, called nodes, within multiple computers through the publisher (i.e., a node that continually broadcasts a message) and the subscriber (i.e., a node that receives a message from a publisher). Each of these modules has many capabilities

Experimental setup and results

The proposed system has been tested on Constructed Facilities Lab (CFL) at North Carolina State University. This lab environment resembles an indoor construction site as there are stacks of construction materials and ongoing fabrication of structural components (e.g., columns, walls, bridge spans, etc.). Fig. 16 shows the test environments (A and B) and the corresponding 2D global map. Before the experiments, through an initial data collection, a comprehensive map of the construction site

Conclusion and future work

Over the past few years, autonomous UAVs and UGVs have received significant popularity in the construction industry, namely construction site surveying, existing structure inspection, and work-in-progress monitoring. However, there are still numerous open problems for further research such as efficient autonomous navigation in cluttered GPS-denied environments, where some places are inaccessible by the UGVs. To address this issue and increase the degree of automation through vision-based data

Declaration of competing interest

All authors confirm that there are no known conflicts of interest associated with this publication and there has been no significant financial support for this work that could have influenced its outcome.

Acknowledgments

We would like to thank all the students who were part of the ECE 592 class of Spring 2018 for their assistance with this project.

References (102)

  • K. Asadi et al.

    Vision-based integrated mobile robotic system for real-time applications in construction

    Autom. Constr.

    (2018)
  • Y.-H. Lin et al.

    The IFC-based path planning for 3D indoor spaces

    Adv. Eng. Inform.

    (2013)
  • T. Pire et al.

    S-PTAM: stereo parallel tracking and mapping

    Robot. Auton. Syst.

    (2017)
  • D. Gutierrez-Gomez et al.

    Dense RGB-D visual odometry using inverse depth

    Robot. Auton. Syst.

    (2016)
  • L.S. Pheng et al.

    Managing Productivity in Construction: JIT Operations and Measurements

    (2018)
  • A. Wickowski

    “JA-WA” - A wall construction system using unilateral material application with a mobile robot

    Autom. Constr.

    (2017)
  • A. Mirjan et al.

    Building a Bridge with Flying Robots

    (2016)
  • K. Asadi et al.

    Vision-based obstacle removal system for autonomous ground vehicles using a robotic arm

  • Y. Ham et al.

    Visual monitoring of civil infrastructure systems via camera-equipped Unmanned Aerial Vehicles (UAVs): a review of related works

    Vis. Eng.

    (2016)
  • J. Park et al.

    A BIM and UWB integrated mobile robot navigation system for indoor position tracking applications

    Journal of Construction Engineering and Project Management

    (2016)
  • K. Asadi et al.

    Building an Integrated Mobile Robotic System for Real-Time Applications in Construction

  • K. Asadi et al.

    Real-time image localization and registration with BIM using perspective alignment for indoor monitoring of construction

    J. Comput. Civ. Eng.

    (2019)
  • Yuneec, Typhoon H Drones (Last accessed: 01/07/2020),...
  • Parrot, Discover Parrot's FPV drones (Last accessed: 01/07/2020),...
  • DJI, Drone solution for a new generation of work (Last accessed: 01/07/2020),...
  • Y. Fang et al.

    Case study of BIM and cloud-enabled real-time RFID indoor localization for construction management applications

    J. Constr. Eng. Manag.

    (2016)
  • P. Liu et al.

    A review of rotorcraft unmanned aerial vehicle (UAV) developments and applications in civil engineering

    Smart Struct. Syst.

    (2014)
  • M. Blsch et al.

    Vision based MAV navigation in unknown and unstructured environments

  • M. Zollfer et al.

    State of the art on 3D reconstruction with RGB-D cameras

    Comput. Graphics Forum

    (2018)
  • C. Forster et al.

    SVO: fast semi-direct monocular visual odometry

  • C.D. Herrera et al.

    DT-SLAM: deferred triangulation for robust SLAM

  • J. Engel et al.

    LSD-SLAM: large-scale direct monocular SLAM

  • R. Mur-Artal et al.

    ORB-SLAM: a versatile and accurate monocular SLAM system

    IEEE Trans. Robot.

    (2015)
  • R. Mur-Artal et al.

    ORB-SLAM2: an open-source SLAM system for monocular, stereo, and RGB-D cameras

    IEEE Trans. Robot.

    (2017)
  • D. Galvez-Lpez et al.

    Bags of binary words for fast place recognition in image sequences

    IEEE Trans. Robot.

    (2012)
  • C. Kerl et al.

    Dense visual SLAM for RGB-D cameras

  • T. Qin et al.

    VINS-Mono: a robust and versatile monocular visual-inertial state estimator

    IEEE Trans. Robot.

    (2018)
  • T. Schneider et al.

    Maplab: an open framework for research in visual-inertial mapping and localization

    IEEE Robot. Autom. Lett.

    (2018)
  • F. Endres et al.

    3-D mapping with an RGB-D camera

    IEEE Trans. Robot.

    (2014)
  • M. Labb et al.

    Appearance-based loop closure detection for online large-scale and long-term operation

    IEEE Trans. Robot.

    (2013)
  • A. Hornung et al.

    OctoMap: an efficient probabilistic 3D mapping framework based on octrees

    Auton. Robot.

    (2013)
  • M. Labb et al.

    RTAB-Map as an open-source lidar and visual simultaneous localization and mapping library for large-scale and long-term online operation

    J. Field Rob.

    (2019)
  • A.G. Howard et al.

    MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications

    (2017)
  • A. Paszke et al.

    ENet: A Deep Neural Network Architecture for Real-Time Semantic Segmentation

    (2016)
  • K. Asadi et al.

    LNSNet: lightweight navigable space segmentation for autonomous robots on construction sites

    Data

    (2019)
  • K. Asadi, P. Chen, K. Han, T. Wu, E. Lobaton, Real-Time Scene Segmentation Using a Light Deep Neural Network...
  • Y.U. Cao et al.

    Cooperative mobile robotics: antecedents and directions

    Auton. Robot.

    (1997)
  • B. Arbanas et al.

    Decentralized planning and control for UAV-UGV cooperative teams

    Auton. Robot.

    (2018)
  • A. Downs et al.

    Registration of range data from unmanned aerial and ground vehicles

  • C. Forster et al.

    Air-ground localization and map augmentation using monocular dense reconstruction

  • Cited by (0)

    View full text