Next Article in Journal
Feasibility of Moderate Deficit Irrigation as a Water Conservation Tool in California’s Low Desert Alfalfa
Next Article in Special Issue
Robotic Fertilisation Using Localisation Systems Based on Point Clouds in Strip-Cropping Fields
Previous Article in Journal
The Influence of Trichoderma harzianum Rifai T-22 and Other Biostimulants on Rhizosphere Beneficial Microorganisms of Carrot
Previous Article in Special Issue
A Concept of a Compact and Inexpensive Device for Controlling Weeds with Laser Beams
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Field Robots for Intelligent Farms—Inhering Features from Industry

Centre for Automation and Robotics (UPM-CSIC), Arganda del Rey, 28500 Madrid, Spain
*
Author to whom correspondence should be addressed.
Agronomy 2020, 10(11), 1638; https://doi.org/10.3390/agronomy10111638
Submission received: 1 September 2020 / Revised: 16 October 2020 / Accepted: 22 October 2020 / Published: 24 October 2020

Abstract

:
Estimations of world population growth urgently require improving the efficiency of agricultural processes, as well as improving safety for people and environmental sustainability, which can be opposing characteristics. Industry is pursuing these objectives by developing the concept of the “intelligent factory” (also referred to as the “smart factory”) and, by studying the similarities between industry and agriculture, we can exploit the achievements attained in industry for agriculture. This article focuses on studying those similarities regarding robotics to advance agriculture toward the concept of “intelligent farms” (smart farms). Thus, this article presents some characteristics that agricultural robots should gain from industrial robots to attain the intelligent farm concept regarding robot morphologies and features as well as communication, computing, and data management techniques. The study, restricted to robotics for outdoor farms due to the fact that robotics for greenhouse farms deserves a specific study, reviews different structures for robot manipulators and mobile robots along with the latest techniques used in intelligent factories to advance the characteristics of robotics for future intelligent farms. This article determines similarities, contrasts, and differences between industrial and field robots and identifies some techniques proven in the industry with an extraordinary potential to be used in outdoor farms such as those derived from methods based on artificial intelligence, cyber-physical systems, Internet of Things, Big Data techniques, and cloud computing procedures. Moreover, different types of robots already in use in industry and services are analyzed and their advantages in agriculture reported (parallel, soft, redundant, and dual manipulators) as well as ground and aerial unmanned robots and multi-robot systems.

1. Introduction

According to the Food and Agriculture Organization (FAO), the world’s human inhabitants are expected to reach 9.6 billion people by 2050. Feeding this huge population is largely considered one of the greatest outstanding challenges in terms of human initiatives. Cultivated land is close to its maximum in developed countries, and as predicted by the European Agricultural Machinery Association (CEMA) [1]—the association representing the European agricultural machinery industry—food production must increase by 70% to successfully feed the human population circa 2050. This mission demands more efficient infrastructure, farms, and production devices capable of preserving resources in a sustainable, environmentally friendly, and cost-effective manner.
The precision farming concept, which consists of assembling different methods and techniques to manage variations in the field to increase crop productivity, improve business profitability, and ensure eco-environmental sustainability, has provided some significant solutions. After more than three decades of development, the basic technologies on which precision farming was constructed are becoming mature enough to aid in accomplishing this mission. Figure 1 illustrates some of these techniques along with their connections, distinguishing those based on information and communication technologies (ICT) from those that rely on field robotics [2,3,4,5,6,7].
Currently, agricultural activities involving robotics exhibit a high degree of technology and are capable of performing autonomous tasks. Most of these tasks are related to harvesting and weeding, followed by the disease detection and seeding, according to a recent study on research and commercial agricultural robots for field operations [8]. For example, fertilization-spreading tasks can be executed autonomously if the appropriate implement tanks have been filled with fertilizer and attached to fueled autonomous vehicles, but only until the system has run out of fertilizer or fuel. Then, human operators have to participate in refilling and refueling/recharging. The same concept is applicable to planting and spraying.
Furthermore, harvesting systems must offload the yield when their collecting tanks are full, and this operation is mostly performed by operators. Similarly, the attachment of agricultural tools—and tool interchanging—requires the involvement of operators. However, these manual operations are susceptible to automation, as similar operations have been automated in industry. Thus, another step forward for agriculture is to relegate operators to mere supervisors by combining these types of activities with other already automated farm management activities to organize a fully automated system approaching the model of the fully automated factory, where raw materials enter and finished products leave with no human intervention. Thus, a fully automated farm can be seen as an agricultural field where materials (seeds, fertilizers, herbicides, etc.) enter and crops leave with no human intervention. Moreover, this parallelism can be extrapolated for farms to closely resemble the intelligent factory model. This idea is the intelligent farm concept, which is a fully automated farm upper layer that builds a completely connected and flexible system that [9] optimizes system performance through a wider network, learns from new conditions in real- or quasi-real-time, adapts the system to new working conditions, and performs whole production processes autonomously.
An intelligent farm is founded on autonomous decision making [10] to guarantee asset efficiency; improve product quality, product safety, and environmental sustainability; decrease production costs; minimize delivery time to consumers; increase market share; enhance profitability; and maintain the labor force. To achieve the intelligent farm, many of the systems and components that are currently being used in agriculture will require design modifications and supplementary improvements, especially robotic systems. Therefore, this article analyses similarities, contrasts, and differences found between industrial and field robots and focuses on presenting some characteristics that agricultural robots should inherit from a broad classification of industrial robots (parallel, soft, redundant, and dual manipulators) to achieve the intelligent farm concept considering not only robot morphologies and features but also communication, IoT sensing [11], computing and data management methods [12], and cyber-physical techniques [13]. Robots for greenhouse farms merit a specific revision and are out of the scope of this study.
This article is organized as follows. Section 2 compares the activities in intelligent factories with activities carried out on farms. Then, Section 3 states the techniques on which intelligent factories rely as an approach to the intelligent farm concept. Section 4, Section 5, Section 6 and Section 7 present the status of robotic systems applicable to agriculture: manipulators, unmanned ground robots, unmanned aerial robots, and multi-robot systems. Finally, Section 8 presents some conclusions.

2. Intelligent Factories versus Intelligent Farms

2.1. Similarities

The conceptual translation of the automated factory concept into the automated farm can be approached by enumerating the similarities and dissimilarities between the two scenarios. For example, factories and farms share the following similarities:
Factories use robot manipulators that use different tools to perform dissimilar tasks. These tool changes are mainly made automatically. Similarly, farm vehicles have to change agricultural implements to perform different functions; such implement substitutions should be performed automatically.
Factory manipulators cooperate and collaborate to accomplish tasks. Similarly, mobile robots operating on farms could cooperate and collaborate to complete missions.
Factories have simple fixed sensor networks to measure different magnitudes. Likewise, fixed sensor networks could be installed at strategic measuring points in some specific farm areas.
Both factories and farms can capitalize on wireless technologies for communications.
For both factories and farms (if a network is available), everything can be controlled by a central controller (factory/farm management system).

2.2. Contrasts

However, factories and farms present the following slight contrasts:
Factories use equipment such as manipulators and computer numerical controlled machines that are placed at fixed positions to sense and perform operations on goods and materials supplied through assembly lines (process lines) or mobile robots. In contrast, on farms, goods and materials are often at fixed positions, while the machinery must be moved using autonomous vehicles.
Factories have complex sensors and equipment (e.g., vision systems and laser cutting tools) at fixed points in the production lines. On farms, complex sensors and equipment (e.g., vision systems and nitrogen detectors) must be moved to the appropriate measuring points on the farm.
Factories use both wired and wireless communication networks to send data to the central decision-making system. On farms, field vehicles must be equipped with innovative wireless communication systems to send collected data to the decision-making system.
Energy supplies on farms (i.e., fuel/charge) and agricultural materials (seeds, fertilizers, pesticides, etc.) are not directly comparable to those in factories, where materials and energy can be provided easily. In factories, materials are supplied through belts or robots, and energy is supplied by cables (electric) or hose pipes (hydraulic, pneumatic); on farms, energy (fuel/charge) and agricultural materials (seeds, fertilizers, pesticides, etc.) are supplied at refueling/refilling stations. These dissimilarities form challenges that must be addressed. Regarding refueling, it is worth mentioning that the trend in many countries is to support electric cars. For example, the United Kingdom has begun replacing its fleet of combustion-engine taxis in London with electrically driven taxis, while Norway and France plan to end the market for diesel and gasoline models in 2025 and 2040, respectively. Thus, the intelligent farm concept should consider this trend and shift the concept of refueling to that of recharging.

2.3. Strong Differences

Furthermore, factory and farm scenarios differ widely in the environment, the targets, and the duty cycles as follows:
Environment: Robotic applications in industry are concerned with repetitive, precise, and scheduled actions to be executed in steady, well-defined, and structured environments, while robotic applications in agriculture must address dynamic, complex, and unstructured environments and crops that feature rapid changes with position and time [14]. For example, terrain and vegetation characteristics as well as light conditions and visibility vary continuously; these variations can range from millimeters to kilometers and from seconds to months. In addition, these identifiable characteristics are imprecise and exhibit intrinsic uncertainty [15]. Furthermore, working conditions in agriculture are severe; for example, wind, dust, rain, extreme temperature, humidity, and vibrations are uncontrolled factors, whereas working conditions in industrial factories are easier to maintain within proper thresholds. Thus, new technologies are needed to overcome severe conditions in farms.
Targets: Tasks for mobile robots in agriculture can be divided into treatment/actuation tasks (e.g., planting, spraying, fertilizing, and cereal harvesting) and manipulation tasks (e.g., fruit and vegetable harvesting). To fulfill treatment/actuation tasks, a mobile robot must feature positional accuracy and leveling (if the implement is assumed to be rigidly attached to the mobile platform). In this case, the positional accuracy provided by commercial differential GPS (DGPS) systems is usually enough. However, for fruit harvesting, the mobile platform needs only rough positioning, but the manipulators, in conjunction with the pertinent sensors, must be capable of accurate positioning.
Many agricultural tasks involve objects such as plants or fruits that are soft, vary widely in shape and size, and are extremely sensitive to environmental physical conditions, primarily temperature, humidity, pressure, and wind [16]. In contrast, in many industrial tasks, robots manipulate rigid, nondeformable objects with static physical features that are easy to grasp and handle. Thus, agriculture demands complex sensors and intelligent manipulation systems that provide gentle; precise; and, usually, complex handling maneuvers to maintain quality. These characteristics have jeopardized the introduction of robotic systems to replace humans for tasks that require manipulating agricultural products, and these tasks are often still being handled manually [17]. However, manual labor can comprise 40% of the total cost, and labor remains the largest single cost-contributor in agriculture [18]; this cost makes the introduction of robotic manipulators economically attractive and incidentally will accelerate the introduction of mobile robots into agriculture.
Duty cycle: Additionally, achieving a level of automation for farms as high as that found in factories is difficult because of the seasonality of agricultural tasks. Mobile robots can be adapted to several tasks over an entire year; however, these robots will never achieve the nonstop production achieved by industrial production lines.
Considering the similarities and dissimilarities between factories and farms and the needed improvements for farms to achieve a fully automated farm status, the next step to configure an intelligent farm is to follow the paradigm of an intelligent factory.

3. Approach to the Intelligent Farm Concept

Intelligent factories depend on the strong combination of five concepts: (1) artificial intelligence, (2) cyber-physical systems (CPSs), (3) the Internet of Things (IoT), (4) Big Data, and (5) cloud computing. Intelligent farms should also be based on these principles to reduce the traditional delays in applying new technologies to agriculture with respect to their application in industry. These concepts are introduced in the following paragraphs.

3.1. Artificial Intelligence

Currently, there is not a widely accepted definition for artificial intelligence (AI) [19], but “the capability of a machine (usually a computer) to imitate intelligent human behavior” gives insight on the meaning of the term. AI already has an intensive use in robotics, specifically in navigation of mobile robots [20], but also has a potential in factories: quality checking, prediction of failures, predictive maintenance, digital twins, analysis of environmental impact, use of data, etc., which can be used in farms for similar purposes. A thorough revision of AI in factories is summarized in [21].

3.2. Cyber-Physical Systems

A cyber-physical system is an integration of computation in physical processes. Computation consists of embedded hardware and communication networks that monitor and control the physical processes in closed loop [22]. Therefore, a CPS consists of the process itself, hardware, and software. The embedded hardware forms the physical device and the software is considered as the virtual device. The physical device is formed by sensors, controllers, computers, data acquisition devices, communication network, etc. The virtual device consists of mathematical models that represent the behavior of the physical device and the relevant control algorithms (See Figure 2).

3.3. The Internet of Things

The IoT is a computing concept that has not yet been precisely defined to describe how physical objects can be connected to the Internet and identified by other devices. This communication is made through sensor and wireless technologies as well as by RFID (radio frequency identification) and QR (quick response) codes. In CPSs, the physical devices are connected through the IoT, while the virtual devices are connected through the Internet, as illustrated in Figure 2 [13].

3.4. Big Data

Although there is not a unified definition for Big Data, this term refers to the enormous amount of data currently available in many disciplines, making efficient analyses using conventional data processing techniques difficult. These data can be in the form of texts, geometries, images, videos, sounds, and geospatial information and are (1) available on the Internet, (2) provided by public institutions, or (3) collected with mechanisms based on the IoT. Big Data refers to collecting, storing, analyzing, searching, sharing, transferring, and visualizing data of processes to obtain relevant information [12,13,23].

3.5. Cloud Computing

Managing big datasets requires special computing and storing tools, and cloud computing techniques provide comfortable, fast, economical, and secure mechanisms for doing so. Cloud computing is defined by The National Institute of Standards and Technology as “a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources—e.g., networks, servers, storage applications and services—that can be rapidly provisioned and released with minimal management effort or service provider interaction” [24].
A cloud computing solution can be implemented as (1) a public cloud where the access is provided via interfaces in a pay-per-use procedure; (2) a private cloud, which is similar to an organization’s Intranet; and (3) a hybrid cloud, which combines public and private cloud features [25].
Cloud providers operate hardware and software computing infrastructures to provide services to demanding users over the Internet offering three types of services: software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS) [25].
Figure 3 sketches the concepts of an intelligent factory (left) and an intelligent farm (right), showing the common elements (center).

4. Manipulators for Agriculture

A manipulator is an electromechanical device that consists of several bodies joined with joints that form an open (serial) chain (see Figure 4a). One end of the chain, the base is attached to a structure (the ground, a mobile platform, etc.), whereas the other end (end effector or tool) can be positioned and oriented in a working volume by controlling the position of each joint. As end effectors, we can find grippers, devices (sprayers, dispensers, etc.), or sensors (humidity, temperature, nitrogen, etc.). Industrial manipulators were deployed in the industry around 1961, and, just a few years later (1968), the first robot manipulator for agriculture was proposed to solve harvesting problems [26]. Since then, many solutions have been used in industry and tested in agriculture, including parallel structures, soft manipulators, redundant manipulators, and dual arms.

4.1. Arms and End Effectors

Normally in agriculture, an arm manipulator is used to move its end effector to a position at a given point with the required orientation to interact with an object. This interaction can be by contact (e.g., fruit recollection and mechanical weeding) or noncontact (e.g., spraying applications, sensing). The type of end effector depends on the type of interaction: grippers for fruit recollection, rotating hoes for mechanical weeding, sprayers for herbicide application, cameras, nitrogen detectors for measuring, etc.
Manipulators are mainly characterized according to their number of independent motions (degrees of freedom (DOF)), types of joints (rotary, prismatic), and mechanical structure, defined by the combination of the types of joints used and their relative positions and orientations, e.g., articulated, Cartesian, polar, cylindrical, and gantry. A manipulator can be adapted to several agricultural tasks by simply using the appropriate end effector. Sometimes, commercial industrial manipulators are adapted to agricultural needs, but for many applications, a specific manipulator design is required.
Conventional manipulator structures adapted to agriculture lack sufficient speed, especially in harvesting operations, and the ability to interact with the environment, i.e., when reaching for a piece of fruit, an arm can crash into branches or other pieces of fruit. The speed issue can be avoided using nonconventional robotic structures such as parallel-delta structures, while the problem of interaction with the environment can be solved using redundant structures or soft mechanisms.

4.2. Parallel Manipulators

Unlike a serial manipulator, a parallel robot has its end effector connected to its base by several (usually three to six) independent linkages working in parallel. The term “parallel” refers to the topology rather than geometry and does not mean that links are parallel to each other [27]. Figure 5a,b illustrates two basic schemes for this structure. Parallel robots feature lower inertia and higher precision, payload capacity, acceleration, speed, and stiffness than serial manipulators; nevertheless, the workspaces of these structures are smaller and the control is more complex, which strongly limit the potential number of applications [28].
A 6-DOF parallel manipulator can control the position and orientation of the end effector. When the linkages of a 6-DOF parallel manipulator are based on prismatic linkages, the structure is known as the Stewart platform (see Figure 5a). A lower number of DOF restricts the orientation of the end effector, and for 3-DOF, the end effector can only be positioned. A special parallel structure, known as the delta manipulator (see Figure 5b,c), is based on articulated linkages that use parallelograms to restrict the motion of the end effector to pure translation [29]. This system is mainly used as a 3-DOF manipulator; however, some versions with more than 3 DOF have already been developed.
Because of their high speed, delta structures are used for pick-and-place operations in industry; therefore, delta manipulators could be applied for harvesting fruits when trees are grown in trellises. There are some attempts to check this type of manipulator in weed management using mechanical and chemical tools [30].
As opposed to tasks in industry where objects are normally of few different forms and sizes, objects in agriculture present a large variety of forms and dimensions. These features force us to use special grippers that integrate sophisticated sensors to handle that variability. In addition, objects in industry are normally rigid, whereas crops have a large variety of rigidities that are much lower than those of the mechanisms in contact with the crops. Some crops, especially high-value crops, are deformable products that have to be handled softly, which requires the use of haptic grippers based on force and touch sensors.

4.3. Soft Manipulators

A worthy solution to handle deformable products is the use of soft end effectors. Some fruits and vegetables can be picked with suction end effectors, e.g., apples and eggplants, but other products need to be gently handled, e.g., grapes and lettuce, and require a kind of human hand emulator, not only for providing mechanical ability but also for detecting touch and supplying the right pressure. A soft gripper carried by a rigid manipulator that provides accuracy and force can be used in many harvesting tasks; however, for other tasks in which the manipulator can interact with products or branches, the manipulator needs to exhibit soft features and requires sensorimotor joint coordination to achieve the “soft” characteristics. In general, fruit and vegetable harvesting requires soft grippers and soft manipulators that also have to provide accuracy, robustness, and force. These attributes can be achieved by series elastic actuators [31] that incorporate elastic structures similar to those of humans. Soft end effectors and soft manipulators are sometimes referred to as soft robotics [32].

4.4. Redundant Manipulators

The interaction of manipulators with crops and branches can be avoided by using redundant manipulators. An ordinary industrial manipulator has enough DOF to achieve any desired position and orientation (pose) in its end-effector workspace, which is limited because of the intrinsic mechanical constraints (link lengths and joint angles) or extrinsic obstacles encountered in the workspace. In contrast, a redundant manipulator has more DOF than needed to access its complete workspace. This attribute enables redundant manipulators to reach a given point in their workspaces, avoiding their joint limits and the surrounding obstacles in the workspaces. In addition, this structure is more robust with respect to mechanical and electronic joint failure. Figure 4a illustrates a redundant manipulator scheme and its versatility for avoiding obstacles in harvesting tasks. Most redundant manipulators in industry are based on 7-DOF structures, but 8-DOF manipulators [33] and 9-DOF [34] manipulators can be found in the literature for different applications. For a larger number of DOF, this type of manipulator is considered a hyperredundant manipulator. No doubt, this type of structure has great application potential in agriculture.

4.5. Hyperredundant Manipulators and Continuum Manipulators

When a manipulator exhibits a very large degree of redundancy (much higher than 6 DOF), the manipulator is called hyperredundant. The border between redundant and hyperredundant denominations is unclear, but a clear example of a hyperredundant manipulator, as shown in Figure 6a, is a prototype composed of 10 modules or sections with 3 DOF each (a total of 30 DOF) built in the mid-1990s [35] Another example is the manipulator illustrated in Figure 6b and presented in [36]. In addition to the advantages mentioned for redundant manipulators, hyperredundant manipulators have the capability of performing new forms of grasping by surrounding objects with clear applications in agriculture.
There is a type of structure that consists of several sections characterized by a continuous and independent bending of each section. This structure can generate a sequence of smooth curves that place the end effector in the desired position. A manipulator based on this type of structure is known as a “continuum” manipulator, sometimes also referred to as hyperredundant, but most researchers prefer to consider it a different type of manipulator. A continuum manipulator resembles the motion of an elephant’s trunk, a swan’s neck, octopus’ tentacles, or a snake, just to mention a few examples. This structure has recently been used to configure a harvesting manipulator for agriculture [37]. The authors reported that the continuum mechanism worked properly for harvesting fruits, with no damage to fruits, leaves, and branches, but the structure needed to be improved to increase its payload.

4.6. Dual-Arm Manipulators

The idea of using two manipulators with shared workspaces was first developed in industry for the assembly of parts and components. Currently, that idea has been renewed, and such a device has been called a dual-arm manipulator. Robotics companies are introducing products in the industrial market [38,39], but research is ongoing for applications in the agricultural sector [18,40]. The advantages of using dual-arm manipulators can be found for (1) bimanual grasping, (2) grasping with one arm and cutting with the other arm, (3) handling deformable products, (4) moving away leaves and branches with one arm to obtain products with the other arm, etc. (see Figure 7).
Table 1 summarizes some attempts to use manipulator technology for agriculture.

5. Unmanned Ground Robots for Agriculture

Ground autonomous robots were based on the deployment of new technologies fostered, in the beginning, for military applications and demonstrated important achievements to adapt military vehicles as autonomous robots. When those technologies were made commercially available, especially GPS, they were used to convert conventional vehicles into ground autonomous vehicles and, then, autonomous tractors for agriculture applications emerged. After demonstrating the capabilities of these agricultural robots, the developers focused on the design of agricultural vehicles following robotics criteria. The following paragraphs present the technologies used and the two trends in developing field agricultural robots.

5.1. Approach to UGVs for Agriculture

The commercial availability of the global navigation satellite system (GNSS) has provided easy ways to configure autonomous vehicles or navigation systems to assist drivers in outdoor environments, especially in agriculture, where many highly accurate vehicle steering systems have become available [46,47]. These systems aid operators in the precise guidance of tractors using LIDAR (light/laser detection and ranging) or GNSS technology but do not endow a vehicle or tool with any level of autonomy. Other critical technologies, such as safety systems responsible for detecting obstacles in the robots’ path and safeguarding humans and animals in the robots’ surroundings as well as preventing collisions with obstacles or other robots, must be incorporated to configure autonomous vehicles. Additionally, wireless communication between the robot and the operator and external servers (cloud, CPSs, and IoT technologies) will be critical to incorporate decision-making systems built on Big Data analysis as well as to expand decision processes into machine learning and artificial intelligence fields.
The technology required to deploy more robotic systems into agriculture is available today, as are the clear economic and environmental benefits of doing so. However, although agricultural machinery manufacturers have not missed the marketing potential of showing concepts [48,49], these manufacturers are unwilling to deploy fully robotic systems into the market. Therefore, additional hard work must be made by both researchers and private companies to discover new solutions, as encouraged by the Standing Committee on Agricultural Research [50]. In addition, farmers are normally reluctant to use new technologies and thus emulation tools for new equipment will help in introducing agricultural robotic systems [51].
Autonomous systems used for precision agriculture tasks (see Figure 1) involve the coordination of the following devices (see Figure 8):
D1
Detection system: This sensor system provides data concerning soil, crops, and weed management;
D2
Autonomous robots: Each robot comprises a mobile platform, a robot controller, and a safety system;
D3
Agricultural tools: These tools act directly on soil/crops and can be based on physical (mechanical, thermal, etc.) and chemical (pesticides, fertilizers, etc.) principles; and
D4
Farm manager: This aspect normally includes a decision support system, a planner, and a supervisor.
The relationships among these systems are illustrated in Figure 8 along with their working process, configured as a sequence of actions (Ai, i = 0 to 6). This process starts with system deployment, A0, and then the other actions are cyclically repeated until the conclusion of the agricultural task—field completion, A6. These actions are as follows:
A0
The system is deployed in the working field.
A1
The sensors onboard the multirobot system, D1, detect crop, soil, and environmental features in a small area in front of the robots.
A2
The collected data are sent, via the robot controllers, to the decision support system, which determines the subsequent commands for the robots and agricultural implements.
A3
Motion commands and intervention instructions are sent to the multirobot systems, which follow the planned trajectories; the robots move to the supplied positions.
A4
The robots send the pertinent commands to the agricultural tools to perform the treatment.
A5
The treatment is applied; then, the process repeats from action A1 or exits if the task is complete, A6.
A6
The task is complete.
Robotics for agriculture applies to both manipulators (D3—agricultural tools) and mobile robots (D2—autonomous robots). The former are important devices mainly for harvesting on the basis of the structures presented in Section 4; the latter are required for almost all agricultural tasks, and the robot structures are strongly dependent on the application environment.
The development of ground autonomous vehicles for agriculture focused on two approaches: (1) the automation of conventional vehicles and (2) the design of specific robot structures.

5.2. Automation of Conventional Vehicles

The first approach consisted of retrofitting commercial agricultural tractors with (1) actuators to manage a vehicle’s throttle and steering; (2) sensors to obtain information about the vehicle’s environment and self-position; and (3) computers to manage the information obtained by the sensors (including self-localization), generate a path (planning), and follow the path by controlling the actuators (steering).
These components enable vehicles to navigate autonomously. In addition, the computer also manages communication with operators and external servers and controls the diverse elements of the agricultural implements carried or towed by the vehicle. Table 2 summarizes some of the autonomous vehicles built by following this first approach.

5.3. Design of Specific Robot Structures

The second approach to configure autonomous mobile robots for agriculture relies on the design of specific robot structures. This design is performed by considering the different techniques used in the development of general ground mobile robots, which are based on wheels, tracks, and legs, as locomotion systems.
Recently, some hybrid locomotion systems based on wheels and legs have appeared in the literature (wheel-legged robots), exhibiting interesting characteristics. Independently of the locomotion structure used, a robot also has to include elements to sense, act, and control the robot’s subsystems, as indicated in the description of the retrofitting of conventional vehicles (actuators, sensors, and computers). Figure 9 illustrates the main mobile platform structures for the steering scheme, and Table 3 summarizes some autonomous vehicles built as specific structures.

6. Unmanned Aerial Robots for Agriculture

An aerial robot is an aircraft with no human pilot on board. This type of robot can be remotely operated by a human operator or autonomously operated under onboard computer control (unmanned aerial vehicle (UAV)). These devices, popularly known as drones, are built in a variety of types defined by the technology used to fly the drone, e.g., fixed-wing (planes), single-rotor helicopter, hybrid system (vertical takeoff and landing), and multirotor. Although drones were originally developed for military equipment, drone use has rapidly expanded to other applications that include surveillance, aerial photography and video, delivery of goods, weather forecasting, traffic forecasting, and agriculture, where drones are being used to support spatial data collection to influence further policies and decisions [73].
Multirotor technology is the most used in drones over the last decade. This type of vehicle is lifted and propelled by four (quadrotor) or six (hexrotor) rotors and uses a control system to balance the thrust of each rotor to control drone lift and yaw, pitch, and roll angles. In this manner, the controller can produce stable flight of the drone [74].
The use of multirotor drones for agriculture has increased drastically in recent years, and several specific designs have been developed [61,74] (see Figure 10). Equipped with the appropriate sensors (vision, infrared, multispectral, and hyperspectral cameras, etc.), drones allow farmers to obtain data (vegetation, leaf area, and reflectance indexes) from their fields to study delicate changes in crops that cannot be detected by scouting the ground. These data permit farmers to infer crop diseases, plagues, nutrient insufficiencies, water stress or excess, and other circumstances that could affect crop growth. With this information, farmers can plan possible remedies (irrigation, fertilization, weed control, etc.). Drones will definitely be used intensively in intelligent farms and must be designed to include the new technologies to be used in intelligent factories.

7. Multirobot Systems and Fleets

As discussed above, many research efforts have developed specifically designed autonomous applications for agriculture [58,75,76], and many other efforts are aimed at operating groups of vehicles under unified control. This approach embodies the concept of multirobot systems, which constitute a forward step in automating agricultural activities. Multirobot systems use several mobile robots to execute different tasks cooperatively and collaboratively. Using groups of robots, in which every robot interacts with others to achieve a well-defined objective, is an emerging and necessary concept for applying autonomous systems to daily agricultural tasks [77,78,79,80].
The theoretical foundations of multirobot systems or robot fleets have been studied in recent years [81,82], but the first experimental fleets were tested only recently; no commercial equipment is available yet. The RHEA system was one of the first attempts to build a fleet of robots for agriculture [61] and focused on the design, development, and testing of a new generation of automatic and robotic systems for both chemical and physical effective weed management (see Figure 11). The fleet consisted of both aerial and ground vehicles equipped with a machine vision system to detect weeds. The ground robots were also equipped with agricultural implements for treatment applications. The RHEA system showed that multirobot systems enable the use of small vehicles that accomplish work volumes similar to that of a large machine but provide the following additional benefits:
more accurate positioning during operation;
an intrinsically lighter weight than conventional machines, which reduces soil compaction and improves vehicle safety for people, crops, and the robot itself—all of which are features currently requested for agricultural equipment [83];
easier acquisition because the multirobot approach allows farmers to acquire high-technology equipment in an incremental manner;
increased fault tolerance—a failure in a small robot means one less robot at work, while a failure in a large vehicle halts the entire field operation; and
mission coordination and reconfiguration—at any time, the fleet behavior can be changed to optimize the mission or accommodate sudden changes in field conditions.
An important limitation of multirobot systems is that the total number of devices (e.g., sensors, actuators, and computers/controllers) increases according to the number of robots; thus, a large number of robots can result in complex and expensive systems that are attractive only when applied to high-value crops (e.g., grapes, peppers, and golf course turf).

8. Conclusions

8.1. Summary and Considerations

As presented in Section 2, intelligent factories and intelligent farms have many common difficulties and some common solutions based on AI, CPSs, the IoT, Big Data, and cloud computing, but there are specific aspects of intelligent farms, such as those related to robotization of tasks, that should be investigated independently.
Field robots for use in intelligent farms can be of two types: mobile robots, capable of moving throughout the working field, and manipulators, normally attached to mobile robots and capable of performing some types of actions on crops. Mobile robots can be unmanned ground robots or unmanned aerial robots, whereas manipulators can follow diverse techniques that provide different characteristics: (1) rigid manipulators; (2) soft manipulators; (3) parallel robots; (4) dual-arm manipulators; (5) redundant, hyperredundant, or continuum manipulators; etc.
Conventional rigid industrial manipulators can be applied to agriculture, but the use of soft manipulators and soft end effectors will be essential to manage deformable high-value crops. Additionally, dual-arm manipulators can also handle these crops properly and should be considered harvesting devices. This type of manipulator also has the potential to mimic human behavior by pushing leaves and branches aside to search for and reach crops. This activity of reaching the crops in environments with leaves and branches can also be undertaken using redundant manipulators. These structures, presented in Section 4.4, allow the manipulator to surround obstacles easily. When the redundancy is high, this type of manipulator is referred to as hyperredundant and is more versatile for surrounding obstacles than simple redundant manipulators. Finally, another type of manipulating structure capable of surrounding objects is the continuum structure, a specific concept that is sometimes considered a hyperredundant manipulator. Softness, duality, and redundancy are features to take into consideration for future intelligent farms, especially for harvesting high-value crops. For other types of crops whose productivities depend on harvesting speeds, parallel manipulators can be a practical solution. All these structures should be considered as future manipulators for intelligent farms. Table 4 summarizes the main features of some current manipulators as a starting point to develop new manipulators for intelligent farms.
Unmanned ground robots were first based on retrofitting commercial agricultural tractors, but a new trend of developing specifically designed mobile platforms based on robotics principles is being carried out. These new developments provide supplementary benefits regarding maneuverability, adaptability to crops, and adaptability to terrain. Mobile robots based on independent steering devices offer the best maneuverability, but wheel-legged structures can offer similar characteristics while improving adaptability to crops (narrow-row crops, wide-row crops, etc.) and terrain (slopped terrain, irregular terrain, etc.). Ground mobile robots will be essential for intelligent farms and should be developed using robotics principles and incorporating the knowledge gained in the development of conventional tractors.
The use of unmanned aerial robots in agriculture has exploded in recent years, with a clear limitation on the maximum payloads these robots can carry. This fact restricts the use of these mobile robots to the tasks of sensing and carrying portable sensors. Increasing the payloads by increasing the number of rotors or rotor efficiency will contribute to the use of more sensors and more sophisticated sensors. Miniaturization of sensors and onboard equipment, always an ongoing activity, will also help to increase the use of UAVs. In any case, applications of UAVs for direct actuation in crops seem to be unaffordable using the current technology in a relatively short period. Nevertheless, some studies for applying treatments at discrete points, e.g., pesticide application on processionary moth nests, are being conducted [84].
Several small robots working collaboratively and cooperatively can (1) achieve the same work as a large vehicle but with lighter total weight, which reduces soil compaction; (2) improve maneuverability and safety because of the lower inertia of the robots; (3) increase robustness regarding fault tolerance because the system can still work with a few units out of service; and (4) replan the mission according to the status of the robots or the environment. These abilities, regardless of the robot types and robot structure, position multirobot systems as the main candidates for outdoor unmanned ground vehicles (UGVs) in intelligent farms [53,75]. On the basis of existing agricultural vehicles and prototypes, UGVs for deployment in intelligent farms should meet the characteristics presented in Table 5.
Manipulators and mobile robots also need to be resilient; efficient in resource management; user-friendly; and capable of enabling AI techniques, CPSs, the IoT, and cloud computing techniques to support services (SaaS, PaaS, and IaaS). Finally, vehicles in general are shifting from using combustion engines to using electric motors as traction systems; therefore, robots in future intelligent farms should be powered with batteries, as should the rest of the devices, including manipulators.

8.2. Final Remarks

The increase in the world population increases the demand for more efficient, safer, and more environmentally friendly production systems and processes. The industry started to accomplish this global objective early in the 2000s by pursuing the concept of the “intelligent factory”, where the production process is performed autonomously, optimizing the system performance, learning from new conditions in real-time, and adapting the procedures to those new conditions. Agriculture, which has traditionally adopted technological advances later than industry, should follow the idea of the intelligent factory and develop the intelligent farm concept.
This article analyses the features that agricultural robots should obtain from industrial robots to accomplish the intelligent farm concept concerning robot structures and functionalities as well as communication, computing, and data management methods. The research is focused on robotics for outdoor farms and analyses different manipulator and mobile robot structures as well as the latest technologies used in intelligent factories with the objective of advancing the characteristics of robotics for future intelligent farms.
This article determined similarities, contrasts, and differences between industrial and field robots and identifies those techniques already proven in the industry that exhibit the potential to be used in outdoor farms. These techniques are those derived from artificial intelligence, cyber-physical systems, the Internet of Things, Big Data techniques, and cloud computing procedures and methods. Furthermore, some systems already in use in industry and services such as parallel, soft, redundant, and dual manipulators as well as ground and aerial unmanned robots and multi-robot systems were analyzed and their advantage in agriculture was reported.

Author Contributions

Conceptualization P.G.-d.-S. and R.F.; bibliographic review: manipulators, D.S. and E.N., unmanned robots, L.E. and M.A., multi-robots, P.G.-d.-S.; writing—original draft preparation, P.G.-d.-S.; writing—review and editing R.F. and M.A.; supervision P.G.-d.-S. All authors have read and agreed to the published version of the manuscript.

Funding

The research leading to these results has received funding from (i) FEDER/Ministerio de Ciencia, Innovación y Universidades—Agencia Estatal de Investigación/Proyecto ROBOCROP (DPI2017-84253-C2-1-R) and (ii) RoboCity2030-DIH-CM Madrid Robotics Digital Innovation Hub (“Robótica aplicada a la mejora de la calidad de vida de los ciudadanos. Fase IV”; S2018/NMT-4331), funded by “Programas de Actividades I+D en la Comunidad de Madrid” and co-funded by Structural Funds of the EU.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. CEMA. European Agricultural Machinery. Available online: https://www.cema-agri.org/ (accessed on 2 July 2020).
  2. Pflanz, M.; Nordmeyer, H.; Schirrmann, M. Weed Mapping with UAS Imagery and a Bag of Visual Words Based Image Classifier. Remote Sens. 2018, 10, 1530. [Google Scholar] [CrossRef] [Green Version]
  3. Deng, J.; Zhong, Z.; Huang, H.; Lan, Y.; Han, Y.; Zhang, Y. Lightweight Semantic Segmentation Network for Real-Time Weed Mapping Using Unmanned Aerial Vehicles. Appl. Sci. 2020, 10, 7132. [Google Scholar] [CrossRef]
  4. Hu, J.; Wang, T.; Yang, J.; Lan, Y.; Lv, S.; Zhang, Y. WSN-Assisted UAV Trajectory Adjustment for Pesticide Drift Control. Sensors 2020, 20, 5473. [Google Scholar] [CrossRef] [PubMed]
  5. Suardi, A.; Stefanoni, W.; Alfano, V.; Bergonzoli, S.; Pari, L. Equipping a Combine Harvester with Turbine Technology Increases the Recovery of Residual Biomass from Cereal Crops via the Collection of Chaff. Energies 2020, 13, 1572. [Google Scholar] [CrossRef] [Green Version]
  6. Gonzalez-De-Soto, M.; Emmi, L.; Perez-Ruiz, M.; Agüera, J.; Gonzalez-De-Santos, P. Autonomous systems for precise spraying—Evaluation of a robotised patch sprayer. Biosyst. Eng. 2016, 146, 165–182. [Google Scholar] [CrossRef]
  7. Gonzalez-De-Soto, M.; Emmi, L.; Garcia, I.; Gonzalez-De-Santos, P. Reducing fuel consumption in weed and pest control using robotic tractors. Comput. Electron. Agric. 2015, 114, 96–113. [Google Scholar] [CrossRef]
  8. Fountas, S.; Mylonas, N.; Malounas, I.; Rodias, E.; Santos, C.H.; Pekkeriet, E. Agricultural Robotics for Field Operations. Sensors 2020, 20, 2672. [Google Scholar] [CrossRef]
  9. Burke, R.; Mussomeli, A.; Laaper, S.; Hartigan, M.; Sniderman, B. The Smart Factory: Responsive, Adaptive, Connected Manufacturing; Deloitte University Press: Westlake, TX, USA, 2017; Available online: https://dupress.deloitte.com/dup-us-en/focus/industry-4-0/smart-factory-connected-manufacturing.html (accessed on 2 July 2020).
  10. Robert, M.; Thomas, A.; Bergez, J.-E. Processes of adaptation in farm decision-making models. A review. Agron. Sustain. Dev. 2016, 36, 64. [Google Scholar] [CrossRef] [Green Version]
  11. Brewster, C.; Roussaki, I.; Kalatzis, N.; Doolin, K.; Ellis, K. IoT in Agriculture: Designing a Europe-Wide Large-Scale Pilot. IEEE Commun. Mag. 2017, 55, 26–33. [Google Scholar] [CrossRef]
  12. Wolfert, S.; Ge, L.; Verdouw, C.; Bogaardt, M.-J. Big Data in Smart Farming—A review. Agric. Syst. 2017, 153, 69–80. [Google Scholar] [CrossRef]
  13. Ochoa, S.F.; Fortino, G.; Di Fatta, G. Cyber-physical systems, internet of things and big data. Futur. Gener. Comput. Syst. 2017, 75, 82–84. [Google Scholar] [CrossRef]
  14. Hiremath, S.A.; Van Der Heijden, G.W.A.M.; Van Evert, F.K.; Stein, A.; Ter Braak, C.J.F. Laser range finder model for autonomous navigation of a robot in a maize field using a particle filter. Comput. Electron. Agric. 2014, 100, 41–50. [Google Scholar] [CrossRef]
  15. Bechar, A. Robotics in horticultural field production. Stewart Postharvest Rev. 2010, 6, 1–11. [Google Scholar] [CrossRef]
  16. Eizicovits, D.; Berman, S. Efficient sensory-grounded grasp pose quality mapping for gripper design and online grasp planning. Robot. Auton. Syst. 2014, 62, 1208–1219. [Google Scholar] [CrossRef]
  17. Zion, B.; Mann, M.; Levin, D.; Shilo, A.; Rubinstein, D.; Shmulevich, I. Harvest-order planning for a multiarm robotic harvester. Comput. Electron. Agric. 2014, 103, 75–81. [Google Scholar] [CrossRef]
  18. Bechar, A.; Eben-Chaime, M. Hand-held computers to increase accuracy and productivity in agricultural work study. Int. J. Prod. Perform. Manag. 2014, 63, 194–208. [Google Scholar] [CrossRef]
  19. Wang, P. On Defining Artificial Intelligence. J. Artif. Gen. Intell. 2019, 10, 1–37. [Google Scholar] [CrossRef] [Green Version]
  20. Nirmala, G.; Geetha, S.; Selvakumar, S. Mobile Robot Localization and Navigation in Artificial Intelligence: Survey. Comput. Methods Soc. Sci. 2017, IV, 12–22. Available online: http://cmss.univnt.ro/wp-content/uploads/vol/split/vol_IV_issue_2/CMSS_vol_IV_issue_2_art.002.pdf (accessed on 2 July 2020).
  21. Li, B.-H.; Hou, B.-C.; Yu, W.-T.; Lu, X.-B.; Yang, C.-W. Applications of artificial intelligence in intelligent manufacturing: A review. Front. Inf. Technol. Electron. Eng. 2017, 18, 86–96. [Google Scholar] [CrossRef]
  22. Lee, E.A. Cyber Physical Systems: Design Challenges. In Proceedings of the 11th IEEE International Symposium on Object and Component-Oriented Real-Time Distributed Computing (ISORC), Orlando, FL, USA, 5–7 May 2008; pp. 363–369. [Google Scholar]
  23. Bordel, B.; Alcarria, R.; Robles, T.; Martín, D. Cyber–physical systems: Extending pervasive sensing from control theory to the Internet of Things. Pervasive Mob. Comput. 2017, 40, 156–184. [Google Scholar] [CrossRef]
  24. Mell, P.; Grance, T. The NIST Definition of Cloud Computing, Version 15, 10-7-09. National Institute of Standards and Technology. Information Technology Laboratory. Available online: https://csrc.nist.gov/publications/detail/sp/800-145/final (accessed on 2 July 2020).
  25. Jadeja, Y.; Modi, K. Cloud computing—concepts, architecture and challenges. In Proceedings of the 2012 International Conference on Computing, Electronics and Electrical Technologies (ICCEET), Kumaracoil, India, 21–22 March 2012; pp. 877–880. [Google Scholar]
  26. Schertz, C.; Brown, G. Basic Considerations in Mechanizing Citrus Harvest. Trans. ASAE 1968, 11, 343–346. [Google Scholar] [CrossRef]
  27. Merlet, J.-P. Parallel Robots, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2000. [Google Scholar]
  28. Patel, Y.D.; George, P.M. Parallel Manipulators Applications—A Survey. Mod. Mech. Eng. 2012, 2, 57–64. [Google Scholar] [CrossRef] [Green Version]
  29. Lin, J.; Luo, C.-H.; Lin, K.-H. Design and Implementation of a New DELTA Parallel Robot in Robotics Competitions. Int. J. Adv. Robot. Syst. 2015, 12, 153. [Google Scholar] [CrossRef] [Green Version]
  30. EcoRobotix Ltd. Available online: https://www.youtube.com/watch?v=PQK3nP8jrLA (accessed on 2 July 2020).
  31. Pratt, G.A.; Williamson, M.M. Series elastic actuators. In Proceedings of the 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots, Pittsburgh, PA, USA, 5–9 August 1995; Volume 1, p. 399. [Google Scholar]
  32. Kim, S.; Laschi, C.; Trimmer, B. Soft robotics: A bioinspired evolution in robotics. Trends Biotechnol. 2013, 31, 287–294. [Google Scholar] [CrossRef]
  33. Iossifidis, I.; Steinhage, A. Controlling an 8 DOF Manipulator by Means of Neural Fields. In Proceedings of the International Conference on Field and Service Robotics FSR2001, Helsinki, Finland, 11–13 June 2001; pp. 1–7. [Google Scholar]
  34. Palankar, M.; De Laurentis, K.J.; Alqasemi, R.; Veras, E.; Dubey, R.; Arbel, Y.; Donchin, E. Control of a 9-DoF Wheelchair-mounted robotic arm system using a P300 Brain Computer Interface: Initial experiments. In Proceedings of the 2008 IEEE International Conference on Robotics and Biomimetics, Bangkok, Thailand, 22–25 February 2009; pp. 348–353. [Google Scholar]
  35. Chirikjian, G.S.; Burdick, J.W. A Hyper- Redundant Manipulator. IEEE Robot. Autom. Mag. 1994, 1, 22–29. [Google Scholar] [CrossRef] [Green Version]
  36. Tang, L.; Wang, J.; Zheng, Y.; Gu, G.; Zhu, L.; Zhu, X. Design of a cable-driven hyper-redundant robot with experimental validation. Int. J. Adv. Robot. Syst. 2017, 14, 1–12. [Google Scholar] [CrossRef] [Green Version]
  37. Shao, T.; Du, M.; Bao, G.; Zhang, L.; Yang, Q. Fruit harvesting continuum manipulator inspired by elephant trunk. Int. J. Agric. Biol. Eng. 2015, 8, 57–63. [Google Scholar]
  38. Motoman. Development of Dual-arm Robot MOTOMAN-SDA20D, Data Base for Noteworthy Contributions for Science and Technology, Japan. 2018. Available online: https://dbnst.nii.ac.jp/english/detail/2047 (accessed on 2 July 2020).
  39. ABB. YuMi—Creating an Automated Future Together. 2018. Available online: https://new.abb.com/products/robotics/industrial-robots/yumi (accessed on 2 July 2020).
  40. Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
  41. De-An, Z.; Jidong, L.; Ji, W.; Ying, Z.; Yu, C. Design and control of an apple harvesting robot. Biosyst. Eng. 2011, 110, 112–122. [Google Scholar] [CrossRef]
  42. Baur, J.; Pfaff, J.; Ulbrich, H.; Villgrattner, T. Design and development of a redundant modular multipurpose agricultural manipulator. In Proceedings of the 2012 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Kaohsiung, Taiwan, 11–14 July 2012; pp. 823–830. [Google Scholar]
  43. Milutinovic, D.; Slavkovic, N.; Kokotovic, B.; Milutinovic, M.; Zivanovic, S.; Dimic, Z. Kinematic Modeling of Reconfigurable Parallel Robots Based on Delta Concept. J. Prod. Eng. 2012, 15, 71–74. [Google Scholar]
  44. Zhao, Y.; Gong, L.; Liu, C.; Huang, Y. Dual-arm Robot Design and Testing for Harvesting Tomato in Greenhouse. IFAC-PapersOnLine 2016, 49, 161–165. [Google Scholar] [CrossRef]
  45. Sepulveda, D.; Fernandez, R.; Navas, E.; Armada, M.; Gonzalez-De-Santos, P. Robotic Aubergine Harvesting Using Dual-Arm Manipulation. IEEE Access 2020, 8, 121889–121904. [Google Scholar] [CrossRef]
  46. Autopilot. Available online: http://www.trimble.com/Agriculture/autopilot.aspx (accessed on 2 July 2020).
  47. AutoTrac. Available online: https://www.deere.com/en_INT/products/ (accessed on 2 July 2020).
  48. New Holland. Available online: http://agriculture1.newholland.com/nar/en-us/about-us/whats-up/news-events/2016/new-holland-nh-drive-new-concept-autonomous-tractor (accessed on 2 July 2020).
  49. CASE. 2017. Available online: https://www.therobotreport.com/case-ih-displays-new-cab-less-concept-tractor/ (accessed on 2 July 2020).
  50. EU SCAR. Agricultural Knowledge and Innovation Systems towards the Future—A Foresight Paper; Publications Office of the European Union: Luxembourg, 2016. [Google Scholar]
  51. Tsolakis, N.; Bechtsis, D.; Bochtis, D. AgROS: A Robot Operating System Based Emulation Tool for Agricultural Robotics. Agronomy 2019, 9, 403. [Google Scholar] [CrossRef] [Green Version]
  52. O’Connor, M.; Bell, T.; Elkaim, G.; Parkinson, B. Automatic Steering of Farm Vehicles Using GPS. In Proceedings of the 3rd International Conference on Precision Agriculture, Minneapolis, MN, USA, 23–26 June 1996; pp. 767–777. [Google Scholar]
  53. Noguchi, N.; Reid, J.F.; Will, J.; Benson, E.R.; Stombaugh, T.S. Vehicle automation system based on multi-sensor integration. In Proceedings of the 1998 Annual International Meeting (ASAE), Orlando, FL, USA, 12–16 July 1998; Paper No. 983111. pp. 49085–49659. [Google Scholar]
  54. Pilarski, T.; Happold, M.; Pangels, H.; Ollis, M.; Fitzpatrick, K.; Stentz, A. The Demeter System for Automated Harvesting. Auton. Robot. 2002, 13, 9–20. [Google Scholar] [CrossRef]
  55. Stentz, A.; Dima, C.; Wellington, C.; Herman, H.; Stager, D. A System for Semi-Autonomous Tractor Operations. Auton. Robot. 2002, 13, 87–104. [Google Scholar] [CrossRef]
  56. Thuilot, B.; Cariou, C.; Cordesses, L.; Martinet, P. Automatic guidance of a farm tractor along curved paths, using a unique CP-DGPS. In Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180), Maui, HI, USA, 29 October–3 November 2001; Institute of Electrical and Electronics Engineers (IEEE): New York, NY, USA, 2002; Volume 2, pp. 674–679. [Google Scholar]
  57. Subramanian, V.; Burks, T.F.; Arroyo, A. Development of machine vision and laser radar based autonomous vehicle guidance systems for citrus grove navigation. Comput. Electron. Agric. 2006, 53, 130–143. [Google Scholar] [CrossRef]
  58. Nørremark, M.; Griepentrog, H.; Nielsen, J.; Søgaard, H. The development and assessment of the accuracy of an autonomous GPS-based system for intra-row mechanical weed control in row crops. Biosyst. Eng. 2008, 101, 396–410. [Google Scholar] [CrossRef]
  59. Emmi, L.; Gonzalez-De-Soto, M.; Pajares, G.; Gonzalez-De-Santos, P. Integrating Sensory/Actuation Systems in Agricultural Vehicles. Sensors 2014, 14, 4014–4049. [Google Scholar] [CrossRef]
  60. Emmi, L.; Gonzalez-De-Soto, M.; Pajares, G.; Gonzalez-De-Santos, P. New Trends in Robotics for Agriculture: Integration and Assessment of a Real Fleet of Robots. Sci. World J. 2014, 2014, 1–21. [Google Scholar] [CrossRef] [Green Version]
  61. Gonzalez-De-Santos, P.; Ribeiro, A.; Fernandez-Quintanilla, C.; Lopez-Granados, F.; Brandstoetter, M.; Tomic, S.; Pedrazzi, S.; Peruzzi, A.; Pajares, G.; Kaplanis, G.; et al. Fleets of robots for environmentally-safe pest control in agriculture. Precis. Agric. 2016, 18, 574–614. [Google Scholar] [CrossRef]
  62. Bergerman, M.; Maeta, S.M.; Zhang, J.; Freitas, G.M.; Hamner, B.; Singh, S.; Kantor, G. Robot Farmers: Autonomous Orchard Vehicles Help Tree Fruit Production. IEEE Robot. Autom. Mag. 2015, 22, 54–63. [Google Scholar] [CrossRef]
  63. Kayacan, E.; Kayacan, E.; Ramon, H.; Saeys, W. Towards agrobots: Identification of the yaw dynamics and trajectory tracking of an autonomous tractor. Comput. Electron. Agric. 2015, 115, 78–87. [Google Scholar] [CrossRef] [Green Version]
  64. Ruckelshausen, A.; Biber, P.; Dorna, M.; Gremmes, H.; Klose, R.; Linz, A.; Rahe, R.; Resch, R.; Thiel, M.; Trautz, D.; et al. BoniRob: An autonomous field robot platform for individual plant phenotyping. Precis. Agric. 2009, 9, 841–847. [Google Scholar]
  65. Bawden, O.; Kulk, J.; Russell, R.; McCool, C.; English, A.; Dayoub, F.; Lehnert, C.; Perez, T. Robot for weed species plant-specific management. J. Field Robot. 2017, 34, 1179–1199. [Google Scholar] [CrossRef]
  66. Underwood, J.P.; Calleija, M.; Taylor, Z.; Hung, C.; Nieto, J.; Fitch, R.; Sukkarieh, S. Real-time target detection and steerable spray for vegetable crops. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA)—Workshop on Robotics in Agriculture, Seattle, WA, USA, 25–30 May 2015; pp. 504–510. [Google Scholar]
  67. Precision Makers—Greenbot. Available online: http://www.precisionmakers.com/greenbot/ (accessed on 2 July 2020).
  68. Raussendorf. Raussendorf Maschinen. Available online: http://www.raussendorf.de/en/fruit-robot.html (accessed on 2 July 2020).
  69. Bogue, R. Robots poised to revolutionise agriculture. Ind. Robot. Int. J. 2016, 43, 450–456. [Google Scholar] [CrossRef]
  70. Kongskilde. New Automated Agricultural Platform—Kongskilde Vibro Crop Robotti. 2017. Available online: http://conpleks.com/robotech/new-automated (accessed on 2 July 2020).
  71. AGREENCULTURE—Centéol 2018 & Movies. Available online: https://www.agreenculture.net/copy-of-challenge-centeol-2018 (accessed on 2 July 2020).
  72. Naïo Technologies—Multifunctional Vineyard Weeding Robot—TED. Available online: https://www.naio-technologies.com/en/agricultural-equipment/vineyard-weeding-robot/ (accessed on 2 July 2020).
  73. Sylvester, G. (Ed.) E-Agriculture in Action: Drones for Agriculture; Food and Agriculture Organization of the United Nations and International Telecommunication Union: Bangkok, Thailand, 2018. [Google Scholar]
  74. Patel, P.N.; Patel, M.A.; Faldu, R.M.; Dave, Y.R. Quadcopter for Agricultural Surveillance. Adv. Electron. Electr. Eng. 2013, 3, 427–432. [Google Scholar]
  75. Bakker, T.; Van Asselt, K.; Bontsema, J.; Müller, J.; Van Straten, G. A path following algorithm for mobile robots. Auton. Robot. 2010, 29, 85–97. [Google Scholar] [CrossRef] [Green Version]
  76. Nagasaka, Y.; Saito, H.; Tamaki, K.; Seki, M.; Kobayashi, K.; Taniwaki, K. An autonomous rice transplanter guided by global positioning system and inertial measurement unit. J. Field Robot. 2009, 26, 537–548. [Google Scholar] [CrossRef]
  77. Peleg, D. Distributed Coordination Algorithms for Mobile Robot Swarms: New Directions and Challenges. In Proceedings of the Distributed Computing—IWDC 2005, Kharagpur, India, 27–30 December 2005; Springer: Berlin/Heidelberg, Germany, 2005; pp. 1–12. [Google Scholar]
  78. Blackmore, S.; Stout, B.; Wang, M.; Runov, B. Robotic agriculture–the future of agricultural mechanisation. In Proceedings of the 5th European Conference on Precision Agriculture, Uppsala, Sweden, 9–12 June 2005; Stafford, J.V., Ed.; Wageningen Academic Publishers: Wageningen, The Netherlands, 2005; pp. 621–628. [Google Scholar]
  79. Cheung, B.K.-S.; Choy, K.; Li, C.-L.; Shi, W.; Tang, J. Dynamic routing model and solution methods for fleet management with mobile technologies. Int. J. Prod. Econ. 2008, 113, 694–705. [Google Scholar] [CrossRef] [Green Version]
  80. Sørensen, C.; Bochtis, D. Conceptual model of fleet management in agriculture. Biosyst. Eng. 2010, 105, 41–50. [Google Scholar] [CrossRef]
  81. Bautin, A.; Simonin, O.; Charpillet, F. Towards a communication free coordination for multi-robot exploration. In Proceedings of the 6th National Conference on Control Architectures of Robots, Grenoble, France, 24–25 May 2011; Available online: https://hal.inria.fr/inria-00599605/document (accessed on 2 July 2020).
  82. Bouraqadi, N.; Fabresse, L.; Doniec, A. On fleet size optimization for multi-robot frontier-based exploration. In Proceedings of the 7th National Conference on Control Architectures of Robots, Nancy, France, 10–11 May 2012; pp. 1–9. Available online: http://car2012.loria.fr/files/2012/Bouraqadi-CAR2012.pdf (accessed on 2 July 2020).
  83. Blackmore, B.S.; Have, H.; Fountas, S. A specification of behavioural requirements for an autonomous tractor. In Proceedings of the 6th International Symposium on Fruit, Nut and Vegetable Production Engineering Conference, Potsdam, Germany, 11–14 September 2001; pp. 25–36. [Google Scholar]
  84. Bacco, M.; Berton, A.; Ferro, E.; Gennaro, C.; Gotta, A.; Matteoli, S.; Paonessa, F.; Ruggeri, M.; Giuseppe, V.; Zanella, A. Smart Farming: Opportunities, Challenges and Technology Enablers. In Proceedings of the 2018 IoT Vertical and Topical Summit on Agriculture—Tuscany (IOT Tuscany), Tuscany, Italy, 8–9 May 2018; pp. 1–6. [Google Scholar]
Figure 1. Technologies involved in precision agriculture.
Figure 1. Technologies involved in precision agriculture.
Agronomy 10 01638 g001
Figure 2. Cyber-physical systems and the Internet of Things.
Figure 2. Cyber-physical systems and the Internet of Things.
Agronomy 10 01638 g002
Figure 3. Smart factory (left) and smart farm (right) with their common elements (center).
Figure 3. Smart factory (left) and smart farm (right) with their common elements (center).
Agronomy 10 01638 g003
Figure 4. (a) Serial and redundant manipulator structure—qi indicates a degree of freedom; (b) redundant manipulator—courtesy of CROPS (Intelligent sensing and manipulation for sustainable production and harvesting of high value crops: clever robots for crops) consortium and Prof. D. Rixen, Technical University of Munich).
Figure 4. (a) Serial and redundant manipulator structure—qi indicates a degree of freedom; (b) redundant manipulator—courtesy of CROPS (Intelligent sensing and manipulation for sustainable production and harvesting of high value crops: clever robots for crops) consortium and Prof. D. Rixen, Technical University of Munich).
Agronomy 10 01638 g004
Figure 5. (a) Parallel manipulator structure; (b) delta manipulator structure; (c) delta manipulator (courtesy of Professor J. Lin, Chien Hsin University of Science and Technology [29]).
Figure 5. (a) Parallel manipulator structure; (b) delta manipulator structure; (c) delta manipulator (courtesy of Professor J. Lin, Chien Hsin University of Science and Technology [29]).
Agronomy 10 01638 g005
Figure 6. (a) A redundant manipulator structure and (b) an elephant-trunk/snake manipulator (courtesy of Professor Gu, Shanghai Jiao Tong University [36].
Figure 6. (a) A redundant manipulator structure and (b) an elephant-trunk/snake manipulator (courtesy of Professor Gu, Shanghai Jiao Tong University [36].
Agronomy 10 01638 g006
Figure 7. Dual-arm manipulator harvesting eggplants.
Figure 7. Dual-arm manipulator harvesting eggplants.
Agronomy 10 01638 g007
Figure 8. Components of a precision agriculture robotic system.
Figure 8. Components of a precision agriculture robotic system.
Agronomy 10 01638 g008
Figure 9. Mobile platform structures: (a) front steering wheels and rear traction wheels (RHEA prototype); (b) skid steering wheels (courtesy of AGREENCULTURE); (c) independent steering wheels (courtesy of Naïo Technologies); and (d) a wheel-legged prospective structure.
Figure 9. Mobile platform structures: (a) front steering wheels and rear traction wheels (RHEA prototype); (b) skid steering wheels (courtesy of AGREENCULTURE); (c) independent steering wheels (courtesy of Naïo Technologies); and (d) a wheel-legged prospective structure.
Agronomy 10 01638 g009
Figure 10. Hexrotor (AirRobot AR-200) carrying a vision camera for remote detection of weeds.
Figure 10. Hexrotor (AirRobot AR-200) carrying a vision camera for remote detection of weeds.
Agronomy 10 01638 g010
Figure 11. A fleet of robots (the RHEA fleet).
Figure 11. A fleet of robots (the RHEA fleet).
Agronomy 10 01638 g011
Table 1. Examples of different types of manipulators for agriculture.
Table 1. Examples of different types of manipulators for agriculture.
InstitutionYearCharacteristic and Applications
School of Electrical and Information Engineering, Jiangsu University, China2011Apple harvesting manipulator [41].
Technical University of Munich, Germany2012Redundant modular multipurpose agricultural manipulator [42].
University of Belgrade, Faculty of Mechanical Engineering, Serbia2012Reconfigurable parallel robots based on delta concept with application in agriculture [43].
Key Laboratory of E&M—Zhejiang University of Technology, China2015Continuum manipulator for agriculture [37].
School of Mechanical Engineering, Shanghai Jiao Tong University, China2016Dual-arm robot for harvesting tomato in greenhouses [44].
Centre for Automation and Robotics (UPM-CSIC)2020Dual-arm manipulation for aubergine harvesting [45].
Table 2. Agriculture autonomous robots based on retrofitting conventional vehicles.
Table 2. Agriculture autonomous robots based on retrofitting conventional vehicles.
Institution/YearCharacteristic and Applications
Stanford University (United States), 1996An automatic control system was developed and tested on a large farm tractor using four GPS antennas [52].
University of Illinois (United States), 1998A guidance system created by integrating a sensor based on machine vision, a real-time kinematics GPS (RTK-GPS), and a geometric direction sensor. The fusion integration methodology was based on an extended Kalman filter and a two-dimensional probability density function statistical method [53].
Carnegie-Mellon University (United States)—Demeter project, 1999This is a self-propelled hay harvester for agricultural operations. The positional data are fused from a differential global positioning system (GPS), a wheel encoder (dead-reckoning), and gyroscopic system sensors [54].
Carnegie-Mellon University (United States)—Autonomous Agricultural Spraying project, 2002The objective was to automate ground-based vehicles for pesticide spraying to achieve a system that was significantly cheaper, safer, and friendlier to the environment. A remote operator was able to supervise the nighttime operation of up to four spraying vehicles [55].
LASMEA-CEMAGREF (France), 2001This study investigated the possibility of achieving vehicle guiding using a carrier phase differential GPS as the only sensor. The vehicle heading was derived according to a Kalman state reconstructor, and a nonlinear velocity independent control law was designed that relied on chained systems properties [56].
University of Florida (United States), 2006An autonomous guidance system for use in a citrus grove was developed on the basis of machine vision (average guidance error of approximately 0.028 m) and laser radar (average guidance error of approximately 0.025 m). The system was tested on a curved path at a speed of approximately 3.1 m s−1 [57].
University of Aarhus and the University of Copenhagen (Denmark), 2008An automatic intra-row weed control system was connected to an unmanned tractor and linked via a hydraulic side shifting frame attached to the rear three-point hitch of the vehicle [58].
RHEA (Robot Fleets for Highly Effective Agriculture and Forestry Management) consortium, 2014Automation and sensor integration of a fleet (three units) of medium-sized tractors that cooperated and collaborated in physical/chemical weed control and pesticide applications for trees [59,60,61].
Carnegie-Mellon University (United States), 2015Self-driving orchard vehicles for orchard tasks, i.e., tree pruning and training, blossom and fruit thinning, fruit harvesting, mowing, spraying, and sensing [62].
University of Leuven (Belgium), 2015Tractor guidance using model predictive control for yaw dynamics [63].
Table 3. Agriculture autonomous robots based on specific structures.
Table 3. Agriculture autonomous robots based on specific structures.
Vehicle/YearCharacteristics and Applications
BoniRob/2009Steering scheme: Independent steering/traction wheels in 1-DOF legs (wheel-legs).
Applications: General agricultural tasks—BoniRob is based on an app concept similar to the intelligent phone scheme that gives third parties the possibility to integrate their own modules for specific applications [64].
AgBot II/2014Steering scheme: Two front skid steering wheels and two rear caster wheels.
Applications: Fertilizing tasks in large-horticultural crops; weed management: detects and classifies weeds. Destroy weeds using mechanically or chemically devices [65].
Ladybird/2015Steering scheme: Independent steering wheels.
Applications: Assessment of crop using hyperspectral cameras, thermal and infrared detecting systems, panoramic and stereovision cameras, LIDAR, and GPS [66].
Greenbot/Precision Makers 2015Steering scheme: Independent steering/traction wheels.
Applications: Regularly repeated tasks in agriculture and horticulture [67].
Cäsar/2016Steering scheme: Independent steering/traction wheels.
Applications: Autonomous use in enclosed fruit plantations and vineyards. Pest and soil management, fertilization, harvesting, and transport [68].
RIPPA/2016Steering scheme: Independent steering/traction wheels.
Applications: Vegetable growing industry; spot spraying of weeds using a directed micro-dose of liquid [69].
Vibro Crop Robotti/2017Steering scheme: Skid steering wheels.
Applications: Precision seeding and mechanical row crop cleaning [70].
Céeol/2019Steering scheme: Skid steering trucks.
Applications: Furrow preparation, fertilization, weeding, harvesting, soil analysis [71].
Naïo/2019Steering scheme: Independent steering/traction wheels.
Applications: Mowing, leaf thinning, trimming, etc., for vineyards [72].
Table 4. Main characteristics of some current manipulators.
Table 4. Main characteristics of some current manipulators.
Manipulator/Status (Year)TypeDOFDimensions (m)SpeedPayload (kg)
CROPS/Prototype (2014)Serial/Redundant91.53 (maximum extension)15 s/piece≈0.75
ABB IRB-360-1/Commercial (2012)Delta4Radius (2): 0.4
Height: 0.2
10 m s−11
FESTO Elephant trunk/Prototype (2010)Continuum--Length: 0.85 m--3
Yaskawa SDA5D/Commercial (2018)Dual-Arm15 (1)Horizontal. reach: 0.845 m
Vertical reach: 1.118 m
200°/s (3)5 kg per arm
(1) 7 DOF (degrees of freedom) per arm plus base. (2) Cylindrical workspace. (3) On average each joint.
Table 5. Main characteristics of current mobile robots.
Table 5. Main characteristics of current mobile robots.
Vehicle (Year)TypeDimensions (m)Weight (Kg)Speed (m s−1)Payload (kg)
BoniRob (2009)Independent steering2.8 × 2.4 × 2.211001.5150
AgBot II (2014)Skid and caster wheels3 × 2 × 14002.7200
Ladybird (2015)Independent steering--325 kg1.2--
Greenbot (2015)Independent steering1 × 1.8 × 0.47--11500 kg
Cäsar (2016)Independent steering3 × 1.3 × 0.9210003.332000
Céeol (2019)Skid trucks0.80 × 1.8 × 15503.33350/700
Naïo (2019)Independent steeringLength: 2.30
Width: 1.5–2
Height: 1.5–2
8001.12240
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Gonzalez-de-Santos, P.; Fernández, R.; Sepúlveda, D.; Navas, E.; Emmi, L.; Armada, M. Field Robots for Intelligent Farms—Inhering Features from Industry. Agronomy 2020, 10, 1638. https://doi.org/10.3390/agronomy10111638

AMA Style

Gonzalez-de-Santos P, Fernández R, Sepúlveda D, Navas E, Emmi L, Armada M. Field Robots for Intelligent Farms—Inhering Features from Industry. Agronomy. 2020; 10(11):1638. https://doi.org/10.3390/agronomy10111638

Chicago/Turabian Style

Gonzalez-de-Santos, Pablo, Roemi Fernández, Delia Sepúlveda, Eduardo Navas, Luis Emmi, and Manuel Armada. 2020. "Field Robots for Intelligent Farms—Inhering Features from Industry" Agronomy 10, no. 11: 1638. https://doi.org/10.3390/agronomy10111638

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop