SciELO - Scientific Electronic Library Online

 
vol.86 número210Evaluación fisicoquímica y sensorial de una barra de fruta a base de mangoDesarrollo de un modelo dinámico para módulos y arreglos fotovoltaicos índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

Links relacionados

  • En proceso de indezaciónCitado por Google
  • No hay articulos similaresSimilares en SciELO
  • En proceso de indezaciónSimilares en Google

Compartir


DYNA

versión impresa ISSN 0012-7353

Dyna rev.fac.nac.minas vol.86 no.210 Medellín jul./sep. 2019

https://doi.org/10.15446/dyna.v86n210.74701 

Artículos

Use of remotely piloted aircraft in precision agriculture: a review

Utilización de sistemas de aeronaves no tripuladas en agricultura de precisión: una revisión

Luana Mendes dos Santos,a 

Gabriel Araújo e Silva Ferraz,a 

Brenon Diennevan Souza Barbosaa 

Alan Delon Andradea 

a Engineering Department, Universidade Federal de Lavras, Lavras, Brazil. luanna_mendess@yahoo.com.br gabriel.ferraz@deg.ufla.br b.diennevan@outlook.com alanmg13@gmail.com


Abstract

The objective of this review was to examine the current use of remotely piloted aircraft (RPA) in obtaining data to assist in the application of precision farming techniques and to exemplify successful situations of technology use. The RPA has applications for monitoring, mapping, vegetation index (VI) extraction, volume, plant height, among others, and has been studied in several agricultural crops, being support for decision making on agrochemical application, planting failure, accompaniment of growth favoring the increase of crop productivity. One of the potentialities evaluated through RPA is the use of VI, which may be extracted from digital images obtained by cameras that contain only the visible band. It may be an alternative for farmers who do not have access to RPA coupled with high-tech embedded sensors. Therefore, it is a tool that may contribute to the decision making, allowing the acquisition of high spatial and temporal resolution images.

Keywords: unmanned aircraft system (UAS); drone; photogrammetry

Resumen

El objetivo de esta revisión fue examinar el uso actual de aeronaves pilotadas remotamente (RPA) en la obtención de datos para aplicaciones de la agricultura de precisión, ejemplificando situaciones exitosas de uso de la tecnología. La RPA tiene aplicaciones para monitoreo, mapeo, extracción de índice de vegetación (VI), volumen, altura de plantas, y ha sido estudiado en diversas culturas agrícolas, siendo soporte para toma de decisión sobre aplicación de agroquímicos, falla de plantación, acompañamiento del crecimiento y aumento de la productividad. Un potencial evaluadas por la RPA es el uso del VI, que puede ser extraído de imágenes digitales obtenidas por cámaras que contienen la banda visible. Siendo una alternativa para los agricultores que no tienen acceso a RPA acoplado a sensores de alta tecnología. Por lo tanto, es una herramienta que puede contribuir a la toma de decisiones, permitiendo la adquisición de imágenes de alta resolución espacial y temporal.

Palabras clave: sistema de aeronave no tripulada; drone; fotogrametría

1. Introduction

Precision agriculture (PA) consists integrated, information- and production based farming system that is designed to increase long term, site-specific and whole farm production efficiency, productivity and profitability improving production and environmental quality [25]. According to [28], massive data collection and real-time analysis will be required in the near future, involving sensors that record the variations and plant characteristics, with a quick and accurate diagnosis to intervene in the crop. Such sensors can be based on satellites, airplanes, Remotely Piloted Aircrafts (RPA), fixed on mobile robots or agricultural machines. Being possible to identify weeds, pests and diseases, leaf health through its reflectance, water stress, nutritional deficit, non-uniformity, planting failures, crop anomalies and other factors significant for decision making in the use of fertilizers and pesticides, and for mitigation measures.

Currently, the farm management requires high-resolution images, usually in centimeters [26,57]. According to [57], there are still some limitations to the application of sensor systems based on remote sensing in farm management, such as: timely collection and delivery of images, lack of high spatial resolution data, interpretation of images and data extraction, and the integration of this data with agronomic data in specialized systems.

Given these limitations, it is expected that RPA are efficient remote sensing tools for the PA. RPA has an expanded role that complements the images of manned aircrafts and satellites, providing agricultural support [15], enabling flexibility in data collection of high spatial and temporal resolution, allowing the efficient use of resources, protecting the environment, providing information related to management treatments as use of localized application and variable rate machines, sowing, application of fertilizers and pesticides [16].

Thus, there is the PA ability to follow technological development based on more flexible and accessible platforms to farmers, such as RPA. This technology allied to the PA allows a closer and more frequent follow-up of diverse cultures with better management and handling.

Therefore, this review was aimed to bring the definitions about RPA, to evidence their current applications in agriculture among several authors and to study the potentialities and alternatives of this new technology applied to PA.

2. Remotely piloted aircrafts as a tool for precision

agriculture

Remote digital imaging has emerged and become popular through unmanned aircraft systems (UAS). For [20], UAS are composed by aircrafts with associated elements and employed without pilot on board, which may be remotely piloted aircraft (RPA) or an autonomous aircraft. Fully autonomous aircrafts are those that cannot be intervened once started the flight.

According to [20], the term unmanned aerial vehicles (UAV) is obsolete, since aviation organizations use the term aircraft. Thus, it was adopted a term that is recognized worldwide in order to have a standardization. Another reason is due to the operational complexity of these aircrafts, since it requires a ground station, the link between the aircraft and the pilot, besides other elements necessary to make a safe flight. Therefore, the term ‘vehicle’ would be inappropriate to represent the entire system.

Research related to UAS have been developed worldwide by providing information on these or about the popularly known as drones, whose name originated in the United States and is related to drone, a male bee and its tinnitus, due to the produced sound when operating. There are other terminations found in the literature, including unmanned aerial vehicles (UAVs) [3,26,31,40], Remotely Piloted Aircraft Systems (RPAS) [7], Remotely Piloted Aircraft (RPA) [9,20,56], Remotely Piloted Vehicles (RPV) [12,39], Unmanned Aircraft Systems (UAS) [20,52], Remotely Operated Aircraft (ROA) [46].

In view of this situation and several terminologies that have arisen to represent these aircrafts, the [20] has designated the term RPA as standard, which is then used in this review.

RPAs are aircrafts in which the pilot is not on board, whose control is remote through interface as a computer, simulator, digital device or remote control, in contrast to the autonomous aircrafts that once programmed have no external intervention during the flight. RPAs are considered a subcategory of unmanned aircrafts.

Most RPAs have: GNSS, which provides position information; motors; autopilot and/or remote control; cameras for obtaining images and videos; landing gear; radio transmitter and receiver; and systems for measuring altitude and direction.

RPAs can be classified as fixed or rotary wing. The fixed wings are suitable for longer distances, reach higher speeds, although are heavier, larger and with higher costs in relation to those with rotary wing, as observed by [36] in their studies of potentials and operability of RPA. Most take-off and landing cases of RPAs with fixed-wing platform are manual, except those with autopilot.

The rotary-wing platforms are more autonomous, have vertical take-off and landing, are usually smaller and of lower weight in relation to the fixed wings, and because of their size they can hover in the sky, being advantageous in order to get images of desirable targets. They have a disk-shaped rotary wing with a propeller (helicopter), two propellers (bicopter), three (tricopter), four (quadricopter), six (hexacopter) or eight (octacopter). However, they have some limitations, such as flight autonomy, being usually powered by battery that last for 25 min on average, and they do not support a large number of embedded sensors because they are small [36].

Precision agriculture uses accurate methods and techniques to monitor areas more efficiently. Geospatial technologies are used in order to identify the variations in the field and to apply strategies to deal with the variability [57]. Among these technologies, it can be mentioned: Geographic information systems, Global Navigation Satellite System (GNSS) and Remote Sensing, which can aid in obtaining data and provide localized support in agricultural management.

Remote sensing obtains target information through sensors on platforms at orbital level, reaching a spatial resolution of 0.61 cm in the panchromatic band obtained by the QuickBird satellite, since the RPA achieve a spatial resolution of 5 cm, being useful in studies that require a higher detailed level, as shown by comparative studies of [33]. The spatial resolution of RPA is directly related to flight altitude, and [38] achieved a spatial resolution of 8.1 mm and 65 mm, with flight altitude of 15 and 120 m respectively.

According to [57], in emergency cases of crop monitoring, nutrient deficit analysis, crop forecasting, the orbital sensors cannot provide continuous data, with high frequency and with a high detailed level. It also has limitations as high costs, lack of operational flexibility and low spatial and temporal resolution [52]. Another factor that interferes in the acquisition of images are the weather conditions, e.g., since on cloudy days there is the impediment of the solar energy passage and hence information loss of surface data [16].

Despite advances in the science of remote sensing, due to the mentioned limitations, some studies were performed in order to find different platforms in which they were efficient in obtaining data remotely with minimum costs. Some examples of these studied platforms were airships [21,47], balloons [48] and kites [1,53]. Although dealing with low-cost platforms in relation to orbital platforms, they rely on operationally impractical manual maneuvers for certain locations, e.g., making it difficult the monitoring and follow-up of crops [52,57].

According to [52], there are several peculiarities that make RPA a potential technology, highlighting low cost in relation to piloted aircraft and orbital platforms; autonomous data collection and ability to perform missions; ability to operate under adverse weather conditions and in hazardous environments, and lower exposure risk for the pilot.

Thus, RPAs are being studied and used to obtain high temporal resolution images (e.g., collected several times a day), spatial resolution (in cm) and low operating costs [12,16,18,24,34,42,54]. They can be applied in smaller areas and in specific locations with the easiness of obtaining data in lower time, e.g., monitoring the growth of several cultures.

2.1. Studies performed with remotely piloted aircrafts in

agriculture

Research in the scientific literature shows that the digital images obtained through RPAs have been used successfully to generate vigor maps in grape plantation. In research performed by [32], images were collected using a hexacopter platform with near-infrared (NIR) camera in which they obtained normalized difference vegetation indexes (NDVI) of a vineyard whose data were in agreement with the reflectance data obtained in the field, being possible to generate maps and evaluate the winemaking potential of the vineyard using such platforms more autonomously.

Developed a method of preprocessing images obtained by RPA, since processing is the phase of obtaining products that requires time and processors [16]. These authors also used a camera with NIR filter, which enabled to extract NDVI from the images and hence estimate the wheat and barley biomass.

For the monitoring of palm cultivation for oil extraction, [36] tested three RPA models and observed the culture monitoring possibilities through the obtained images, such as the control of pruning and the identification of pathologies through infrared images. The tested models were: eBee (fixed wing), X8 (fixed wing) and Phantom 2 (rotary wing), in which only eBee had camera with NIR, being possible to indicate its use for analysis of some pathologies as chlorosis in palm culture. The eBee model showed a higher cost in relation to the others, reaching 45 min of flight time and estimated yield to cover an area of 100 ha per flight at 150 m altitude. The X8 has a medium cost, with also a 45-min flight time, it presented some difficulty for maneuverability in relation to other models tested, covering an area of 100 ha per flight. Phantom 2 model was the cheaper model with the shortest flight time (25 min) with capacity to cover an area of 12 ha per flight at 150 m altitude.

In order to perform mapping of invasive species with RPA and subsequent application of herbicides in maize, in directed with machines, [5] concluded that the use of post-emergence image data in maize through RPA decreased the use of herbicide, resulting in an economy in untreated areas from 14 to 39.2% for uniform spraying and savings from 16 to 45 € ha. These studies corroborate the cyclical efficiency of PA, in which there is data collection, analysis and timely treatment of the problem, improving productivity, reducing the use of inputs, and ensuring the uniformity of application.

Studied the green normalized difference vegetation index [GNDVI = (NIR - green) / (NIR + Green)] and were able to relate linearly with leaf area index and biomass from images obtained by RPA [19]. Thus, they obtained leaf area index with high resolution in the wheat crop.

Used RPA images and observed that mapped regions that had significant reductions in the leaf area index (LAI) and NDVI may be subjected to field inspection for nutrient deficiency, such as potassium, so thus pest and disease can be detected early in the canola crop caused by this nutritional deficiency [38].

Obtained a good correlation of thermal images obtained by RPA with field data measured with term radiometers, in apple crop, being half of the area irrigated and the other half subjected to water stress [10]. They observed the canopy temperature significantly higher in stressed trees.

Collected georeferenced images from coffee crop in the 2002 harvest using RPA, and compared the image pixels with the reflectance data collected in the field, creating a crop maturity index [22].

Research on vegetation dynamics in forests, identification of clearings and monitoring were performed by [8], which concluded that RPA can be used for such studies, obtaining high resolution images and also being a tool for forest control. The authors consider that the low cost of missions with RPA allows recording the vegetation dynamics throughout the year.

Used RPA for image collection in order to monitor and support decisions in the coffee plantation, and several crop management aspects may be benefited from aerial observation, according to the authors [15]. The study demonstrated the RPAs monitoring ability over a long period, besides obtaining images with high spatial resolution, mapping invasive weed outbreaks and for revealing irrigation and fertilization anomalies, thus concluding that RPA is a broad tool that complements the use of satellites and piloted aircraft for agriculture support.

2.2. Embedded technology in RPA

Although RPAs have a great spatial and temporal resolution, they do not have a good spectral resolution. The spectral resolution is related to the wavelength detected by the sensor and the amount of spectral ranges or bands. Sensors embedded in cameras makes a RPA more expensive. For instance, a conventional camera has low spectral resolution because it contains only three bands: blue (0.45 to 0.49 μm), green (0.49 to 0.58 μm) and red (0.62 to 0.70 μm), being possible to obtain digital images only in the visible range. As a comparative result, studies performed by [41] verified that multispectral camera has the ability to discriminate vegetation better than conventional camera due to the number of bands. The infrared (IR) spectral range comprises Near Infrared (NIR; 0.78 to 2.5 μm), mid-infrared (MIR; 2.5 to 5.0) and far-infrared (FIR; 5.0 to 10.0), which make possible to extract from images spectral responses through vegetation indexes, such as NDVI.

NDVI is the most widespread index and is calculated based on spectral ranges of red and NIR. There are sensors that measure thermal radiation whose spectral range goes from the mid-infrared (MIR) to the far-infrared (FIR).

Most of the studies used NIR cameras for several purposes [5,13,16,19,32,36] however, due to the high cost of these sensors, it was decided to use cameras containing only the visible band [23,24,36,42].

In this context, [55] proposed the modified photochemical reflectance index (MPRI), which uses only the green and red band. Studied the spatial and temporal variability of the MPRI vegetation index applied in São Carlos grass images through the use of a remotely piloted aircraft (RPA) with embedded digital camera [11]. The authors concluded that this technique presented a potential aid in the control and management in grass cultivation areas.

In research performed by [42] using RPA equipped with a conventional low cost camera, they tested six visible spectral indices (CIVE, ExG, ExGR, Woebbecke Index, NGRDI, VEG) and concluded that ExG and VEG obtained the best precision for mapping of wheat, with values varying from 87.73% to 91.99% at 30 m flight altitude and from 83.74% to 87.82% at 60 m flight altitude.

Evaluated the use of conventional RGB cameras and removed the filter from the red band, thus the camera was sensitive to capture the bands blue, green and NIR [30]. This result has shown to be a promising tool for studies on vegetation monitoring, identification of phenological trends and plant health.

Platforms of this type embedded with conventional sensors are becoming an integral part of the global society [45]. Even without cameras with high spectral resolution, satisfactory results have been obtained for the agricultural area. Such platforms act as "an eye in the sky" [35], paving the way to use these technologies by small farmers in order to inspect and monitor crops in a fast and detailed way. This technology is intended to be the producer’s eyes, assisting him and making real time inspections in the future, being able to identify anomalies before spreading, such as in the case of pests and diseases, aiding the decision making in the field.

2.3. Flight planning and image acquisition

To perform the flight planning, softwares can be used in the office to accomplish such an action, and then sending the flight plan to RPA. Another possibility would be to perform the planning through applications installed in smartphones or tablets connected in the remote control of RPA, allowing the mission planning minutes before the flight. For each aircraft, whether fixed or rotary wing, there will be compatible software or application to plan and execute missions, as shown by studies performed by [36] in which was used eMotion 2 for eBee, the Mission Planner for X8 and the application DJI-Phantom for Phantom 2.

Some items must be evaluated before the flight: area, potential hazards from and for flight, flight planning, preparation and configuration of equipment, checking equipment and realization of flight and data collection.

Before performing the flight, it is necessary to evaluate the area and observe some safety factors for the aircraft operation, for the operator and for the people involved around the operation, including: weather conditions; wind speed; presence of objects, poles, trees, electric transmission towers; appropriate flight locations away from airports and areas with high population density; landing and take-off places; ground conditions and limiting factors related to the specific laws of each city/state/country should be observed.

Flight planning is a fundamental preliminary step towards obtaining quality products that meet the objectives. The planning is performed through softwares or free applications available in the market, they act as a ground control station for the RPA, and thereby the software route can be sent to the RPA when performing the flight planning, which is called downlink in the literature [57]. The captured images can thus be transmitted (downlink) to the terrestrial station or stored in the memory coupled in the RPA.

Generally, softwares or applications allow planning and executing the flight, being possible to make some flight definitions before performing the mission. In compatible softwares and applications, it is possible to create a new project for each RPA, in which the user can inform data for project settings such as coordinate systems, area, flight direction and camera settings, which are important for the correct mission operation, being that these information are stored in a database and then sent to be executed by the aircraft [14].

After activating the project, the area of interest must be delimited and the image overlaps, flight altitude and aircraft speed. Some softwares do not give the option of forward and lateral overlap.

Investigations performed by [27] show that overlapping is a factor that affect the accuracy and quality of the final product. The authors tested two forward and lateral overlap settings (80% -50% and 70% -40%), and found that the greater overlap (forward 80% and lateral 50%) was the most recommended for orthomosaic preparation. However, depending on the purpose and product, higher overlays will increase the capture time of images, thus resulting in a greater amount of point cloud and hence longer processing time. It should be studied and evaluated if there is a need for a greater amount of overlap for the application.

Recommend to use sufficient longitudinal and transverse coverage areas, at least 70 and 40%, respectively [39].

Once the flight plan is set, the mission must be saved. To execute the mission, the smartphone, tablet or computer must be connected to the aircraft’s remote control via USB, turn on the remote control and the aircraft. When selecting the desired mission and sending to the aircraft, before flying, a checklist will appear to check the flight. After checking, the aircraft is ready to take off.

2.4. Processing and use softwares

The processing of images generated through the RPA consists of a semi-automatic workflow, in which most of the software perform a similar flow, follow the calibration process of the camera, align the images, generate cloud points to then generate the Digital Surface Model (DSM) and Digital Elevation Model (DEM), which can be used for the production of orthomosaics, 3D modeling and obtaining metric information, such as area calculation, volume, heights, among others. [17,29].

After obtaining the products, it is possible to perform the image analysis, such as: soil use classification through the study of the spectral bands of images, or through object-oriented image analysis using supervised classification techniques with studies performed by [33] and [23]. Production of maps using vegetation indexes as studies performed by [32] and [42].

Considering that the altitude of RPA is low, there is an increase in spatial resolution, however, it cannot cover large extensions as orbital platforms, and it is necessary to capture a large number of images to cover larger areas of interest. Thus, the mosaic of these images is a necessary preprocessing [57]. However, studies performed by [51] developed a completely automated method for the mosaic preparation, being an advance for the preprocessing of these images by reducing processing time.

An automatic georeferencing method for small area was proposed by [54], reaching an accuracy of 0.90 m. This procedure reduces the data processing time, but errors of 0.90 m for application of nutrients, spraying and planting that requires accuracy may cause application errors, and thus compromising culture. Therefore, studies that improve the accuracy of automatic processing are valid for the current technology application scenario.

Accurate ground control points (GCPs) are important data for the geometric correction of products obtained by remote sensing, as well as GCPs can significantly increase the accuracy of maps. These control points are targets marked on the ground, or characteristic points of the terrain, such as crossings of roads and corners of buildings that can be identified in the image [50]. In this respect, the coordinates must be collected with a global positioning system (GPS) receiver of Real Time Kinematic (RTK) or Post-Processed Kinematic (PPK) for good accuracy. Research performed with RPA to obtain images showed good results using a minimum amount of points spread in the study area. Research works, such as by [49], used 24 targets as GCPs collected with dual-frequency GPS in RTK in a 30 x 50 m area, obtaining a position error around ± 0.05 m horizontally and ± 0.20 m vertically, [44] collected 23 GCPs with dual-frequency GPS RTK in an area of 125 x 60 m and obtained Root-Mean-Square Error (RMSE) values around 0.04-0.05 m horizontally and 0,03-0,04 m vertically.

For the automatic assembly of the orthomosaic, there are already several softwares that facilitate and automate this pre-processing. Such softwares allow high automation degree, favoring people who do not have expertise to produce accurate information of DSM, DEM, orthomosaic in a shorter time in relation to conventional photogrammetry [52].

The open source and free softwares (Table 1) currently available, in relation to commercial ones developed and used as PhotoScan, are few reliable and accurate in case of large blocks and complex images [6,29]. Considering the acquisition of images in small and occasional areas, such softwares supply the demand, however, the obtaining of products (orthomosaics, DEM, DMS) requires processing time, being this a problem still faced in obtaining products of such technology. According to [29], the approximate time for flight plan and RPA image collection is 25%, for obtaining control points (GCP) in the field is 15% and for photogrammetric processing is 60%. Even with the automatic softwares there is a demand for processors to complete and reach the products.

Table 1.  3D building software, mosaic and other image-based products 

Source: The Authors.

2.5. Applications and potentialities

It is possible to observe that there are aircraft applications to obtain data in agriculture, from crops of wheat [16,19,42], grape [32], canola [38] and maize [5], until perennial crops, such as apple [10], palm [36] and coffee [15,22], using several platforms, including fixed [23,36] and rotary wing [31,32,38,54]; containing RGB [23], multispectral [31] and thermal cameras [10], and performing several forms of image treatment for each purpose.

However, the applications of RPA in agriculture are still in the beginning, where studies related to agriculture are still limited and can be developed, such as: the improvement in the image processing time to obtain products; the real time sharing of products; the capacity of embedded sensors; improvement of the RPA autonomy and provision of technology to farmers.

Within this context, it is possible to study the use of these aerial platforms combined with sensors, mainly in the visible spectrum, for applications within PA, being able to work with vegetation indices in the visible range, observation of planting failures, image segmentation and classification, quantification of planted areas, analysis of crop uniformity, detection of plant height and crown diameter through images, monitoring of invasive plants, plant lodging, monitoring of crop anomalies, formation of historical series of cultures and growth monitoring by performing seasonal flight, among other applications. Besides being able to embed multispectral and thermal sensors to correlate with water stress, nutritional stress, pests and diseases and other factors related to plant health in which larger bands of the electromagnetic spectrum can capture more efficiently. Another potential of these aircrafts are the RPA for application of agrochemicals (RPA sprayers), which identify the site that needs defensive or fertilizer, thus applying automatically. This procedure can be useful for small areas and sensitive crops, such as strawberry and tomato that need specific and localized care in search for quality and sustainability in these crops. It is worth mentioning that the development and studies that seek to integrate the image capture with the real-time processing, providing the farmer with a fast and accurate crop diagnosis, will contribute significantly to the advance of PA in the field.

3. Considerations

During the last decade, RPA technologies have evolved and expanded the range of applications and utilities. The imagery through aerial platforms with RPA applying remote sensing in precision agriculture includes not only weed mapping, but also vigor mapping, mapping and detection of nutrient deficiency, susceptibility to pests, biomass estimation, and pasture monitoring.

However, there are still shortcomings in the use of this technology, such as sensor capacity and low spectral resolution. However, even before limitations, it is possible to perform remote sensing studies based on RPA using conventional cameras for agricultural applications, possibly having better management in the field through high temporal and spatial resolution images.

References

[1] Aber, J.S., Aber, S.W, Buster, L., Jensen, W.E. and Sleezer, R.L., Challenge of infrared kite aerial photography: a digital update. Transactions of the Kansas Academy of Science, 112(1), pp. 31-39, 2009. DOI: 10.1660/062.112.0205. [ Links ]

[2] Bendig, J., Willkomm, M., Tilly, N., Gnyp, M.L., Bennertz, S., Qiang, C. and Bareth, G., Very high resolution crop surface models (CSMs) from UAV-based stereo images for rice growth monitoring in Northeast China. ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 1(2), pp. 45-50, 2013. DOI: 10.5194/isprsarchives-xl-1-w2-45-2013. [ Links ]

[3] Candiago, S., Remondino, F., De Giglio, M., Dubbini, M. and Gattelli, M., Evaluating multispectral images and vegetation indices for precision farming applications from UAV images. Remote Sensing, 7(4), pp. 4026-4047, 2015. DOI: 10.3390/rs70404026. [ Links ]

[4] Caradonna, G., Tarantino, E., Scaioni, M. and Figorito, B., Multi-image 3D reconstruction: a photogrammetric and structure from motion comparative analysis. Computational Science and its Applications - ICCSA, 10964, pp. 305-316, 2018. DOI: 10.1007/978-3-319-95174-4_25. [ Links ]

[5] Castaldi, F., Pelosi, F., Pascucci, S. and Casa, R., Assessing the potential of images from unmanned aerial vehicles (UAV) to support herbicide patch spraying in maize. Precision Agriculture, 18(1), pp. 76-94, 2016. DOI: 10.1007/s1111. [ Links ]

[6] Colomina, I. and Molina, P., Unmanned aerial systems for photogrammetry and remote sensing: a review. ISPRS Journal of Photogrammetry and Remote Sensing,92, pp. 79-97, 2014. DOI: 10.1016/j.isprsjprs.2014.02.013. [ Links ]

[7] Cristea, L., Luculescu, M.C., Zamfira1, S.C., Boer, A.L. and Pop, S., Multiple criteria analysis of remotely piloted aircraft systems for monitoring the crops vegetation status. IOP Conference Series: Materials Science and Engineering, 147(1), pp. 1-8, 2016. DOI: 10.1088/1757-899x/147/1/012059. [ Links ]

[8] Getzin, S., Wiegand K . and Schöning I ,., Assessing biodiversity in forests using very high‐resolution images and unmanned aerial vehicles. Methods in Ecology and Evolution, 3(2), pp. 397-404, 2012. DOI: 10.1111/j.2041-210x.2011.00158.x. [ Links ]

[9] Giles, D.K., Use of remotely piloted aircraft for pesticide applications: Issues and outlook. Outlooks on Pest Management, 27(5), pp. 213-216, 2016. DOI: 10.1564/v27_oct_05. [ Links ]

[10] Gómez-Candón, D., Virlet, N., Labbé, S., Jolivot, A. and Regnard, J.L., Field phenotyping of water stress at tree scale by UAV-sensed imagery: new insights for thermal acquisition and calibration. Precision agriculture, 17(6), pp. 786-800, 2016. DOI: 10.5194/isprsarchives-xl-1-w2-157-2013. [ Links ]

[11] Gonçalves, L.M., Barbosa, B.D.S., Ferraz, G.A.ES., Maciel, D.T. and Santos, H.F.D., Space and temporary variability of the index vegetation applied to images obtained by a remotely piloted aircraf. Revista Brasileira de Engenharia de Biossistemas, 11(4), pp. 340-349, 2017. DOI: 10.18011/bioeng2017v11n4p340-349. [ Links ]

[12] Hardin, P.J. and Hardin, T.J., Small‐scale remotely piloted vehicles in environmental research. Geography Compass, 4(9), pp.1297-1311, 2010. DOI: 10.1111/j.1749-8198.2010.00381.x. [ Links ]

[13] Hardin, P.J. and Jackson, M.W., An unmanned aerial vehicle for rangeland photography. Rangeland Ecology & Management, 58(4), pp. 439-442, 2005. DOI: 10.2111/1551-5028(2005)058[0439:auavfr]2.0.co;2. [ Links ]

[14] Hernandez-Lopez, D., Felipe-Garcia, B., Gonzalez-Aguilera, D. and Arias-Perez, B., An automatic approach to UAV flight planning and control for photogrammetric applications. Photogrammetric Engineering &Remote Sensing , 79(1), pp. 87-98, 2013. DOI: 10.14358/pers.79.1.87. [ Links ]

[15] Herwitz, S., Johnson, L., Dunagan, S., Higgins, R., Sullivan, D., Zheng, J., Lobitz, B., Leung, J., Gallmeyer, B. and Aoyagi, M., Imaging from an unmanned aerial vehicle: agricultural surveillance and decision support. Computers and Electronics in Agriculture, 44(1), pp. 49-61, 2004. DOI: 10.1016/j.compag.2004.02.006. [ Links ]

[16] Honkavaara, E., Saari, H., Kaivosoja, J., Pölönen, I., Hakala, T., Litkey, P., Mäkynen, J. and Pesonen, L., Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture. Remote Sensing , 5(10), pp. 5006-5039, 2013. DOI: 10.3390/rs5105006. [ Links ]

[17] Hugenholtz, C.H., Whitehead, K., Brown, O.W., Barchyn, T.E., Moorman, B.J., LeClair, A., Riddell, K. and Hamilton, T., Geomorphological mapping with a small unmanned aircraft system (sUAS): Feature detection and accuracy assessment of a photogrammetrically derived digital terrain model. Geomorphology, 194, pp. 16-24, 2013. DOI: 10.1016/j.geomorph.2013.03.023. [ Links ]

[18] Hunt, E.R., Cavigelli, M., Daughtry, C.S., Mcmurtrey, J.E. and Walthall, C.L., Evaluation of digital photography from model aircraft for remote sensing of crop biomass and nitrogen status. Precision Agriculture , 6(4), pp. 359-378, 2005. [ Links ]

[19] Hunt, E.R., Hively, W.D., Daughtry, C.S., McCarty, G.W., Fujikawa, S.J., Ng, T.L. and Yoel, D.W., Remote sensing of crop leaf area index using unmanned airborne vehicles. In: Proceedings of the Pecora, 17 Symposium, Denver, Colorado, U.S.A., [online]. 2008. Available at: Available at: https://scholar.google.com.br/scholar?hl=pt-BR&as_sdt=0%2C5&q=Remote+sensing+of+crop+leaf+area+index+using+unmanned+airborne+vehicles.+&btnG= . Accessed: Ago. 15, 2016. [ Links ]

[20] ICAO- International Civil Aviation Organization. Circular 328: Unmanned Aircraft Systems (UAS). 2011. ISBN 978-92-9231-751-5. [ Links ]

[21] Inoue, Y., Morinaga, S. and Tomita, A., A blimp-based remote sensing system for low-altitude monitoring of plant variables: a preliminary experiment for agricultural and ecological applications. International Journal of Remote Sensing , 21(2), pp. 379-385, 2000. DOI: 10.1080/014311600210894. [ Links ]

[22] Johnson, L., Herwitz, S., Lobitz, B. and Dunagan, S., Feasibility of monitoring coffee field ripeness with airborne multispectral imagery. Applied Engineering in Agriculture, 20(6), pp. 845, 2004. DOI: 10.13031/2013.17718. [ Links ]

[23] Laliberte, A.S., Herrick, J.E., Rango, A. and Winters, C., Acquisition, orthorectification, and object-based classification of unmanned aerial vehicle (UAV) imagery for rangeland monitoring. Photogrammetric Engineering &Remote Sensing , 76(6), pp. 661-672, 2010. DOI: 10.14358/pers.76.6.661. [ Links ]

[24] Laliberte, A.S. and Rango, A., Image processing and classification procedures for analysis of sub-decimeter imagery acquired with an unmanned aircraft over arid rangelands. GIScience &Remote Sensing , 48(1), pp. 4-23, 2011. DOI: 10.2747/1548-1603.48.1.4. [ Links ]

[25] Liaghat, S. and Balasundram, S.K. A review: the role of remote sensing in precision agriculture. American Journal of Agricultural and Biological Sciences, 5(1), pp. 50-55, 2010. DOI: 10.3844/ajabssp.2010.50.55. [ Links ]

[26] López-Granados, F., Torres-Sánchez, J., De Castro, A.I., Serrano-Pérez, A., Mesas-Carrascosa, F.J. and Peña, J.M., Object-based early monitoring of a grass weed in a grass crop using high resolution UAV imagery. Agronomy for Sustainable Development, 36(4), pp. 36-67, 2016. DOI: 10.1007/s13593-016-0405-7. [ Links ]

[27] Mesas-Carrascosa, F.J., Notario-García, M.D., Meroño-de Larriva, J.E. and García-Ferrer, A., An analysis of the influence of flight parameters in the generation of unmanned aerial vehicle (UAV) orthomosaicks to survey archaeological areas. Sensors, 16(11), pp. 1-14, 2016. DOI: 10.3390/s16111838. [ Links ]

[28] Mulla, D.J., Twenty five years of remote sensing in precision agriculture: key advances and remaining knowledge gaps. Biosystems Engineering, 114(4), pp. 358-371, 2013. DOI: 10.1016/j.biosystemseng.2012.08.009. [ Links ]

[29] Nex, F. and Remondino, F., UAV for 3D mapping applications: a review. Applied Geomatics, 6(1), pp. 1-15, 2014. DOI: 10.1007/s12518-013-0120-x. [ Links ]

[30] Nijland, W., de Jong, R., de Jong, S.M., Wulder, M.A., Bater, C.W. and Coops, N.C., Monitoring plant condition and phenology using infrared sensitive consumer grade digital cameras. Agricultural and Forest Meteorology, 184, pp. 98-106, 2014. DOI: 10.1016/j.agrformet.2013.09.007. [ Links ]

[31] Pena, J.M., Torres-Sánchez, J., de Castro, A.I., Kelly, M. and López-Granados, F., Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (UAV) images. PloS One, 8(10), pp. e77151, 2013. DOI: 10.1371/journal.pone.0077151. [ Links ]

[32] Primicerio, J., Di Gennaro, S.F., Fiorillo, E., Genesio, L., Lugato, E., Matese, A. and Vaccari, F.P., A flexible unmanned aerial vehicle for precision agriculture. Precision Agriculture , 13(4), pp. 517-523, 2012. DOI: 10.1007/s11119-012-9257-6. [ Links ]

[33] Rango, A., Laliberte, A., Steele, C., Herrick, J.E., Bestelmeyer, B., Schmugge, T., Roanhorse, A. and Jenkins, V., Using unmanned aerial vehicles for rangelands: current applications and future potentials. Environmental Practice, 8(3), pp. 159-168, 2006. DOI: 10.1017/s1466046606060224. [ Links ]

[34] Rango, A., Laliberte, A., Herrick, J.E., Winters, C., Havstad, K., Steele, C. and Browning, D., Unmanned aerial vehicle-based remote sensing for rangeland assessment, monitoring, and management. Journal of AppliedRemote Sensing , 3(1), pp. e33542, 2009. DOI: 10.1117/1.3216822 [ Links ]

[35] Rodríguez, A., Negro, J.J., Mulero, M., Rodríguez, C., Hernández-Pliego, J. and Bustamante, J., The eye in the sky: combined use of unmanned aerial systems and GPS data loggers for ecological research and conservation of small birds. PLoS One, 7(12), pp. e50336, 2012. DOI: 10.1371/journal.pone.0050336. [ Links ]

[36] Romero, V.R., Villareal, A.M., León, J.L.T. y Hernández, A.H., Perspectivas de la tecnología VANT en el cultivo de palma de aceite: monitorización del cultivo mediante imágenes aéreas de alta resolución. Revista Palmas, 36(3), pp. 25-41, 2015. [ Links ]

[37] Sabina, J.A.R., Valle, D.G., Ruiz, C.P., García, J.M.M. y Laguna, A.G., Fotogrametría aérea por drone en yacimientos con grandes estructuras. Propuesta metodológica y aplicación práctica en los castillos medievales del Campo de Montiel. Virtual Archaeology Review, 6(13), pp. 5-19, 2015. [ Links ]

[38] Severtson, D., Callow, N., Flower, K., Neuhaus, A., Olejnik, M. and Nansen, C., Unmanned aerial vehicle canopy reflectance data detects potassium deficiency and green peach aphid susceptibility in canola. Precision Agriculture , 17(6), pp. 659-677, 2016. DOI: 10.1007/s11119-016-9442-0. [ Links ]

[39] Siebert, S. and Teizer, J., Mobile 3D mapping for surveying earthwork projects using an Unmanned Aerial Vehicle (UAV) system. Automation in Construction, 41, pp. 1-14, 2014. DOI: 10.1016/j.autcon.2014.01.004. [ Links ]

[40] Tokekar, P., Vander-Hook, J., Mulla, D., Isler, V., Sensor planning for a symbiotic UAV and UGV system for precision agriculture. IEEE Transactions on Robotics, 32(6), pp. 1498-1511, 2016. DOI: 10.1109/tro.2016.2603528. [ Links ]

[41] Torres-Sánchez, J., López-Granados, F., De Castro, A.I. and Peña-Barragán, J.M., Configuration and specifications of an unmanned aerial vehicle (UAV) for early site specific weed management. PloS One, 8(3), pp. e58210, 2013. DOI: 10.1371/journal.pone.0058210. [ Links ]

[42] Torres-Sánchez, J., Peña, J.M., de Castro, A.I. and López-Granados. F ., Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Computers and Electronics in Agriculture , 103, pp. 104-113, 2014. DOI: 10.1016/j.compag.2014.02.009. [ Links ]

[43] Tugi, A., Rasib, A.W., Suri, M.A., Zainon, O., Yusoff, A.R.M., Rahman, M.Z.A. and Darwin, N., Oil palm tree growth monitoring for smallholders by using unmanned aerial vehicle. Journal Teknologi, 77(26), pp. 17-27, 2015. [ Links ]

[44] Turner, D., Lucieer, A. and De Jong, S.M., Time series analysis of landslide dynamics using an unmanned aerial vehicle (UAV). Remote Sensing , 7(2), pp. 1736-1757, 2015. DOI: 10.3390/rs70201736. [ Links ]

[45] Urban, J., What is the eye in the sky actually looking at and who is controlling it? Regulatory opportunities in US drone law-an international comparative analysis on how to fill the cybersecurity and privacy gaps to strengthen existing US drone laws. Federal Communications Law Journal, SSRN Electronic Journal, [online]. pp. 1-76, 2017. DOI: 10.2139/ssrn.2964559. [ Links ]

[46] Uysal, M., Toprak, A. and Polat, N., DEM generation with UAV Photogrammetry and accuracy analysis in Sahitler hill. Measurement, 73, pp. 539-543, 2015. DOI: 10.1016/j.measurement.2015.06.010. [ Links ]

[47] Vericat, D., Brasington, J., Wheaton, J. and Cowie, M., Accuracy assessment of aerial photographs acquired using lighter‐than‐air blimps: low‐cost tools for mapping river corridors. River Research and Applications, 25(8), pp. 985-1000, 2009. DOI: 10.1002/rra.1198. [ Links ]

[48] Vierling, L.A., Fersdahl, M., Chen, X., Li, Z. and Zimmerman, P., The Short Wave Aerostat-Mounted Imager (SWAMI): a novel platform for acquiring remotely sensed data from a tethered balloon. Remote Sensingof Environment, 103(3), pp. 255-264, 2006. DOI: 10.1016/j.rse.2005.01.021. [ Links ]

[49] Wallace, L., Lucieer, A., Malenovský, Z., Turner, D. and Vopěnka, P., Assessment of forest structure using two UAV techniques: a comparison of airborne laser scanning and structure from motion (SfM) point clouds. Forests, 7(3), pp. 1-16, 2016. DOI: 10.3390/f7030062. [ Links ]

[50] Wang, J., Ge, Y., Heuvelink, G.B., Zhou, C. and Brus, D., Effect of the sampling design of ground control points on the geometric correction of remotely sensed imagery. International Journal of Applied Earth Observation and Geoinformation, 18, pp. 91-100, 2012. DOI: 10.1016/j.jag.2012.01.001. [ Links ]

[51] Wang, H., Li, J., Wang, L., Guan, H. and Geng, Z., Automated mosaicking of UAV images based on SFM method. In Geoscience and Remote Sensing Symposium (IGARSS), IEEE International , 2014, pp. 2633-2636. DOI: 10.1109/igarss.2014.6947014. [ Links ]

[1] Whitehead, K. and Hugenholtz, C.H., Remote sensing of the environment with small unmanned aircraft systems (UASs), Part 1: a review of progress and challenges. Journal of Unmanned Vehicle Systems, 2(3), pp. 69-85, 2014. DOI: 10.1139/juvs-2014-0006. [ Links ]

[53] Wundram, D. and Löffler, J., High‐resolution spatial analysis of mountain landscapes using a low‐altitude remote sensing approach. International Journal of Remote Sensing , 29(4), pp. 961-974, 2008. DOI: 10.1080/01431160701352113. [ Links ]

[54] Xiang, H. and Tian, L., Method for automatic georeferencing aerial remote sensing (RS) images from an unmanned aerial vehicle (UAV) platform. Biosystems Engineering, 108(2), pp. 104-113, 2011. DOI: 10.1016/j.biosystemseng.2010.11.003. [ Links ]

[55] Yang, Z., Willis, P. and Mueller, R., Impact of band-ratio enhanced AWIFS image to crop classification accuracy. In Proc. Pecora, 17(1), pp. 1-11, 2008. [ Links ]

[56] Zajkowski, T.J., Dickinson, M.B., Hiers, J.K., Holley, W., Williams, B.W., Paxton, A., Martinez, O. and Walker, G.W., Evaluation and use of remotely piloted aircraft systems for operations and research-RxCADRE 2012. International Journal of Wildland Fire, 25(1), pp.114-128, 2016. DOI: 10.1071/WF14176. [ Links ]

[57] Zhang, C. and Kovacs, J.M., The application of small unmanned aerial systems for precision agriculture: a review. Precision agriculture , 13(6), pp. 693-712, 2012. DOI: 10.1007/s11119-012-9274-5. [ Links ]

How to cite: Santos, L.M., Ferraz, G.A.S., Barbosa, B.D.S. and Andrade, A.D., Use of remotely piloted aircraft in precision agriculture: a review. DYNA, 86(210), pp. 284-291, July - September, 2019

L.M. Santos, received the BSc. degree in Agricultural Enviromental Engineering in 2016, from the Universidade Federal Rural do Rio de Janeiro, Brazil. MSc. in Agricultural Engineering in 2018 from the Universidade Federal de Lavras, Brazil, and safety engineers from Unilavras. Currently, he is PhD student in Agricultural Engineering in Universidade Federal de Lavras. His research interests include: precision agriculture using unmanned aircraft system. ORCID: 0000-0001-8406-2820

G.A.S. Ferraz, received the BSc. in Agricultural Engineering in 2008, the MSc. in Agricultural Engineering in 2010, and the PhD degree in Agricultural Engineering in 2012, all of them from the Universidade Federal de Lavras, Lavras, Brazil. From 2011 to 2014, he worked for Universidade Federal Rural do Rio de Janeiro and since 2014 for the Universidade Federal de Lavras where he is a full professor in the Engineering Department. His research interests include: precision agriculture, geostatistics, agricultural power machinery and agricultural machines. ORCID: 0000-0001-6403-2210

B.D.S. Barbosa, received the BSc. Eng in Agriculture and Enviromental Engineering in 2014 from the Federal University of Minas Gerais, Brazil. MSc. in Water Resources Engineering of Federal University of Lavras, Brazil, in 2015. Currently, he is PhD student in Agricultural Engineering in Federal University of Lavras. His PhD research is focused on precision agriculture using unmanned aerial vehicles. ORCID: 0000-0001-6791-2504

A.D. Andrade, received the BSc. degree in Agricultural Engineering in 2016 and the MSc. degree in Agricultural Engineering in 2018, both from the Universidade Federal de Lavras, Lavras, Brazil. His research interests include: Agricultural machines, precision agriculture and machinery testing. ORCID: 0000-0002-5851-5950

Received: September 05, 2018; Revised: August 26, 2019; Accepted: September 09, 2019

Creative Commons License This is an open-access article distributed under the terms of the Creative Commons Attribution License