SciELO - Scientific Electronic Library Online

 
vol.88 número217Bibliometric analysis in disc brakes: An overviewProduction and characterization of dual-phase steels from an AISI 8620 steel with high Mn content índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Journal

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Não possue artigos similaresSimilares em SciELO
  • Em processo de indexaçãoSimilares em Google

Compartilhar


DYNA

versão impressa ISSN 0012-7353versão On-line ISSN 2346-2183

Dyna rev.fac.nac.minas vol.88 no.217 Medellín abr./jun. 2021  Epub 08-Nov-2021

https://doi.org/10.15446/dyna.v88n217.91879 

Articles

Use of unmanned aircraft systems for bridge inspection: a review

Uso de sistemas de aeronaves no tripuladas para la inspección de puentes: una revisión

Didier Aldana-Rodríguez a  
http://orcid.org/0000-0002-6483-9580

Diego Leonardo Ávila-Granados b  
http://orcid.org/0000-0001-6782-3667

Jorge Armando Villalba-Vidales a  
http://orcid.org/0000-0001-6921-0565

a Facultad de Ingeniería y Ciencias Básicas, Fundación Universitaria los Libertadores, Bogotá, Colombia. daldanar@libertadores.edu.co, javillalbav01@libertadores.edu.co

b Escuela de Aviación del Ejército Nacional de Colombia, Bogotá, Colombia diegoavilagranados@cedoc.edu.co


Abstract

This review describes the use of Unmanned Aircraft Systems (UAS) for bridge inspection, with an emphasis on Multi-rotor UAS. It depicts the different levels of automation and autonomy during UAS operation and what levels are achieved during inspections. A description of the payload of UAS consisting of the equipment required to acquire data and images is included. It also contains a compendium of the techniques used to create models from images in order to detect failures and perform Structural Health Monitoring (SHM) through techniques, such as: 3D reconstruction, infrared thermography, Structure From Motion (SFM), Convolutional Neural Network (CNN) and others. The software required to apply the mentioned techniques is also mentioned. It subsequently explains the generation of mathematical models to characterize the multirotor and generate efficient trajectories. Finally, the review concludes by describing the operational limitations of UAS and future challenges.

Keywords: bridges; unmanned aircraft system; 3D reconstruction; infrared thermography; structural health monitoring

Resumen

Esta revisión describe el uso de los Unmanned Aircraft Systems (UAS) para la inspección de puentes, haciendo énfasis en los UAS Multirrotor. Relaciona los diferentes niveles de automatización y autonomía durante la operación de los UAS y cuáles de esos niveles se logran durante la inspección. Hay una descripción de la carga paga del UAS compuesta por los equipos requeridos para adquirir datos e imágenes. Se incluye un compendio de las técnicas que se usan para la creación de modelos a partir de imágenes, con el propósito de detectar fallas y realizar Structural Health Monitoring (SHM) mediante técnicas como: reconstrucción 3D, termografía infrarroja, Structure From Motion (SFM), Convolutional Neural Network (CNN) entre otras, así como el software requerido para aplicarlas. Posteriormente explica la generación de modelos matemáticos para caracterizar los multirrotores y generar trayectorias eficientes. Finaliza describiendo las limitaciones operacionales de los UAS y los retos futuros.

Palabras clave: puentes; sistema de aeronave no tripulada; reconstrucción 3D; termografía infrarroja; monitoreo de salud estructural

1. Introduction

Bridge inspection plays an important role in the construction and infrastructure sector, since it is required to maintain the structures’ safe operation, extend their useful life and ensure reliability through sustainable processes that guarantee efficiency of resources [1]. Inspections consist of periodically checking the bridges to perform Structural Health Monitoring (SHM) with Non-Destructive Testing (NDT) techniques, which seek to detect failures and discontinuities in structure materials and components without physically affecting the examined components. This is in order to repair and refurbish the bridge if required, ensuring its continuous operation under safety standards.

Currently, SHM of bridges is mostly accomplished through visual inspection supported by using sensors and specialized cameras. Its objective is to obtain data and images that allow accurately determining the variations of physical characteristics, possible defects, and discontinuities in the structural components of bridges. The described inspections are performed by inspectors using manual techniques, accessing the bridge with ladders, scaffolding, vehicles with lifts, or climbing while using ropes and harnesses. These activities create potentially unsafe conditions for people. According to [1,2], during visual inspections, the exhaustive studies and detailed evaluations of bridge conditions are expensive, technically complex, and require a great deal of time. This is especially true during image acquisition and data processing, which are the most demanding activities. The purpose of visually inspecting a bridge is to detect defects, such as cracks, fractures, corrosion, pores, delaminations and others. High-resolution images are required in order to precisely detect these defects. The images must be taken at a distance determined by the specifications and characteristics of the cameras and sensors that are used, also considering the geometric and physical characteristics of the bridge’s structure. These aspects, according to [2], occasionally restrict, make difficult, or prevent inspectors’ access or approximation to specific parts of the bridge, affecting the quantity, quality and clarity of the images an inspector can obtain. Such a scenario produces subjectivity in the results, which leads to decision-making based on the analysis of one or a few images. Considering the mentioned limitations, using Unmanned Aircraft Systems (UAS) as platforms for observing and acquiring data and high-resolution images is applicable, turning them into an innovative, simple, cheap, efficient and safe choice for inspecting and monitoring bridge conditions. These are significant advantages compared to traditional methods and using manned aircraft. This review provides useful information to better understand using UAS by contributing elements to develop future research projects, academic processes, equipment selection and appropriate techniques for inspecting and evaluating the structural conditions of bridges. In the last decade, the use of UAS for structure inspections has increased, but significant technological development has not been evident. In addition, the mathematical and scientific literature that covers the subject is scarce in comparison with that of traditional techniques, which makes this field a relevant subject for researching and analyzing potential applications.

According to [3], the most common name for these vehicles is drones, referring to the drone bee, since the bee’s particular sound resembles that of the airborne vehicles. Over the past 30 years, the term has evolved from Unmanned Aircraft Vehicle (UAV) to more precise terms, such as Remote Piloted Aircraft System (RPAS) and Unmanned Aircraft System (UAS), terms and acronyms embraced by the scientific and academic community, government aviation regulators [4,5,6] and companies dedicated to manufacturing or servicing these vehicles. A UAS is considered a system because it integrates three subsystems: i) the unmanned aircraft, ii) the ground control station, and iii) the communications link between the aircraft and the ground station [7-9]. These subsystems are synergistically linked to each other to achieve autonomous, controlled and stable flight. The UAS can be remotely controlled by a human on the ground, fly autonomously under the control of a computer, or through a combination of both methods. This leads to a system with different degrees of automation and operational autonomy that, according to the National Highway Traffic Safety Administration (NHTSA) of the United States [10], can be classified in six levels: i) level 0, the pilots have manual and full control of vehicle navigation, ii) level 1, there is a certain degree of automation applied to two flight modes. The first one corresponds to holding some altitude during dynamic flight and the second to static and sustained flight. iii) Level 2, the UAS navigates based on several flight modes programmed by the pilot, maintaining its route autonomously if there are no unexpected changes in the flight environment, iv) level 3, unlike the previous level, the UAS understands the changes in the flight environment and controls the flight modes to navigate in the new environment, v) level 4, the UAS can adapt and react when there is some anomaly in the system, an accident or a sudden collision with any object, and vi) level 5, a UAS can navigate autonomously in all environments and situations. A more detailed classification sort by levels is proposed in [11], as follows: i) remotely operated vehicle, ii) vehicle with the capacity of completing a mission iii) robust real-time response to failures or events, iv) adaptable vehicle during failures or events, v) real-time coordination between vehicles, vi) real-time cooperation between vehicles and vii) fully autonomous aircraft.

UAS are also classified into two types according to their takeoff and landing features. The first type is Horizontal Takeoff and Landing (HTOL), characterized by having fixed wings, covering long distances and reaching high speeds. The second type is Vertical Takeoff and Landing (VTOL), characterized by having one or more rotating wings and possessing the ability to perform sustained and stable static flight [12,13], which is considered an advantage with respect to bridge inspection. VTOLs have less speed than HTOLs, but, on the other hand, they are smaller, lighter, and cheaper. According to their configuration and number of engines, VTOLs are divided into helicopters and multi-rotors. The latter are widely used for civil purposes due to their good maneuverability, good controllability and lower acquisition cost. They are called tricopters if equipped with three engines, quadcopters if equipped with four engines, hexacopters if equipped with six engines, and octocopters if equipped with eight engines [14].

Multi-rotors have five basic components [15,16]: i) a frame that can be made of plastic, carbon fiber, wood, or aluminum, ii) a motor-helix assembly, in which the propellers are fixed pitch propellers (pitch is the distance traveled in the air during a complete 360-degree rotation of the propeller), which are coupled to brushless electric motors located in the arms of the frame, iii) an Electronic Speed Controller (ESC) that manages current flow to the motor according to the required RPM depending on the multi-rotor operation. ESC is controlled by Pulse Wide Modulation (PWM). iv) A Flight Controller (FC), considered the brain of the drone, which is in charge of sending control signals to the ESCs. These signals are generated based on commands received by the FC through a signal receiver (Rx) transmitted by the ground station (Tx), and the signals received from various types of sensors, both internal and/or external to the FC [17]. Sensors are generally devices, such as gyroscopes, accelerometers, barometers, and magnetometers, which allow the FC to determine the attitude, altitude, speed and position of the aircraft, supported by a satellite navigation system such as GPS or GLONASS. v) A Lithium Polymer (LiPo) battery with high electrical power and energy density that feeds the UAV’s electronic components [18].

2. Materials and methods

This article was created through a systematic review, as described in [19]. SCOPUS and Google Scholar were used as research tools. On SCOPUS, the search was performed in two different time slots: articles released during the last five years (January 2015 to February 2020) and articles released before 2015. The keywords used for the search were: i) drones, ii) unmanned aircraft vehicle, iii) unmanned aircraft system, iv) remotely piloted aircraft system and v) bridge inspection. A total of 257 were found and they were reduced to 112 after filtering the results according to their relevance (determined by number of citations). The guiding questions described in [19] were used in order to perform the analysis. As a result, 55 articles obtained from the search performed on SCOPUS were selected. This group of papers was called the Academic Relevant Space (ARS) [19]. The same key words used for SCOPUS were used for Google Scholar, and a total of 11 references were chosen. These references added to the ARS from SCOPUS comprise the 66 references taken into consideration for writing this review.

3. Studies performed with unmanned aircraft systems for bridge inspection

Bridges are structures of vital importance for the transportation of people and goods. For this reason, their periodic inspection is pertinent and necessary to ensure continuous and safe operation, as well as to extend their useful life. This is extremely important for bridges that are subject to structural degradation, aging and mechanical damage due to fatigue due to loads, thermal expansion and/or contraction and delamination in concrete and cracks. The latter is considered one of the most important parameters for monitoring and evaluating structural conditions [20].

Although there are a variety of bridges, they are usually divided into three major sections: i) foundation, ii) substructure and iii) superstructure. The foundation contains the piles that provide support and a solid base for the bridge. They also transmit the weight and loads to the terrain. The caps, which are made of concrete and contribute to transferring loads to the ground, are located above the piles. The abutments are found in the substructure. These are vertical walls located at the ends of bridges, which retain the soil around the bridge [21]. If the bridge is composed of several sections, as shown in Fig. 1, pier and pier caps are located at the ends of each section to support the sections and disperse vibrations produced by traffic crossing the bridge. The decks that directly support the traffic loads are located in the section called the superstructure. These elements are attached to the pier caps through bearings. The last ones transfer loads from the decks to the substructure.

Source: The Autors

Figure 1 Main Parts of a Bridge 

Inspections performed manually on bridges by means of visual inspection techniques are expensive, risky, time-consuming and require the expertise of highly qualified inspectors, which produces a high degree of subjectivity in the data analysis process and decision-making for maintenance. Additionally, equipment, such as ladders, ropes and lifting baskets mounted on land or water vehicles are required to inspect areas that are difficult to access [3,21,33]. Due to the large size of certain structures, there is a high risk derived from working at heights. For this reason, [20] presents the need for an intelligent and precise technique to perform a Structural Health Monitoring (SHM) study. Methods based on emerging technologies and digital techniques, combined with the use of UAS equipped with cameras and sensors of various types, allow evaluating and monitoring bridges' structural conditions through image-based approaches as a source of information.

Using multirotor UAS for bridge inspections has demonstrated significant development in the last 10 years, since these vehicles are smaller and more maneuverable compared to manned and unmanned fixed-wing aircraft. Additionally, they have a certain degree of trajectory control and flight autonomy, facilitating their use for inspecting complex areas that are difficult to access. On the other hand, a variety of equipment and sensors for inspecting structures can be equipped on UAS, such as high-resolution digital cameras, thermographic cameras [22,23], Light Detection and Ranging or Laser Imaging (LIDAR) devices for terrain characterization [24], radiation detectors [25] and humidity and temperature sensors [26], among others that make up the UAS payload. These elements allow inspectors to obtain the necessary information to detect and analyze various types of defects and discontinuities in bridge structure components and materials.

3.1 Methodology to perform bridge inspection with UAS

A detailed methodology for acquiring data autonomously from images obtained using UAS is presented in [20]. The proposed methodology consists of ten macro-processes, as follows: i) task definition, ii) criteria assessment, iii) mission preparation and control, iv) flight path generation, v) data acquisition, vi) photogrammetric and 3D reconstruction, vii) 3D modeling and visualization, viii) anomaly detection, ix) mechanical interpretation and x) structural condition assessment. Inspection parameters and criteria are defined in processes i and ii, determining what properties and quantities are required to be obtained, such as structure geometry, anomalies, defects and discontinuities, in order to be detected and evaluated. The requirements and criteria for rejecting and accepting these anomalies are also established. Legal and safety specifications are defined during process iii, such as minimum approach distance between the object and the drone, as well as minimum and maximum flight heights. The technical specifications of the equipment on the drone, such as flight systems, cameras, lenses, and sensors are also considered. Processes iv and v include planning of flight path by following determined waypoints. Camera orientation is set in order to obtain high-resolution images that have a percentage of overlap between each other, These processes are also described in [22,28]. At this point, a flight to obtain images of the component to be analyzed is performed. As a result of steps iv and v, the flight path has been optimized with the specific points to be analyzed. A set of raw images is then obtained, with the camera’s real position and orientation and time stamps of each image. Processes vi and vii consists of pre-processing the images acquired with the UAS through radiometric and geometric enhancement. A 3D model is built to perform a georeferenced photogrammetric analysis using a simple meshing of the inspected component. 3D reconstruction is performed using the Structure From Motion (SFM) technique [20,33] used to obtain 3D models from 2D images. Georeferencing is accomplished by adjusting the gathered data with a Dense Stereo Matching [27]. As a result of steps vi and vii, a dense cloud of georeferenced 3D points is finally acquired [17,24,28]. Process viii focuses on executing an automated analysis of the images, identifying and sizing the anomalies through their metrics and characterization, quantifying variables such as pixel size and resolution of the object. In parallel, a mesh and texture is created to build a 3D surface model, achieving data integration aimed at mapping the anomalies. Step ix includes analyzing the point cloud and recording the geometric changes in the structure and the changes in the location of the anomalies, as well as changes in their dimensions. The anomalies are interpreted based on the evaluation criteria and mechanical properties of the materials that make up the inspected component. Finally, the structural evaluation of the bridge’s condition is completed in process x.

3.2 Equipment, software and techniques used in bridge inspection with UAS

The research developed in [28] analyzed the abutments, piers and pylons of an aged bridge to detect, characterize and quantify cracks. An Inspire 2 quadcopter equipped with a Zen muse X5S camera with a 20.8-megapixel resolution was used to do so. The methodology consisted of the following steps: i) acquiring the images using the camera mounted on the drone, ii) generating a point cloud to build a damage map or 3D inspection map by means of the Pix4D Mapper commercial software [20,28]. This process took 150 min. iii) Detecting cracks through deep learning methods [29], such as region analysis with Convolutional Neural Network (CNN) [30,31]. CNN is a deep learning algorithm that has an image as input and weighs the importance of various aspects or objects on the input image, differentiating them from each other. iv) Quantifying cracks by image processing to detect cracks by binarization, in order to convert a Red-Green-Blue (RGB) image [22,28,38,39] into a binary one with AutoCAD 2017 software. This process took 30 min. Noise was filtered in order to clean the image. v) Displaying the images on an inspection map using the Sobel [32] algorithm for edge and contour detection. This methodology was applied to the region of interest (ROI) of the bridge, which was the lateral part of the decks and pier caps in this case. The UAS was manually operated at a distance of two meters to avoid losing GPS signal and to ensure a sufficient Field of View (FOV) of the camera. A total of 384 images were obtained, revealing twelve cracks between 0.55 mm and 1.92 mm thick and 8.32 mm and 78.43 mm long. After comparing the mentioned results to an analysis of the same cracks by traditional methods, an error of between 1 and 2% was found.

An inspection of the Placer River Bridge in Alaska was conducted in [33]. The structure is 85 meters long and has a wooden superstructure. The research compared a traditional inspection developed with the LIDAR method applied manually by an inspector to a hybrid autonomous method performed by a UAS. During the study, the effectiveness and efficiency of the method was validated, comparing the number of points, point density and noise level in the images. The pictures were gathered using a DJI S800 hexacopter equipped with a 24.3 megapixel SONY NEX 7 camera and a GoPro Hero 3 camera. Data acquisition and flight path planning were performed using the Mission Planner [34] software from 3D Robotics, allowing the researchers to obtain 2626 images and 20 videos. Subsequently, the team created the 3D reconstruction of the bridge from the images using the Dense Structure From Motion (DFSM) technique [35] jointly with the Hierarchical PointCloud Generation (HPCG) technique described in detail in [36]. By employing this hybrid method, 1,412,060,890 points with a density of 5,656,185 points per cubic meter and a noise level (distortion) of 4.5 mm of the image were obtained. On the other hand, using the traditional LIDAR method, 202,790,259 points with a density of 1,478,099 points per cubic meter and a noise level of 1.8 mm were obtained. The increase in points in the hybrid technique increases the density and consequently increases the geometric resolution of the image, which benefits the detectability of possible defects or discontinuities in the ROI or damage region (DR). Nevertheless, the increase in the total number of processed pixels increases the complexity and processing time of creating the model.

The objective of the study performed by [37] was to inspect a steel bridge to detect damage through a critical members’ fracture analysis (FCM). Two aspects were analyzed and compared: i) Maximum Crack to Camera distance (MCC) or maximum distance for detection and ii) Achievable Crack to Platform distance (ACP). Two experiments were conducted: i) an external field inspection on a 120 m long bridge specimen simulating a section of a steel bridge and ii) a real inspection on a bridge in Utah City. During the experiment, the following equipment was used: i) DJI MAVIC quadcopter equipped with a 12-megapixel resolution camera, ii) 3DR IRIS quadcopter equipped with a 12 megapixel resolution GoPro Hero 4 camera and iii) quadcopter drone assembled for the researchers, equipped with a 16 megapixel resolution Nikon COOLPIX L830 camera. The measurements were made under the lighting conditions generally found during bridge inspections, which are: i) dark conditions under the bridge on a cloudy day, ii) intermediate lighting conditions under the bridge on a clear day and iii) artificial lighting conditions using electric light lamps. The results revealed that cracks might be detected at a greater MCC distance if illumination increases. In darkness, the MCC distances increased from 0.2m to 0.6m with the GoPro and 0.4m to 1.10m with the DJI camera. On the other hand, a 1.10m MCC was obtained with the DJI camera in artificial lighting conditions, while the Nikon camera went from 0.3m MCC in dark conditions to 1m in artificial lighting conditions. Another aspect that affects the efficiency of crack detection is the camera's ability to increase ISO sensitivity. It considers the amount of light that must pass through the lens in low light conditions. ISO sensitivity values for the Nikon camera ranged between 280 and 1600 and between 480 and 1600 for the Mavic DJI camera, while it was always 400 for the GoPro because it does not vary as lighting conditions change. This makes GoPros less suitable for crack detection. In the absence of a GPS signal, the DJI MAVIC quadcopter, unlike the other two multirotors, used an alternate positioning system based on stereovision and sonar to maintain altitude, which guaranteed a 0.25m ACP and a 0.25m MCC with clear crack detection in real-time and in post-flight image analysis.

Another objective of [37] was to determine the effects of wind on detecting cracks and fractures using UAS. To do so, four beams, the abutments and two girders of a bridge located over the Fall River in the city of Ashton in Idaho were inspected. The procedure was performed near midday under wind speeds of between 7 m/s and 11 m/s. With the drone assembled by the researchers, it was not possible to achieve control or maneuverability within the mentioned wind speed range, while with the IRIS drone, they achieved an ACP distance of 0.6 m with no real-time crack detection, which was obtained later during post-flight image analysis. In contrast, with the MAVIC DJI drone, an ACP distance of 0.25 m was achieved and cracks were detected both in real-time and during post-flight analysis for speeds near 7 m/s at all lighting conditions. However, at speeds near 11 m/s, no cracks were detected. Besides, it was possible to detect other defects and anomalies of interest in real time, such as corrosion at the bottom of the south beam, efflorescence, cracks in the concrete, possible delamination in the abutment and minor corrosion in the splice plate of one of the beams.

[37] analyzed the effectiveness of the probability of detection (POD) of cracks in a 120-meter bridge test tube located at Purdue University. The specimen is used to train inspectors because it has many previously characterized cracks. The results of the inspection on the test tube with three drones (Mavic DJI, Inspire 1, DJI Phantom 3) were compared to the average results obtained by 30 human inspectors. The effectiveness was evaluated by quantifying the number of hits during an inspection versus the actual number of cracks in the test tube. The procedure was performed with a wind speed of 4 m/s and, for evaluation purposes, the following parameters were considered: i) number of cracks reported (call), ii) number of true positives (hit), iii) number of false positives (fallout), iv) number of false negatives (misses), v) hit/call ratio, vi) true positive ratio (TPR) calculated by dividing the hits by the sum of hits and misses and vii) false-positive ratio (FPR), calculated by dividing the false positives by the number of calls. The evaluation showed that UAS-assisted inspections lasted between 1.5 and 3 times longer than real-time human inspections and approximately 2 times more calls were obtained in UAS inspection compared to human inspections. Either way, there was no representative difference in this item during post-flight image analysis. In the TPR index, there was only approximately a 10% difference between UAS and human inspection. UAS inspections produced between 10% and 20% less false positives than human inspections. Based on the results of all the experiments, the study concluded that UAS performance in FCM presents a quality similar to that of inspections performed by human inspectors.

In [38], the team conducted a study on the deck and girders of a bridge in the city of Idaho. The structure has a length of 675 m, a deck of approximately 8 m in width, and an area greater than 2,787 square meters. The inspection was performed by manual flight in First Person View (FPV) mode, with a DJI Phantom 3 quadcopter equipped with a 12 megapixel camera. The approach distance was 2 to 3 meters, which allowed taking 4k videos. The procedure's objective was to inspect the connections between components to evaluate the condition of bolts, rivets and the possible presence of rust on them. The bearings were also inspected for misalignment, bulging or tearing, as well as leaks, concrete spalling, steel loss and cracks in the joints. The project presented a limitation due to the loss of the GPS signal while approaching the structure or flying under it. Therefore, it was not possible to establish a flight path with viewpoints and/or waypoints. To overcome this limitation, [39] proposed a UAS navigation system using the Ultrasonic Beacons System (UBS). This system was developed to provide high precision positioning based on ultrasonic sensors, which allows applications in environments without GPS. It can be considered an alternative to generating a mapping and location system with centimeter accuracy. It is easy to integrate to UAS through low-cost hardware. Additionally, CNN was used as a method to detect cracks in the concrete of a bridge [40,41] and locate them accurately through a method called geotagging. Three drones were used during the research: two of them manufactured by the team itself and equipped with a Sony FDRX3000 camera and, a commercial Parrot Bebop 2 drone. The flight plan was made using Mission Planner. The study consisted of the following steps: i) manufacturing two multirotors instead of using a commercial multirotor, since the latter does not allow modifying the source code for autonomous navigation, ii) installing a mobile beacon in the drones to determine their 3D location, iii) modifying the source code of the flight controller firmware, iv) integrating ultrasound beacon system (UBS) with the autonomous flight controller and v) replacing the GPS coordinates with a signal in the data provided by the images for geotagging. The flight was precise, but there were some fluctuations in altitude. The cracks were detected in the concrete with an accuracy of 96.6%. After comparing the results of the UAS images to those obtained by manual collection, it was demonstrated that both images were highly accurate.

In [22], a multispectral UAS detection system was used for evaluating bridge decks to detect internal delamination in the concrete. A quadcopter assembled on an F550 frame was used to inspect a 31 ft. × 13 ft. × 8 in. concrete deck specimen. The payload that comprised the multispectral system consisted of a GoPro Hero3 RGB camera and a FLIR TAU 2 thermographic camera with an operating range of -40 °C to 80 °C. The IR-RGB multispectral system was used during the study to locate delamination in the subsoil by analyzing the thermograms obtained with the thermographic camera and to locate cracks in the surface through high-resolution RGB images. During the study, researchers detected regions with sub-surface delamination shown as hot spots in the thermograms that display temperature gradients by means of thermal contrasts. They are displayed on a scale that associates the temperature values to a color gradient.

Another research article that addresses the use of infrared thermography (IRT) linked to a UAS for bridge inspection is [23]. Its main objective was to assess the reliability of using a multirotor UAS equipped with an on-board thermographic camera in order to determine the condition of the reinforced concrete decks of a bridge in the city of London. The technique is based on evaluating certain properties of concrete, such as density, thermal conductivity and specific heat. The study was conducted in the following sequence: i) determining the UAS’ capacity for acquiring thermal images, ii) developing a procedure focused on images analysis, iii) creating a mosaic thermogram of the entire deck and iv) producing a condition map with the geometry and dimensions of the detected delaminations. The mentioned objectives were developed through the following methodological steps: i) using the applied passive IRT to evaluate two deteriorated decks, ii) improving the thermal contrast of images by means of ImageJ software, iii) overlaying the images using the Matlab software to produce the thermal gradient map of the decks, iv) identifying defects through the thermal contrasts achieved in step ii, which is caused by the interruption of the heat flow in the concrete, and v) quantifying the delaminated areas through the thermal contrasts of the images. This methodology is coordinated with the one described in ASTM D4788-03 [42], which defines the standard procedure and equipment required to conduct a passive infrared thermography test to detect delamination in concrete bridge decks. The researchers used a Inspire 1 Pro drone equipped with a Vue thermal pro camera. The experiments were conducted 6 hours after sunrise, under the following conditions: a temperature of 26 degrees Celsius, relative humidity of 22%, wind speed of 22 km/hr and dry decks. Four images were taken at a height of 10 m, with an overlap between images of 50%, and a spatial resolution of 2.5 cm in height. The total inspection time was 20 minutes. The images were enhanced through ImageJ software, and the team joined the overlapping images to obtain a 640 x 780 mosaic thermogram with 499,200 pixels using a Gaussian-smoothing filter [43]. It is worth mentioning that the authors developed a code in Matlab to extract the pixels from the images. Then, they generated a threshold classification to sort and choose the appropriate photos. The total percentage of delaminated areas on the bridge deck was determined by calculating the total percentage of pixels in the higher temperature areas. The results were compared to the ones obtained with traditional techniques, leading to the following data: by inspecting the deck using the traditional hammer technique, 17% of total delaminated zones in the deck of the bridge were identified. On the other hand, the total delaminated zone calculated from the study with UAS and IRT was 15.4%, which reflects only a difference of 1.6% between the two methods. Similarly, the total delaminated zones detected in a second bridge deck by the hammer drilling method and the IRT were 32% and 29.3% respectively, establishing a 2.7% difference between the two methods. These results demonstrate the feasibility of using articulated IRT with UAS for evaluating the condition of concrete bridge decks. In Table 1, the important and relevant aspects and parameters of each study are compared and listed.

Table 1 UAS applications on bridges inspection. 

Source: prepared by authors

3.3 Trajectory and mission planning for UAS

Flight planning is one of the most important factors for an inspection’s success and the quality of the images obtained with UAS. It is possible to generate flight trajectories with advanced techniques such as: image-based recognition [28,44], Simultaneous Localization and Mapping (SLAM) [45], a technique that allows mapping an unknown environment in real-time and simultaneously locates itself in that environment, and Lidar Odometry Mapping in Real-time (LOAM) [46]. This last technique estimates position through LIDAR. Commercial planning software, such as Mission Planner [33], DJI Ground Station Pro [47], Pix4d Capture [48] or Dronedeploy [49] are available to perform the mission. Mission planning software contains graphic interfaces that allow the user to define the trajectory and tasks of the UAS. The drones may have 3 types of trajectories: i) point to point control, consisting of going from point A to point B, regardless of the trajectory between the points, ii) trajectory tracking, which is when the drone is required to follow a certain trajectory, and iii) obstacles avoidance, which is when the drone is required to avoid obstacles during a certain trajectory. By using a path-planning method, the multirotor may achieve the ability to avoid obstacles, track targets and move from one point to another with precision and operational safety [11]. When a dynamic and mathematical model of the drone has been obtained, optimal flight paths can be defined by means of high or medium-level programming languages designed according to the user’s needs [50]. There are currently several techniques and models for planning optimal trajectories with minimum energy consumption, smooth transitions between states and minimization criteria, establishing initial conditions for the position, speed and acceleration of the path to follow [51]. These parameters are required to avoid obstacles.

3.4 Dynamic Modeling and Control Algorithms for UAS

The dynamic modeling of quadcopter drones is performed with two differential equations. One is related to translational movements and the other models rotational movements. These equations may or may not be linear. In any case, they may be linearized [52,53] if the quadcopter is operating at a specific point at low speeds [11]. Control strategy selection depends on the linearity or non-linearity of the modeled system, taking into account that linear strategies are less complex and easy to implement, but are, at the same time, very sensitive to disturbances. On the other hand, non-linear strategies are more complicated to implement but are less sensitive to disturbances. Two types of linear controllers for quadcopters can be found: Proportional, Integral and Derivative (PID) and Linear Quadratic Regulator (LCR) [54-57]. They are modeled by status feedback and using matrix inequalities DML [58,59]. There are also several non-linear models for quadcopters that allow good precision of the modeling but increase the complexity of the analysis, such as control based on neural networks, adaptive control [60], fault-tolerant control, robust control, backstepping control, control H [61], model prediction control and control based on disturbance observers [62-65].

4. Conclusions, limitations, and future challenges

Among the limitations to using UAS for bridge and structure inspection in general is the loss of signal connection to the Global Navigation Satellite System (GNSS) while operating vehicles under the structures or close to them. This condition forces manual operation because it makes it difficult or impossible to plan trajectories and/or autonomous missions. As a challenge to the industry, it is required to increase the power and efficiency of satellite navigation systems and develop simple and portable alternative methods to deal with the loss of satellite signal. Lighting conditions are another important factor that limits operations, since dark conditions, which are usually found under bridges reduce the detectability of flaws and affect the approach distances between the drone and the structure. This makes it necessary to use artificial lighting external to the UAS. A major challenge in the future of drones consists of the optimal adaptation of powerful autonomous lighting systems, since installing a powerful lighting system on UAS is currently inefficient due to the extra current consumption of the lights. In turn, that reduces the flight time. Another significant issue is the increase in the amount of payload capacity on multirotor UAS. The simultaneous use of sensors and cameras is currently limited. This highlights the need to research materials that reduce weight and improve aerodynamic efficiency. A critical subject is associated to flight times, since they only range between 20 to 35 minutes, limiting the vehicles to short periods of operation and requiring the availability of several batteries and access to charging points in the field. The mentioned scenario increases costs and inspection times. Therefore, developing components with high-energy efficiency and different alternative power sources or improving the LiPo batteries that currently power UAS is relevant. The difficulty of operating in confined spaces or very close to certain bridge components is another limitation, given the structural fragility of the propellers and drone arms. This risk has been mitigated by using protective baskets [66]. In spite of being functional, the protective baskets affect aerodynamic efficiency due to the extra drag they produce and the weight added to the system, as well as reducing maneuverability and controllability. Finally, the operational limitation produced by the wind must be overcome, since high-quality images cannot be acquired at wind speeds of higher than 7 m/s. Based on the results of the studies related to this review, it is prudent to ensure UAS are an efficient tool to complement and reduce the workload in traditional inspections performed by humans, increasing their operational and occupational safety. However, it is pertinent to clarify that, although a significant amount of research is being performed on the subject, there is still a lack of technological development in UAS and onboard equipment to execute fully autonomous missions for the inspection of bridges and general structures.

Reference

[1] Morgenthal, G. and Hallermann, N., Quality assessment of Unmanned Aerial Vehicle (UAV) based visual inspection of structures. Advances in Structural Engineering, 17(3), pp. 289-302, 2014. DOI: 10.1260/1369-4332.17.3.289 [ Links ]

[2] Hallermann, N., Morgenthal, G. and Rodehorst, V., Unmanned Aerial Systems (UAS)-Survey and monitoring based on high-quality airborne photos, IABSE Symposium Report International Association for Bridge and Structural Engineering, pp. 1-8, 2015. DOI: 10.2749/222137815818358583 [ Links ]

[3] Hallermann, N., Morgenthal, G. and Rodehorst, V., Vision-based deformation monitoring of large-scale structures using Unmanned Aerial Systems, IABSE Symposium Report International Association for Bridge and Structural Engineering, pp. 2852-2859, 2014. DOI: 10.2749/222137814814070343 [ Links ]

[4] Doc 10019 AN/507 Manual sobre sistemas de aeronaves pilotadas a distancia (RPAS). In: International Civil Aviation Organization - OACI 2015. Montréal, [online]. 2015. [date of reference February, 4th of 2020]. Available at: Available at: https://www.icao.int/isbn/Lists/Publications/DispForm.aspx?ID=2757Links ]

[5] A-NPA, No. 16/2005, policy for unmanned aerial vehicle (UAV) certification, European Aviation Safety Agency - EASA, K ̈oln, Germany, [online]. 2005. [date of reference February 4th of 2020]. Available at: Available at: https://www.easa.europa.eu/document-library/notices-of-proposed-amendments/npa-16-2005Links ]

[6] UAEAC, U.A. Reglas generales de vuelo y de operación - RAC 91. [online]. Bogotá, Colombia. [date of reference February 5th of 2020]. Available at: Available at: https://www.aerocivil.gov.co/autoridad-de-la-aviacion-civil/reglamentacion/racLinks ]

[7] Adabo, G.-J., Unmanned aircraft system for high voltage power transmission lines of Brazilian electrical system, AUVSI Unmanned Systems, pp.1556-1563, 2013. [ Links ]

[8] Valavanis, K., Vachtsevanos, G., Handbook of unmanned aerial vehicles. 1st ed, Netherlands, Springer. 2014, pp. 82-92. [ Links ]

[9] Ułanowicz, L., Jóźko, M. and Szczepaniak, P., Controlling the operation process of the unmanned aerial system, Journal of KONBiN, 44(1), pp. 5-36. 2018. DOI:10.1515/jok-2017-0059. [ Links ]

[10] Standard J3016_201806, Taxonomy and definitions for terms related to driving automation systems, NHTSA/SAE, [online]. 2014, [date of reference February 5th of 2020] Available at: Available at: https://www.sae.org/standards/content/j3016_201806/Links ]

[11] Miranda, C.R., Garrido, M.R., Aguilar, B.L. and Guerrero, E.J., Drones modelado y control de cuadricópteros. 1ra ed., España, Alfaomega, 2020, pp. 3-36. [ Links ]

[12] Pratt, K.S., CONOPS and autonomy recommendations for VTOL small unmanned aerial system based on Hurricane Katrina operations. Journal of Field Robotics 26(8), pp 636-650, 2009. DOI: 10.1002/rob.20304 [ Links ]

[13] Sa, I., Hrabar, S. and Corke, P., Inspection of pole-like structures using a vision-controlled VTOL UAV and shared autonomy, IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4819-4826, 2014. DOI: 10.1109/IROS.2014.6943247 [ Links ]

[14] Cho, O.H., Ban, K.J. and Kim, E.K., Stabilized UAV flight system design for structure safety inspection, Conference on Advanced Communication Technology, pp. 1312-1316, 2014. DOI: 10.1109/ICACT.2014.6779172 [ Links ]

[15] Gaponov, I. and Razinkova, A., Quadcopter design and implementation as a multidisciplinary engineering course, in: Proceedings of IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE), pp. H2B-16-H2B-19 2012. DOI: 10.1109/TALE.2012.6360335 [ Links ]

[16] Javir, A.V., Pawar, K., Dhudum, S., Patale, N. and Patil, S., Design, analysis and fabrication of quadcopter. Journal of Advance Research in Mechanical & Civil Engineering, 2(3), pp. 16-27, 2015. [ Links ]

[17] Juniper, A., The complete guide to drones, 1st ed, Octopus publishing Group, Hachette, UK, 2015. pp. 106-127. [ Links ]

[18] Elliott, A., Build your own drone manual: owners' workshop manual, 1st ed, Haynes North America Incorporated, Haynes, USA, 2017, pp. 98-151. [ Links ]

[19] Pérez, J., Revisión sistemática de la literatura en ingeniería. Medellín, 2019. [ Links ]

[20] Morgenthal, G, et al., Framework for automated UAS-based structural condition assessment of bridges. Automation in Construction, 97, pp. 77-95, 2019. DOI: 10.1016/j.autcon.2018.10.006 [ Links ]

[21] Tomiczek, AP., et al., Bridge inspections with small, unmanned aircraft systems: case studies. Journal of Bridge Engineering, 24(4), art. 05019003, 2019. DOI: 10.1061/(ASCE)BE.1943-5592.0001376 [ Links ]

[22] Khan, F., et al., Investigation on bridge assessment using unmanned aerial systems, Structures Congress, pp.404-413, 2015. DOI: 10.1061/9780784479117.035 [ Links ]

[23] Tarek, O. and Moncef, L.N., Thermal detection of subsurface delaminations in reinforced concrete bridge decks using unmanned aerial vehicle. American Concrete Institute, ACI Special Publication, 331, pp. 1-14, 2019. [ Links ]

[24] Bolourian, N., et al., High-level framework for bridge inspection using LiDAR-equipped UAV, in: Proceedings of the International Symposium on Automation and Robotics in Construction, pp. 683-688, 2017. DOI: 10.22260/ISARC2017/0095 [ Links ]

[25] Cotua, O. y Causil, L., Diseño y ensamble de la arquitectura física de un Dron, para dosimetría ambiental en los cultivos bioenergéticos. BSc. Thesis, Department System Engineering, Cooperativa de Colombia University, Bogotá, Colombia, 2019. [ Links ]

[26] Chiu, W.K., et al., Large structures monitoring using unmanned aerial vehicles. Procedia Engineering, 188, pp. 415-423, 2017. DOI: 10.1016/j.proeng.2017.04.503 [ Links ]

[27] Xiao, X, et al., Multi-view stereo matching based on self-adaptive patch and image grouping for multiple unmanned aerial vehicle imagery. Remote Sensing, 8(2), pp. 89, 2016. DOI: 10.3390/rs8020089 [ Links ]

[28] Kim, In-Ho, et al., Application of crack identification techniques for an aging concrete bridge inspection using an unmanned aerial vehicle. Sensors, 18(6), pp. 1881, 2018. DOI: 10.3390/s18061881 [ Links ]

[29] Kang, D., and Cha. Y.-J., Damage detection with an autonomous UAV using deep learning, Sensors and Smart Structures Technologies for Civil, Mechanical, and Aerospace Systems, art. 1059804, 2018. DOI: 10.1117/12.2295961 [ Links ]

[30] Liu, L., et al., CNN based automatic coating inspection system. Advances in Science, Technology and Engineering Systems Journal, 3(12), pp: 469-478, 2018. DOI: 10.25046/aj030655 [ Links ]

[31] Jin-Hwan, L. et al., Diagnosis of crack damage on structures based on image processing techniques and R-CNN using unmanned aerial vehicle (UAV), Sensors and Smart Structures Technologies for Civil, Mechanical, and Aerospace Systems, art. 1059811, 2018. DOI: 10.1117/12.2296691 [ Links ]

[32] Rodríguez, J., Landassuri, V. y Flores, J., Reconocimiento de patrones numéricos para vuelo controlado de un AR Drone utilizando redes neuronales artificiales. Res. Comput. Sci, 107, pp. 61-71, 2015. DOI: 10.13053/RCS-107-1-6 [ Links ]

[33] Khaloo, A, et al., Unmanned aerial vehicle inspection of the Placer River Trail Bridge through image-based 3D modelling. Structure and Infrastructure Engineering, 14(1), pp. 124-136, 2018. DOI: 10.1080/15732479.2017.1330891 [ Links ]

[34] Sanchez-L.J.L. et al., A vision based aerial robot solution for the mission 7 of the international aerial robotics competition, in: International Conference on Unmanned Aircraft Systems (ICUAS). pp. 1391-1400, 2015. DOI: 10.1109/ICUAS.2015.7152435 [ Links ]

[35] Livyatan, H., et al., Dense structure from motion. U.S. Patent No. 9,959,595. May 1st of 2018. [ Links ]

[36] Khaloo, A. and Lattanzi, D., Extracting structural models through computer vision, Structures Congress 2015, pp. 538-548, 2015. DOI: 10.1061/9780784479117.047 [ Links ]

[37] Dorafshan, S., Robert J.T. and Maguire, M., Fatigue crack detection using unmanned aerial systems in fracture critical inspection of steel bridges. Journal of Bridge Engineering, 23(10), art. 04018078, 2018. DOI: 10.1061/(ASCE)BE.1943-5592.0001291 [ Links ]

[38] Gillins, M., Gillins, D. and Parrish, C., Cost-effective bridge safety inspections using unmanned aircraft systems (UAS), Geotechnical and Structural Engineering Congress, pp. 1931-1940, 2016. DOI: 10.1061/9780784479742.165 [ Links ]

[39] Kang, D. and Cha, Y-J., Autonomous UAVs for structural health monitoring using deep learning and an ultrasonic beacon system with geo‐tagging. Computer‐Aided Civil and Infrastructure Engineering. 33(10), pp. 885-902, 2018. DOI: 10.1111/mice.12375 [ Links ]

[40] Cha, Y‐J., Wooram, C. and Büyüköztürk, O., Deep learning‐based crack damage detection using convolutional neural networks. Computer‐Aided Civil and Infrastructure Engineering, 32(5), pp. 361-378, 2017. DOI: 10.1111/mice.12263 [ Links ]

[41] Cha, Y‐J., et al., Autonomous structural visual inspection using region‐based deep learning for detecting multiple damage types. Computer‐Aided Civil and Infrastructure Engineering, 33(9), pp. 731-747, 2017. DOI: 10.1111/mice.12334 [ Links ]

[42] ASTM D4788-03, Standard test method for detecting delaminations in bridge deck using infrared thermography, American Society of Testing Materials, [online]. 2013. [date of reference March 26th of 2020]. Available at. Available at. https://www.aenor.com/normas-y-libros/buscador-de-normas/astm?c=026572Links ]

[43] Wang, J., Peilun, F. and Robert, X., Machine vision intelligence for product defect inspection based on deep learning and Hough transform. Journal of Manufacturing Systems, 51, pp. 52-60, 2019. DOI: 10.1016/j.jmsy.2019.03.002 [ Links ]

[44] Han, K., Lin, J. and Golparvar, F., A formalism for utilization of autonomous vision-based systems and integrate project models for construction progress monitoring, in: Proceedings of the Conference on Autonomous and Robotic Construction of Infrastructure, pp. 124-138, 2015. [ Links ]

[45] Munguia, R., Urzua, S., Bolea, Y. and Grau, A., Vision-based SLAM system for unmanned aerial vehicles. Sensors, 16, art. 372, 2016. DOI: 10.3390/s16030372 [ Links ]

[46] Jiarong, L. and Zhang, F., Loam livox: a fast, robust, high-precision LiDAR odometry and mapping package for LiDARs of small FoV, in: International Conference on Robotics and Automation (ICRA), pp. 3126-3131, 2020. DOI: 10.1109/ICRA40945.2020.9197440 [ Links ]

[47] Andaluz, V. et al., Robot nonlinear control for unmanned aerial vehicles’ multitasking. Assembly Automation, 38(5), pp. 645-660, 2018. DOI: 10.1108/AA-02-2018-036 [ Links ]

[48] Kung, O. et al., Simplified building models extraction from ultra-light uav imagery. ISPRS-International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 38(1C22), pp. 217-222, 2011. DOI: 10.5194/isprsarchives-XXXVIII-1-C22-217-2011 [ Links ]

[49] Nazmus, S., Muhammad, M. and Sharif, M., Application framework for forest surveillance and data acquisition using unmanned aerial vehicle system, International Conference on Engineering Technology and Technopreneurship (ICE2T), pp. 1-6, 2017. DOI: 10.1109/ICE2T.2017.8215979 [ Links ]

[50] Jeremy, H. et al., Design of guaranteed safe maneuvers using reachable sets: autonomous quadrotor aerobatics in theory and practice, IEEE International Conference on Robotics and Automation, pp. 1649-1654, 2010. DOI: 10.1109/ROBOT.2010.5509627 [ Links ]

[51] Bachrach, R. et al., On the design and use of a micro air vehicle to track and avoid adversaries The International Journal of Robotics Research, 29(5), pp 529-546, 2010. DOI: 10.1177/0278364909348805 [ Links ]

[52] Mistler, V., Benallegue, A. and Msirdi, M., Exact linearization and noninteracting control of a 4 rotors helicopter via dynamic feedback, IEEE International Workshop on Robot and Human Interactive Communication, pp. 586-593, 2001. DOI: 10.1109/ROMAN.2001.981968 [ Links ]

[53] Belkheiri, M., et al., Different linearization control techniques for a quadrotor system, in: 2nd International Conference on Communications, Computing and control Applications (CCCA), 2012, pp. 1-6. DOI: 10.1109/CCCA.2012.6417914 [ Links ]

[54] Chovancová, A., Fico, T., Hubinsky, P. and Duchon, F., Comparison of various quaternion-based control methods applied to quadrotor with disturbance observer and position stimator. Robotics and Autonomous Systems, 79, pp. 87-98, 2015. DOI: 10.1016/j.robot.2016.01.011 [ Links ]

[55] Bouabdallah, S., Noth, A. and Siegwart, R., PID vs LQ control techniques applied to an indoor micro quadrotor, International Conference on intelligent Robots and Systems, pp. 2451-2456, 2004. DOI: 10.1109/IROS.2004.1389776 [ Links ]

[56] Reyes, V. et al., LQR control for a quadrotor using unit quaternions: modeling and simulation, International Conference on Electronics, Communications and Computing, CONIELECOMP, pp. 172-178, 2013. DOI: 10.1109/CONIELECOMP.2013.6525781 [ Links ]

[57] Fresk, E. and Nikolakopoulos, G., Full quaternion-based attitude control for a quadrotor, European Control Conference, pp. 3864-3869, 2013. DOI: 10.23919/ECC.2013.6669617 [ Links ]

[58] Ajmera, J., Sankaranarayanan, V., Point-to-point control of a quadrotor: theory and experiment, IFAC-PapersOnLine, 49(1), pp. 401-406, 2016. DOI: 10.1016/j.ifacol.2016.03.087 [ Links ]

[59] Ryan, T. and Kim, H., LMI- based gain synthesis for simple robust quadrotor control, IEEE Transaction on Automation Science and Engineering, 10(4), pp. 1173-1178, 2013. DOI: 10.1109/TASE.2013.2259156 [ Links ]

[60] Wang, X., Shirinzadeh, B. and Ang, M., Nonlinear double - integral observer and application to quadrotor aircraft. IEEE Transactions on Industrial Electronics, 62(2), pp. 1189-1200, 2015. DOI: 10.1109/TIE.2014.2341571 [ Links ]

[61] Raffo, G., Ortega, M. and Rubio, F., Backstepping/nonlinear H∞ control for path tracking of a quadrotor unmanned aerial vehicle, American Control Conference, pp. 3356-3361, 2008. DOI: 10.1109/ACC.2008.4587010 [ Links ]

[62] Lopes, R., Santana, P., Borges, G. and Ishihara, J., Model predictive control applied to tracking and attitude stabilization of a VTOL quadrotor aircraft, International Congress of Mechanical Engineering, COBEM, pp. 176-185, 2011 [ Links ]

[63] Dong, W., Gu, G.Y., Zhu, X. and Ding, H., High - performance trajectory tracking control of a quadrotor with disturbance observer. Sensors and Actuators, 211, pp. 67-77, 2014. DOI: 10.1016/j.sna.2014.03.011 [ Links ]

[64] Dong, W., Gu, G.Y., Zhu, X. and Ding, H., An adaptive trajectory control for UAV using a real-time architecture, International Conference on Unmanned Aircraft Systems, ICUAS, pp. 32-42, 2014. DOI: 10.1109/ICUAS.2014.6842236 [ Links ]

[65] Nicol, C., Macnab, C. and Serrano, A., Robust adaptive control of a quadrotor helicopter. Mechatronics, 21, pp. 927-938, 2011. DOI: 10.1016/j.mechatronics.2011.02.007 [ Links ]

[66] Salaan, C. et al., Close visual bridge inspection using a UAV with a passive rotating spherical shell. Journal of Field Robotics, 35(6), pp. 850-867, 2018. DOI: 10.1002/rob.21781 [ Links ]

D. Aldana-Rodríguez, is BSc. Eng. in Electronic Engineering in 2008, and got a second BSc. in Aeronautical Engineering in 2017, MSc. in Mechanical Engineering 2018, Drone pilot UAS operator, from the Colombian school of sports aviation. He is currently a full-time professor of the Aeronautical Engineering program at the Fundación Universitaria Los Libertadores de Colombia.ORCID: 0000-0002-6483-9580

D.L Ávila-Granados, is a BSc. Eng. in Aeronautical Engineering in 2013, MSc. in Mechanical Engineering in 2020, current instructor at Engineering Faculty of Colombian Army Aviation Academy, former Project Leader Engineer and, former Aeronautics Parts Fabrication Chief at CIAC. ORCID: 0000-0001-6782-3667

J.A Villalba-Vidales, is a BSc. Eng. in Mechanical Engineering in 2008, MSc. in Mechanical Engineering in 2015. He is currently a full-time professor of the Mechanical Engineering program at the Fundación Universitaria Los Libertadores de Colombia.ORCID: 0000-0001-6921-0565

How to cite: Aldana-Rodríguez, D., Ávila-Granados, D.L. and Villalba-Vidales, J.A., Use of unmanned aircraft systems for bridge inspection: a review.. DYNA, 88(217), pp. 32-41, April - June, 2021.

Received: November 26, 2020; Revised: February 01, 2021; Accepted: February 15, 2021

Creative Commons License The author; licensee Universidad Nacional de Colombia