Automated inspection of complex engineering structures using robotics
Abstract: This work presents various methods for the automated inspection of complex engineering structures using robotics. As part of this thesis, the major required fundamentals and a conclusion is given. The latter includes the discussion of the results and their potential for future research and applications. Moreover, this work addresses the question of how human inspectors can be integrated into the automated process.
As a basis, various sensor systems are evaluated for sub-millimeter crack detection in concrete structures. While the 3D information of laser scanning does not achieve the required resolution, photogrammetric approaches are superior due to the use of high-resolution 2D images. Laser triangulation reaches the highest 3D resolution but is limited due to the depth of field and field of view (FoV). To acquire subsurface damages, the automated fusion of surface and subsurface data using ground penetrating radar and laser scanning is analyzed. Apart from damage detection, the precise 3D referencing of detected damages is essential. Since many engineering structures are global navigation satellite system (GNSS)-denied areas, simultaneous localization and mapping (SLAM) algorithms are required. To compare state-of-the-art SLAM algorithms, this work presents an automated method to evaluate the absolute and relative accuracy for different combinations of sensor setups on a mobile robotic platform reaching SLAM accuracies of a few centimeters in a bridge scenario. Since laser scanning data cannot be used for sub-millimeter crack detection, an image-based method for semi-automatic 3D crack map generation using deep learning for semantic segmentation of cracks is presented. The experiments show that cracks are often detected too wide or crack-similar structures can lead to false positives. The limiting factor is the trade-off between the ground sampling distance and context as a result of a sufficient FoV. Additionally, diverse surfaces and limited training data due to time-intensive crack annotation complicate the process. As part of this work, a crack width correction reduces the number of false positives. The localization of damages is achieved using a photogrammetric approach based on structure from motion. Photogrammetric point cloud data and textured mesh data is not sufficient due to limited resolution or artifacts. Therefore, the segmented crack information is projected onto the as-planned surface mesh using raytracing. This results in high-resolution crack point cloud data imprinted on the surface. Subsequent clustering based on defined distance criteria and projecting the medial axis, derived using the 2D data, allows calculating the width along all branches of each individual crack.
In summary, this thesis presents new methods and discusses future ways to overcome the trade-off between capturing large surfaces and sub-millimeter crack detection and measurement, subsurface structure detection, precise localization and mapping in GNSS-denied areas, and the diversity and complexity of engineering structures. Thereby, this work contributes to the future of automated inspection of complex engineering structures using robotics
- Standort
-
Deutsche Nationalbibliothek Frankfurt am Main
- Umfang
-
Online-Ressource
- Sprache
-
Englisch
- Anmerkungen
-
Universität Freiburg, Dissertation, 2024
- Klassifikation
-
Elektrotechnik, Elektronik
- Schlagwort
-
Automatisches Prüfen
Qualitätskontrolle
Industrieroboter
Fehlererkennung
Automation
Bahnplanung
Robotik
Structural Health Monitoring
SLAM-Verfahren
Kollaborativer Roboter
Bauwerk
Maschinelles Lernen
Drohne
Inspektion
- Ereignis
-
Veröffentlichung
- (wo)
-
Freiburg
- (wer)
-
Universität
- (wann)
-
2024
- Urheber
- Beteiligte Personen und Organisationen
- DOI
-
10.6094/UNIFR/255668
- URN
-
urn:nbn:de:bsz:25-freidok-2556682
- Rechteinformation
-
Open Access; Der Zugriff auf das Objekt ist unbeschränkt möglich.
- Letzte Aktualisierung
-
14.08.2025, 10:51 MESZ
Datenpartner
Deutsche Nationalbibliothek. Bei Fragen zum Objekt wenden Sie sich bitte an den Datenpartner.
Beteiligte
Entstanden
- 2024