A research initiative at the University of Klagenfurt is exploring the use of autonomous drones for inspecting critical infrastructure, using industrial imaging technology from IDS Imaging Development Systems GmbH. The project integrates AI-based object-relative navigation and a USB3 Vision industrial camera from the uEye LE family to facilitate real-time inspection tasks.
Funded by Austria’s Federal Ministry for Climate Action, Environment, Energy, Mobility, Innovation and Technology (BMK), the drone system is designed to identify and navigate around infrastructure components such as power poles and insulators. It captures image data from specific distances to enable repeatable and accurate comparisons over multiple inspection flights.
The system uses a TWINs Science Copter platform with a Pixhawk PX4 autopilot and an NVIDIA Jetson Orin AGX 64GB DevKit as the onboard computer. The IDS U3-3276LE C-HQ camera, equipped with a Sony Pregius IMX265 global shutter sensor, provides high-resolution images at up to 58 frames per second. These images support the drone’s AI navigation system, which relies on semantic information extraction to identify and localize inspection targets in real time.
Unlike GNSS-based systems, which can suffer from signal degradation in environments with physical obstructions, the AI-based system uses semantic data to establish relative positions and orientations between the drone and inspection targets. This enables consistent and safe navigation, particularly in hard-to-reach or GPS-compromised environments.
The project uses the ROS (Robot Operating System) in combination with the IDS peak SDK for camera integration and control, allowing automated configuration of exposure, white balance, gain, and downsampling. Navigation, mission control, and data management are handled through the CNS Flight Stack, an open-source software platform for UAV operations.
Real-time navigation is achieved through sensor fusion using data from the camera, IMU, LIDAR, and GNSS. An Extended Kalman Filter (EKF) processes these inputs to maintain drone stability and positional accuracy. Image data is captured at a rate of 50 fps with a resolution of 1280 x 960 pixels to match the processing capabilities of the onboard AI.
Test flights are conducted in the university’s drone hall, where an electricity pole model with insulators serves as the primary inspection target. The system aims to support broader applications in mobile robotics by demonstrating the role of compact, high-performance industrial cameras in autonomous infrastructure monitoring.
According to the research team, the ongoing work focuses on optimizing the AI model for onboard use and further developing the inspection system to enhance reliability and efficiency in real-world scenarios.
Image: Alpen-Adria-Universität (aau) Klagenfurt