Traditionally, a considerable number of wind farms do not have a policy of preventive maintenance activities due to the high costs associated with it. Furthermore, wind farms that realize these preventive activities, normally use ground crews with conventional cameras and zoom lenses not very efficient and thus time-consuming. In consequence, most of the damages or structural defects on the turbines are failed to be detected at an early stage, leading to greater costs in its correction or to the catastrophic failure of the wind turbine. With the emergence of drone technology, these kinds of activities can now be performed with enhanced results, allowing maintenance operators to acquire images from any angle of each wind turbine. In order to be able to detect anomalies in a wind turbine, the acquired media has to be captured and traditionally processed at a later stage in dedicated data centres with powerful hardware. If an anomaly is detected, a second visit is scheduled to the targeted wind turbine, allowing to acquire extended detailed images on the issue detected and to confirm or discard the previously flagged potential defect. This modus operandi is of course not ideal and not cost-effective.
The goal of the use case is to validate the NebulOuS framework in deploying wind turbine inspection software in an optimal way, making appropriate use of cloud and fog resources. Drones capture high-resolution images of wind turbine blades, which are processed by AI-enabled algorithms to automatically detect turbine damages and discover other valuable information for turbine maintenance. This process however generates a considerably high amount of data that typically is stored on the camera memory and later copied to a database for offline processing by the AI algorithms. NebulOuS will automatically and in real-time handle this process being infrastructure agnostic and without needing human intervention. NebulOuS will exploit 5G networks and will cope with data streaming, efficiently utilizing cloud and edge computing paradigms to enable data processing as close to its data source as possible. NebulOuS will enable video streams data collected by the drone to be processed in near real time during the actual inspection flight. Upon any anomaly detection, the re-adjustment of the drone’s route will be possible to collect additional data from a specific turbine in case of a confirmed defect. Data is to be processed first at the Edge (lightweight version of the image recognition software) and only if a potential anomaly is detected, this data is redirected and processed using private or public cloud resources. By doing this, both network bandwidth and computational resources consumed will be considerably lower as only potential anomalies are to be fully broadcasted and analysed to the central node. This will also minimise the amount of the data collected and stored for further off-line analysis. To this end, this use case will demonstrate in a real-life deployment, the usage of drones for inspection routines on wind turbines in a wind farm environment in Portugal’s north region.
Operation & Deployment of Use Case: In the envisioned scenario, drone operators will deploy several drones equipped with cameras. The image recognition application software includes six components that today need to be developed independently and integrated by software integrators by spending a significant effort (e.g., image quality checker, computer vision-based component for geo-referencing within the farm, cracks detector etc.) In this case, a possible deployment of these components includes the use of: i) resources on the drone (GPU); ii) an edge server located in a 5G cell and iii) cloud resources.
Main Partners Supporting the Use Case:
UW (Portugal, SME): Resources will be considered for hosting application components at UW’s headquarters development center and at a wind farm in Portugal. UW will provide the edge infrastructure required to ensure that the TTA software will process data over a 5G network.
TTA (Poland, SME): provider of AI-powered software for automated damage detection & predictive risk monitoring.
Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or Directorate-General for Communications Networks, Content and Technology. Neither the European Union nor the granting authority can be held responsible for them.