针对深度卷积神经网络模型因复杂度高导致嵌入式设备难以实现在线检测的问题,提出改进的YOLOv4的风力机叶片损伤检测方法。首先使用MobileNetv3网络代替YOLOv4中的CSPdarknet53主干特征提取网络进行特征提取,并将相同shape的特征层进行加强特征提取;其次在加强特征提取网络上添加注意力机制ECA,并对YOLOv4的边界框损失函数与分类损失函数进行优化;最后,将改进前后的算法与其他检测算法进行比较。结果表明:改进的YOLOv4算法的检测速度可达单张检测时间为0.018 s,检测准确率达到95.7%,通过对YOLOv4网络进行改进,在保证检测准确的前提下,轻量化的模型可满足嵌入式设备检测风力机叶片损伤的需求。
Abstract
An improved YOLOv4 wind turbine blade damage detection method is proposed to solve the problem that on-line detection of embedded equipment is difficult due to high complexity of deep convolution neural network model. Firstly, MobileNetv3 network is used to replace CSPDknet53 backbone feature extraction network in YOLOv4 for feature extraction, and feature extraction is enhanced by feature layer of the same shape. Secondly, an attention mechanism ECA is added to the enhanced feature extraction network, and the loss function of YOLOv4 boundary frame and the loss function of classification are optimized. Finally, the improved algorithm is compared with other detection algorithms. The result shows that the detection speed of the improved YOLOv4 algorithm can reach 0.018 seconds per sheet and the detection accuracy reaches 95.7%. Through improving the YOLOv4 network, the lightweight model can meet the requirement of embedded equipment to detect wind turbine blade damage on the premise of accurate detection.
关键词
风力机 /
叶片 /
损伤检测 /
深度学习 /
YOLOv4
Key words
wind turbines /
blades /
damage detection /
deep learning /
YOLOv4
{{custom_sec.title}}
{{custom_sec.title}}
{{custom_sec.content}}
参考文献
[1] 杨景云, 王文韫, 戴巨川. 基于摄动模态分析的风电叶片动力学特性研究[J]. 太阳能学报, 2023, 44(11): 231-238.
YANG J Y, WANG W Y, DAI J C.Research on dynamic characteristics of wind power blades based on perturbation mode analysis[J]. Acta energiae solaris sinica, 2023, 44(11): 231-238.
[2] 文习山, 邓冶强, 王羽, 等. 风力发电机叶片雷击接闪特性研究综述[J]. 高电压技术, 2020, 46(7): 2511-2521.
WEN X S, DENG Y Q, WANG Y, et al.Review of research on lightning striking performance of wind turbine blade[J]. High voltage engineering, 2020, 46(7): 2511-2521.
[3] 刘杰, 杨娜, 谭玉涛, 等. 基于WD-LSTM的风电机组叶片结冰状态评测[J]. 太阳能学报, 2022, 43(8): 399-408.
LIU J, YANG N, TAN Y T, et al.Assessment of icing state of wind turbine blades based on WD-LSTM[J]. Acta energiae solaris sinica, 2022, 43(8): 399-408.
[4] 王雪平, 张建斐, 李万润, 等. 基于机器视觉的风电叶片风沙侵蚀程度检测方法研究[J]. 太阳能学报, 2020, 41(5): 166-173.
WANG X P, ZHANG J F, LI W R, et al.Study on monitoring method of wind power blades erosion severity under wind-sand storm based on machine vision technology[J]. Acta energiae solaris sinica, 2020, 41(5): 166-173.
[5] 郭迎福, 全伟铭, 王文韫, 等. 基于三维振动信息融合的卷积神经网络风力机叶片裂纹诊断方法[J]. 光学学报, 2020, 40(22): 2212004.
GUO Y F, QUAN W M, WANG W Y, et al.Crack diagnosis method of wind turbine blade based on convolution neural network with 3D vibration information fusion[J]. Acta optica sinica, 2020, 40(22): 2212004.
[6] MENG H, LIEN F S, LI L.Elastic actuator line modelling for wake-induced fatigue analysis of horizontal axis wind turbine blade[J]. Renewable energy, 2018, 116: 423-437.
[7] CHOU J S, CHIU C K, HUANG I K, et al.Failure analysis of wind turbine blade under critical wind loads[J]. Engineering failure analysis, 2013, 27: 99-118.
[8] ACKERMANN T.Wind energy technology and current status: a review[J]. Renewable and sustainable energy reviews, 2000, 4(4): 315-374.
[9] LI D S, HO S C M, SONG G B, et al. A review of damage detection methods for wind turbine blades[J]. Smart material structures, 2015, 24(3): 033001.
[10] GIRSHICK R, DONAHUE J, DARRELL T, et al.Rich feature hierarchies for accurate object detection and semantic segmentation[C]//Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition. New York, USA, 2014: 580-587.
[11] REN S Q, HE K M, GIRSHICK R, et al.Faster R-CNN: towards real-time object detection with region proposal networks[J]. IEEE transactions on pattern analysis and machine intelligence, 2017, 39(6): 1137-1149.
[12] HE K M, GKIOXARI G, DOLLÁR P, et al. Mask R-CNN[C]//2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 2017: 2980-2988.
[13] DUAN K W, BAI S, XIE L X, et al.CenterNet: keypoint triplets for object detection[C]//2019 IEEE/CVF International Conference on Computer Vision (ICCV). Seoul, Korea (South), 2019: 6568-6577.
[14] REDMON J, DIVVALA S, GIRSHICK R, et al.You only look once: unified, real-time object detection[C]//2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Las Vegas, NV, USA, 2016: 779-788.
[15] LIN T Y, GOYAL P, GIRSHICK R, et al.Focal loss for dense object detection[C]//2017 IEEE International Conference on Computer Vision (ICCV). Venice, Italy, 2017: 2999-3007.
[16] HOWARD A, SANDLER M, CHEN B, et al.Searching for MobileNetV3[C]//2019 IEEE/CVF International Conference on Computer Vision (ICCV). Seoul, Korea (South), 2019: 1314-1324.
[17] WANG Q L, WU B G, ZHU P F, et al.ECA-net: efficient channel attention for deep convolutional neural networks[C]//2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). Seattle, WA, USA, 2020: 11531-11539.
[18] GEVORGYAN Z. SIoU loss: more powerful learning for bounding box regression[J]. arXiv preprint arXiv:2205. 12740, 2022.
[19] BOCHKOVSKIY A, WANG C Y, LIAO H Y M. Yolov4: optimal speed and accuracy of object detection[J]. arXiv preprint arXiv:2004.10934, 2020.
[20] YANG C, LIU X, ZHOU H, et al.Towards accurate image stitching for drone-based wind turbine blade inspection[J]. Renewable energy, 2023, 203: 267-279.
基金
中央引导地方科技发展资金项目(2022ZYT012); 湖南省自然科学基金区域联合基金(2023JJ50234); 湖南省教育厅科学研究项目(22B0465)