TY - JOUR
T1 - Barely visible impact damage detection in composite structures using deep learning networks with varying complexities
AU - Tabatabaeian, Ali
AU - Jerkovic, Bruno
AU - Harrison, Philip
AU - Marchiori, Elena
AU - Fotouhi, Mohammad
PY - 2023
Y1 - 2023
N2 - Visual inspection is one of the most common non-destructive testing (NDT) methods that offers a fast evaluation of surface damage in aerospace composite structures. However, it is highly dependent on human-related factors and may not detect barely visible impact damage (BVID). In this research, low velocity impact tests with different energy levels are conducted on two groups of composite panels, namely ‘reference’ and ‘sensor-integrated’ samples. Then, the results of impact tests, together with C-scan and visual inspection images, are analysed to define the BVID range and create an original image dataset. Next, four different deep learning models are trained, validated and tested to capture the BVID only from the images of the impacted and non-impacted surfaces. The results show that all four networks can learn and detect BVID quite well, and the sensor-integrated samples reduce the training time and improve the accuracy of deep learning models. ResNet outperforms other networks with the highest accuracy of 96.2% and 98.36% on the back-face of reference and sensor-integrated samples, respectively. The proposed damage recognition method can act as a fast, inexpensive and accurate structural health monitoring tool for composite structures in real-life applications.
AB - Visual inspection is one of the most common non-destructive testing (NDT) methods that offers a fast evaluation of surface damage in aerospace composite structures. However, it is highly dependent on human-related factors and may not detect barely visible impact damage (BVID). In this research, low velocity impact tests with different energy levels are conducted on two groups of composite panels, namely ‘reference’ and ‘sensor-integrated’ samples. Then, the results of impact tests, together with C-scan and visual inspection images, are analysed to define the BVID range and create an original image dataset. Next, four different deep learning models are trained, validated and tested to capture the BVID only from the images of the impacted and non-impacted surfaces. The results show that all four networks can learn and detect BVID quite well, and the sensor-integrated samples reduce the training time and improve the accuracy of deep learning models. ResNet outperforms other networks with the highest accuracy of 96.2% and 98.36% on the back-face of reference and sensor-integrated samples, respectively. The proposed damage recognition method can act as a fast, inexpensive and accurate structural health monitoring tool for composite structures in real-life applications.
KW - Barely visible impact damage
KW - Deep learning
KW - Hybrid composite sensors
KW - Structural health monitoring
UR - http://www.scopus.com/inward/record.url?scp=85166217357&partnerID=8YFLogxK
U2 - 10.1016/j.compositesb.2023.110907
DO - 10.1016/j.compositesb.2023.110907
M3 - Article
AN - SCOPUS:85166217357
SN - 1359-8368
VL - 264
JO - Composites Part B: Engineering
JF - Composites Part B: Engineering
M1 - 110907
ER -