Dataset for training neural networks in concrete crack detection: laboratory-classified beam and column imagesRepositorio Institucional – Universidad de Lima

The construction industry is increasingly incorporating artificial intelligence into processes for the efficiency and accuracy of structural analysis and monitoring. However, obtaining high-quality datasets to train algorithms for detecting concrete cracks in structural components remains challengin...

Full description

Saved in:
Bibliographic Details
Main Authors: Alexandre Almeida Del Savio, Ana Luna Torres, Daniel Cárdenas-Salas, Mónica Vergara Olivera, Gianella Urday Ibarra
Format: Article
Language:English
Published: Elsevier 2025-08-01
Series:Data in Brief
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2352340925003737
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The construction industry is increasingly incorporating artificial intelligence into processes for the efficiency and accuracy of structural analysis and monitoring. However, obtaining high-quality datasets to train algorithms for detecting concrete cracks in structural components remains challenging, as such cracks normally develop over an extended period under real-world conditions. We introduce a curated dataset of 1,132 manually classified images of concrete cracks in beams and columns. These images were captured in a controlled laboratory environment using a static IP camera and annotated using the LabelImg tool. The dataset includes five object classes representing distinct cracks and failures in beams and columns and corresponding.txt files containing classification and coordinates data. This dataset is designed to facilitate developing and validating of neural network-based computer vision models for automated crack detection. It is a very useful resource for researchers in structural engineering, which enables further developments in automated structural health monitoring and contributes to the overall use of AI in the construction industry.
ISSN:2352-3409