Publicación: Implementing transfer learning for sound event classification using the realised audio database
Fecha
2025-01-04
Título de la revista
ISSN de la revista
Título del volumen
Editor
Elsevier
Resumen
Currently, smart spaces utilize diverse sensors to gather real-time data on activities within environments, enabling systems to activate alarms or execute tasks. These environments support various applications integrating advanced sensory systems like cameras, microphones, and motion sensors. Acoustic sensors are crucial, extracting valuable information from sound waves, including source details and environment characteristics. This article focuses on detecting and classifying common household sounds using the ReaLISED database, comprising 2479 recordings of 18 events. Audio data were collected in real homes with everyday objects and activities, unlike reverberation models. Artificial Intelligence techniques, including Machine Learning classifiers, yielded satisfactory outcomes. Transfer Learning methods were explored, leveraging pre-trained models like YAMNet for sound event recognition by Google, potentially outperforming previous classifiers. Preliminary results reveal a relative reduction of around 25 % in the probability error rate when comparing the obtained results with previous studies, providing valuable insights into potential possibilities and enhancements.
Descripción
This work is part of the project PID2021-129043OB-I00 funded by MCIN/AEI/10.13039/501100011033/FEDER, UE, and project EPU-INV/2020/003, funded by the Community of Madrid and University of Alcala.
Palabras clave
Smart spaces, Acoustic sensors, Audio detection, Transfer learning
Citación
Measurement: Sensors 38: 101711











