Publicación:
Implementing transfer learning for sound event classification using the realised audio database

dc.contributor.authorMohíno Herranz, Inmaculada
dc.contributor.authorGarcía Gómez, Joaquín
dc.contributor.authorAlonso Díaz, Sagrario
dc.contributor.authorGarcía Gallegos, Jaime
dc.contributor.authorPerez Sanz, Fernando
dc.contributor.authorAguilar Ortega, Miguel
dc.contributor.authorGil Pita, Roberto
dc.contributor.funderAgencia Estatal de Investigación (España)
dc.contributor.funderComunidad de Madrid
dc.date.accessioned2025-12-18T08:26:31Z
dc.date.available2025-12-18T08:26:31Z
dc.date.issued2025-01-04
dc.descriptionThis work is part of the project PID2021-129043OB-I00 funded by MCIN/AEI/10.13039/501100011033/FEDER, UE, and project EPU-INV/2020/003, funded by the Community of Madrid and University of Alcala.
dc.description.abstractCurrently, smart spaces utilize diverse sensors to gather real-time data on activities within environments, enabling systems to activate alarms or execute tasks. These environments support various applications integrating advanced sensory systems like cameras, microphones, and motion sensors. Acoustic sensors are crucial, extracting valuable information from sound waves, including source details and environment characteristics. This article focuses on detecting and classifying common household sounds using the ReaLISED database, comprising 2479 recordings of 18 events. Audio data were collected in real homes with everyday objects and activities, unlike reverberation models. Artificial Intelligence techniques, including Machine Learning classifiers, yielded satisfactory outcomes. Transfer Learning methods were explored, leveraging pre-trained models like YAMNet for sound event recognition by Google, potentially outperforming previous classifiers. Preliminary results reveal a relative reduction of around 25 % in the probability error rate when comparing the obtained results with previous studies, providing valuable insights into potential possibilities and enhancements.
dc.description.peerreviewedPeerreview
dc.identifier.citationMeasurement: Sensors 38: 101711
dc.identifier.doidoi.org/10.1016/j.measen.2024.101711
dc.identifier.e-issn2665-9174
dc.identifier.otherhttps://www.sciencedirect.com/science/article/pii/S2665917424006871
dc.identifier.urihttps://hdl.handle.net/20.500.12666/1600
dc.language.isoeng
dc.publisherElsevier
dc.relationVIGILANCIA ACUSTICA INTELIGENTE DE INTERIORES PARA EL ANALISIS Y MODELADO DE ESCENAS PARA LA DETECCION DE COMPORTAMIENTOS ANOMALOS DE PERSONAS Y MAQUINAS
dc.rights.accessRightsinfo:eu-repo/semantics/openAccess
dc.subjectSmart spaces
dc.subjectAcoustic sensors
dc.subjectAudio detection
dc.subjectTransfer learning
dc.titleImplementing transfer learning for sound event classification using the realised audio database
dc.typeinfo:eu-repo/semantics/article
dc.type.coarhttp://purl.org/coar/resource_type/c_2df8fbb1
dc.type.hasVersioninfo:eu-repo/semantics/publishedVersion
dspace.entity.typePublication
oaire.awardNumberPID2021-129043OB-I00
oaire.awardTitleVIGILANCIA ACUSTICA INTELIGENTE DE INTERIORES PARA EL ANALISIS Y MODELADO DE ESCENAS PARA LA DETECCION DE COMPORTAMIENTOS ANOMALOS DE PERSONAS Y MAQUINAS
oaire.awardURIhttps://hdl.handle.net/20.500.12666/1599
relation.isAuthorOfPublication76d131d2-cd15-4863-bec5-0cfe59ac5913
relation.isAuthorOfPublication524643fd-c895-4ebc-a7b9-4703638710fb
relation.isAuthorOfPublication336476e8-afa5-4909-b07e-aecd668ca031
relation.isAuthorOfPublication.latestForDiscovery76d131d2-cd15-4863-bec5-0cfe59ac5913
relation.isProjectOfPublication38f69e0b-e620-445b-bd74-9db4aacb3484
relation.isProjectOfPublication.latestForDiscovery38f69e0b-e620-445b-bd74-9db4aacb3484

Archivos

Bloque original

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
Implementing transfer learning for sound event classification using the realised audio database.pdf
Tamaño:
1 MB
Formato:
Adobe Portable Document Format

Bloque de licencias

Mostrando 1 - 1 de 1
No hay miniatura disponible
Nombre:
license.txt
Tamaño:
4.77 KB
Formato:
Item-specific license agreed upon to submission
Descripción: