Audio signal based 3D space reconstruction using deep convolutional neural networks

Postgraduate Thesis uoadl:2880712 262 Read counter

Unit:
Κατεύθυνση Ηλεκτρονικός Αυτοματισμός (Η/Α, με πρόσθετη εξειδίκευση στην Πληροφορική και στα πληροφοριακά συστήματα)
Library of the School of Science
Deposit date:
2019-09-13
Year:
2019
Author:
Fragkiadakis Emmanouil
Supervisors info:
Διονύσιος Ρεϊσης, Αναπλ. Καθηγητής ΕΚΠΑ
Original Title:
Audio signal based 3D space reconstruction using deep convolutional neural networks
Languages:
English
Translated title:
Audio signal based 3D space reconstruction using deep convolutional neural networks
Summary:
In this essay will focus on the architecture of an artificial neural network
capable of “learning” how to conceive the three-dimensional space
through sound. Inspired by the bats’ sophisticated echolocation
mechanism, capable of providing detailed information about the
surrounding space, the main goal is to research, design and train an
artificial neural network that could generate the stereoscopic
representation of 3D space relying solely on acoustic signals. The network
uses the sound’s spectrum to determine whether and how the sound
interacted with solid matter (reflected, diffused etc.) extracting, this way,
information about the object the soundwaves collided with. The large
training set required for this task was, initially, generated by software
that can simulate 3D-space and ray tracing was used to conceive the
impression of the scene’s acoustics. Finally, an automation was designed
and constructed to provide the means of creating large training sets out
of real-world cases.
Main subject category:
Science
Keywords:
CNN, neural networks, ANN, stereoscopic, classification
Index:
No
Number of index pages:
0
Contains images:
Yes
Number of references:
6
Number of pages:
122
File:
File access is restricted only to the intranet of UoA.

AUDIO SIGNAL BASED 3D SPACE RECONSTRUCTION USING DEEP CONVOLUTIONAL NEURAL NETWORKS.pdf
2 MB
File access is restricted only to the intranet of UoA.