What you can see in the picture above is the detail of the Babybot external ear.
It's a sort of conic hand made plastic lobe that covers a microphone. With two
ears and some directional tuning it is possible to localize sound in space
(under certain hypotheses). We investigated the link between
sound and vision and especially what the contribution of vision is in the
acquisition of a multi-modal map (visuo-acoustic) of the environment. We
modeled the process of sound localization by computing the difference in phase
and intensity of the signals impinging on the two microphones. The directional
tuning given by the external ears maximizes the difference of these two signals
allowing better localization abilities.
|