BIRDS: Acoustic Sensor Array of Bird Communication Networks
The intent of this project is to permit humans to understand the grammar and meaning of bird songs.
This is an NSF funded project that includes media artist Victoria Vesna who is currently developing an installation in collaboration with evolutionary biologist Charles Taylor (UCLA) and physicist Takashi Ikegami (University of Tokyo).
The intent of this project is to permit humans to understand the grammar and meaning of bird songs. Recent advances in sensor arrays, computation, and computational linguistics finally make this long-sought goal achievable. The approach taken in this proposal is to:
- collect huge amounts of bird song recordings from acoustic sensor arrays in a variety of natural settings;
- process the data by software, some of which is recent and some of which will be developed using new advances in localizing source with beam-forming, then filtering out noise, identifying events of interest, and then classifying them according to species and individual, and combining that with behavioral observations;
- this information/knowledge will then be stored in a large database that can be shared among the collaborating research groups; and
- it will be analyzed by computational-linguistic tools to identify the syntax of the songs, and combined with information about the context in which it occurred, then analyzed by new software methods to identify the meaning of those songs.
We will begin testing inferences from those inferences and explore consequences for individual and community ecology. The art installation that is currently in development will utilize this large database and aim at engaging the audience to reintroduce the sounds of birds in our daily environment that has been edited out by all the technological noise.
No comments yet. Why not add the first?
Acknowledgements & Credits
Victoria Vesna, Charles Taylor (UCLA), Takashi Ikegami (University of Tokyo).