This is an audio transcript of the Tech Tonic podcast: ‘Can AI help us speak to animals? Part one’
John Thornhill
Listen (chirping birds). What can you hear? Breathe in and listen more closely. If we listen closely to the sounds of the natural world, we can hear a lot more than we first realise. But human hearing is limited, and outside the range of our ears, the world can be a noisy place. If we could expand our hearing to the lower ranges to what’s called the infrasound, we might hear icebergs splitting halfway across the world, even the rhythmic pulsing of the Earth’s crust, as waves crash across its continental shelves. And here’s something else you’d pick up below the human hearing range, the sound of species communicating — elephants, tigers and peacocks interacting in the infrasound. Then in the ultrasound, there’s chatter on coral reefs. Corn plants are clicking (clicking), and mice and beetles (animal sounds) are emitting sound waves at frequencies too high for the human ear. Well, as microphones have got better, this world of sound is opening up to us. But we’re not just hearing things we could never hear before. Scientists are also using the latest AI to process and make sense of the sounds of the animal and plant world. And some now believe we could one day understand what they’re saying, that in fact, we might be on the brink of a Google Translate for the non-human world.