/dq/media/post_banners/wp-content/uploads/2023/05/Daniela-Hedwig-550x300-1.jpg)
Let’s meet some Elephant Whisperers and Listeners who are keeping their eyes and ears peeled for all the secrets that elephants share in a deep forest
From the AI-led anti-poaching tools and virtual fences at Zambia’s Kafue National Park to recording sounds of Humpback whales and using Machine Learning in NOAA Pacific islands, to using heat signatures for conservation wof Koalas in Australia – technology is rising as a trumpet in many forests and islands cross the globe. Today, more than ever, technology is helping in creating better conservation strategies, strengthening anti-poaching squads to catch hunters, and understanding forest elephants better.
Like in Cornell University’s Elephant Listening Project, that is using acoustics monitoring to conserve the tropical forests of Africa with a focus on forest elephants. It is a project that began in 2007 for gathering new data from the forests of Central Africa, to achieve conservation goals, and build capacity in Central Africa to ensure sustainable conservation into the future. The project has collected sounds from the forest at 32 different locations, many monitored continuously for years. It boasts of the world’s largest archive of sounds from Central Africa—almost one million hours. But more than that it has to its credit—the impact of underlining the effects of oil exploration on elephant behavior, work on seasonal use at a multitude of forest clearings, successful alerts on illegal hunting and probes on the nocturnal behavior of forest elephants.
We got to chat with a key member of this team, Daniela Hedwig who has also studied the vocal behavior of gorillas in Central African Republic and Uganda for her PhD. She is the Research Associate/Team lead Elephant Listening Project at the K. Lisa Yang Center for Conservation Bioacoustics, Cornell University. She has also led a biomonitoring program in a National Park in Gabon and worked with WWF Germany on wildlife crime related issues, focusing on the African elephant poaching crisis. In this tete-a-tete, we learnt a lot about where technology has created a rumble and what can still be improved with AI and real-time data. All ears?
Tell us the core idea of this project? Why listening and not watching elephants?
Elephants are very vocal creatures. Until 1984, when Katy Payne discovered that often their communication happens below the threshold of human hearing, not much was known about the ‘sound’ aspect of elephants much. For instance, at the Dzanga Bai in Central African Republic, ELP researchers unravelled that humans may only hear somewhere around 40 per cent of all the calls elephants produce. Most of the higher human audible frequencies rapidly attenuate in the warm humid air. It so happens that by the time a call has traveled 50-100 meters, only the lowest, inaudible frequencies are left.
We use devices that are hung on trees. A typical array needs five or more recording units, with clocks synchronized to the millisecond level of accuracy. They come with a battery, an internal microphone and configurations for computers so that we can set different recording schedules.
It’s, hence, all about acoustic arrays, which allow us to tell where a call originated. These arrays are very powerful tools. When calls can be so low that we can’t hear them from an observation tower, an array can be a useful answer—it can tell us who gave a call that was associated with some interesting behavior. Acoustics can give a much larger range compared to images or video.
How does it all work?
We use devices that are hung on trees. A typical array needs five or more recording units, with clocks synchronized to the millisecond level of accuracy. They come with a battery, an internal microphone and configurations for computers so that we can set different recording schedules. We can also define sampling range and the kind of frequencies that we want to record. These devices also have to be spaced close enough together that three or more record the same call. They help us guess the size of an elephant population from calls sampled across a huge landscape. They also tell us a lot about their behavior and their movements. It is based on Autonomous Recording Units (ARUs), which can record unsupervised for extended periods of time in nature.
Has the technology approach become less primitive and more rugged-friendly over the years?
Notably enough, K. Lisa Yang Center for Conservation Bioacoustics was an early leader in the design and use of ARUs, - both for ocean and on land. In fact, our first terrestrial recorders were built into PVC sewer pipe with a laptop computer hard drive for data storage. They used circuit boards that wrote the sound files in binary format. Earlier, the devices needed about 16kg (43 lbs) of batteries to run for three months. But we have evolved to SWIFT, designed by BRP that requires only 1kg of battery power to run for three months. It can handle 8 khz and can last for four months. This matters a lot as we are working on remote forests – specially as this entails sending a team for maintaining the recording grid and collecting the data. We use sounds to work better on biodiversity of forests in Africa. We are basically eavesdropping to record their sounds and the soundscapes of forests.
How does this compare to traditional monitoring approaches?
Earlier, every five years, people would walk on foot, count elephant dung and surmise something about the elephant population. It was labor-intensive and gave only a snapshot. But today acoustic monitoring is continuous, with data that can be as frequent as monthly or weekly updates. It can also be installed at large landscapes. It is also unbiased.
Expand on that please.
When you look at how anti-poaching teams work, you have to realize that they carry heavy bags and walk in deep forests in tough conditions. Earlier, their efforts were limited to the direction they chose to walk in. Now, we can create maps that show gunshot data, and can sketch it to elephant habitats. This can be a big help for law-enforcement people and in devising better anti-poaching efforts with better protection of elephant. We can also guide teams on elephant corridors and prediction of suitable habitats.
/dq/media/post_attachments/uploads/2023/05/Elephants-are-the-gardeners-and-architects-of-forests.jpg)
What else do you use acoustics for?
We also try to match certain types of calls with behaviors to build a sort of “dictionary” of elephant vocalizations. Elephants are capable of making extremely low frequency and powerful calls—sometimes as loud as construction tools (90 to 117 dB Sound Pressure Level). Under the best ambient conditions, these low sounds carry over distances of several kilometers and might enable elephants to stay in contact despite separation in the dense rain forest.
Is all this, especially the anti-poaching data, real-time in nature?
No, that’s one constraint we are struggling with. We want to get there. Perhaps, in the next five to six years.
The K. Lisa Yang Center for Conservation Bioacoustics was an early leader in the design and use of ARUs, -both for ocean and on land. In fact, our first terrestrial recorders were built into PVC sewer pipe with a laptop computer hard drive for data storage. They used circuit boards that wrote the sound files in binary format. Earlier, the devices needed about 16kg (43 lbs) of batteries to run for three months.
How, can AI-based algorithms help—especially now that you have so much data for patterns?
Yes, a lot of data has to be verified and interpreted manually. We are dealing with very long recordings—as much as 8TB of data every four months. It’s almost impossible for humans to manage all this. Algorithms can help with automatic detection. We are working with computer scientists to sharpen and accelerate that part. Specially on making sure that technology does not miss signals and gunshots. The current AI-based detectors are power-hungry and code-inefficient. We are working on improvements.
Are technology companies contributing something?
Yes. AWS hosts our data for free. Microsoft and some other companies also came in at some point of this journey. Our data is open. We are keen to share it with other researchers in other conservation areas. Technology is a difficult element to maintain in a region like Africa with power-scarcity and constraints of solar panels, with no Wi-Fi and other practical issues. There’s a lot of room for technology to get better there.
Daniela Hedwig
Team lead, Elephant Listening Project
By Pratima H
pratimah@cybermedia.co.in