Smartphone microphones used for disaster search and rescue
When a natural disaster strikes, time is of the essence if people are trapped under rubble. While conventional search-and-rescue methods use radar-based detection or employ acoustics that rely on sounds made by victims, Shogo Takada is working on a way to use smartphone microphones to assist in locating disaster victims.
Takada’s method combines two types of sound sources: monopole and dipole. Radiating out equally in a circle, monopole sources create sound waves around the source, whereas dipole sources radiate sound from the front and back but cancel out on the sides. Dipole sound sources are directional, which can help researchers estimate the azimuth angle of the sound source, giving them information about the source’s location.
In a disaster situation, a rescuer would emit two dipole sounds, which would be received by the microphone of a trapped victim, and then an electromagnetic wave would be sent from the victim’s phone to broadcast their location. In the presence of sound-reflecting debris, a monopole sound can also be emitted by the rescuer to help reduce the effect of the debris. All of the sound sources can be incorporated into a formula to help estimate the location of the trapped person.
“This method is effective for locating victims buried under debris or soil caused by earthquakes or landslides because sound waves can propagate through them,” said Takada, a student at The University of Tokyo. “It could also be used to locate rescuers affected by secondary disasters.”
Takada’s technique has already proved highly successful in a field test on a disaster training site. The method achieved an error of 5.04° away from the hypothetical victim, when searching over an area of 10 m2.
“One limitation is that the method assumes the victim should possess a device equipped with a microphone,” Takada noted. “This is a more restrictive condition compared to traditional techniques that detect sounds or voices emitted by the victim.”
However, given the widespread use of smartphones, Takada believes that this technique is promising and plans to refine it further.
“In future work, we plan to develop a method that can estimate not only the azimuth angle but also the elevation angle of the sound source,” Takada said. “Additionally, we aim to expand the system to use two sound sources to achieve three-dimensional localisation.”
Takada presented his results as part of the Sixth Joint Meeting of the Acoustical Society of America and Acoustical Society of Japan, held in Honolulu, Hawaii, earlier this month.
Acoustic sensor tech detects drones outside line of sight
Unlike purely optical and radar-based methods, which rely on line of sight, the technology...
UQ drone platform to help Aust research soar
The University of Queensland Drones Collaborative Research Platform offers both aircraft...
Researchers demo real-time flood-sensing technology
UTS researchers have completed field trials demonstrating the AI-driven processing of weather...
