Ruohan Gao, a
@univofmaryland assistant professor of computer science, is working to help virtual worlds feel closer to how people experience real life, through sight, sound, and, eventually, touch.
“We want to mimic, and maybe enhance, how humans hear, see and feel the world,” said Gao, an
Gao is exploring that idea through SonoWorld, a system that turns a single image into an explorable 3D audiovisual environment with spatial sound aligned to the visual scene. As users move through the virtual world, sounds shift with them, making the environment feel more connected to real-world spaces.
Gao developed the project with Distinguished University Professor Ming C. Lin and UMD computer science Ph.D. students Derong Jin and Xiyi Chen. Their paper will be presented at the IEEE/Computer Vision Foundation Conference on Computer Vision and Pattern Recognition in June.
Read more: Link in bio