On the third floor of Arup Global Acoustics’ office in Lower Manhattan is a small, fabric-enclosed space that looks and feels like a cocoon. It features an enormous curved screen, a bone-shaking sound system, and an Oculus Rift VR headset. Arup, a design and engineering firm, calls it the SoundLab.
The company line is that the SoundLab creates virtual sonic environments to help architects improve the acoustics of their buildings. What it really does, though, is transport you visually and aurally into any space—a concert hall, a subway platform, a cathedral. It’s one thing to make it look like you’re there; VR is good at that. It’s quite another to make it sound like it. “We’re using the data to help you feel,” says Raj Patel, Principal at Arup Global Acoustics.
Architects working on SFMOMA used the SoundLab to tune two theaters there; the same technology helped dampen the potential cacophony of the Fulton Street Transit hub in Manhattan. To make it work, engineers build a computer model of a building or space and then map a web of measurements called “impulse responses” that create an acoustic signature. Then the engineers record source sounds—a piano concerto, for example, or the rumble of a subway—in an sound-dampening anechoic chamber. Using acoustic simulation software, the engineers pass that audio signal through the virtual room’s acoustic signature, simulating how it would sound anywhere in that space.
The realism is startling. Sitting inside the SoundLab, surrounded by a sphere of more than 40 speakers, was almost indistinguishable from sitting in a concert hall. I heard melodies over my shoulder. I heard reverberations from high above. But more than that—changing the room changed the sound. A long, narrow room felt more intimate; a larger one felt fuller. That level of control lets designers go way beyond how a piano sounds in a concert hall; they can tune things as mundane as mechanical noise and structural vibrations.
When the SoundLab started almost 20 years ago, processing all this data required 400 computers running 24 hours at a stretch. These days, its flagship simulators—Arup has 14 SoundLabs worldwide—run on two Mac Pro Quad Core Intel Xeon E5 3.7GHz and two Windows Intel Core i7 5960X 4.6GHz processors.
Of course, sound isn’t the only way to evoke the feeling of a space. Arup engineers are working on ways to simulate sensations like movement and vibration, too. A motorized “shake table” lets them model how, say, a concert hall might vibrate during a concert—or an earthquake. Patel and his colleagues are coordinating with researchers to recreate sensations of light, touch, temperature, and even smell. “There are lots of people exploring these things separately,” says Patel. “The ultimate thing would be to pull all of it together in one environment. At some point it’s all going to come together.” And that lab is going to be a hell of a place to visit.
via Wired Top Stories http://ift.tt/1TnLJK4