To more fully serve users’ personal and business needs, digital interfaces are evolving beyond screen-based devices; augmented reality, eye tracking, hand gesture, and brain interface technologies are being refined for mainstream use. Though the technology is on the way, its impact has yet to be determined.
How will these emerging interfaces disrupt the way users interact with digital media?
NYC Media Lab, a public-private partnership launched by the City (NYCEDC), Polytechnic Institute of New York University’s (NYU-Poly), and Columbia University and based at NYU-Poly, set out to answer this question. Last month, the NYC Media Lab gathered 150 researchers, startups, and media executives at Razorfish last Tuesday for Future Interfaces, an evening of live demos, stimulating panel discussions on the latest interface developments, and big conjectures on the future.
Interactive interfaces, such as touch screens or voice recognition, are hardly new technologies. The first touch screen patent was issued 42 years ago, noted NYC Media Lab Executive Director Roger Neal, but the rate of change within the last five years has been particularly astounding.
Panelis Ray Velez, Chief Technology Officer at Razorfish said, “One of the key inflection points we're seeing right now is that there's a huge change of pace in terms of what technology can do for us, and how much new technology is coming out. But what's more interesting, is the pace of adoption.”
Chief Science Advisor at Tactonic Technologies and Professor at NYU Courant Institute, Ken Perlin, added, “There's a stealth quality about Moore's Law. That means that things can chug along for decades and people don't realize that they're about to rise above a certain threshold. I think what we're seeing is that certain levels of graphics, displays, connectivity, and technological enablement — because of this exponential growth that's been going on for decades — have just hit a place where we're getting this wonderful perfect storm.”
Attendees were invited to interact with emerging interfaces made possible by new technological developments. Demos included multi-pressure touch panels from Tactonic Technologies; interactive projected retail displays from Perch Interactive; an augmented reality game from Columbia University’s Computer Graphics and User Interfaces Lab; a heartbeat-emitting, transhuman structure from Parsons MFA Design and Technology; projects from NYU ITP, including a musical-playing, spandex-powered installation; and immersive, multiscreen retail concepts from Razorfish’s Emerging Experiences lab.
The demos allowed a glimpse of the future of human-machine interaction, offering possibilities beyond the touch screen. Panelist Carla Diana, faculty member at the School of Visual Arts and Univeristy of Pennsylvania, said, “The next step is that our interfaces may or may not be glass.” Panelist Chris Bregler, Director of NYU Movement Lab and Professor at NYU Courant Institute, agrees: “Just forget everything you’ve done before with a mouse, keyboard, or screen -- we can go beyond that.”
For more about NYC Media Lab, click here.