Nik Karpinsky quickly tapped out a few computer commands until Zeus, in all his bearded and statuesque glory, appeared in the middle of a holographic glass panel mounted to an office desk.
The white statue stared back at Karpinsky. Then a hand appeared and turned the full-size head to the right and to the left. Yes, it was quite clear, Zeus really was pictured in 3-D.
And there it was from one computer work station on the second floor of Iowa State University's Howe Hall to another down on the first floor: 3-D teleconferencing that's live, real-time and streaming at 30 frames per second.
"Four years ago, this would not have been possible," said Karpinsky, an Iowa State doctoral student in human computer interaction who's been working day and night to make the technology a reality.
Part of the problem is the complexity of the technology, said Song Zhang, Iowa State's William and Virginia Binger Assistant Professor of Mechanical Engineering, an associate of the U.S. Department of Energy's Ames Laboratory and the leader of the 3-D imaging project.
"There are a lot of skills involved," he said. "You have to do programming, optical engineering, hardware, software and networking."
To make it all work, Karpinsky and Zhang had to solve three big technical problems: capturing the 3-D images, transmitting the images and displaying the images.
"I was originally worried about transmission," Karpinsky said. "But we had to focus on all three."That optical hardware is networked and connected to a standard computer with a graphics card. The computer combines, processes and compresses the images. (And it really compresses them – from 700 megabits per second to less than 14 megabits per second.)
The compression allows transmission of 3-D images to another computer, even over wireless networks.
The idea, Karpinsky said, is for the projectors to become the eyes of the teleconferencing system: "What the projector sees is what you see."
Karpinsky and Zhang see a bright future for the technology they've developed with the help of support from the National Science Foundation and Iowa State's Virtual Reality Applications Center.
Zhang said the next steps include developing and testing applications for smart phones. He thinks the technology is only a few years away.
"In the future, we can do all of this 3-D video conferencing on the phone," he said. "These phones are powerful enough to do all the computation."
Zhang also wants to develop the 3-D teleconferencing technology for use in powerful virtual reality environments such as Iowa State's C6, a six-sided room that surrounds users with 100 million pixels of 3-D images.
(Karpinsky won't be part of the continuing research and development work at Iowa State. He graduates this semester and will move to Washington state to work for Microsoft. He'll also work for a startup imaging company called Phasica3D that spun out of Iowa State research.)
All of these 3-D developments, Zhang said, are coming far faster than he expected.
"When Nik first proposed this idea to me," he said, "I never believed we could reach this level by now."
Source:phys.org
0 comments :
Post a Comment