Potential applications abound. "In addition to sound and vision, virtual reality programs could include touch as well," said Mandayam A. Srinivasan, director of MIT's Touch Lab and leader of the MIT team.
Imagine haptic (touch) feedback for a surgeon practicing telemedicine. What about artists from around the world collaborating on a virtual sculpture? They could create different forms, colors, sounds and textures accessible over the Internet. Students in a physics class might "feel" the forces within the nucleus of an atom. "That application could also be sent across a very widespread network," Srinivasan said.
"We really don't know all of the potential applications," he concluded. "Just like Bell didn't anticipate all of the applications for the telephone."
The feat was first accomplished on May 23 of this year. The researchers plan to demonstrate it anew at an Internet2 conference Oct. 28-29 at the University of Southern California. That two-part demo will transmit touch signals between California and MIT, and between California and University College London (UCL).
"As far as we know, this is the first time that touch signals have been transmitted over long distances, particularly across the Atlantic," said Srinivasan, who holds appointments in MIT's Research Laboratory of Electronics and Department of Mechanical Engineering. In 1998, his group transmitted touch signals between two rooms at MIT, allowing two users to perform a cooperative manipulation task in a shared virtual environment.
"Touch is the most difficult aspect of virtual environments to simulate, but we have shown in our previous work with MIT that the effort is worth-while. Now we are extending the benefits of touch feedback to long distance interaction," said Mel Slater, Professor of Virtual Environments in UCL's Computer Science Department and Srinivasan's UCL counterpart.
Srinivasan and Slater's colleagues on the work are former MIT graduate student Boon K. Tay; current MIT graduate student Jung Kim of mechanical engineering; and J. Jordan, J. Mortensen and M. Oliveira at UCL.
All are authors of a paper describing an experiment on the work that involved 20 volunteers. That experiment showed that people completing a collaborative long-distance computer task that included the sense of touch felt a significantly greater sense of having a partner than those without access to the touch interface. The paper was presented Oct. 9 in Porto, Portugal at PRESENCE 2002: The 5th Annual International Workshop on Presence.
HOW IT WORKS
The demonstration of long-distance touch involves a computer and a small robotic arm that takes the place of a mouse. A user can manipulate the arm by clasping its end, which resembles a thick stylus. The overall system creates the sensation of touch by exerting a precisely controlled force on the user's fingers. The arm, known as the PHANToM, was invented by others at MIT in the early 1990s and is available commercially through SensAble Technologies. The current researchers modified the PHANToM software for the transatlantic application.
On the computer screen, each user sees a three-dimensional room. Within that room are a black box and two tiny square pointers that show the users where they are in the room. They then use the robotic arms to collaboratively lift the box.
That's where the touch comes in. As a user at MIT moves the arm--and therefore the pointer--to touch the box, he can "feel" the box, which has the texture of hard rubber. The user in London does the same thing. Together they attempt to pick up the box--one applying force from the left, the other from the right--and hold it as long as possible. All the while, each user can feel the other's manipulations of the box.
An MIT News Office writer participated in a recent demonstration. The force from the participant in London felt so real that the writer jumped backward.
Jung Kim, the MIT researcher who participated in the May demonstration, describes the experience as "amazing. The first touch from the other side of the world!"
There are still technical problems that must be solved, however, before everyday applications will become available. Chief among them is the long time delay, due to Internet traffic, between when one user "touches" the on-screen box and when the second user feels the resulting force. "Each user must do the task very slowly or the synchronization is lost," Srinivasan said. In that circumstance, the box vibrates both visually and to the touch, making the task much more difficult.
Srinivasan is confident, however, that the time delay can be reduced. "Even in our normal touch, there's a time delay between when you touch something and when those signals arrive in your brain," he said. "So in a sense, the brain is teleoperating through the hand."
A one-way trip from hand to brain takes about 30 milliseconds; that same trip from MIT to London takes 150-200 milliseconds, depending on network traffic. "If the Internet time delays are reduced to values less than the time delay between the brain and hand, I would expect that the Internet task would feel very natural," Srinivasan said.
Although improving network speeds is the researchers' main hurdle, they also hope to improve the robotic arm and its capabilities, as well as the algorithms that allow the user to "feel" via computer.
The MIT researchers supplied the haptics expertise for the work; the UCL team covered software development and network issues. The two groups began their collaboration in 1998 when Slater was at MIT on sabbatical.