Haptics: The Technology of Simulating Human Touch
- Substitute Hands
- Worlds in Your Hand
- In the Realm of Illusion
- HUI, Not GUI
- Where Is This Taking Us?
At a lunch table some time back, I listened to several of my colleagues eagerly describing the robots that would make their lives easier. Typical was the servo arm mounted on a sliding rod in the laundry room. It plucked dirty clothes from the hamper one at a time. Using information from the bar codewhich new laws would insist be sewn into every labelthe waldo would sort these items into a top, middle, or lower nylon sack.
As soon as a sack was full of, say, permanent press or delicates, the hand would tip the contents into the washing machine. In this way, garments could be shepherded through the entire cycle until the shirts were hung on a nearby rack, socks were matched and pulled together, and pajamas were patted smooth and stacked on the counter.
Sounds like a great idea, right? I mean, how hard could be for a robotic hand to feel its way around a collar until it connects with a label? As it turns out, that's pretty tricky. In fact, one of the things that keeps us from those robotic servants that we feel sure are our due and virtual reality that lets us ski without risking a broken leg is our limited knowledge of touch.
We understand quite a bit about how humans see and hear, and much of that information has been tested and refined by our interaction with computers over the past several years. But if we are going to get VR that really lets us practice our parasailing, the reality that we know has to be mapped and synthesized and presented to our touch so that it is effectively "fooled." And if we want androids that can sort the laundry, they have to be able to mimic the human tactile interface.
That leads us to the study of haptics, the technology of touch.
Research that explores the tactile intersection of humans and computers can be pretty theoretical, particularly when it veers into the realm of psychophysics. Psychophysics is the branch of experimental psychology that deals with the physical environment and the reactive perception of that environment. Researchers in the field try, through experimentation, to determine parameters such as sensory thresholds for signal perception, to determine perceptual boundaries.
But once haptics research moves from theory into hardware and software, it concentrates on two primary areas of endeavor: tactile human-computer interfaces and devices that can mimic human physical touch, most specifically and most commonly artificial hands.
Substitute Hands
A lot of information can be conveyed by the human hand. Watching The Quiet Man the other night, I was struck by the scene in which the priest, played by Ward Bond, insists that Victor M shake hands with John Wayne. Angrily, M complies, but clearly the pressure he exerts far exceeds the requirements of the gesture. Both men are visibly "not wincing" as the Duke drawls, "I never could stand a flabby handshake myself."
When they release and back away from each other, the audience is left flexing its collective fingers in response.
In this particular exchange, complex social messages are presented to audience members, who recognize the indicators of pressure, position, and grip without being involved in the tactile cycle. Expecting mechanical hands to do all that ours can is a tall order, so researchers have been inching that way for a long time by making them do just some of the things ours can.
Teleoperators, for example, are distance-controlled robotic arms and hands that were first built to touch things too hot for humans to handlespecifically, radioactive substances in the Manhattan Project in the 1950s.
While operators had to be protected from radiation by a protective wall, the radioactive material itself had to be shaped with careful precision. A remote-controlled servo arm seemed like the perfect solution.
Accordingly, two identical mechanical arms were stationed on either side of a 1m-thick quartz window. The joints of one were connected to the joints of the other by means of pulleys and steel ribbons. In other words, whatever an operator made the arm do on one side of the barrier was echoed by the device on the other side.
These were effective and useful instruments, allowing the operator to move toxic substances from a remote location, but they were "dumb." They offered no electronic control and were not linked to a computer.
Modern researchers working on this problem would be concentrating now on devices that could "feel" the density, shape, and character of the materials that were perhaps miles away, seen only on a computer screen. This kind of teleoperator depends on a haptic interface and requires some understanding of how touch works.