Back in the good ‘ole days before the internet and Yo and Meerkat and Snapchat, we had one way to talk to people in faraway lands: the telephone. But a microphone and speaker aren’t much use if you’re hearing impaired.
In 1979, a Bell Labs research project devised a way of communicating with sign language, using just the bandwidth of one phone line. Thirty-five years ago, video chat wasn’t feasible for most users, so the challenge was to distil sign language down to something that could be encoded and sent over the (very limited) bandwidth provided by a single phone line.
The solution was to place 27 retro-reflective dots on the hands and face of each individual, and then point a bright light and a camera at the signer. The dots — and only the dots — were detected by the transmitting device, which could then send the positional information to a screen at the other end. The researchers, Kenneth Knowlton and Vivien Tartter, outlined their achievement in this paper in Nature.
As well as being an impressive technological feat — remember, they pulled this off before the Walkman was a thing — it’s also a good lesson in designing technology around the user. It would have been easier to just force the hard-of-hearing to only use written communication, or learn some other, more machine-friendly sign language. But instead, Knowlton and Tartter found a solution with zero learning curve and total user-friendliness. Better than I can say for certain things today. [AT&T]