We’re so used to seeing megapixels these days that we don’t appreciate how even a few can make the difference between ‘seeing’ and being blind. I recently got the chance to try a tongue display—a device that literally displays images to the tongue using small currents—at the University of Wisconsin at Madison and at Wicab Inc.. If you experimented when you were a kid (!) it’s a little like putting a 9V battery on your tongue: or rather, 100 9V batteries.
At the university, I used the display to follow a path set by a computer. Arrows and other symbols or pictograms were ‘shown’ to me via my tongue to tell me what to do, and I carried out the instructions using the joystick. I could only tell how well I had done (pretty well!) by moving some windows to look at the path I had taken after the fact. Although it may look like I’m staring intently at the screen in the picture, I actually can’t see anything on it that would help me with the task.
As impressive as this was, using the display to show me my immediate environment was much more interesting. Here, the display is connected to a camera that you put on with glasses that cover your eyes. After an initial training period looking at white lines and shapes against a black background, I was allowed to wander around the Wicab offices and an empty warehouse space. With my head continually scanning back and forth and up and down—a natural response when looking at the world with such a narrow field of view, and at such low resolution—I was able to walk around without bumping into walls (more or less), could follow a path defined by lines on the ground, and could even see objects floating in space.
Apparently I did better than most people on my first try, which my trainer put down to my having done the navigation demo the day before. Even with that caveat, I was very surprised at how much I got out of it. Three things struck me the most. First after just a few minutes the experience starts to feel very ‘visual’: when I look back on it, I don’t remember my tongue tingling, but rather the pictures displayed as if I had seen them through my eyes. Second, I had been told that, after training, people could tell the difference between a baseball and a tennis ball. Although I didn’t get to that point, by the end of the demonstration I was convinced that this was true.
Finally, before I used the device, I was skeptical: surely 100 pixels worth of image really wasn’t worth the effort, even to someone blind. I stand corrected.
Photo, top: Shown is the Tongue Display Unit version 1. The saliva in our mouths helps it to make excellent electrical contact.
Photo, centre: Navigating around a virtual maze by accepting instructions from the display. I’m ‘looking’ at the pictures inside my head that are generated by the tongue display: the maze is hidden by another window on the screen.
Photo, bottom: Walking around the Wicab warehouse, staying on the path defined by the lines on the floor.
Originally posted on Brains and Machines.