A couple of years ago I wrote about an idea I had for visualizing the implicit heat maps in Wifi signal strength using actual heat.
I never made the device, but I thought about what the ideas it was pointing at and generalized this as an observation I called "new data for old senses" and wrote some notes about it that I never shared here. Today PT at Makezine posted a link to a project along the lines that I was thinking about. It's a Wifi sensor that uses vibration to give you a sense of the Wifi landscape around you without having to look at anything, which was the crux of my idea in 2005. So, since the idea is now out there, here are my notes:
I'm interested in the idea of using senses that don't normally get used for device communication as secondary display channels. This is a way to allow access to what John Udell calls the vast middle ground between devices that either demand our full attention or none.
We have more senses than sight and sound, which are channels already full with important information, so how do we use our "secondary" senses to communicate "secondary" information?
What are other kinds of senses and other kinds of data we can use?
Here are the somatic senses (thanks, Google!):
- proprioception (the feeling of joint movement)
What to visualize? Liz has been doing a bunch of stuff about visualizing people's relationship with the RF spectrum and geography, but I've been thinking that there are several granularities that would change in perceptible and interesting ways. At human scale in a city there's Wifi strength; at car scale there are things like crime maps, and at airplane scale there are political boundaries (voting records, natural phenomena).
The bottom line is:
How we can introduce secondary information into people's awareness in a secondary way, using their less-used senses and without adding additional cognitive noise to the primary channels of sight and sound?