Probably, one method of interacting with our interfaces will be via a puck that is like a mouse but which is held in the hand and moved in three-space. This can be emulated today (July 2013) with an iPhone or better an iPod Touch. One app that facilitates this is Mobile Mouse, which I can attest is pretty handy.
I believe in the future, we may see a number of these devices, and some people will be interacting with their phones, pads and computers in much the same way. (Speech may well supplant typing for many uses.)
I'd like to envision a combination of a touchpad on the back of such a device, plus the layered conventions we are seeing with the beta ofiOS7.
So far as the device, Google filed a patent for something like this on the back of phones as application 13/593,117. This isn't quite what we envisioned, but they wrote it broadly. Apparently, this is an old idea and a patent has already been refused. The abilities that interest me are:
- the ability to use pressure (or multiple taps) to move along the z-axis. That is, suppose we have several planes on an iPhone interface, each with its own selection affordances. You could tap on the back of the device to expand the layers, and then press (or multi-tap) to choose one of them.
- then, some ability to move the cursor around as a normal trackpad to select or control the interface at these layers.
We have such a concept inferred in our design for a video annotator.
Some early observations on that are here. I think it is revolutionary.