Researchers are taking the idea of a future where nothing is ever lost one step further, as new tracking technologies emerge that promise a future where nothing is ever lost. When electrical muscle stimulation can turn your head toward the location of a lost item, why spend so much time following sound or visual signals? That is not frightening.
Apple's implementation is quite clever when it comes to using AirTags to find a lost item like a set of keys. Connected mobile devices can provide on-screen visual clues as to which direction the lost item can be found, and how far away it is. An animated arrow on a screen doesn't give an accurate way to find a missing item. It doesn't indicate if an item was left on top of a fridge, or if it was accidentally kicked underneath it. AirTags chirps to give someone extra clues to find the exact location of the trackers. What if there was an even more straightforward approach?
The Human Computer Integration Lab at the University of Chicago propose a completely different idea. Remember the scene in the movie where Alan Grant had to physically turn the head of Laura Dern to get her to see the brachiosaurus? If you replaced Alan Grant's hand with probes that stimulated the neck muscles of Ellie Sattler, you would more or less understand what these researchers are proposing.
In a paper published for the Conference on Human Factors in Computing Systems (CHI) which is currently taking place in New Orleans, the researchers detail an approach they are calling Electrical Head Actuation, where visual and auditory cues used to help locate an object are replaced.