As part of IBM's "5 in 5" forecasts of predictions, the company says that "minding reading" (more like mind control) will no longer be a science fiction dream and that within five years, we'll all be controlling our computers and smartphones by just wiggling our brains.
While Apple focuses on speech technology with Siri, IBM believes the next revolution will involve our brains. To tackle and make mind control a reality, we'd all need to wear something like Emotiv's EPOC neuroheadset that's equipped with sensors that read electrical brain signals.
According to IBM Research News "the idea is to use these electrical synapses to also do everyday activities such as placing a phone call, turning on the lights or even in the healthcare space for rehabilitation."
That's great news, but we're not quite ready for mind reading yet. Devices like the EPOC are still kind of bulky and make you look like a dorky cyborg.
The magic will happen when the sensors get miniaturized.
By 2017, like all technology, the EPOC or other similar technologies will probably get smaller. So I can imagine it will have completely dry sensors, and I'd be wearing it all the time, perhaps embedded into a baseball cap, and with a finer range of thought patterns detected and connected directly to my mobile phone - allowing me to interact with the world just by thinking particular thoughts. In doing this I could wonder what the traffic will be like on the way home and this information would pop up in front of me.
Voice-control is really coming into its own with Kinect, Siri and Google/Bing search, but I'm skeptic as to whether to it becoming real in five years. I've played with "mind concentrating" gear like Mattel's Radica Mindflex Duel game, and it didn't exactly blow my mind. It barely worked. So excuse me if I'm not convinced that mind-control/mind-reading will be functional within the next few years.
There's no doubt that the technology will eventually come down to the end user, but that's something I think is still a distant dream.