Tthe world is changing so fast actually that sometimes it is hard to follow but people start to get used sing few years and are always waiting for the next famulous innovation. We the big boom of slate device we have noticed that people could not go outside without it. They want to get in touch all the times with friends , news, business.. Slate are small device that people can bring anywhere and use it anytime. Recently we have noticed in a news broadcats that people now bring their slate at the beach during holiday !!! Woww is it a good thing or not ? I would say it depends on the used of it. In case you bring it to the beach for checking your email or stuff like that, I would says you can leave it in your appartment and have a break for showrt period of the year where you can think of other thing. If you are using it it to read a book, why not but your slate will have much chance to be stolen or broken at the beach that a real book of your choice. This are different scenario that I recently heard on TV and people now are smending much more time in front of their slate screen and wait for the thing that they expecting during the day.
Touch screens are part of our life
Today people are so used to touch screens that they naturally know what to do and how to do with it as a mater of use. Recently I have been in a computer shop and was looking for a specific device when I have notice a children coming next to be and start to explore as well computers in frontg of me. He goes in front a model of his choice and start slide content but it was not working, then he start again. I can see is sad face and then he goes too an other device and repeat the same gesture until he succeed. All this to say that this children like all the others have a first reaction to touch and this is natural.
With so much focus placed on smartphones, touch and gesture control and their
evolution, what will be the next big development in how people use digital
Check this nice article from Alex Hudson : Touchscreens “a small step” in innovation if your are curious and want to know more
A bit of history
More than 20 years ago, I get my first job as a HMI (Human Machine Interface) developer and designer in the industry world. My goal was to build touch user interface to control machinery devices in Pipes & Cable industry (already running under Windows 3.11 and C++). At this time touch technology was already there, present in devices purely dedicated to the industry. First technology I was using was based on a screen cover made of IR LED’s. Then few years after comes capacity and resistive touch screens. Could you imagine yourself 10 years ago interacting with your friend around a table responding to your touch and being able to collaborate on different content content?
If we look today, this is definitely not the touch technology which is new and important to note, simply because now it has been introduced worldwide to public, but more the way we are using it in our daily life. Apple has been of course a real actor to bring that technology to home but behind that Microsoft was working on the first version of Surface, bringing the technology to another level. With the touch technology, for not saying multi touch, we enter deeper in the NUI world which is in a permanent move. Touch is everywhere, multi touch is everywhere. Screens manufacturer have understand that they must be present in this world, start to implement touch on any screen and invest a lot and trying to be innovative at the same time ( Not that so easy I would say).
But wait a while, who says that touch can be only on screens? Actually it can be and it will be on any type of Surface.
For those who miss what could be the world, or what will be the world we are going to live, I drop you the link to the vision of Corning in a world of glass:
Touch is also being adapted for non-screen applications as well. For example, Microsoft is working on a touch interface called “skinput” that allows users to interact by tapping their own skin.
I think (hope I am not wrong and do not hesitate to correct me if I am ) that the Wii-Mote of Nintendo have shown the way of interaction with a simple remote control, and it has been the start of a lot of different ideas from different companies.
Gesture is the way to track user motions and translate those movements to instructions. Nintendo Wii and PlayStation Move motion gaming systems work through controller-based accelerometers and gyroscopes to sense tilting, rotation and acceleration. A more intuitive type of NUI is outfitted with a camera and software in the device that recognizes specific gestures and translates them to actions. Microsoft’s Kinect, for example, is a motion sensor for the Xbox 360 gaming console that allows users to interact through body motions, gestures and spoken commands. Kinect recognizes individual players’ bodies and voices. Gesture recognition can also be used to interact with computers.
The Kinect, and this is only my own opinion, is still consider by a majority like a toy to simply have fun interaction. Still some effort needs to be done to be converted to a real business added value. But I am sure it will come. They area actually some area where Kinect could bring some added value if we think for instance in medical sector where doctor’s hands, that once washed, must not touch any other surface to avoid bacteria’s in white rooms.
It’s a long time that I heard about speech recognition and I think, this technology is one of the most difficult one. Not only by the fact of translating word to action, but also by taking in account your voice variation and the learning curved. Let’s imagine a funny thing. You just finish installing at your home front door a voice recognition system device which replaces your traditional key lock or finger print device. You test that all works well and you are really happy that, once you pronounce your name or other word your door gets open by magic. Unfortunately few days after you get sick and try to get in your home in the same way but this time the system did not recognize you. Huston we have a problem. We start to see it coming on some smartphone or even the Kinect but we cannot say it is usable for having tried for instance on latest Iphone, more a gadget so far.
It allows users to guide a system through eye movements. I have recently seen on TV a broadcast showing a company which was monitoring with eyes tracking system on pilot customers, in which way their customer was selecting a product from the store, what they were looking at first and where customer’s eyes were placed most of the time. This type of analysis comes to help of the marketing team to arrange the way they were presenting products in the store.
For UX interaction with eye tracking, I will drop you to a nice article from UX magazine
Touch interaction on any support. It’s just the beginning
Few days ago I have been dropped to a place where Andy Wilson of Microsoft Research (initiator member of Surface V1) gives an interview of what could be the world of tomorrow in touch interaction. He is talking about our future kitchen, and the way we could interract with our home devices , but more generaly speeking on having touch projection on any support. Check the video and the prototype of what you could have integrated in your next buttons shirt.
The future will reserve a lot of more cool stuff but let’s step back on earth and dream presently what could be tomorrow.Some might be frighten, some others excited. There is so many to say in each of those technologies but my intention was not to write a novel but simply drop few words of it.
It reminds me all the time when I show to my kids an old video tape and they simply ask me:
“Daddy what’s that…”