Often known as 'UI' in short, User Interface is in reality an instrumentation using which a person controls a respective software or a hardware. An exceptional user interface is the one that offers a 'user-friendly' experience, that is which allows a person to communicate with the hardware or a software in an intuitive manner. When it comes to devices associated with user interface there are some common names that comes in our minds such as a keyboard, mouse etc. But recently I came across a video showcasing a probable vision about next level User Interface and trust me guys it was awe-inspiring! Finally, the things that I saw in the movie Mission Impossible or any other innovative ideas are coming true gradually and above all they started making sense to me. So, I take this opportunity to discuss next level 'User Interface (UI)' with you all, backed by a strong research, enough to anticipate active participation :)
Future UI: Mere vision or the next 'big thing'
By taking a look at the video I could easily conclude that next level User Interface will be all about the absence of mouse and keyboards to interact with our systems. Now, how can we interact without any device to give commands? Your answer: Gestures, voice recognition and many more to come! Nearly all sorts of software program available today have a Graphic User Interface, also known as GUI. Graphic User Interface includes elements like toolbar, menu bar, windows and several other controls. The users interface actually differs in different Operating systems, like the UI in Microsoft will differ from that of a Macintosh. However, they do have certain elements in between them that are common like desktop, windows, icons, etc. It is because of these common elements present in the user interface, we never have any problem switching in between these operating systems. In simple words, User interface can be considered as a medium through which messages are transmitted between a user and a machine, and user interface mostly is in form of a software installed on a machine. Such machines include popular names like Smart phones, iPads, PDAs and many more.
The demand for next generation user interfaces aroused with the advent of next generation devices like smart phones, PDAs, iPads and e-book readers. Browsing internet on smart phones is never an easy deal because of the small screen and also interfaces like stylus or a virtual keyboard makes things much more difficult. Things become much more complex when you have to operate these devices in crowded places. Now, new user interfaces like hand gestures or voice recognition can make things much easier. These user interfaces if included in modern day devices can make lifestyle much easier for us, thereby re-designing human-machine interface. For instance you could surf from one television channel to other simply by using hand gestures or voice command. Next generation user interface will be oriented towards creation of an interactive medium that successfully eliminates the communication gap between man and a machine.
Let me tell you about latest technological advancements that may interest you in terms of next generation User Interface:
Intel is in talks with 'Nuance' to develop voice control function for former's new laptop ranges. If reports are to be believed, this voice modulating application will be able to grasp user's accent easily and function accordingly. Impressive isn't it? But that's not all, Samsung is about to unveil it's product that features voice and motion control along with face recognition technology. Apple is already creating rave reviews for 'Siri', voice recognition application in its smart phones and upcoming product lines too. Eye-tracking application manufacturer 'Tobii' recently unveiled its eye-gazing technology in laptops that will soon outperform the need for a mouse. These recent innovative examples showcased by tech giants make me feel that soon we are going to miss our mouse and keyboards, as we are about to take a giant leap into a next generation user interface that has no physical limitations.
Technology behind the gesture control User Interfaces
Well, I was so much indulged in exploring the technology behind motion control devices, I went through all sorts of technology news that I could find and then I finally got what I was looking for! And that is the technology behind these marvelous UI devices. The first technology behind the next generation user interface devices is 'Electromyography', which is a technology used for sensing electrical muscle activity through finger gestures. Other than this, another technology that is still in making is utilization of 'Bio-acoustic sensors', which can detect transmission of energy through a body. Both these technology that I have mentioned are activated by wearing an armband on the upper forearm. This wired arm band sends wireless signals to a computing devices. The difference between this next generation User Interface technology and traditional mice and keyboards is that the latter utilizes 'physical transducers' to achieve the natural adroitness of hands and fingers. Whereas, in case of computer-muscle interface, it enables input signals through gestures adapted from 'Electromyographic' recognizer.
The very apt way to find out whether this technology will be implemented successfully or not is to play the wait game! And by looking at the technological advancement that we have today, I feel very soon User Interface (UI) will be re-defined!
Do share your comments!