Blog

Partners: Shell Belsimpel TMC Emixa
Partners: Thales Bollegraaf Recycling Solutions Autohopper

The future of UIs

The Future of UIs

-by Mariano Perez Chaher

Technology has evolved in ways that not many people could have imagined 20 years ago. Nowadays, walking down the street, you will find that almost everyone has a smartphone. Even children that are not yet able to speak can control the simple interface that is the touchscreen of a tablet.

However, the reason I started writing this article is because a tragic event happened to me last week. As I was getting onto my bike to go to Zernike I heard a crashing sound against the concrete. It was the phone I got exactly one year ago. The characteristic that makes this different from every other time it fell to the ground, was that a part of the screen shattered and half of the touchscreen was not responsive anymore.

So, as I tried to reply to someone’s message, but I couldn’t, as I had to tap right on the part of the screen that was broken, I thought to myself: “If only the controls were not touch, and there were sensors that detected my fingers from a distance, without depending on the fragile glass screen, I would still be able to use my phone”. So I remembered hearing of new technologies for User Interfaces being developed, and I started doing some research.

The first one I had heard of is the Leap Motion Controller, a really small device that can be connected to any PC or Mac, and allows the user to control the PC just by using hand gestures inside the device’s range of detection. The project started in 2008 and now it has become a new commercial device for developers to experiment with, besides creating games (which is also fun).

The Leap Motion Controller is connected with a simple USB device to the computer, and through two Infrared cameras and three infrared LEDs the controller detects the hands of the user in a 3D space and sends the data to the computer. The function of the LEDs is to emit non visible light (infrared) which reflects off the user’s hands and is detected by the camera as input of the system.

 

With my own eyes I have seen this technology used to operate a 100kg prototype robot to stop moving, rise and lower. All by just moving a hand in front of the robot’s sensors. The device also contributes to a better Virtual Reality experience as the user’s hands are directly implemented into the virtual scenario. However not everything is perfect, and the problem with this controller is that it is limited to work only within its own applications and not in any part of the computer’s OS, so its functions are limited, for now.

The second UI appears to be more promising as it is being developed directly by Intel, the company that produces most of the processors for the computers we have. This means that their technology may be implemented directly in the computer without being a third party device, which would allow for a complete experience.

This technology is called “Intel RealSense” and it is basically a camera that is able to detect 3d objects due to its two cameras that detect how deep a space is, which makes it able to differentiate two objects as our eyes do. However, it also has Infrared cameras to detect the inputs as the Leap Motion does.

These Intel cameras are also being marketed already and some laptops are being shipped with implemented RealSense, which sounds really promising for the future. But you may ask yourself, “why do I need a camera that is able to detect depth?” Well, industries may be able to scan 3D objects and send it, which would make some architecture and engineering jobs much easier. However, unfortunately I don’t believe this technology is yet able to completely replace our touchscreens and mice, since it is probably lacking accuracy in detection of our movements, but also because it is not completely implemented into the OS of the computers yet.

 

Therefore, I am writing this article to raise people’s interest and awareness about this technology that could completely change again how we interact with computers, because I believe that its implementation will start when the most influential companies, such as Microsoft and Apple, start to notice it. Also because I want to look as cool as Iron Man does when he moves his hand around the room. But for now I will have to go to a store and see if I can repair my phone, or change to the indestructible and old Nokia 3310, but due to its lack of WhatsApp, unfortunately I don’t think that will happen.

 

Leap Motion: www.leapmotion.com

Intel RealSense: http://www.intel.com/content/www/us/en/architecture-and-technology/realsense-overview.html

 


Critique of pure engineering
24Oct

Critique of pure engineering

Critique of pure engineering - by Alex Möllers What does it mean to be an engineer? The answer seems to be very easy. Children are...

BrainTrainer Budapest
10Oct

BrainTrainer Budapest

Budapest, Braintrainer, Brilliant   Intro In July this year I, Jesper, travelled together with Anton to the Braintrainer anniversary...