The Dell PowerEdge Energy Smart containment rack enclosure is designed to reduce capital and operating costs and improve cooling effectiveness and efficiency.
Instant Expert: Natural User Interfaces
Sometimes I feel like throwing my hands up in the air…
It’s not just your hands. With natural user interfaces, your entire body could be the controller. Have you seen Microsoft’s Kinect?
I have. So the future of computing is Kinectimals?
Not quite. But natural user interfaces – let’s call them NUIs from now on – are the next big thing in everyday computing.
What’s so natural about them?
Most computer interfaces are artificial, so for example when you use a mouse there’s a degree of abstraction between you and the computer: you move the mouse, and the mouse moves the pointer. With NUIs, that abstraction isn’t there: for example to select something, you’d point at it.
Is this the bit where I’m supposed to mention Minority Report?
You can if you like.
Okay then. Minority Report.
That isn’t actually a bad example, although in practice waving your arms about isn’t exactly relaxing. But there are some interactions that are much simpler with natural user interfaces than graphical ones. For example, turning a rotary control by twisting your hand is much easier than fiddling with the mouse to adjust a tiny slider – or worse, trying to adjust an on-screen rotary control with the mouse.
So this stuff is actually practical?
Very much so. You wouldn’t want to control all of Photoshop with multi-touch, but resizing an image by pinching your fingers or rotating a wireframe by sliding your hands apart is faster and more intuitive than trying to do the same thing with a mouse, even if you’ve been doing it with a mouse for as long as you can remember. Also, as graphics tablet users have known for years, drawing with a pen can be more intuitive than drawing with a mouse. Why not get rid of the mat and draw directly on the screen?
Not everybody uses the PC to draw.
No, they don’t, but natural interfaces can do other things too. Browsing through a large asset library, a giant Lightroom folder, a long video file or just a big online shop is much faster if you can do real world-style flipping – especially if your PC can recognise the force of your gesture and accelerate or decelerate accordingly. And NUIs are particularly good at replicating real-world controls, so for example we’ve seen multi-touch used to great effect in music recording and production: a touch screen makes a pretty good set of sliders for a mixing desk.
Where does Kinect come into all of this?
To be honest, it probably doesn’t – or at least, it doesn’t in its current gaming-only form. But the idea of a PC that can recognise you and customise itself accordingly, or one that can respond sensibly to voice commands, isn’t that far-fetched.
Voice control? I’ve tried that. It’s rubbish.
Do you mean dictation programs?
Yes. They’re rubbish.
Some of them tried to do too much, certainly: navigating Windows using nothing but speech wasn’t exactly pleasant. But that doesn’t mean voice recognition isn’t any good. Quite the opposite: modern systems are very good indeed, especially when you team them up with decent microphones and audio processing circuits to filter out background noise.
Let me get this straight. You’re telling me that I’ll control my next PC by shouting at it, waving at it and dancing in front of it?
Probably not dancing, no, and you shouldn’t throw your keyboard, mouse, graphics tablet or trackball in the bin any time soon. Natural user interfaces are still in their infancy, and for work tasks they’re more likely to supplement existing input devices rather than replace them altogether.
And in the long term?
Yes, Minority Report. Isn’t technology exciting?