Dr. Frank Poole: [playing chess with HAL, Poole studies the chessboard] Let's see, king... anyway, Queen takes Pawn. Okay?
HAL: Bishop takes Knight's Pawn.
Dr. Frank Poole: Huh, lousy move. Um, Rook to King 1.
HAL: I'm sorry, Frank, I think you missed it. Queen to Bishop 3, Bishop takes Queen, Knight takes Bishop. Mate.
Dr. Frank Poole: Huh. Yeah, it looks like you're right. I resign.
HAL: Thank you for a very enjoyable game.
Dr. Frank Poole: Yeah, thank you.
By Erik Christiansen
The quote above is from 2001: A Space Odyssey. If you haven't seen the movie by now, you should. Many science fiction films have tried to conceptualize and recreate what the future of artificial intelligence (AI) would look like. These include (but are not limited to) Jarvis (Toni Stark's computer assistant in the Iron Man movies) Bishop (the robot 'humanoid' from Aliens), and Data (from Star Trek Next Generation), just to name a handful.
Though the movie was a bit before my time, I feel that 2001: A Space Odessey came the closest. HAL wasn't a biped. He didn't have a face. He was just a wall ornament. Ever present, always listening, always watching. His voice wasn't menacing (until later in the film). He sounds like your psychologist - calm and collected. However, to me his best feature was he knew what to do before you asked him. He monitored the spaceship automatically, predicted component failures, gave free advice. When I first saw the movie, at the age of 6, I was blown away! "This is so much better than our computer," I would say.
It's cliche to say humanity is closer than ever to having a HAL-like AI, but we are.
What I didn't expect, was that it would come to mobile first, and then move its way to the PC. As I write this, I'm keeping an eye on my Google Now feed - Google's answer to Apple's Siri. Currently Google Now is showing me the weather in Edmonton (24 degrees C), estimated time home via train, shipping estimates for an Amazon package, stocks, and movie times (based on my proximity to the closest theater). I may sounds like a techno-snob, but you can't understand how essential this is until you've used it daily. And now I can't go back.
This is the new software battleground. Phones and tablets will continue to be rectangular slates with glass fronts for a long time. There are only so many ways to create a touch-based user interface. There are only so many widget and icon combinations. The number of multi-touch gestures we can conduct are limited.
The response of many consumers to personal assistants is that they're not perfect, and therefore useless and excessive. It's true to an extent. Google Now doesn't always correctly interpret what I say, especially when I'm in a loud environment. (Arguably Siri does a better job with voice recognition, but lacks some useful information). Sometimes it's faster to type in a search. But, personal assistants are so much more than dictation tools. Now that companies like Google, Apple, and Microsoft have this huge pool of data about their customers, they can use it to give us value. Useful information can be predicted. Things I always want to know (like my work commute) shouldn't have to be searched every time.
Having relied upon Google Now for a full year to give me key information, I haven't really run into a snag. It's remarkably reliable. I've actually come to trust it in full. I'm never worried that it will break and leave me stranded when I need it most.
I depend on Google Now and Siri much of the time. When they're not accessible (in a no WiFi zone), I feel like my devices have lost important functionality. I'm naked without it. It's so important to me (and my colleagues) that it's now become something we look for in future devices. It's a feature that has reached a level of importance equivalent to battery life and security. I'm living the future, and I don't want to go back to anything else...