Friday 13 July 2012

Ghost(s) in the Machine(s)


I think I've lost count of the number of times I've told people I'm just waiting for the day that I can download my mind into a giant robot & run around wrecking stuff up (or more lately, that we could upload ourselves into a world where we can all be giant robots, or anything else for that matter), but this has always been a "one day" dream; something I might live to see, with luck. Now, however, we may find ourselves on the brink of seeing a whole new form of intelligent life emerge under our noses (and above our heads), combining many aspects of these ideas within a handful of years. The thing is, are we ready for it yet?

Even with a direct brain hookup, it's still Pong for one.
- Image: Danshope.com / Wired.com

Last week I read about yet another experiment in which somebody controlled a piece of technology using their mind. Impressive? Yes. New? Not really. Hell, a monkey could do it. No, what was new (for me) was the use of the term "embodying", as the person controlling the robot was able, through cameras & microphones, to see and hear "as the robot". Using this equipment, volunteers were able to use the robot as a proxy, or surrogate, to move about and do things in a room on the other side of the world. With all senses coming from the surrogate, some even reported quickly feeling that it was them doing the moving and doing (When suddenly confronted with a mirror, one reported "Oh look, I have blue eyes!"). This again is nothing particularly new; "telepresence" has been experimenting (albeit badly) with people using robot avatars (in meetings etc.) for decades.

This is what the boardroom at Dalek HQ looks like.
- Images: New York Times


The catch in this experiment however is that, in order to gain the needed precision, volunteers had to be strapped into an MRI machine for the duration, so that the tiny electrical activations within their brains could be monitored in real-time, and in detail. With its noise, close confinement & expense, such a system is something that would seem only to hold any attraction or use for day-players and tourists. Even with a computer body somewhere else to call your own, the sense of confinement would soon have people racing back to their real body and the real world.

But that's only if they have a real world to return to.

Around the world millions of people are paralysed, permanently bedridden or even "locked in", with little or no possible communication with the outside world; they have no "real world" to speak of. Many face the stark reality of living simply as the alternative to being dead, and some would rather choose the latter, when the former may sometimes only differ from it in brain-state and pulse rate. In recent years technology has offered the "locked-in" a chance to communicate with the outside world through sensors that read brain activity or tiny physical movements and allow a computer program to generate text or speech. Indeed, such a system has been on public display for many years through the "voice" of Stephen Hawking (who has, incidentally, repeatedly refused to upgrade his voice to a more realistic tone; having come to accept his distinctive tinny vocalisations as a part of who he is).

What if "surrogate" technologies such as the experiment mentioned above could allow the "locked in" a third choice, beyond death or survival? What if they could get outside, get a job and rejoin the wider world... as a robot?


"Bite my shiny telepresent ass."
- Image: Matt Groening / Fox

Of course at first the idea seems absurd. The person in question wouldn't be a robot any more than a person playing a computer game or driving a remote controlled toy "is" their character or vehicle; the thing is, by many definitions they're not currently living the life of a person either. For a healthy person the idea of spending whole days, let alone every waking hour, lying on a slab, hooked up to machines, is something most would not want to consider. But for those already so confined, this might represent a previously impossible escape into a wider world.

Who says your new body has to have legs?
- Image: Coolrobots.net


For the last few decades (arguably back as far as WW2 research teams have been working on remote-controlled robots (generally known as Waldo units) to undertake specialised jobs too dangerous or even not physically possible for a human body to perform. These range from firefighting, medical applications, logistics, search & rescue, working in radioactive environments and mapping, to exploration and even warfare. To date these have always been operated by guys with joysticks (or occasionally a neural hook-up): "day players", who log off and go home at the end of their shift. So what of the potential for "residential" robot jockeys?

What if your "mini-me" became your actual you?
- Image: Coolrobots.net

Trapped in their own minds at present, here are people often desperate to interact with the world but with little hope of ever doing so. Given access to a remote body, and with no body of their own, as far as physical action is concerned, they could take up permanent residence. All at once they would not so much be locked into their body, as voluntarily locked out of it. An instant WALDO generation.

With little technological progress from where we are today, we could find ourselves sharing our streets with human minds in robot bodies, long before the innovations or surgical techniques required to do so physically are even on the horizon. In 1987 Robocop (and later its dubious sequels) asked about the psychological and societal effects of trying to make a man into a robot. Now I'm left with an interesting thought: what if ED 209 was a person? What if the giant (or small) robots roaming our streets were the eyes, ears, hands and feet (and antennae) of real people, far away? If ED 209 was just Ed, with a wife, two kids and a mortgage; and your roomba used to play jazz guitar?


"I'm looking for volunteers for the company 5-a-side football team. You have 20 seconds to comply... Nah, I'm just pulling your leg... Which isn't easy with guns for hands! I don't know, sometimes I crack myself up... Did you see your faces? Priceless."
- Image: Orion Pictures


Legally, the way is already clear, even if it might require a little polishing. Crimes committed by a surrogate (under direct control, of course) would be attributed to the "pilot"; if possibly via conditions concerning vehicles, objects or weapons. Crimes committed against surrogates would currently only constitute property damage as, psychological trauma aside, the "pilot" would not be physically hurt (developments in haptic technology notwithstanding). But how would the world react to the first crimes committed by embodied surrogates? When the first killing spree is carried out not by a madman with a gun but by the mind within a warplane or a tank?


This one, at least, is friendly.
- Image: La Machine


The main issues I can foresee are societal. How quickly would/could we adjust to the idea that the robots around us aren't just smart but are actually people - ghosts in a machine? Could we get used to people changing bodies the way we currently change jobs? Would this effect be beneficial for the development and adoption of Artificial Intelligence, or shove it back into a small niche for those developing uses for which full autonomy is required, or wireless link-ups are not practical?

Far from being a a nebulous "what-if" for sci-authors to speculate on, our robotic future may be waiting patiently in the (hospital) wings. You may not be able to download your mind into a giant robot just yet...


Image: Spectrum.ieee.org

But you'll be able to stream it.

Sunday 1 July 2012

SLR - Simple Language Recognition


For all my love of technology I have come to the realisation that I  spend an awful lot of time swearing at it, in its many shapes & forms. But what recently surprised me is quite how much of that swearing is done at my camera.

Yeah, that's right, focus on the gravel. I'm sure it contains some rare  and fascinating aggregate.

I love my digital camera. For too long, since some grit destroyed my last-camera-but-one and a combination of a go-kart-related injury and me losing it in a bar did for the last one (OK, so on balance losing it turned out to be more fatal than my mate running it over), I had no camera other than the one hiding behind the screen-lock and loading time of my phone; so to regain the familiar presence of a point-and-shoot snapper at my hip was a feeling of freedom and reassurance that the world was once again mine to capture (in a presently non-giant-robot-related sense). Although there's no beating a huge, expensive SLR for quality of shots or the simple beauty of the composure, I don't want to have to do the agonising maths entailed in deciding whether to bring something like that along with me whenever I step out of the front door. Am I likely to want it? Can I carry it around all day? Will it be safe? Will I lose it? Will (as happened to a devastated friend of mine) I put it down a little too close to a candle and melt my £300 telephoto lens? Nope, I wanted something I could just tuck into my jeans pocket every day and go from opportunity to shot in less than 10 seconds. And that's exactly what I got. So why does it frustrate me so much?

Because it's bloody stupid, that's why.

I bet she's taking a good picture. Damn her eyes.

Don't get me wrong here. This camera represents the most I've ever spent on a piece of photographic equipment and it has the bells and whistles to show it. It's shock-proof, dust-proof, waterproof (to 5 metres), has a telescopic internal optical zoom (that means no projecting lens to get grit stuck in it), a touch-screen and a switch-on-to-shot time that means I rarely miss a thing. Yet when, in the run-up to buying it, I noticed a photographer friend of mine scowling at his (he'd bought the same model for much the same reason as me, although to supplement his existing SLR) and asked him if he'd recommend it, I was baffled by his reply: "This camera has almost everything I wanted. And I wouldn't wish it on my worst enemy." Well of course I ignored him, didn't I?

I'm a sucker for a special feature.

You see, apart from an "intelligent" (woeful) "touch-to-focus" system, my camera (as with many more recent models) boasts a "superior auto" facility, in which on-board software cunningly identifies what it thinks I'm trying to take a photo of and automatically adjusts its settings accordingly. In fractions of a second it will adjust the settings to take into account light levels, colour saturation and even more complex things such as the presence of faces, horizons or moving objects at different depths of field (although it has such an adorably loose definition of what constitutes a face that it brings to mind the various infant face-recognition experiments of the 1970s & 80s - and has taught me a new word: pareidolia).  The problem is, it's just good enough to be infuriating. Sure, it's fantastic if you want to focus on your friends for a quick snap, or catch an opportunity-shot of some vandals running away from smashing up some bus the other day (see picture), but if you want to try something more ambitious you're back into the realms of chance. Because, for all its "intelligence" (or even "superior"ity), it's all just software looking at what's in front of it and taking a punt at what it thinks you might want.

That's the last time you'll smash up a 57 on a Friday afternoon.

Right now, well into the second century of photography, the best way to take a photo is still to spend half your life savings on a box of mirrors and glass, go on a photography course and learn how to tweak your apertures and focal lengths yourself by turning physical dials. Sure, for art exhibitions and weddings I completely understand this, but what you should be appreciating is the photographer's eye for composition, lighting and focus, rather than their ability to phycially tweak their dials in the right way at the right time. The thing is, SLRs are no longer the only cameras capable of taking such shots, when nowadays even mobile phones are packing high-resolution cameras; the problem is getting out of the realms of chance and into getting the shot you want rather than what you end up with after you press the button and your camera guesses what you meant.

We'll never know what this was meant to be.

Modern digital cameras have tried to overcome the technical difficulties of photography by introducing a baffling array of "scene selections" that pre-configure settings such as lighting, colour-saturation, shot speed and aperture, or even by including a manual focus option, but often the action of finding and configuring these takes long enough that the moment is lost, even if you can find the appropriate one and aren't forced to work it out through trial and error. I found, with an old camera, that I could take excellent "motion-streak" photos by setting it to "night mode" with a slow flash and a long exposure. My new camera, of course, has a much better "night mode", which uses advanced software to carefully remove this effect, thus improving actual night shots and ruining my burgeoning artistic endeavour. What if I wanted it blurry, dammit? 

No, no, no. Look, the light's got all smeared...
There, that's better. This is a Night Shot. Party time.

You shouldn't need to schlep around a digital SLR the size (and value) of a small child in order to take the snaps that your compact has the ability, just not the inclination, to take. The problem is not that digital compact cameras don't have the settings or features required to take the photographs that we want, it's just that they're too buried in software to be accessible (or, if accessible, user-friendly). 

Aha! When I focus on the leaf, lock the focus, then very carefully move the camera over whilst maintaining the same relative distance I can... This is ridiculous.

But that's only a hop and a step away from where the technology needs to be.

Last year the iWorld (and nearly the rest of the world) went iMad for Siri, the sinister in-phone butler that does pretty well at turning "natural language" commands into instructions that it can follow (with some exceptions). So "What's the weather like?" brings up a weather map and "What am I doing on Tuesday, my life is hollow?" returns a non-judgmental look at your agenda. Rather less fanfare greeted Google Goggles, which does its best at recognising objects that the phone camera is pointing at and trying to identify and find links to them.  If somebody were to put the two of these things together we're not a million miles away from a "virtual photographer" that can take simple requests about what to photograph and how to do so. How many times have we cursed aloud at our digital cameras over their choice of lighting or focus, when we could be using pretty much the same strings of words (barring the odd profanity) as instructions just as we line up our shot?

"I want to take the picture just as the foot touches the [blue] ball."
"I want to take the picture just as the first person crosses the [white] line."
"I want a picture of the [red] bug with the background defocused [not the side of that bloody leaf again]."
"I want a picture of their faces as they cross the line [not a blur in front of a perfectly-focused tree in the middle-distance]."
"No, I wanted the Sun reflecting on that puddle, not the random arrangement of sticks at the side that looks like an emoticon smiley. Even though that is quite cool."
"Just for once I want a picture of the moon."

OK, so perhaps in the short term we might not be able to engage in such artistic discourse with our little photographic friend, but the software is certainly there to interpret simple instructions regarding colour and even shape, as well as key words such as "horizon" and "person", which would certainly help in several of the situations above. Of course I would fully expect to find myself swearing at my poor, hapless camera as it found itself misinterpreting my words for the n-th time, but at least it would be able to triangulate what I'm saying with what it's seeing, rather than just taking a punt based on the latter.


No, you're right. The distorted reflection of that overhanging hedge does hold more artistic interest than this spider walking on the frikkin' water.

Can you see this baby frog poking its head out of the water to take its first breath of air? No, me neither. But the limescale on the side of the tank came out beautifully.

Yes I do appreciate the almost fractal-like branching of the leaf lobes, but I wouldn't have minded a better look at the  robin chewing on that live wasp.

Ooh, look! You can actually see the bubbles in the plastic of that bench!

Typical. There you are, carefully lining up your shot of some leaves when a bunch of deer wander in and ruin it. Joke's on them though - still got my leaf shot.

Neither rain nor snow nor performing slug will stay this camera from  the swift completion  of its appointed duty to accurately document the wood-grains on this floor.

And who knows, maybe someday not too far down the line we'll be able to wirelessly think our photographic criteria to our digital photographic assistants, or DPAs (to mint a term), rather than leaving all the mind-machine interfacing to fighter pilots and the terminally locked-in. And eventually, once we've finally got the things hooked directly into our optic nerves, we'll have nobody to blame for our botched shots but ourselves, our short attention spans and the fact that all our eyes ever really see is the inside of a bio-storage pod

"This time I'm going to get that jumping shot just as their feet leave the ground."
Via  http://hookup-articles.blogspot.co.uk/2010/04/mind-reading-machine.html

Right, I'm going out to the garden to take a photo of that spider web that some inconsiderate arachnid has built right next to a feature that looks like Chris Packham. Wish me luck.

Wherever there is an animal doing a thing, there you will find him.

Dammit.

Sunday 25 March 2012

Jeeves 2.0 - Your Personal Cloud Awaits

OK, I'm going to go right ahead and say it: Owls are a pretty-much hopeless base on which to build a communications infrastructure. In J.K. Rowling's ludicrously popular Harry Potter books these secretly intelligent and loyal predators are an ever-present mail system that underpins much of the magical world, but let's face it, they're rubbish, aren't they? They spend half their time getting lost, killed or just being grumpy about whether or not they've been kept in fresh mice (which one hopes aren't secretly magical or sentient too) and by the time the world (seemingly represented by England) is in crisis everyone resorts to jury-rigging their magical projections into mouthpieces anyway.

There is really nothing stopping wizards from using a phone.

So why even bother to have them in the books? Because they're mysterious, they're cute and, whether we've actually seen one of the razor-brained killers for ourselves or not, they're something personal that follows the protagonists through life and causes them great personal pain when they're lost or stop functioning.

Sounds like an iPhone to me.

Of course, by future/magical standards, phones are currently at least as rubbish as owls; not least because we are at least as much at their beck and call as they are at ours. They may not be hungry for fresh-dead rodents, but who hasn't had their smartphone whinge at them about being hungry until it finally decides to turn its metaphorical back on us and refuse to communicate with us until we feed it?
There is really nothing stopping you from hiring a PA.

A few months ago, with great fanfare, Apple announced the launch of Siri to accompany the new iPhone 4S.  As a voice-activated personal assistant it was pretty likely to change everybody's lives by being able to tell people what the weather was likely to be like soon or get confused about who you wanted to call and, indeed, has proved to be about as useful to enthusiastic adopters as Android 4.0's face-unlock and the Xbox Kinect's Minority-Report-esque hand-swiping interface. It's a nice party-trick to show your friends but at the end of the day you'll just push the buttons and get it done in half the time and far more reliably.

But I've now got so far off the point that I've nearly got lost. Here's the point:

What we need is our own personal robots.



Wait.



I'm going to take a deep breath and say something I never thought I'd let myself think:

Maybe we don't all need our own personal robots.

Let's face it: the little ones are annoying & the big ones want to kill you.
Image: Blade Runner. Copyright: Warner Bros.

With all the advances in robots that can walk, fly, run, jump, unscrew lids from things and maintain pleasant conversation, there isn't really much short-term prospect that they'll be able to do all of this, unobtrusively follow us all of us around (all of us) and quietly run our lives without constantly just getting in our/each other's/their own way or running out of juice and falling over. And what exactly are they going to be getting done as they walk/crawl/fly/slither along? Opening doors for us as they remember our friends birthdays? I've already outsourced my memory to my phone and the doors already open themselves. On the human planetary colony of Solaria envisaged in Isaac Asimov's The Naked Sun, the tiny human population all have dozens of personal robots to assist them with every facet of their lives but I get the feeling that even one would would start to become a hindrance as much as a benefit, until they're good enough to replace us anyway.

I think what we really need (at least to tide us over until we all retreat inside a superpowered personal future, at least) is a personalised infrastructure. My music and photos and videos don't live at home, or even on my phone or MP3 player any more, they're somewhere out "in the Cloud" and can simply be summoned to wherever I need them, whenever I want. Last weekend (Sunday evening to be precise) I thought of a film that my brother had recommended, looked it up on my IMDB app, ordered it using my Amazon app and it was on my desk at work before lunchtime the next day. But we don't even need to rely on people to recommend things for us or even deliver them any more. My new Virgin Media TiVo box arrived last week and after a few days of training it's already pretty good at predicting and automatically recording shows it thinks I'll like (and even has the decency to tell me why). I sold my car last year because I couldn't justify the running costs or afford the insurance any more, but increasing numbers of my friends are turning to ZipCar or similar car clubs, which let them track down a compatible car using GPS on their phone, hop in, drive it wherever they need to, then drop it off somewhere else later. Who needs a car any more when you can have them stashed all over the place just waiting for you?

Of course these systems still leave a lot to be desired. Who hasn't waited days for a parcel that hasn't arrived or had to spend their Saturday morning queuing at the Post Office for something that that couldn't quite be smashed in through the letter box? Who hasn't waited weeks for the one movie they actually wanted from LoveFilm whilst patiently sitting through the dross they don't even remember ordering? And I'm wearily informed by my ZipCar-using friends that picking up (or dropping off) a car isn't quite as easy as radioing KITT and asking him to meet you around the corner in 30 seconds.

Bloody Entropy.

But it's pretty good though, isn't it? Of course we may never (need to) get to the 1950s vision of a vacuum-tube based future, much parodied by Futurama et al,  in which goods and even people are whizzed effortlessly around the metropolis through an innumerable series of interconnected tunnels that connect pretty much everything to pretty much everything else. Sadly after a brief high-water mark the tubes have been relegated to shunting small change around supermarkets. For now our only communication tubes are digital and the physical goods still tend to require some dude in a van to deliver them. For now*.

Imagine the tailbacks around IKEA.

So perhaps a future of personal robotic servants isn't one we need to be aiming towards. Perhaps we'll be happier and more efficient with a PA in our pocket (or wired directly into our brains) and an infrastructure intelligent and integrated enough to get on with organising the rest for us without a power-hungry butler occupying the seat next to us (I can't get a seat on the bus half the time anyway). When I was at university (way back in the mists of time) I was briefly fascinated when the campus got a live webcam streaming a view of the piazza outside the Student Union so that you could see what was going on before you bothered leaving your room to join in (or just get hammered back in halls).  Last November I went laser-shooting at Bunker 51, somewhere under the derelict factories of East London and found myself staring at the prototype of a remotely operated, webcam-targeted paintball turret that armchair warriors can use to punish real people in real woods from the safety of home using a Playstation 3 app. And who wants the robots calling those kinds of shots?
Still can't make my tea though.