Monday, July 21, 2014
(Picture from here.)
I'm a big fan of technology. There are technological solutions to many of the world's issues from hunger to clean water. The problems we face are usually how to get people out of the way of solving the problem rather than the problem itself.
That said, I'm definitely not a fan of humanizing technology.
Let me be very clear. I am all in favor of making technology easy. I want getting food and clean water to be so cheap and comfortable that you'd have to spend money and effort not to get it. I want to be able to ask my computational partner, think I'd like that? And have it come back to me with a cogent answer.
But I don't want it to try to act like a human being. I'm not interested in the illusion of an experiential entity.
There's a strong pursuit of this illusion going on now. The most recent one to come across my desk is Jibo, a robotic device that is intended to integrate into a family. To act in the role of a person. (See here.)
Jibo was developed by MIT's Cynthia Breazeal with the expressed intent of becoming "part of the family." (See here.) She feels emotion is the "next wave of this humanized high-touch engagement with technology."
I'm not a fan.
I like a humanized robot voice better than a voice that sounds artificial mainly because it's more intelligible. And I really like some of the capabilities advertised for Jibo. It has strong facial recognition coupled with a good camera. So it can take pictures like nobody's business. Since it can identify users, it's a no-hands portal to the net and can be a real internet helper with reminders and such. My objection isn't what it can do but how it presents itself.
For example, in the Jibo advertisement the robot does things like read a story to a child. According to the article Jibo will "put a smile on your face and make you feel better."
It worries me that someone thinks consumer machine is necessary for that.
I have a dog. It greets me when I come home and puts a smile on my face and makes me feel better. Sometimes it makes a mess. It has to be taken out and walked. If I step on its tail by accident, it yelps in pain so I musn't do that. The dog gives me a gift and requires an obligation. This is a transaction between us. This is a relationship between two experiential entities.
But Jibo is not an experiential entity. It is a machine that is intended to counterfeit the trappings of an experiential entity. It is designed to present the attributes of a relationship without the realization of a relationship.
This bothers me.
I think it's important that we understand who we're talking to. Who we're in actual relationships with. I think, for example, we should be concerned about the welfare of our children, parents and friends. The welfare of celebrities doesn't remotely interest me. I think we should concern ourselves with factual news rather then pleasant fiction. The relationships I have are with my wife, son and dog. Not my toaster. Not my microwave oven. Not my nice talking internet portal.
Fundamentally, it's a distraction. I talked about the Turing Test a while back. Turing had the idea that a device that could imitate a human being could be as intelligent as a human being. Back then most of the literature conflated intelligence with the ability to experience things. After all, how could you be smart and not experience the world?
Now we know that it is quite possible. Intelligence and experience are very separable. We can have quite intelligent systems that have zero experiential nature and really dumb systems that fool people sufficiently to be thought intelligent. In fact, some of the truly creepy ideas in SF are systems that are very intelligent in their execution of tasks without such a nature. Think of Terminator.
Or, even better, remember the old Colossus or War Games movies. Now think of them again and realize there was nothing in those machines but elaborate rules engines. Nothing inside but LEDs and a hum.
Now, if there were a person (entity, experiential organism, alien) in Jibo, I'd be all over it.