Documenting the Coming Singularity

Sunday, May 20, 2007

Machine Consciousness: No Practical Value?

I read an interesting article this morning from the Burlington Free Press featuring the work of Josh Bongard, hired by the University of Vermont as an assistant professor of computer science. The article focused much of its attention on Bongard's self-aware robot, Black Starfish.

The question naturally arises: What is self-awareness?
"You have a sense of your own body," Bongard said. That much seemed to be true of Starfish, whose creators assigned it the task of moving across a surface without telling it how to do so. Instead, the robot was programmed to test the locomotive possibilities for itself with a "series of playful actions" that Bongard compared to what a human infant engages in. The robot learns what its body can do and what it can't, ultimately teaching itself to walk, as Bongard describes it.
The article concludes with these thoughts:
He acknowledges that the notion of conferring a machine with self-awareness -- a quality which some people consider exclusively human -- can be "controversial."

He distinguishes, though, between self-awareness and consciousness. Starfish had one characteristic but not the other, he believes.

To be conscious, he said, one must be aware of one's own self-awareness.

Could a conscious robot be built?

"It's theoretically possible," Bongard replied, "but I'm not sure of the practical value."
I certainly give a lot of weight to statements made by an expert in the field, but Bongard's final words, as reported in the article, are curious. I wonder what Bongard considers the concept of "practical value" to encompass. The article does not explore further what he might mean, so it's impossible to say for sure, but if we take the comment at face value, perhaps he means to say that a conscious machine intelligence would serve no purpose for the betterment of humanity, at least that he can think of.

My previous post does mention the problems of ethics that are involved in how we would treat a conscious machine entity, and the idea that useful, intelligent machines need not be conscious, so perhaps Bongard is correct in a narrow sense, speaking as a scientist. But is practical value the only good? I think not.

I am reminded of a wonderful line in the movie Communion, where actor Chris Walken, playing the part of Whitley Streiber, discussing the reality (as he sees it) of ETs visiting Earth, says "The world is getting so small, it would be nice to meet someone new!"

That sort of captures one of the motivations behind the goal of machine consciousness: It would be someone new. Practical value? Maybe not. Valuable? Definitely.



Singularity & The Price of Rice is updated daily; the easiest way to get your daily dose is by subscribing to our news feed. Stay on top of all our updates by subscribing now via RSS or Email.

0 comments :