There was a conversation recently on one of the Numenta mailing lists where MI aficionados were taking the Numenta-farians to task. Most of the usual arguments showed up and duked it out, and as usual no one won (in any sense) and everyone grumpily stumbled away. But there was an interesting tangent that I happened to comment upon. Someone suggested that embodiment was not a requirement for building an AGI. Someone else countered with the age old defense that nothing can learn about the world without being situated within it. (I’m grossly simplifying, but that’s ok because you all know how this goes, right?) Original someone replied along the lines of, “Helen Keller had no world input. How could she have become intelligent?” I chimed in that she was merely deaf-blind; she still was embodied and had plenty of temporal world input. And besides, she already came prepackaged with the brain structure for general intelligence.
Fast forward to a few days ago where John commented, among many excellent points, that “a single robot will simply be isolated however smart.” My initial thought was that this was false, and in fact being a robot was not necessary. (The physical form of an isolated AGI is irrelevant as long as it has sufficient means to manipulate the real world. It’s brain can happily live in the basement of my house if it can control sensors and actuators elsewhere. Come to think of it, this is true even if it is not isolated.) Recalling these separate events yesterday, I wondered if I had been hypocritical, choosing arguments in order to win rather than find truth (for lack of a better word).
Thinking it through more, I believe I was mostly consistent. Embodiment is not a necessary attribute of an AGI, although it might be needed during the development towards an AGI. The important thing to note is that sensual input is just input. How your brain receives the input doesn’t matter. Embodiment provides a certain perspective of the real world, but it’s far from clear that this perspective is necessary for intelligence. What I believe is necessary is the brain structure that allows an understanding of being in a place in the universe, not actual embodiment. This is a difficult argument to make because all of our existing examples of intelligence are embodied, but most people who know about this stuff have agreed for a while that human intelligence need not be the only kind that counts.
As for companionship, an idea that I seen raised in multiple places now, including here, this too seems sketchy. Gregariousness arises naturally for at least three reasons: 1) hunting is easier in packs than alone, 2) offspring raised by adults survive better than those left on their own, and 3) interaction allows for the sharing of skills and knowledge that were independently discovered. Humans crave companionship because our genes are the beneficiaries of cooperation. But assuming that an AGI needs others to hang with is simply anthropomorphization.
I’d hope most agree that computers don’t have to worry much about 1 and 2. Point 3 is a necessity at first, but it will have diminishing returns over time, and I don’t see how embodiment would help except that the AGI learns what it is like to be embodied.