Nick Boesel, Student at University of Michigan
What do you think is the one characteristic that will always be able to separate AI and humans?
First, it's hard to use words like "always" in this context.

I actually remember when the excerpt of Most Human Human was printed in the Atlantic, the cover said "Why Machines Will Never Beat the Human Mind," and I cringed, because that was a different claim than the much more nuanced one that I actually make in the book and article. The BBC once grilled me on live television why I thought machines would _never_ beat the human mind (emphasis on "never"), and it was hard to explain that this claim had actually been made by the graphic designer who did the Atlantic cover, not by me!

But back to your question, which is a very good one. In philosophy, there's an idea called "dualism" that comes most famously from Descartes; in effect it means that there are two wholly separate types of things in the world. Matter is one, and the "mind" or the "soul" is something entirely different. If you're a dualist, it's possible to believe that machines will literally never bridge the gap to human-level intelligence because they're "only" material.

I'm not a dualist: I think that the unbelievable physical complexity of the brain and body are the "whole story" when it comes to explaining how intelligence arises in human beings.

There's an intriguing book from 1989 by Roger Penrose called "The Emperor's New Mind," where he makes an argument for the idea that the brain is subject to quantum mechanics in a way that purely algorithmic systems (Turing machines, etc.) are not -- and from this he makes an argument that there is an unbridgeable chasm between brains and computers. I find this line of thought intriguing, but ultimately find myself incredulous either that quantum effects are so central to the human experience, or that they can't be incorporated into some form of computing. So again, I'm left concluding that there really isn't some fundamental barrier that prevents AI from coming to be.

But that, too, is a different question than what you're asking. You're asking what will always separate AI and humans, and for me it's very interesting to consider this question from the perspective of some future world in which legitimate, inarguable, human-level (or beyond) AI exists.

I think there is no denying that this intelligence would be of a very different type, or "flavor," than human intelligence -- just as human intelligence and dolphin intelligence, or human intelligence and octopus intelligence, are today. In fact the gap would probably be much wider than that.

For one thing, human intelligence operates overwhelmingly at very specific spatial and time scales. When we try to comprehend the age of the universe, for instance, we use metaphors like "If the universe were one year old, the human race would appear on December 31st" and things like that -- making analogies to the scales that are relevant to our bodies. Watch a time-lapse of a starfish or a plant and you have a completely different understanding of it. The same is true at an even bigger scale with geological phenomena.

Even more significant, in my view, than the impact of the human body, the human sensory system, and the human lifespan, is the simple fact that humans have a specific life history—we're _individuals_. When you and I see a movie and I ask you what you think, when I could just as easily read an eloquent essay by a film scholar, what's going on is that I'm not actually trying to learn about the movie. I'm trying to learn about _you_.

An AI that emerges as some kind of mesh of networked devices will present itself _very_ differently than one that emerges as disparate individual minds accumulating idiosyncratic life histories. My guess is the first will come much sooner than the second, and be much more "alien" as a result. Who knows, maybe in time we'll have both. Maybe the human race will become increasingly networked, to the point that spouses literally share sensory organs, for instance, seeing what each other see and hearing what each other hear. In that case, we end up meeting the "alien" AI mesh halfway, and the human experience is nothing like what it used to be.
Continue reading...