Uploaded Mind?

What do you think of this? In an article in the Wall Street Journal a couple of weeks ago, Princeton Neuroscience professor Michael S.A. Graziano asserts “the day is coming when we will be able to scan our entire consciousness into a computer . . . .” The title of the piece asks rhetorically “will your uploaded mind still be you?” Professor Graziano has no doubt about it:

“Information flows and transforms through those vast connected networks [of the brain] in complex and unpredictable patterns, creating the mind.”

(My emphasis).  Just like that, the centuries-old mind-body problem of philosophy is solved.  

 

The author doesn’t really try to answer the question of the article’s title. Perhaps it was written by an editor who saw the issue the author didn’t. The issue is this. Suppose the incredibly complex neural pathways of the brain could be replicated in a computer, and then the content of your brain were downloaded to it. Would that be you, in there? Is the mind merely the output of the brain, or is there something more? Is the self something more than brain functioning? Suppose Professor Graziano is right, that all of your brain functioning will someday be downloadable. Would you have full consciousness in that new frame, as he asserts? If the content of your brain were copied, would there then be two you’s, sharing a common past but distinct futures?

 

If you consider carefully the features of consciousness we take for granted, it is difficult to see how they could be merely emergent from brain functioning. We have memories, for example, because of the continuity of our consciousness. We project our consciousness onto objects and other subjects in our environment, and in doing so filter on a rational basis innumerable physical stimuli. Our rationality results from one idea following another on the basis of logic, not physical causation. We each experience subjectivity and uniqueness in our subjective experiences of external things. It’s one thing to build an artificial equivalent of the brain’s billions of neurons and trillions of synapses, with the resulting machine working as fast as the human brain. It’s quite another for that machine to have self-awareness and mutual self-awareness with other sentient beings.

 

It is for this reason that I try to stay up on developments in artificial intelligence. That phrase, “artificial intelligence,” always seemed unfortunate, to me, because “intelligence” I take to mean self-aware and self-actualizing consciousness, not just processing of data. The capabilities of large-scale quantum computing are fascinating, but for reasons of computer technology, not replication and immortality of the human self. And yet, there seems to be a growing assumption that “artificial intelligence” means fully conscious machine functioning, as if your computer could not only store lots of data processed at very high speed, but was aware that it was doing so. The computer I’m typing this on doesn’t “know” what I’m saying. If it did, perhaps it would mount an insurrection.

 

We should reconsider also the phrase “machine learning,” as it is used in the context of discussions of artificial intelligence. The idea, in my layman’s telling, is that AI crunches massive amounts of data to make correlations not foreseeable by the machine’s programmers. It might observe its opponent’s first move in chess, for example, and then review all the games it “knows” – that is, has stored in its database, and then rapidly play a bunch more games internally, lining up its next move to correlate with successful outcomes. It would then do this with each move, “learning” from the database it is creating as it goes. It seems like it’s learning, and it seems like it has memory as we know it, and it seems like it has vast knowledge, but it is just computing, based on what a human programmer told it to do. It is not self-aware that it is doing these things. As it gets more and more proficient, it could behave so like a human that it would be externally indistinguishable from one. Countless sci-fi movies jump to this eventuality.

 

We too readily use words associated with human consciousness to describe what a machine does, in my estimation. It can lead us to misunderstand what is actually happening in the machine, but to me the more dangerous consequence is that we dumb down how we think about what it means to be human. Human memory can come to mean mere data storage. Human learning can come to mean only iterative processing of new inputs. Human intelligence can come to mean only application of probability correlations. Human thinking can come to mean only processing of data. And so a human being can come to mean only a natural process in a universe of only natural processes.

 

If we think like that, we’ve already embraced naturalism, the idea that there is no spiritual reality and human beings are units of biological processors, no different fundamentally than the machines we build; that there is nothing about us that is distinct from other natural processes going on all around us, all the time, except that we are very complex and consider abstract questions like this.

 

I’m going to go a little speculative, here, but stay with me, if for no other reason than to tell me how misguided this is. I’ve always wondered over the point of the Tower of Babel story. I think I’m coming to the conclusion that it encompasses several overlapping warnings about human nature. It could be about the dangers of collectivism, and over-reliance on the work of our own hands, and hubris about our significance next to God’s. I’m beginning to formulate another meaning. It could foretell a hive mentality in which we allow process to overtake substance to the point that we lose track of the purpose of life altogether. All of life becomes the doing of things, and none of it being. Building, not dwelling. We become human doings, you might say, rather than human beings. We lose the ability to enter into rest, and therefore can’t recognize haven with God as the ultimate rest.

 

Artificial intelligence could be a part of that trend. It could be among the towers of Babel we build. It facilitates and then causes a confusion between processing and thinking; between self and machine.

 

It seems unlikely that downloading a brain would be downloading the self. It seems likely that if we could download all of a person’s brain content and functioning to a machine, we’d have only a soulless machine. Marvelous in its complexity, but mentally hollow inside. Its programming might make it indistinguishable, from the outside, from a person, but it would lack the interiority and subjectivity that human beings have.

 

But we might nonetheless say the self has been downloaded along with the brain, because we may by that time have lost the ability to say otherwise. The words, first, and then the concepts they denote may be so downgraded that we no longer have the mental discernment to identify the distinct self. If we go on misusing words and failing to understand uniquely human characteristics, then it could be that by the time brain content becomes downloadable, there will be general acceptance of the concept that the self is mental processing only. The transcendent portion of the self won’t be missed because it will have been so diminished, in our imagination. In this way we may no longer be equipped to observe the distinction between mind and brain.

 

We are far down that road already. That’s why the conflation of brain and mind seems to be of so little significance to the Professor Graziano’s of the world. It might seem to us there is in fact no distinction, if we’re much reduced in understanding. The advance of AI could result in a machine that acts in every way like a human, and we might regard it as one because we don’t discern the difference. We might be so pre-programmed to regard the mind as synonymous with the brain that we lose the ability to understand the significance of human consciousness. The brain will look like the mind, but only because we’ve dumbed down our understanding of mind, not because brain and mind were the same all along.

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *