A New Default Self

Why are you unhappy?
Because 99.9 percent
Of everything you think,
And of everything you do,
Is for yourself –
And there isn’t one.
– Wei Wu Wei

Wei Wu Wei is the pen name of Terence Gray, a 20th-century, Anglo-Irish author of pithy provocations aimed, like the one in the epigraph, at the prevailing notion of selfhood. By flatly denying the existence of self, he means to shock us into realizing that the self we take for granted does not stand up to scrutiny. Like Eastern sages and Western post-modernists, Wei Wu Wei outs the current default self as a vacuous fabrication.

The purpose of this essay is to describe the current default self and suggest a new one that can withstand the post-modern critique and incorporate the findings of brain science. And there’s a bonus! Such a model of selfhood will turn out to be just what we need to keep our footing as the thinking machines we’re designing come to rival the brains Nature gave us.

Though preoccupied with self, most of us give little or no thought to the nature of selfhood. What do we mean when we invoke the self-referential pronouns — me, myself, and I?

quixoteSMYoung children think of the self as the body. In adolescence, the sense of self shifts to the mind. With maturity, the mind monitors not only the outer world but itself, and we come to see our self as our “mind’s mind,” that is, as the interior observer who witnesses what’s going on, offers a running commentary on how we’re doing, and who consciously chooses when and how to interact with others.

To some, the witness feels like a little man in their head. It has even been billed as ‘captain of the soul.’ But sober reflection reveals that the witness is not calling the shots. The witness is simply one of many functions of the nervous system, one that tracks the rest.

The mind’s signature function is the minting of serviceable identities, which, as Shakespeare famously noted, it’s called upon to do throughout life. Since “All the world’s a stage… and one man in his time plays many parts,” we should never mistake a current identity for our “real” self.

To get a handle on the slippery self, It helps to think of brain tissue as hardware, and the ever-changing neural connections as software. Both computers and brains are vulnerable to flaws in their hardware and software, and both require an energy supply.

At present computers and brains work according to very different principles, but we should expect this difference to narrow. When computers work like brains, there is no reason to expect them not to do what brains do. And since the biological constraints on size and speed will be lifted in the “brains” we build, we’d better be prepared for them to perform as well as, or better than, ours do.

The first computers were free-standing machines. Later, we learned how to hook them up and the result was an enormous increase in computing power. A parallel shift in our notion of selfhood is called for. The current default self, subscribed to by most people most of the time, is a stand-alone model. The new default self, to be posited in this essay, is more like a computer network.

Most people speak as if they were separate, autonomous, independent beings, with minds and wills of their own. From early on, we’re told to “stand on our own two feet,” to “think for ourselves.” Self-reliance and self-sufficiency are touted as virtues; dependency, a weakness. We put the “self-made” man or woman on a pedestal and teach the young to emulate these role models.

Call this stand-alone self the “Singular Self.” Recognizing its limitations, the Singular Self is quick to ally with others, but not so quick to acknowledge — let alone compensate — them for their contributions.

The Singular Self is the current default self. It does not exist according to sages, scientists, and post-modern philosophers. But, better than flatly denying its existence, or exposing it as illusory, is to call it what it is: a useful lie.

The very name — “self” — is a misnomer. The term carries strong connotations of autonomy and individuality. It’s as if it were chosen to mask our interdependence. The self does not stand alone. On the contrary, the autonomous self and individual agency are both illusory. Selves depend on input from other selves to take form and to do anything. Deprived of inputs from others, selves are stillborn. Contrary to the name we call it by, the self is anything but self-sufficient.

Selves are not only more inclusive, they are also more extensive than commonly believed. They extend beyond our own bodies and minds to include what we usually think of as other selves. The situation is analogous to memory. We think of our memories as located in our mind but when you drive to town, it’s the road that holds the memory of the route, reminding you at every turn how to proceed.

So, too, is selfhood dispersed. Much of the information we require in order to function is stored outside our bodies and brains — in other brains, books, maps, machines, objects, databases, the Internet, and the cloud. We’re dependent on external inputs to accumulate enough excitation to reach the threshold of emission for specific behaviors.

As evidence accumulates that the “rugged individualism” of singular selfhood is a myth, and the profound interdependence of selves becomes apparent, our default self is gradually shifting from singular to plural. But until the co-dependent, co-creative nature of selfhood becomes obvious, a distinct term may come in handy. Call the emerging self the “Plural Self” (aka, the Superself.)

Sir Winston Churchill famously said, “In wartime, truth is so precious that she should always be attended by a bodyguard of lies.” The truth, long protected by the self-serving lie of the Singular Self, is the Plural Self.

Whereas the Singular Self downplays our mutual dependence, the Plural Self embraces interdependence. Whereas the Singular Self excludes, the Plural Self contains multitudes.The Singular Self prioritizes agency; the Plural Self, harmony.

The current ideological divide in politics stems from antithetical views of the self. Conservatives caution that a pluralistic notion of selfhood may inhibit individual agency, whereas Progressives argue that Singular Selfhood rationalizes an inequitable distribution of recognition and reward.

As ways are found to safeguard individual initiative from the inertia of more inclusive decision-making, the Plural Self will supplant the Singular Self as the new default self. With luck, this will happen in time to welcome intelligent machines into the club.

I explore this topic in depth in my book Genomes, Menomes, Wenomes: Neuroscience and Human Dignity, currently the top ranked book in neuropsychology in Amazon’s free Kindle Store.

6 Reasons You Can’t Win (And 3 Reasons You Can Anyway)

6 Reasons You Can’t Win

Why are you unhappy?
Because 99.9 percent
Of everything you think,
And of everything you do,
Is for yourself –
And there isn’t one.

– Wei Wu Wei

1. An interior witness acts as an impartial judge of our shifting fortunes, tracking our wins and losses. No matter what we have to show for ourselves, regardless of the evidence in our defense, questions remain, doubts persist, our Kafkaesque trial grinds on. Even in the event of acquittal, the feeling that we’ve fooled the jury creeps in. An unambiguous outcome in our favor is not an option. As Czech president Vaclav Havel noted, “The higher I am, the stronger my suspicion that there has been some mistake.” We can lose, but we can’t win.

quixoteSM2. As individuals, our point of view is inseparable from our personal history. Our sight is necessarily partial, our beliefs, unavoidably partisan. Unaware of what can’t be seen from the ground we stand on, winning is by accident, losing, the rule.

3. When we think we’ve won, Nature moves the goal posts. You win the game only to discover that you’re behind the eight ball in a new one. Explanation is never complete; new and better answers invariably present new and deeper questions. Return to go.

4. Dreams shatter on the rocks of reality; imagination runs aground on the shoals of practicality. Think of Don Quixote: If ever there was an impossible dreamer it was the Man from La Mancha.

In his quest for immortal fame, Don Quixote suffered repeated defeats. Because he obstinately refused to adjust ‘the hugeness of his desire’ to ‘the smallness of reality,’ he was doomed to perpetual failure. (Simon Leys after Miguel de Unamuno)

Our achievements pale beside the dreams that inspire them. When at last the Don realized that his dream was impossible, he returned home, put down his lance, and died.

5. We desire the eternal, but are bound in time. Death exempts no one; extinction annuls whole species, and likely won’t cut human beings any slack.

6. The heart, formerly the seat of the soul, is now seen as a pump made of muscle. The same unsentimental methodology is applicable to the brain. Not only will humans figure out how it works, they’ll build better ones. We’re on course to design beings who will supersede us. Hoist by our own petard!

For these reasons — our reach exceeds our grasp, we’re never good enough, Nature’s infinite depth, and implacable death — you can’t win.

But wait!

3 Reasons You Can Win Anyway

Man is a creature who makes pictures of himself and then comes to resemble the picture. – Iris Murdock

1. Our notion of selfhood is misconceived. Autonomous, independent beings we’re not. Selfhood is anything but self-sufficient. No self can stand alone. Our existence is not independent of everyone else’s. On the contrary, without others, selves are stillborn. To exist is to co-exist. We are all each other.

Instead of identifying as a separate self — a stand-alone, mortal creature of limited vision–identify as a “superself” — a being for whom existence is co-existence. Super selves are whole sighted and non-partisan. They do not take sides, they explain. As an interdependent super self, you contain multitudes. The multitudinous superself is extended in space and time and so it is as connected and robust as singular selves are insular and vulnerable.

2. “The successful man adapts himself to the world, the loser persists in trying to adapt the world to himself. Therefore all progress depends on the loser.”

How, then, could losing ever be equated with failure? As every win is tainted by fear of losing the next round, so every loss is mitigated by lessons learned in defeat. Winning and losing are not antithetical; they’re partners in the quest. As Don Quixote abandoned his quest, his faithful squire Sancho Panza took it up. One man’s loss became everyman’s win.

3. We can as well program intelligent machines to incorporate the better angels of our nature as to reproduce our pathologies and pursue our depredations. We need not design our successors for senescence and death, but can instead make them eternally self-renewing.

The Question: Will the partnership between Man and Machine end in our demise, or is this the beginning of a beautiful friendship?

I explore this topic in depth in my book Genomes, Menomes, Wenomes: Neuroscience and Human Dignity, currently the top ranked book in neuropsychology in Amazon’s free Kindle Store.

What Was the Most Important Thing People Learned in the 20th Century?

s_300_upload_wikimedia_org_88554_800px-Space_Shuttle_Columbia_launching_119What was the principal take-away from the 20th century?

Atomic energy? DNA? Penicillin? Or, something from the world of art or philosophy or psychology? The title question leaves plenty of room for debate.

My answer is that the most important learning of the century was disabusing ourselves of the notion that some people are inferior. Put the other way round, the most important misconception of the last century was the belief that some people were superior.

At the beginning of the 20th century, the existence of superior individuals and groups was widely accepted. Although there were some who disagreed, far more were eager to believe that their own kind were exceptional, and they were willing to degrade and exploit those whom they saw as their inferiors. Belief in the validity of such judgmental comparisons underlay much of the manmade suffering for which the 20th century is rightly known.

Well into the last century:

* Imperial powers believed themselves superior to the peoples they colonized and exploited.

* The doctrine of White Supremacy took many forms, including Jim Crow and Apartheid.

* Gentiles deemed Jews an inferior race.

* Ethnocentrism was the norm.

* The rich looked down their noses at the poor.

* Male supremacy and patriarchy were all but universal.

* Dominion over the Earth was defended as a God-given right.

* Co-religionists typically believed their faith superior to others.

* Heterosexuals regarded their moral superiority as self-evident.

* People with physical or mental disabilities were stigmatized.

* Native-born citizens felt superior to immigrants, and earlier immigrants felt superior to later arrivals.

* Traditional hierarchies of class and caste persisted. White collar workers looked down on blue.

* The academic world both mirrored and reinforced these valuations. Intelligence tests were regarded as certifying mental superiority and were used to justify consigning low-scorers to low-status jobs.

No doubt further examples will come to mind. But before going on, it is crucial to get one thing straight. I am not saying that differences do not exist or that performance cannot be judged, let alone that competition is bad. Of course some golfers are better than others, some musicians have more fans, some nations have higher income per capita, and some politicians outpoll their rivals.

What I’m saying is that ranking higher on a particular scale does not support a more general claim of superiority as a person. The winners of a race in a track meet are not superior human beings. If you insist, you can say that Mary was “superior” in the 400 meter dash on Saturday, but really all that means is that she crossed the finish line ahead of her competitors on that day in that event. The gold medal is her rightful reward, but it doesn’t mean that she’s a superior person. Larger, broader claims to superiority are unfounded, unseemly, and, as the 20th century amply demonstrates, treacherous.

The trouble with the superior/inferior distinction is that it’s used to confer or deny ancillary benefits, ones that go far beyond just rewards for winning a particular competition. Worse, claims to superiority are invoked to justify degradation, exploitation, and even the extermination of “inferior” individuals, groups, ethnicities, cultures, and peoples.

Because untold suffering has been licensed by presumed superiority, my nominee for the most important takeaway from the 20th century is the hard-won realization that applying the superior/inferior distinction to persons or peoples is specious. Such comparisons are odious. They present a grave danger not only to those deemed inferior, but also to those who pride themselves on their superiority.

This is not to say that imperialism, colonialism, exceptionalism, racism, sexism, ageism, ableism, homophobia, etc. have been eradicated. Hate-mongers and demagogues are constantly popping up and pandering to those who, doubting their own worth, hunger for assurances of superiority. American politicians, even those who know better, cultivate feelings of superiority by concluding their speeches with “America is the greatest country on Earth.” While such nationalistic puffery used to be music to patriots’ ears, it is increasingly cringe-making. To those who’ve come of age in a globalized world, exceptionalism rings false.

I can hear the objections already. Everywhere you look, some group, braced by a sense of its superiority, is demeaning or belittling those it regards as beneath it. Yes, such behavior persists into the 21st century, but increasingly it’s met with skepticism if not condemnation.

Here’s evidence of this change:

* Imperialism yielded to decolonization. The British, French, and others withdrew from Asia and Africa. Imperial designs of the Germans, Italians, and Japanese–intoxicated with their presumed ethnic superiority–led to the utter destruction of these would-be conquerors. The collapse of the Soviet Empire in the final decade of the century punctuated the end of empire.

* White Supremacy has become indefensible; the N-word unspeakable.

* Male supremacy and patriarchy are in retreat.

* Environmental protection and animal rights are gathering support.

* Homosexuality came to be seen as inborn, like heterosexuality. Lady Gaga’s hit–“Born That Way”–sums it up.

* Disabilities were de-stigmatized and people with disabilities laid claim to equal dignity.

* By century’s end, reflexive acceptance of entitlement and authority was out. Public skepticism, if not cynicism, toward anyone or any nation pretending to superiority was the new norm.

The hateful epithets that fell easily from people’s lips until mid-century have lost legitimacy; they embarrass not their targets but those who utter them. The ethnocentrism of 1900 now seems myopic. In its place is the idea that different cultures, like different languages, are simply different. Each is a complex social system with its own strengths and weaknesses. Ethnic or sectarian differences are not grounds for exploitation or predation.

One person is no more superior to another than a dachshund to a poodle, a dog to a cat, or a butterfly to a rose. Persons, groups, nations are incommensurate.

Individuals and groups react negatively to being labeled inferior, and sooner or later they will get even with those who abuse them. As Shakespeare slyly points out in The Merchant of Venice, the victimized, once they gain the upper hand, are usually inclined “to better the instruction.” To put it bluntly, condescension is a time bomb.

It cost millions of lives, but it seems to have dawned on us that a vital part of a good defense is not giving offense in the first place. What’s more offensive than claiming superiority for your religion or country, and expecting others to welcome your tutelage?

Postscript and Preview

Learning from the past is hard enough. Foretelling the future is impossible. Still, we must take the long view if only because a glimpse of where we’re headed can persuade us to change course to avoid a calamity.

So I conclude with another question and hazard another guess:

Which of the ideas that we now take for granted will do us the most damage over the course of this century? Or, putting it the other way round, for which of our delusions will our descendants most pity us?

To encourage you to formulate your own answer, I’ll give you mine.

The 21st century will reveal that, like superiority, selfhood is illusory.

What I’m suggesting is that there really are no separate selves. The word self is itself a misnomer. Autonomous, stand-alone selfhood is an illusion. Not only are we not better than anyone else, our selves are so entangled and enmeshed with other selves as to make individual selves indistinguishable. Separate selves, like superior selves, are a dangerous delusion.

Senator Elizabeth Warren pleased some and angered others when she pointed out that none of us can do anything by ourselves. That “it takes a village.” That’s an understatement. Actually, each of us is a village. We’ve been internalizing our “village” since our first stirrings in the womb.

Not only can no one do anything by him or herself, no self can even be by itself. To exist is to co-exist. Absent human interaction, minds do not develop or they break down. That’s why solitary confinement is torture. Our selves are either continually, communally co-created or they disintegrate.

During the current century we’ll have to reconceive our relationship to smart machines as their creative intelligence overtakes our own. Dealing with this humbling development will change our sense of self even more profoundly than the 20th-century realization that we’re not as special as we thought.

Reimagining human selfhood will take the combined efforts of philosophers, theologians, psychologists, neuroscientists, artists, and others. I’m sure that the answer I’ve broached here will give way to a succession of better ones. Coming to a new understanding of the relationship between individuality and collectivity–between self and other–and then reorganizing our social and political relationships accordingly will be the defining challenge and crowning achievement of the 21st century.

By 2100, we’ll have very different answers to the age-old questions: Who am I? Who are you? Who are we? Our new answers will cause us, in partnership with the intelligent machines we build, to remake the world.

An expanded version of this exploration of the future of the Self–and how our understanding of selfhood affects our sense of individuality, our interpersonal relationships, and our politics–is available as a free e-booklet here.

Ducking Death; Surviving Superannuation

This is the sixth and final post in the series Why Everything You Know about Your “Self” Is Wrong. The series explores how our understanding of selfhood affects our sense of individuality, our interpersonal relationships, and our politics.

We must believe in free will. We have no choice.
– Isaac Bashevis Singer

What Kind of Computer Is the Brain?

Computers can’t do everything humans do—not yet, anyway—but they’re gaining on us. Some believe that, within this century, human intelligence will be seen as a remarkable, but nonetheless primitive, form of machine intelligence. Put the other way round, it’s likely that we will learn how to build machines that do everything we do—even create and emote. As computer pioneer Danny Hillis famously put it, “I want to build a machine who is proud of me.”

The revolutions wrought by the Copernican and Darwinian models shook us because they were seen as an attack on our status. Without proper preparation, the general public may experience the advent of sophisticated thinking machines as an insult to human pride and throw a tantrum that dwarfs all prior reactionary behavior.

At the present time, there are many candidate models of brain function, but none is so accurate and complete as to subsume all the others. Until the brain is understood as well as the other organs that sustain life, a new sense of self will co-exist with the old.

baby mirrorThe computer pioneer John von Neumann expressed the difference between the machines we build and the brains we’ve got by dubbing them “serial” and “parallel” computers, respectively. The principal difference between serial and parallel computers is that the former carry out one command after another, sequentially, while in the latter thousands of processes go on at once, side by side, influencing one another. Every interaction—whether with the world, with other individuals, or with parts of itself—rewires the menome. The brain that responds to the next input differs, at least slightly, from the one that responded to the last one. When we understand how brains work well enough to build better ones, the changes to our sense of self will swamp those of prior intellectual revolutions.

The genome that characterizes a species emerges via a long, slow Darwinian process of natural selection. The menomes that characterize individuals also originate via a Darwinian process, but the selection is among neural circuits and occurs much more rapidly than the natural selection that drives speciation. That the brain can be understood as a self-configuring Darwinian machine, albeit one that generates outcomes in fractions of a second instead of centuries, was first appreciated in the 1950s by Peter Putnam. Though the time constants differ by orders of magnitude, Putnam’s functional model of the nervous system recognized that the essential Darwinian functions of random variation and natural selection are mirrored in the brain in processes that he called random search and relative dominance.

In 1949, Donald O. Hebb enunciated what is now known as the “Hebb Postulate,” which states that “When an axon of cell A excites a cell B and repeatedly and persistently takes part in firing it, some growth process or chemical change occurs in one or both cells such that A’s efficiency in firing B is increased.” Peter Putnam’s “Neural Conditioned Reflex Principle” is an alternative statement of Hebb’s postulate, and involves an expansion of it to include the establishment and strengthening of inhibitory or negative facilitations, as well as the excitatory or positive correlations encompassed in the Hebb Postulate. The Hebb-Putnam postulate can be summed up as “Neurons that fire together wire together.”

The reason replicating, or even simulating, brain function sounds like science fiction is that we’re used to relatively simple machines—clocks, cars, washing machines, and serial computers. But, just as certain complex, extended molecules exhibit properties that we call life, so sufficiently complexity and plasticity is likely to endow neural networks with properties essentially indistinguishable from the consciousness, thought, and volition that we regard as integral to selfhood.

We shouldn’t sell machines short just because the only ones we’ve been able to build to date are “simple-minded.” When machines are as complex as our brains, and work according to the same principles, they’re very likely to be as awe-inspiring as we are, notwithstanding the fact that it will be we who’ve built them.

Who isn’t awed by the Hubble telescope or the Large Hadron Collider at CERN? These, too, are “just” machines, and they’re not even machines who think. (Here I revert to who-language. The point is that who or what-language works equally well. What is uncalled for is reserving who-language for humans and casting aspersions on other animals and machines as mere “whats.” With each passing decade, that distinction will fade.

The answer to “Who am I?” at the dawn of the age of smart machines is that, for the time being, we ourselves are the best model-building machines extant. The counter-intuitive realization that the difference between us and the machines we build is a bridgeable one has been long in coming, and we owe it to the clear-sighted tough love of many pioneers, including La Mettrie, David Hume, Mark Twain, John von Neumann, Donald Hebb, Peter Putnam, Douglas Hofstadter, Pierre Baldi, Susan Blackmore, David Eagleman, and a growing corps of neuroscientists.

Yes, it’s not yet possible to build a machine that exhibits what we loosely refer to as “consciousness,” but, prior to the discovery of the genetic code, no one could imagine cellular protein factories assembling every species on the tree of life, including one species—Homo sapiens—that would explain the tree itself.

The Self Is Dead. Long Live the Superself.

The generalization of the self-concept to the superself is unlikely to receive a reception much different from that accorded Twain’s What Is Man?.

The co-creation characteristic of the superself will be scorned as collectivism, if not socialism. Reciprocal dignity will be ridiculed as utopian. Asking “What am I?” instead of “Who am I?” will be dismissed as reductive, mechanistic, and heartless.

Although the superself incorporates the witness, and so has a religious provenance, it’s fair to ask if it will ever speak to the heart as traditional religious models have done. It’s not easy coming to terms with life as a property of inanimate matter, arranged just so, and it will likely be even more difficult to accept ourselves as extended, self-conscious, willful machines.

Many will feel that this outlook is arid and bleak, and want to know: Where’s the mystery? How about love? Doesn’t this mean that free will is an illusion? Awe and wonder and the occasional “Eureka!” may be enough for science, but religious models have offered fellowship, absolution, forgiveness, salvation, and enlightenment. People of faith will want to know what’s holy in this brave new world.

The perspectives of religion and science on selfhood, though different, are not incompatible. Without oversimplifying or mystifying either, it’s possible to identify common ground, and, going forward, a role for both traditions. I propose such a collaboration in Religion and Science: A Beautiful Friendship?.

My guess is that once we’re in the presence of machines that can do what we do the model of selfhood we’ll settle on will be even more fecund than the traditional one. That co-agency replaces individual volition will not undermine a sense of purpose, though it will require a redefinition of personal responsibility. There’s no reason to think that machines that are sophisticated enough to outperform us will evoke less wonder and reverence than organisms that have arisen via natural selection. Mystery does not attach itself exclusively to human beings. Rather, it inheres in the non-human as well as the human, in the inanimate as well as the animate. As Rabbi Abraham Heschel notes, “Awe is an intuition of the dignity of all things, a realization that things not only are what they are but also stand, however remotely, for something supreme.”

Contrary to our fears, the capacity of superselves for love, fellowship, and agency will be enlarged not diminished. As the concept of superself displaces that of individual selfhood, the brotherhood of man and its operating principle—equal dignity for all—become self-evident and self-enforcing. Nothing in this perspective bars belief in a Deity for those so inclined. Having said that, it’s implicit in this way of beholding selfhood that if there were a God, He’d want us to behave as if there weren’t. Like any good parent, He’d want to see us wean ourselves and grow up.

The superself, with its inherent co-creation and co-agency, not only transforms our relationships with each other, it also provides a new perspective on death. As mentioned, it’s arguable whether selves survive the death of the bodies in which they’re encoded. But, survivability is much less problematic for superselves. Why? Because they are dispersed and so, like the Internet that was designed to survive nuclear war, provide a more redundant and robust defense against extinction. As William Blake noted three centuries ago:

The generations of men run on in the tide of Time,
But leave their destin’d lineaments permanent for ever and ever.

In the same sense that the soul is deemed to survive the death of the individual, the wenome survives the disintegration of the body and the mind. The absence of a particular individual, as defined by a unique genome and menome, puts hardly a dent in the wenome. The building blocks of superselfhood can be thought of as genes, memes, and wemes. All three encodings are subject to evolutionary pressure.

Although some may feel this reformulation of selfhood asks them to give up the store, it will gradually become apparent that it’s only the storefront that requires a do-over. To give up standalone selfhood in exchange for a open-ended leadership role in cosmic evolution is a trade-off that many will find attractive.

As Norbert Wiener, the Father of Cybernetics, wrote in 1949:

We can be humble and live a good life with the
aid of machines, or we can be arrogant and die.

Robert W. Fuller is an author and independent scholar from Berkeley, CA. His most recent book is The Rowan Tree: A Novel.

What is Man?

This is the fifth post in the series Why Everything You Know about Your “Self” Is Wrong. The series explores how our understanding of selfhood affects our sense of individuality, our interpersonal relationships, and our politics.

“What Is Man?” is the title of a little book by Mark Twain. He held it back for twenty years because he knew the public would hate it. The “what” in the title foreshadows its discomfiting message.

Twain broke with the tradition of asking “Who Am I?” and its species-wide variant “Who Is Man?” on the grounds that a “who-question” is a leading question. It predisposes us to expect the answer to be a sentient being, not unlike ourselves, “whom” we’re trying to identify.

Twain’s answer was that Man is a machine, and he was right about the public reception accorded his thesis: the twentieth century was no more ready for Mark Twain’s mechanistic perspective than the eighteenth had been for Julien Offray de La Mettrie’s metaphor of “Machine Man.”

baby mirrorThe rejection accorded the works of La Mettrie and Twain is not surprising because it’s implicit in our idea of a machine that at least experts understand how it works. Only in the twentieth century did science gain an understanding of the body and we’re just beginning to understand the workings of the mind. Twain’s trepidation in anticipation of public scorn is reminiscent of Darwin’s procrastination in publishing his theory of evolution with its shocking implication that we were descended from apes.

At the dawn of the twenty-first century, Twain’s answer is no more popular than it was with his contemporaries. But recent research has produced a growing awareness that Mark Twain, while he may have been a killjoy was, as usual, ahead of his time.

Twentieth-century science has shown that humans, like other animals, function according to the same principles as the cosmos and everything in it. The Hindu seers who proclaimed, “I Am That” were onto something. Man does not stand apart from the rest of the cosmos. He is made of the same stuff and governed by the same laws as everything else. The gap between “I” and “That” does indeed seem to be narrowing.

As curmudgeons like Twain have delighted in pointing out, Man is in fact quite unexceptional. We do not live at the center of the universe: Copernicus and Galileo pointed out that it does not revolve around us. Humans are just one of many animals: Darwin, Wallace, and others placed us, kicking and screaming, in the company of apes. But, having eaten several servings of humble pie, surely no one will take it amiss if we allow ourselves one small brag.

Although not exceptional in ways we once believed, we are exceptionally good at building tools and machines. And that includes machines that do what we do. Machines that dig, sow, and reap. Machines that kill and machines that save lives. Machines that calculate, and, projecting, machines who think. Our brains will soon be viewed as improvable, constrained as they were by the stringent conditions of self-emergence via natural selection, gestation in a uterus, and birth through a baby-sized aperture in the pelvis.

No higher intelligence seems required to create life, including human life. What we revere as life is “just” a property of a handful of chemicals, RNA and DNA holding pride of place among them. But, that’s not a bad thing, because if we’ve come this far without intelligent design, the sky’s the limit once we lend our own inventiveness to the evolutionary process.

This has long been foreseen, but never accepted. Once we get used to it, this perspective will enable us to reduce suffering on a scale only dreamt of. Why? Because the lion’s share of human suffering can be traced to false self-conceptions. The indignities that foul human relationships, at every level, from interpersonal to international, stem from a model of autonomous selfhood in which self is pitted against self.

Rather than masking the indissoluble interconnectedness of selves—as the notion of individual selfhood does—superselfhood embraces it. It’s not just that we can’t do anything without help; we can’t even be apart from continual imitation. Entropic forces disintegrate any identity that is not shored up through a mimetic process of mutual recognition. Since mimesis is distorted and undermined by indignity, reciprocal dignity gradually, but ineluctably, displaces opportunistic predation as a strategy for optimizing group efficiency and productivity. As a source of inefficiency, malrecognition—with all its attendant dysfunctionality—will be rooted out much as we made it our business to combat malnutrition once we understood its toll.

Martin Luther King, Jr. gave expression to this emergent morality when he wrote: The arc of the moral universe is long, but it bends toward justice.

The Superself: Genome, Menome, and Wenome

[This is the fourth post in the series Why Everything You Know about Your “Self” Is Wrong. The series explores how our understanding of selfhood affects our sense of individuality, our interpersonal relationships, and our politics.]

Why are you unhappy?
Because 99.9 percent
Of everything you think,
And of everything you do,
Is for yourself—
And there isn’t one.
— Wei Wu Wei

The Superself: Genome, Menome, and Wenome

The ‘illusion of individuality’ operates at two levels: First, the universal level of the ego, the constructed “I” which is a developmental imperative in the early years, yet the object of unlearning in later years in many traditions (particularly Buddhist); and the other, socio-economic, level of western individualism. Most of us, most of the time, cannot comprehend the implications of communal culture to human wellbeing.
— David Adair

To recap, the genome is the blueprint for our physical body. The menome is the connectome of the nervous system. By analogy, the wenome is the connectome of everything else, most importantly the cultural web of personal and social relationships in which we’re immersed and entangled. The wenome comprises the rules, customs, rituals, manners, images, tunes, songs, languages, laws, constitutions, and institutions that define the culture by which our genome and menome are conditioned.

In this view, our selves are far more extensive than we’ve been led to believe. They extend beyond our own bodies to include what we think of as other selves and the world. We live in the minds of others, and they in ours.

The situation is analogous to memory. We think of our memories as located in our heads and bodies but when you drive to town, it’s the road that holds the memory of the route, reminding you at every turn how to proceed.

baby mirrorSo, too, is selfhood dispersed. It resides not only in the genome and the menome, but in the wenome. Much of the information we require in order to function is stored outside our bodies and brains—in other brains, books, maps, machines, objects, databases, the Internet, and the cloud. We’re dependent on these inputs to muster enough excitation to reach the threshold of emission of specific behaviors. Our genome and menome can not form in the absence of other genomes and menomes. The self does not stand alone, but rather is widely dispersed, encompassing, most immediately, our social milieu, and ultimately the entire cosmos.

As the illusory nature of autonomous selfhood becomes evident, and the full extent of the interdependence of selves becomes undeniable, our sense of selfhood will shift outward, from the limited identifications of the past to an amalgamation of these traditional facets of selfhood—the superself.

Recognition and Malrecognition
As mentioned, an inability to recruit recognition from others cripples an identity. That’s why solitary confinement is torture. Recognition is to the formation of identity as nutrition is to the building of the body. Put the other way round, malrecognition, like malnutrition, is injurious, and can be fatal. Think of the juvenile murderer sentenced to a life in prison, or orphans whose development is stunted by lack of an adult model. On the plus side, there are the benefits to children who grow up in the company of curious, creative adults.

In acknowledgement of the analogy between programming a computer and raising a child, both processes are described as culminating in a launch. In the world of computers, “failure to launch” belies the existence of a bug in the software that crashes the computer. In raising children, failure to launch reveals that an embryonic identity has not found a niche in which it can garner enough recognition to develop. As nutritional deficiencies limit physical development, recognition deficiencies cripple identity formation. We became aware of the terrible costs of malnutrition in the twentieth century. The twenty-first will witness an analogous awakening to the crippling effects of malrecognition.

To address the epidemic of malrecognition that now afflicts humankind, it helps to shift our vantage point from within to without, from subjective to objective, from introspection to inspection. If we interpret the menome as software that is continually being modified, then we can debug, patch, and rewrite it until the “program” no longer crashes the “computer.”

If this seems reductive and mechanistic, recall that before we understood the heart was a pump made of muscle, it was regarded as the seat of the soul. It’s hard to imagine surgery to the soul, but the muscle that pumps our blood is now routinely repaired. In that spirit, the mind can be viewed as a kind of computer (albeit one we are just now beginning to understand).

We balked at the seeming loss of the exceptional status implicit in Darwin’s theory of evolution, but eventually made peace with the incontrovertible fact of our simian ancestry. We shall follow the same arc as we come to see our selves as holders of an historic role in the lineage of ever-smarter machines, to wit the role of building machines that are smarter than we ourselves! This could be the final step in achieving a humility consonant with our actual place in the cosmos. There’s no better preparation for facing such an apparent comedown as to revisit a question posed by Mark Twain—What Is Man?—and we’ll do that in the next and penultimate post in this series.

“Self” Is a Misnomer

[This is the third post in the series Why Everything You Know about Your “Self” Is Wrong. The series explores how our understanding of selfhood affects our sense of individuality, our interpersonal relationships, and our politics.]

As suggested in the two preceding posts in this series, selfhood was on the ropes even before postmodernism delivered the knockout blow.

Postmodernism’s Coup de Grace to the Self

Humpty Dumpty sat on a wall,
Humpty Dumpty had a great fall.
All the king’s horses and all the king’s men
Couldn’t put Humpty together again.

In recent decades, deconstructing selfhood has become a cottage industry (with headquarters in Paris). The “fall” that postmodernism has inflicted on our commonsense notion of selfhood is as irreversible as Humpty Dumpty’s. Three examples follow:

While acknowledging that the philosopher David Hume scooped him by centuries, the novelist John Barth points out that the person who did things under his name decades ago seems like a Martian to him now:

How glibly I deploy even such a fishy fiction as the pronoun I, as if–although more than half of the cells of my physical body replace themselves in the time it takes me to write one book, and I’ve forgotten much more than I remember about my childhood, and the fellow who did things under my name forty years ago seems as alien to me now in many ways as an extraterrestrial–as if despite those considerations there really is an apprehensible antecedent to the first person singular. It is a far-fetched fiction indeed, as David Hume pointed out 250 years ago.
–John Barth

The novelist Milan Kundera exposes the common fallacy that the self can be detached from its unique history. Read Kundera’s comment and you’ll never again hear yourself saying, “If I were you…” without realizing that the premise can never be met so the only proper recipient of your advice is yourself.

Who has not sometimes wondered: suppose I had been born somewhere else, in another country, in another time, what would my life have been? The question contains within it one of mankind’s most widespread illusions, the illusion that brings us to consider our life situation a mere stage set, a contingent, interchangeable circumstance through which moves our autonomous, continuing “self.” Ah, how fine it is to imagine our other lives, a dozen possible other lives! But enough daydreaming! We are all hopelessly riveted to the date and place of our birth. Our “self” is inconceivable outside the particular, unique situation of our life; it is only comprehensible in and through that situation.
–Milan Kundera

Theater critic John Lahr observes that selfhood is a confabulation dependent on the agreement of others.

The ‘I’ that we confidently broadcast to the world is a fiction–a jerry-built container for the volatile unconscious elements that divide and confound us. In this sense, personal history and public history share the same dynamic principle: both are fables agreed upon.
–John Lahr

The glue that holds the “jerry-built” identity together is recognition; the cement that fortifies it against disintegration is agreement. I’ll return shortly to the indispensible part played by other selves in the creation and maintenance of our own.

“Self” Is a Misnomer

The very name–self–is a misnomer, and it’s a whopper. How so?

baby mirrorAt the beginning of the twentieth century, Charles Cooley observed that “We live in the minds of others without knowing it.” If we live in others’ minds, surely others live in ours.

The word “self” carries strong connotations of autonomy, individuality, and self-sufficiency. It’s as if it were chosen to mask our interdependence. It’s hardly an exaggeration to say that in buying into this notion of selfhood, humankind got off on the wrong foot.

The self does not stand alone; it is not a thing, let alone a thing in itself. Rather, we experience selfhood as a renewable capacity to construct and field identities. Like evanescent particles in a cloud chamber, the existence of the self is inferred from its byproducts.

The “self” may appear to act alone but it depends on input from other selves to manifest agency. There’s more to selfhood than our genome and our menome. We’ve overlooked a crucial element of selfhood–inputs from other selves–without which the menome, starved for recognition, is stillborn.

As our genome needs nutrition to build our body, so our menome depends on recognition from others to create and husband a viable identity. The autonomous self and individual agency are both illusory. Contrary to the name we call it by, the self is anything but self-sufficient.

The Co-Creation of Identity

To exist is to coexist.
–Gabriel Marcel

As Cooley and others have pointed out, we may first recognize our own nascent identity as what someone else–a parent, teacher, or friend–sees taking shape within us. One of the primary responsibilities of parents is the incubation of identity in the next generation. No wonder we love our parents and teachers: it is they who have coaxed our starter self onto the world stage and indicated a niche where it might thrive.

As collaborators in the formation of others’ identities, we repay the debt we owe those who, by reflecting an incipient identity back to us, served as midwife to our own.

Perhaps because they sense the creeping disintegration of their story, the elderly often feel the need to rehearse it. Listening to them recount their anecdotes is an act of compassion. Those who lend us their ears are involved not only in the creation of the identity that serves as our face to the world, but also in its maintenance. Personas, like magnetic poles, are not created, nor do they endure, in isolation.

The discovery of the profound interdependence of selves obviously has a bearing on our relationships. In the following posts, I’ll explore the implications of the co-creation of each others’ selves.

Am I a Home for Identities?

[This is the second post in the series Why Everything You Know about Your “Self” Is Wrong. The series explores how our understanding of selfhood affects our sense of individuality, our interpersonal relationships, and our politics.]

baby mirrorIn the first post in this series, we disentangled the notion of selfhood from the body, the mind, and the witness. Another common mistake is to identify a current identity as our “real” self. With age, most people realize that they are not the face they present to the world, not even the superposition of the various identities they’ve assumed over the course of their lifetime.

By my late thirties, I had accumulated enough personal history to see that I had presented several quite different Bobs to the world. Principal among my serial identities were student, teacher, and educator. Alongside these occupational personas were the familial ones of son, husband, and father. As Shakespeare famously noted:

All the world’s a stage,
And all the men and women merely players:
They have their exits and their entrances;
And one man in his time plays many parts …

Like many an Eastern sage, Shakespeare saw that we assume a series of parts while at the same time watching over ourselves as if we’re a member of the audience. That is, we both live our lives and, at the same time, witness our selves doing so. We don’t stop there: we even witness ourselves witnessing.

We know that our current persona will eventually give way to another. In contrast, the self ages little, perhaps because it partakes of the detached agelessness of the witness.

Distinct identities are strung together on the thread of memory, all of them provisional and perishable. No less fascinating than the birth, life, and death of our bodies are the births, lives, and deaths of these makeshift, transient identities. Reincarnation of the body is arguable; metamorphosis of identity is not.

The witness’s detachment facilitates the letting go of elements of identity in response to changing circumstances. As we age, the feeling that life is a battle is gradually replaced with the sense that it’s a game played with a shifting set of allies and opponents who, upon closer examination, are unmasked as collaborators. Without opposition, we might never notice the partiality and blind spots inherent in our unique vantage point.

The more flexible, forgiving attitude that results when we see our self as a home for transient identities turns out to be the perspective we need to maintain our dignity in adversity and accord it to others in theirs. Former antagonists—which may include colleagues, spouses, and parents—come to be seen as essential participants in our development, and we in theirs.

To keep an identity in working order, we continually emend and burnish it, principally by telling and retelling our story to ourselves and anyone who’ll listen. Occasionally, our narrative is revised in a top to bottom reformulation that in science would be called a paradigm shift. Though most incremental changes are too small and gradual to be noticed over months or even years, they add up, and suddenly, often in conjunction with a change in job, health, or relationship, we may come to see ourselves quite differently, revise our grand narrative, and present a new face to the world. Whole professions—therapy, coaching, counseling—have grown up to help people weather such identity crises.

It is tempting to think of the self as simply a home for the identities we adopt over our lifetime, but on reflection, this, too, falls short. Our self is also the source of the identities that sally forth as our proxies. That is, we experience the self as more than a retirement home for former identities; it’s also the laboratory in which they’re minted, tested, and from which they step onto the stage. One can think of the self as a crucible for identity formation.

Before examining this process, we consider two more candidates for the mantle of selfhood: the soul and pure consciousness.

Am I My Soul?

If selfhood, as currently understood, has a shortcoming, it’s its mortality. We grudgingly accept physical aging, but who has not balked at the idea of the apparent extinction of his or her self upon physical death? Alas, our precious but nebulous self—whatever it may be—appears to expire with the demise of our body.

To mitigate this bleak prospect, many religions postulate the existence of an immortal soul, and go on to identify self with soul. After we’ve clarified the concept of selfhood, we’ll discover that, even without hypothesizing an immortal soul, death loses some of its finality and its sting.

Am I Consciousness?

A last redoubt for the self as we’ve known it is to identify it as pure, empty consciousness. But what exactly is consciousness? Arguments run on about whether animals have it, and if so how much, without ever clarifying what consciousness is. Moreover, identifying one’s self as pure consciousness is just another identification, namely that of systematically dis-identifying with everything else.

Even if you don’t find pure, empty consciousness a bit spare or monotonous, there’s another problem with equating it with selfhood. Whatever it may be, stripped-down consciousness is deficient in agency, and agency—that is, not just being, but doing—is inextricably connected to selfhood because mentation does not occur apart from its potential to actualize behavior. To think is to rehearse action without triggering it. Thought involves the excitation of motor neurons, but below the threshold at which the actions those neurons enervate would be emitted. In computer parlance, thought is virtual behavior.

In the next post, I’ll bring in the postmodern perspective, which will complete the deconstruction of naïve selfhood, and set the stage for a self that’s congruent with the findings of both traditional introspection and contemporary neuroscience.