Sunday, January 12, 2014
© Charles D. Hayes
If you doubt that we’re born to deny reality, you’re actually proving the point. The evidence is indisputable that we human beings have built-in reality buffers. We smoke, drink, overeat, waste resources, and engage in every possible kind of risk-taking activity, oblivious to or disregarding the likely results of our actions.
At the core of our tendency to deny reality is the barefaced inevitability of our own death. Unless we are threatened with imminent annihilation or given a short time to live, we are predisposed to perceive of the future as something open-ended and unlimited, regardless of our age. We are loath to admit our existence is finite.
Some of us are so sensitive about the subject of death that people or practices that appear to be different from the familiar give us pause. We reject otherness, change, and uncertainty because they represent the possibility of our demise. Thousands of religious belief systems exist throughout the world, and yet the adherents within each of them resolutely believe that theirs is the only correct worldview. Similarly, conspiracy theorists prefer to believe in string-pulling manipulation by powerful forces rather than accept the frightening prospect that no one is in control.
A recent entry in examining our pronounced ability for deceiving ourselves is Denial: Self-Deception, False Beliefs, and the Origins of the Human Mind by Ajit Varki and Danny Brower. These authors contend that the ability to deny reality is the very psychological mechanism that has made our survival possible and that optimism is indeed a strategy for denial. Physician and writer Abraham Verghese has called this "the most exciting idea in evolution since Darwin." Yes, it’s exciting, but it’s certainly not new.
Back in 1974, Ernest Becker won the Pulitzer Prize for The Denial of Death, an examination of our propensity for self-deception about our own mortality. And before the ideas of Richard Dawkins, Sam Harris, Christopher Hitchens, and Daniel Dennett gained prominence, we had the work of John F. Schumaker. His Wings of Illusion and The Corruption of Reality took the subject of belief and self-deception to points that other theorists are just now beginning to discover.
Looking deeply into our existential predicament is a sobering experience. Our sun is a second-rate star in a modest galaxy, where no one thing or location can be deemed more important than any other—with the exception of those upon whose light and gravity we depend. The earth is hurtling through space at thousands of miles per hour and appears to be headed nowhere in particular.
The same analogy applies to our lives as individuals. We represent an amalgamation of biology, culture, time, and place, with no particular significance attributable to any of these components. The only thing that is special about any of us is our uniqueness with regard to others, which is only a matter of degree.
We come into the world with biological predispositions, and we absorb cultural biases and beliefs as readily as plants photosynthesize sunlight. These factors make it impossible for any single individual or group to claim title to precisely the right place to be, the right things to believe, or the right things to do—although you would never know it by the proliferation of pretense all around us.
Dig deep enough into our ontological dilemma and the evidence of cosmic chaos is overwhelming. In the face of it, people find comfort in an illusion of permanency, which seems highly preferable to any objective recognition of how much our lives are influenced by chance. In a universe where disorder rules, our lives amount to nothing more than a posture we assume, and yet, as individuals we feel that our lives represent the ground zero of meaningful experience. In one sense, what we do means nothing, but in another sense, it can mean everything.
In his book An Appetite for Wonder, scientist Richard Dawkins writes about how, because of timing, something as simple as a sneeze can have a domino effect on the future. A personal example brought this home to me recently. I intended to call a friend one day, but I didn’t. Some hours later, that friend was killed in a traffic accident. Now, I’m reasonably sure that if I had phoned him as I’d intended, he would still be alive because he would not have been at the intersection at the moment the accident happened. A matter of a few seconds would have changed the outcome.
Imagine how different life might be for us now if, in November of 1963, President Kennedy’s motorcade had not driven by Dealey Plaza in Dallas. We can speculate ad nauseam, but trying to mentally reverse past events is both futile and counterproductive. If I had called my friend as planned, it may indeed have changed the course of his behavior and avoided the accident. When we begin to reason like this, however, questions persist. For instance, was it the last time I did talk to him that somehow set him up for his misfortune? Such lines of thinking are seductive, but they always reach a dead-end and encourage magical thinking.
I’m not in any way suggesting that we are responsible for unknowable future events. Only in hindsight can events appear inevitable. The present is rife with chaotic possibilities. To the contrary, thinking through hypothetical situations can help us inoculate ourselves against comforting illusions that shelter us from seeing just how precariously our lives depend upon luck.
My sense is that everything does happen for any number of reasons, but nothing can happen in the lives of human beings that cannot be altered by chance. We are bound together in a chain of chaotic events so seamlessly connected that they appear tranquil right up to the moment when reality crashes the party. By design, our brains impose a sense of order on a world driven by mayhem.
Subjectivity is the substance we are made of. Our worldviews represent our social bonds steeped in emotional experience. Our mortal fears surface when our beliefs are seriously questioned, because the process threatens to raise a window on reality that most of us would prefer stayed closed. Yet, in a cosmic wink, we will all be gone, centuries will pass, and what is commonly believed today will someday be thought quaint if not absurd.
Ecologists tell us that a sustainable population of humans on our planet is somewhere in the neighborhood of 1.5 billion people, but by the end of this century, the world’s population is estimated to be almost fifteen times that amount. This statement alone should remove any doubt about our being deniers of reality.
John F. Schumaker says we need to determine an optimal level of reality distortion that won’t exact the price of civilization. In his words, "The impossible challenge is to face the truth without panic, to derive all meaning from where we are and what we are." Illusions aside, this is all we’ve got.
It’s easy to appreciate how illusions have helped us survive. Evolution equipped us for self-deception in part so that we would readily take risks without calculating our chances of success. Obviously this approach has worked.
In centuries past, illusions have aided our survival, but now we’re speeding forward without questioning our assumptions. Because of our burgeoning numbers, the future, if we are to have one, demands that we trade our illusions for objectivity. What helped us thrive as a species in the distant past now threatens our very existence.
My Books and Essays on Amazon
New Fiction: The Call of Mortality
My Other Blog
Follow me on Twitter @CDHWasilla
Friday, August 16, 2013
© Charles D. Hayes
Educational philosopher Robert Hutchins championed the life of the mind, which he encouraged through the study of literature and ongoing dialog with learned peers. He called this "The Great Conversation" and published a book by that name in 1952. I credit Hutchins’ work, and particularly the Great Books series that he fostered, with motivating me to embark upon my pursuit of self-education and to continue the course of lifelong learning that I follow to this day.
Key to developing a life of the mind and learning to think for oneself is the notion of ongoing dialog. Hutchins placed great value on exposing oneself to multiple points of view in order to formulate an opinion of one’s own. Real learning happens not when we swallow whole what someone else believes, but rather when we work through an issue in our own minds.
What has disturbed me about public discourse today, especially in social media, is my sense that it mostly serves as an echo chamber. No one seems to talk reasonably with anyone who has divergent ideas. Instead, disagreements become shouting matches with voices expressed in text, in all caps, and with character bursts too short for satisfactory explanation. Lately, though, I’ve had to admit that beneath the media hype about polarized camps, many people are engaged in meaningful dialog and minds are changing. The growing support for gay marriage proves the point.
For months my email signature contained the assertion that "life is too short to text and too important to tweet." My nephew, however, has convinced me to think of Twitter as a news source, a library index card, or a subject header that opens an opportunity for conversation and further learning. It gives me the ability to delve deeper into a subject when it points me to material that I otherwise would not know about. So, much to my own surprise, I’ve now removed that statement from my signature and have activated a Twitter account.
From the beginning, my skepticism about Twitter stemmed from the current mania for the brevity of bulleted lists and the admonition that everything worth reading must be up front. Introductions and first chapters in books often turn out to be the only text worthy of one’s time. In far too many cases, the "keep it short, keep it simple, put it up front" practice, combined with texting and tweeting, has appeared to result in a lack of the depth necessary for basic comprehension. To my mind, it’s essential to know why and how a person has reached whatever conclusion is put forth.
The Great Books program advanced by Robert Hutchins offered an approach to the humanities through the exploration of our finest literature, a method that today is largely marginalized. These days, advocates for the humanities and a liberal education increasingly attempt to make their case by beating around the bush. Too often they fail to explain why we should value the humanities. And yet, the argument desperately needs to be made because the humanities contain seeds of goodwill that are capable of turning red and blue states purple.
The human condition is the conundrum at the crux of civilization. Our respective cultures provide a barrage of edicts about what we are to do in life with too little regard for how we are to cope with the inescapable anxiety that comes with our fragile existence. This makes us egregiously vulnerable to political manipulation.
Too many of us are ill-equipped to manage the conditions we find ourselves in, unless we have learned enough about our species to deal with our conscious and subconscious angst about our own inevitable demise. We come pre-programmed with a clash-driven political nature that is due, in part, to our split-brain architecture, which enables us to compartmentalize conflicting information. As a result, we tend to readily hate those whom we view as different based upon the flimsiest of criteria.
The promise of faith helps some people by offering the assurance that if you believe this, there is nothing to worry about. For others, though, the fact that their belief system is not universal, in and of itself, is cause for the kind of contempt that routinely ferments into a hatred of nonbelievers.
If one is born into a poorly educated culture, the future portends a life of poverty and a worldview filled with scorn and social paranoia. Taken further, if one's culture is ignorant and socially oppressive, a life of political zealotry may prove to be irresistible and scapegoats will be enthusiastically provided for persecution.
We can't put our own lives in perspective without developing the ability to envision ourselves in a global context. We must also realize that our propensity to view our own respective cultures as naturally superior to all others is a primeval short circuit that fosters ridicule and disdain. Ultimately it can lead to self-destruction.
Life is an existential dilemma, and it is subjective to the nth degree. For each of us, life is an unsolvable mystery, and yet the pursuit of the puzzle can provide enough existential relief to make living pleasurable. Learning continuously along the way leaves plenty of room for those who view the world differently.
We human beings need to know all we can about being human, especially about other cultures with differing customs, traditions, and worldviews. Knowledge of many subjects we consider electives, such as psychology, anthropology, and sociology is desperately needed by all citizens. Promoting that greater level of understanding could dispel the needless social anxiety that’s born of ignorance and perpetuated by our tribal nature for exclusiveness.
Today the amount of time that people waste hating others because of absurd misunderstandings is astounding. The more we learn about the outer world, the larger our inner world becomes. We are less threatened by things we don't understand because we know that the process of trying to understand can relieve us of needless anxiety.
I have joined Twitter with the goal of following not only those with whom I agree but also those with whom I don’t agree—especially those whose messages I think are destructive. The point of entering into a public conversation, in my view, is not to enhance one's career, social standing, and earning potential, or to live a life of ease made possible by magical software. It is, instead, an aspiration to experience the kind of life that transcends our respective cultures, a life with the independence of mind to determine value without coercion and to develop our sense of humanity, regardless of which culture we were born into.
No doubt this is a tall order, and it may indeed be overly idealistic. Still, I think it’s clear that only in hindsight can we judge the effectiveness of something we would characterize as a great conversation. Steven Pinker’s The Better Angels of Our Nature offers a compelling argument that over the long term, we are making moral progress. All we have to do now is speed it up. Twitter has crossover potential. Maybe it can help. Follow me on Twitter: @CDHWasilla
My Books and Essays on Amazon
New Fiction: The Call of Mortality
My Other Blog
Saturday, May 18, 2013
© Charles D. Hayes
In 2011, Stephanie Coontz published A Strange Stirring, a book about the status of women at the dawn of the 1960s. Even though I lived through those times as an adult, the memories Coontz brought to mind were shocking. Fifty years since that era, gender inequality still exists, especially when it comes to employment compensation, but the fact that today many of the cultural assumptions of the ’60s seem far afield is a sign of genuine progress.
Now comes the movie 42 about the life of Jackie Robinson, the first African-American baseball player to break into the major leagues. When I try to make sense of my reaction to the film, the feelings it evokes are much deeper and much more appalling than those I felt reading Coontz. Although I was very young in the 1940s, I lived through those times, too, and I have vivid memories of the racism that reigned uncensored in our society.
I grew up in a racist region of the country, in a racist community and, I'm ashamed to say, in a racist family. Racism in those days might as well have been in our drinking water. Assumptions of white superiority were simply taken as gospel truth. Children parroted the same bigoted notions in public as those spouted from the mouths of their ill-educated parents. In hindsight, I can see that our mind-set amounted to a common form of malignant arrogance that grows by feeding on itself as it binds one group together against another.
The dialog in 42 brought to memory old conversations that, if heard today, would be considered astounding, even, I suspect, in the Deep South. Watching actor Chadwick Boseman portraying Jackie Robinson at bat, trying to concentrate while a white baseball manager shouts racial epithets, makes you want to crawl under your seat, only because you can't get your hands on the offender.
The one thing that troubles me about the film, which I thought was very well acted, is my suspicion that young people accustomed to speedy media solutions will come away with the impression that acceptance of Robinson into the white world of sports was something that occurred rather quickly, perhaps after just a few winning games. The truth, however, was far different. Another couple of decades of overt racial hostility would follow before the Civil Rights era even began to take hold.
What 42 makes crystal clear is how shallow and superficial the strain of contempt is that enables and sustains racism as prejudice is handed down from one generation to the next. The process is born of fear, misunderstanding, hearsay, innuendo, inarticulate chit-chat, frustrated exasperation, and just plain old stupidity. Unsupported nonsense derived from stereotypes passes from one person to another, while the veracity of what is said is waived by the power of the relationship. In other words, if one's friend or family member said it, then it must be true, and it will be defended by virtue of group loyalty, even if it has no validity.
As I watched 42, it occurred to me that the whole ethos of identity politics depends upon half-truths spoken under stress and because of the existential angst that comes with the human condition. It's really that simple. When identity is the most important theme at hand, nothing holds it together quite as well as old-fashioned belligerence expressed through inarticulate gestures of discrimination aimed at gaining support for one group at the expense of another.
Stupidity is a bonding element, and outright hatred is the greatest unifier of all. Thus, whatever we bring together in communal disinformation must be defended with much larger doses of deceit because nothing short of outright lies will make sense. Watch 42, listen to the white baseball manager's racial rant, and you will see what I mean.
The lesson to be learned from collective stupidity is how to spot it, how to arrest its propagation by taking a time-out as in sports, and how to maintain an intellectual default toward demanding more information. Classic examples of popular ignorance as a bonding substance are the millions of emails sent daily with the intent of binding one's group by alienating another. The remedy here is simple. When you receive one of these malicious messages, ask the sender to stop.
The identity of the other changes over time, but the methods remain the same. The only variable is the degree of vitriol. The same strain of identity culture that in one era is focused on racial hatred is a versatile social conduit that can be utilized for homophobia, anti-immigration, gun rights, the facile notion of imaginary superiority purported by Ayn Rand's John Galt wannabes, or any sort of distinctiveness revered by one's identity group.
The point is simple and yet profound: As long as large groups of Americans rely on their sense of identity to further their political interests, no one need bother with factual matters because they simply aren't counted.
In Moneyball, another baseball movie, Brad Pitt, playing team manager Billy Beane, describes the game to his players saying, "It's a process, it's a process." Indeed it is, and so is the furtherance of popular culture: it's a process, especially the way we pass it on. If we can shut down the anti-intellectual aspect of the process that’s based solely on who we think we are and choose to stop demonizing others, we can begin to live as if reality matters more than identity. Once that happens, then there is a possibility for achieving a livable democracy. If not, I fear the chance is lost.
Unless people can recognize their tendency to demonize those who seem other and are willing to correct the behavior, they can watch a movie like 42, sympathize with Jackie Robinson, and go right back the next day to spewing racial hatred in a milder form so as to keep their identity intact. In a few days, the whole lesson will be forgotten. This is why it takes generations to change what should actually happen in the time it takes for a television commercial to play. Progress will come only when honesty can trump identity.
My Books and Essays on Amazon
Saturday, April 6, 2013
© Charles D. Hayes
Instantaneous annihilation by a massive object from space seems like a merciful death compared to losing oneself day by day, moment by moment, in the passageways of your own mind. Okay, it's not an asteroid, but what's coming is just as bad, if not worse. I'm talking, of course, about Alzheimer's disease, and for more than five million people the asteroid analogy is too late; it's already struck with a vengeance. The result is nearly $200 billion a year in medical expense, with more than 15 million people acting in the capacity of unpaid caregivers. If we felt the full impact of these conditions at once, instead of gradually over many years, the collective gasp of anguish would drown out most other concerns.
Alzheimer's brings with it a gift of guilt that keeps on giving, because there are no satisfactory solutions. If you take care of family members with the disease, you feel guilty. If you find them a great place to be cared for, you feel guilty. It's what this disease does to your family members that causes guilt to follow your every decision because, no matter which option you choose, things always get worse. In terms of the cost of stress, Alzheimer's is off the charts for both victims and caregivers. Both are wounded. Caregivers live shorter lives because of the emotional toll. My grandmother took care of my grandfather at home, and the cost to her own health was enormous.
My asteroid metaphor is especially appropriate because of what our demographics tell us is to come. In 2050 our gray asteroid goes from $200 billion to an estimated trillion dollars or greater, and the cost in individual anguish escalates by orders of magnitude that we are barely capable of imagining. If this disease were something for which a company could be considered liable for causing, the punitive damages in a court settlement would likely be for an amount too great for us to comprehend because not even our government counts that high.
Now, consider what steps we would be taking if we were dealing with a real asteroid whose orbit would, at some future date, bring it in direct contact with the earth. Our efforts, of course, would depend to a degree on its size. An object large enough to be considered a planet killer would be viewed differently than one that would only portend local destruction in the immediate vicinity of where it hit.
Alzheimer's by any measure is a seismic global event. Moreover, it's only one disease that increasingly affects an aging population. There are many kinds of dementia that are hard to distinguish from one another, as well as some medical treatments that actually cause symptoms that mimic Alzheimer's. A few years after my grandfather’s death at the age of 92, I learned that a conflict in his medications was very likely responsible for at least some, and perhaps all, of his dementia. And I have no reason to doubt that many aging people today who are under the care of more than one physician are still given conflicting medications because of the pace and economics of medical practice in America.
In his book The World Until Yesterday: What Can We Learn from Traditional Societies, Jared Diamond writes about visiting a village on the Fijian island of Viti Levu. While he was there, an islander accused him of being from a country where we throw our old people away, referring to the fact that we often put our family members in retirement or nursing homes. Diamond also tells us about cultures where the old are killed or abandoned as a matter of what is considered economic necessity.
As for Americans, he writes, "Care for the elderly goes against all those interwoven American values of independence, individualism, self-reliance, and privacy." I suspect that it is no small part of this ethos that adds to our guilt, no matter what actions we take with our aging parents and relatives. Guilt is often an overt expression of the exasperation that comes from feelings of utter helplessness.
And yet, every time we try to have a serious public discussion about the end of life, there is a chorus of political vitriol about death panels. Speaking only for myself, I would rather die a violent and painful death than be among those I could only identify as strangers, while being angry, confused, and existentially lost for what could amount to years somewhere in the shadowy corridors of my own mind.
Long before we experience the full effect of the gray asteroid of 2050, we need to find a way to let individuals decide for themselves whether they want to end their lives under medical supervision when their minds are gone and there is absolutely no hope for recovery. I suspect if the Fiji Islanders knew more about our society, they would declare that we often show more compassion for our pets than our old people.
The statistics are truly frightening. One in eight people over age 65 has Alzheimer's, and nearly half of us who reach the age of 85 will suffer its ravages. There are some hopeful signs in medical research for ways to fight Alzheimer's, but nothing close to a cure or prevention as of yet, and the asteroid gets closer every day.
The Obama administration is setting an ambitious goal for having an effective treatment for Alzheimer's disease by 2025. Their budget, unfortunately, doesn't measure up to the gravity of the challenge, but if they try to invest more money in the effort, we surely can expect more filibusters on the horizon.
In the meantime, if I get Alzheimer's, I would rather that the money required to keep my body alive go instead to looking out for young people. How about you?
My Books and Essays on Amazon
Sunday, January 27, 2013
© Charles D. Hayes
Picture a young man who’s born and raised in the post-war South, trained in the Marines, and steeped in the ideological culture of Texas law enforcement. That’s who I was in the early 1960s. Like millions of others, I had internalized the popular ideas of my geographic region, which imbued me with a xenophobic and racist worldview as the one true window on reality. I was up to my neck in mainstream indifference. It would be another decade before I embarked on the process of self-education that would enable me to begin awakening intellectually.
Mainstream indifference is a form of ignorance born of inattention and apathy. Depending solely upon appearances, it is fed by pettiness and gravitation toward whatever seems easiest. It revels in anti-aesthetics, bad faith, an absence of mindfulness, and a total lack of reflection about matters vital for making sense of the world. Not just half-hearted, these are half-headed efforts. Devoid of compassion, mainstream indifference is a hostile, authoritative, and testosterone-laden environment where the weak are ridiculed and the poor are held in contempt, regardless of the circumstances for their plight.
This anti-intellectual mindset leads to the kind of situation where, as recently as 1998, unthinking white men can assume that it’s acceptable to drag a black man behind a pickup truck until he is dead, as happened to 49-year-old James Byrd in Jasper, Texas, or to murder a young man like Matthew Shepard in Wyoming simply because he is gay.
In effect, mainstream indifference is a selfish, cliché-ridden, and narrow-minded refuge for racists, bigots, misanthropes, and misogynists. It’s a psychological wasteland where thoughtless people are bound together by a yoke of stupidity that’s wholly accepted as plain old common sense. Such thinking frequently betrays itself, however, as seething hatred, complete with public demonstrations of contempt for “others” when, actually, a lack of curiosity is the real culprit. The social realm where it thrives is anti-intellectual to the bone, feeding upon a disdain for eloquence in literature, the arts, and all serious endeavors that require cerebral verve.
This deeply internalized conviction is often vested in superstition, intermingled with conspiracy theory, and held so dear that it cannot be acknowledged for what it really is—a profoundly malignant strain of despair shared by a fearful populace who are unified by their own lack of awareness and bonded by a form of hatred so spurious that it feeds off itself. I understand this level of relating because I was a frequent participant before I began my own journey of self-education. I have seen how such insensitivity infects otherwise good people who don’t set out in any way to harm others but wind up doing so because of an inherent default to the worst human instincts. Indifference lies at its core.
In 1987, Holocaust survivor and Nobel Laureate Elie Wiesel put this mystery of human nature in crystal clear perspective. He said, "The opposite of memory is not forgetfulness. The opposite of memory is indifference. What is the opposite of art? Not ugliness. Indifference. What is the opposite of faith? Not heresy, but indifference. What is the opposite of life? Not death, but indifference to life and death."
Indeed, history has shown that indifference is often a breeding ground for evil, allowing social relations to deteriorate to a point where facts are less important than choosing sides. In a democracy dependent on accountable citizenship, indifference is a spiritless sidestepping of responsibility and a serious impediment to achieving authenticity.
My perspective about learning and relating to others stems from the advantage of seriously pursuing education later than most, when I already had some worldly experience under my belt. Even though it’s nearly impossible to remember what it’s like not to know something after you’ve learned it, I still have a keen understanding of what it’s like to internalize a racist social outlook without the cognizance to know better. Hatred thrives on indifference, but knowledge fosters tolerance, even a measure of tolerance for indifference. I’m quite certain that, had I not embraced self-education as a lifelong endeavor, I would have become a frustrated and anxious individual by now, very likely convinced that any reason there might be for my not achieving more in life was someone else’s fault.
Today millions of Americans have such an outlook, and what’s so disappointing is that I know how they feel. After more than three decades of voracious reading, writing, and reflecting, however, I’m convinced that curiosity can overpower indifference. I also know that reaching a level of interest about any subject powerful enough to become a self-sustaining form of motivation can be a hard thing to do. Still, I think for most people it’s not a question of having enough time but rather how they choose to spend what time they do have. Intellectual maturity is a function of deliberate learning, not of age. True adulthood is not possible without it.
Reflective maturity involves the kind of intellectual honesty that enables clear scrutiny of our hidden prejudices as well as the ability to discern patterns of self-defeating behavior. This need not be an unpleasant experience. Maturity is not the time to shrink from responsibility; it’s the time to assume it. Later life is not a time to become set in our ways, but rather a time to figure out how and why we have “ways” at all. It’s a time for lifelong liberals to look for value in conservatism and a time for conservatives to do the reverse.
Learning in the September of one’s life is exhilarating because of the vast perspective that years of lived experience provide. Maturity achieved is an unspoken yet glaring declaration not only that one has lived, but also that one has learned from the experience. (Adapted from The Rapture of Maturity: A Legacy of Lifelong Learning.)
My Books and Essays on Amazon
Sunday, July 22, 2012
© Charles D. Hayes
A radio news broadcast recently reported that we are using up natural resources at a pace that exceeds our planet's largesse by half; if we continue, by 2050, we will require three planets to cover the deficit. This was followed by a discussion about our enormous budget deficit and political gridlock. These are formidable issues, although the evidence is overwhelming that few people are paying close attention.
Albert Einstein was quick to argue that the thinking required to solve problems needs to be greater in substance than the thinking that allowed them to occur. And yet today, at a time when deep reading and critical thinking are desperately needed, more and more people are devoting time to 140-character hot-button discussions, leaving too little time for serious analysis. Nietzsche's herd mentality comes to mind, and I can imagine Emerson spinning in his grave at the very notion of a world consumed by chit-chat. He was so incensed by small talk that I can picture him summing up today's chatter with something like "twits tweet." Nietzsche, I suspect, would have burst a blood vessel at the thought of millions of people following one another for 140-character tidbits, when 140 pages of serious study would barely get the job done.
Now I am not a luddite. I love technology. I'm not blind to the positive effects of a world connected by broadband. There are too many upsides to list. But there are also downsides. Increasingly I see young people (and some not so young) spending their days flitting this way and that, like subatomic particles being moved by unseen forces, while focused on a hand-held gadget. An alarming number of teenagers spend their days in a frenzy of texting that goes on into the night, sleeping with their phone at the ready. This flurry of activity makes David Riesman's notion of "other directedness" in his 1950 book The Lonely Crowd seem quaint and the very idea of inner-directedness historically irrelevant.
In Hamlet's Black Berry William Powers puts it succinctly: "Digital busyness is the enemy of depth." He even suggests that in today's world deep reading sometimes "feels subversive."
I retired from Alaska's North Slope oil field in the fall of 2011. In the camp, it was not unusual to dine in the evening with people holding a fork in one hand and a gadget in the other, seldom taking their eyes off the latter. Many of these same people, even during work hours, could not seem to go but a few minutes without checking or sending text messages in dialogue so trivial in content as to amount to an inappropriate distraction and an egregious waste of company time.
Consider the public fascination with Facebook, Twitter, and LinkedIn. No doubt, there are positive things that can and are being accomplished with these kinds of media, but right now polarization seems to be a major benefit. Political echo chambers abound as group members share email assaults on out-groups, relentlessly making fun of their opposition while continuously upping their levels of contempt. It's hot buttons 24/7. Us, us, them, them. Then we wonder why we have become politically dysfunctional.
Beneath the surface of all of this frenzy of nonsensical communication is the underlying reality that there are literally millions of people subtly coming on to us under the pretense of friendship with the covert motivation to sell us something. I, too, have books and essays for sale on Amazon and other vendors, and like many authors I figure that if people like my web posts they might be interested in reading my books. I'm not by any means against commerce, but I can't help but think that something is deeply disturbing about a market growing exponentially for books promising to tell sellers how to come on to customers without seeming to, so the seller can set the hook before the buyer recognizes the artificial pretense. So much purposeful deception is as disappointing as it is disingenuous.
Add the scams and data phishing going on in cyberspace to the insincere dialogue and the vicious partisan politics underway, and it makes one wonder where we are headed. Today's gadgets are going to become obsolete and give way tomorrow to new ones. No telling how we will use them exactly or whether they will compensate for our Stone Age minds or add further to the venomous political contempt we are witnessing as our current technology exacerbates our primitive political predispositions.
Moral psychologist Jonathan Haidt describes our human condition as a rider/elephant predicament in which the rider represents our conscious ability to reason (with emotion) while the elephant represents our emotions, which operate to a large extent at an unconscious level. I would add a self and a robot to this analogy. The self, which is a fuzzy concept neurologically, is home to both the rider and elephant, while the robot represents our tools.
Long before the creation of cyberspace Marshall McLuhan warned us that what enthralls us about technology is that it represents a narcissistic extension of ourselves. The existential danger in our enthusiasm for the latest in gadgetry is in becoming so distracted that we let the robot take over, thus becoming lost in a maelstrom of confusion and subservient to our tools.
KINDLE Books and EBooks on Amazon:
Alaska Short Fiction Series for Kindle
KINDLE Essays on Amazon:
NOOK Books and Essays on Barnes & Noble