Saturday, December 19, 2009

Vulnerability

We are all vulnerable. We think it a curse. It leads to doubt and fear and error. It makes us weak and susceptible to all manner of bad choices and tragedies. A recent replay on Bill Moyers’ Journal, regarding Lyndon Johnson’s private telephone conversations during the period leading toward the Vietnam tragedy, reminded me of the enormous implications of one man’s vulnerabilities. In his phone conversations, LBJ affirmed what every anti-war protestor knew: that the Vietnam conflict was a war of national liberation; that further involvement in this war would create a quagmire from which it would be hard to escape; that the war was not winnable. So, why did LBJ persist? Because, as he admitted privately in several phone calls, he would look weak if he did not prosecute the war, and worse, to his mind, he would be pilloried by the Republicans for this perceived weakness. We all know where his vulnerabilities led us, and who and how many paid the price. In the end, LBJ had to withdraw from the run for the presidency in 1968; he had been defeated. If he had resisted his vulnerabilities and pulled out of Vietnam, he might have lost the election, but at least his historical honor would have been preserved.

Tiger Woods’ recent display of human vulnerability is not to be found in his libido as much as in his bad judgement of character (i.e., his own character and that of his co-respondents); the tawdry leaps into bed with every cocktail waitress and wannabe actress within sight (Maureen Dowd in the NY Times cautions that men should avoid liaisons with young women who have 8 x 10 glossies to hand out); and, his far too easy (or non-existent?) conception about what constitutes virtue and fidelity. More than this, as a young man who demanded the privilege of privacy, who accepted the image of integrity necessary to represent a wide range of commercial products, and who claimed superiority to his other colleagues in a game which demands individual honesty on the course, he failed to live up to the persona he had consciously created around himself. Raised by doting parents, he learned the value of keeping everything within the family -- tight to his chest -- and with tenaciously pursuing one’s own success, and that success alone. Now, he is facing the hardest contest of his entire life, one that none of us ever completely wins – admitting and exposing one’s biggest weaknesses and vulnerabilities to others.

President Barack Obama has his own weaknesses and vulnerabilities, which we are not yet able to portray exactly. “Hope” now seems remote; “audacity” has given way to “timidity”; “Yes We Can” is now “maybe we can’t.” Rhetoric and reality are drifting farther apart in “America-the-political,” and the President seems to have no more immunity from this drift than any average politician.

On the other hand, if we were not vulnerable, we would be monsters. In a culture like ours, with its emphasis on autonomy, individuality, and personal agency, invulnerability would be an invitation to Thomas Hobbes’s “war of all against all.” Madeleine L’Engle (an American writer for young adults who believed in universal salvation) said: “When we were children, we used to think that when we were grown-up we would no longer be vulnerable. But to grow up is to accept vulnerability... To be alive is to be vulnerable.” Vulnerability is what allows us to love one another. Marriages, partnerships, relationships, and friendships necessarily include identifying and accepting the vulnerabilities of those near us, and exposing the vulnerabilities we have to those we love. This is not easy. Most of us believe that we must pursue friendships without showing weakness. As Walter Anderson (an American painter with a lifelong struggle with mental illness) said, “We’re never so vulnerable than when we trust someone – but paradoxically, if we cannot trust, neither can we find love or joy.” Vulnerability equals humility, a humility that oddly leads in turn to human solidarity.

No season reminds me of our vulnerability as much as Christmas. For some, simple and pure Christian joy may push fears and doubts aside. For many, however, the holiday season raises expectations of happiness and community that are false, expectations that can never be fulfilled. It can easily become a season of failure and depression, as studies and statistics readily demonstrate.

If emotional vulnerability reigns among so many of us around the winter solstice, the poignancy of material vulnerability – of persons and sometime whole societies having poor or no housing, of having little food and inadequate clothing, of receiving no education, and possessing no vision of a personal future – is exponentially greater. The Copenhagen summit on climate change illustrated just how central and profound world poverty is in the discussion of any topic. We would all be well served to remember this, and do something about this material poverty not only in the holiday season but all year long as well. We would be best served to recognize, address, and mitigate insofar as possible those personal, public, and political vulnerabilities – the primary vulnerabilities of feelings and emotions, of compromised thoughts and actions -- which now hinder hope for both the world’s material and non-material future.

Friday, December 11, 2009

Beyond Hope?

Recent events in U. S. politics and governance leave me feeling too defeated to write or speak or rant. So, I’ll let Henry Adams do it for me. Adams is writing about the period from the Gilded Age into the 1890s.

Henry Adams (1838-1918)

We are here plunged in politics funnier than words can express. Very great issues are involved . . . . But the amusing thing is that no one talks about real interests. By common consent they agree to let these alone. We are afraid to discuss them. Instead of this the press is engaged in a most amusing dispute whether Mr. Cleveland had an illegitimate child and did or did not live with more than one mistress.” [just substitute Tiger Woods for Grover Cleveland]

Politics, as a practice, whatever its professions, has always been the systematic organization of hatreds.” [The Republican Right]

Practical politics consists in ignoring facts.” [All of the current Congress]

Modern politics is, at bottom, a struggle not of men but of forces. The men become every year more and more creatures of force, massed about central powerhouses. The conflict is no longer between the men, but between the motors that drive the men, and the men tend to succumb to their own motive forces.” [lobbyists for corporations]

The press is the hired agent of a monied system, and set up for no other purpose than to tell lies where their interests are involved. One can trust nobody and nothing.” [all television network news, including PBS]

It is always good men who do the most harm in the world.” [Barack Obama?]

Reform in America is now a cause with such a remote chance of success as to make it unworthy of further discussion or consideration. We must drive for reforms elsewhere in the world, and hope that the best Americans will eventually be able to join necessary world-wide causes.

Monday, November 30, 2009

American Exceptionalism: History and Reality Denied

Ignoring American claims of “exceptionalism” is as impossible as ignoring the U. S. as a whole. Exceptionalism, nationalism, and identity are welded together in the American psyche to form a shield that seems impervious to the facts of comparative history or the conclusions of rational discourse or even the mere use of close observation.*

A recent opinion page by Thomas L. Friedman in the NY Times resurrects some of the distended delusions that can follow from the over-deployment of the exceptionalist claim. In “Advice from Grandma” (Nov. 11, 2009), Friedman, to his great credit, concludes with an excellent prescription regarding citizenship and politics in order that the U. S. avoid a “suboptimal” future in the new competitive world economy. (I have offered a similar prescription at various times in my blog page). But, before offering this prescription, Friedman adds to some of the old canards about American exceptionalism. He begins by noting disapprovingly that some would claim that while the 19th century was “owned” by Great Britain and the 20th century was “owned” by the U. S., the 21st century is to be owned by China. Unwilling to concede the last part of this equation, Friedman goes on to argue contrarily that the U. S. will still maintain dominance as a great world economic power through the special genius of American “imagination.” Citing the Apple iPod as an example, Friedman proceeds to argue that America “is still the world’s greatest dream machine.” This is American exceptionalism a la Friedman today.

So, there we have it, the latest in a long line of characteristics that presumably make the U. S. not only “exceptional” but perhaps even outside the normal constraints of history as well. In 1629, it was the John Winthrop’s claim of “a city upon a hill” for all to see (and admire) that established the first exceptionalist claim; that lasted less than a century. In the late 18th century it was the American Revolution, which was deemed exceptional in that, unlike the French Revolution, it led to “republican virtue.” In the early 19th century, with “republican virtue” all but invisible, it was Jacksonian democracy and individualism; and then it was American expansionism and “manifest destiny” or, in other words, the ability of the U. S. to steal lands formally belonging to Mexico. In the latter half of the 19th century, Americans claimed they were exceptional in their brand of industrial expansion, and in their invention of the modern corporation. And, they crowed, the American “dream machine” was emblemized by the inventions of Thomas Edison -- who promised a major, life-changing, invention every few months of so, followed closely by the mass production genius of Henry Ford. In the late 20th century, the continental empire of vast resources combined with “democratization” of “lesser” people, who had fallen under American military or economic suzerainty, that marked the rewards of American exceptionalism. By the mid-20th century we were all watching a fifteen-minute show on television named “Industry on Parade”; the industry was all American, all of the time, and endless in its promise of growth and prosperity (a laughable visual image today). By the late 20th century, we were told that American exceptionalism could be found in its ultra-advanced economy, which wedded finance, industry, service industries and technology in a way the rest of the world could only envy. By the end of the 20th century, this exceptionalism had been pared back to claims of technological and educational superiority. Then came the tech stock collapse and the realization that Asians and Indians and “even” the Irish had developed far better educational and mathematical skills than Americans could match.

So, what is left? What is left are still vague claims of superiorities of American character. As hard as this is to believe, American exceptionalism remains as a substitute for history and historical understanding in the mind of even well-educated Americans (e.g., Friedman’s “imagination” claim). But the experience we have garnered from globalization and world trade is that all cultures have about the same abilities – given the chance for a level playing field – to succeed. If you want to see “imagination” at work, just watch any group of poor people around the world, who through their vast capacities for invention find ways to raise food, provide shelter to their families, and sometimes even to advance the education of their children.

Globalization is here to stay, and with it new ideas about law and rights and the protection of all of the world’s citizens in fundamental ways – health care, decent shelter, clean water, good food, education, security, and employment – must be addressed and instituted. Exceptionalism is the last bastion of the 19th century nation-state. It is the moral equivalent of claims about racial superiority and inferiority. And, we can only avoid global ecological and environmental disasters if we abandon exceptionalisms of all types. If the world community can make any progress on these fronts, we will have all made ourselves more exceptional than any people who ever lived in the past.


* I have addressed this issue academically in, “ ‘And We Burned Down the White House, Too’: American History, Canadian Undergraduates, and Nationalism,” The History Teacher 37, no. 3 (May 2004); and, re-published, in part, as “American History, Canadian Undergraduates, and Nationalism” in Carl Guarneri and James Davis, eds., Teaching American History in a Global Context (2008).

Wednesday, November 18, 2009

"Mac vs PC"

We have all seen the “I am a Mac, and I’m a PC” television ads. Justin Long (“Mac” – a skilled actor who retains a boyish appearance and cultivates a suave yet youthful demeanor) and John Hodgman (“PC” – a wildly inventive and always slightly over-weight comedian) stand together against a blank white background. “PC” usually opens with seemingly limitless optimism about the product he represents, only to be disappointed, and then embarrassed when his latest defense of the PC and its operating systems and its applications is utterly undermined. Of course, the actual superiority of Apple computers challenges credulity, and we are all encouraged to wink at some of the near falsehoods at the fuzzy edges of these ads – falsehoods somehow made unimportant in the general spirit of humor and entertainment.

It is these ads, in fact, that suggest the first of three observations I want to make about how Mac computers have come to be symbolically representative of important and controversial elements coursing through contemporary North American culture. For many decades now, the “hard sell” approach to advertising has been revitalized. Billy Mays’s irritating voice (now silenced by his early death) and carnival hucksters like Vince Shlomi (“Shamwow”) are just the grotesque edge of the hard sell approach. The novel “Winesburg, Ohio” suggested long ago that Americans were attracted to grotesques (although one thinks that Sherwood Anderson did not really have these characters fully in mind). Fifty years ago, all of my teenaged friends and acquaintances sneered at the hokey lies of the hard sell. How could anyone succumb to lies and rants of these men (yes, they were all men; watch the TV show “Mad Men”)? We all loved the VW ads or any product that had even a hint of self-deprecation in their message.

The interesting thing about the “Mac” ads is the subtle soft sell underlying the blunt implication that Macs are better than PCs. This soft sell is founded on the underlying decency of both central characters – “Mac” and “PC.” They meet like well-meaning acquaintances, if not quite friends. In fact, they seem to genuinely like each other. There is no muscle flexing and fist-pumping. “Mac” cringes at “PC’s” humiliations and sympathizes with his failures even as “PC” seeks to convince his counterpart of at least some redeeming qualities to his product. True dialogue is attempted. “Mac” remains open to “PC’s” repeated entreaties. The message remains that a Mac is a better computer but that there is a place for PCs in this world as well. There is a hint of the old idea of “market share” as opposed to a Hobbesian war of “all against all.” One would almost think we were back in the 1960s. It is these Mac ads, through their contrast with standard hard-sell ads, that reminds us of the relentless marketing barrage we are exposed to in contemporary times – a barrage that is often aggressive, visceral, and visually manipulative.

The continued existence of Mac computers leads to my second observation: this one is about the centripetal forces of what may start out as capitalism but become something else. No one can agree on the “market share” of Apple computers. Estimates range from 3% to over 20%. It all depends on what you are counting. Nevertheless, Apple computers will never dominate the computer market – not even the personal computer market. This is a consequence of clear forces (not truly “market” ones) that were applied early on in the history of computers. Microsoft on created an early monopoly on operating system software. With no large competitors in the business market, they established the dominance of PCs in every medium-to-large workplace. All institutions had to follow. In my university, only two or three departments (my history department was one) used computers early on. We all used Apple computers. In the late 1980s, as computers became ubiquitous, institutions such as ours adopted Microsoft’s operating systems and IBM hardware. Those of us who clung to Mac’s (and that would be everyone who started with one) were marginalized in many ways. We had more trouble communicating through the university system. We had far, far less tech support. Our then president even told one unit manager that she needed to get rid of “that garbage” (Macs) that she had been using in her unit. Whether it wants to or not, Apple can never intercede in a market that is so interlinked. Much of what passes for capitalism is in fact collusion at best and monopoly at worst. There are no ways, not even anti-trust legislation, to stop this juggernaut, and the experience of Apple computers proves it.

Last week I attended a lecture on why we should still read Charles Darwin’s “Origin of the Species." The speaker pointed out that evolutionary theory has advanced substantially from its origins in the 19th century, and that many contingencies – genetic, molecular, cultural and human – intercede to modify evolutionary change. For some odd reason, I thought of Mac computers. If some kind of genetic fitness were the sole governing element, Mac computers would be the overwhelming favorite of all personal computer users. Their operating systems, especially from OS X through “Snow Leopard,” are superior in every way. They are more reliable, intuitive, and sophisticated. The Apple operating systems have better graphics than Microsoft systems, and always have had. The artistic design of Mac personal computers (even some of the retro models) have always been far in advance of the gray-flannel-suit appearance of PCs. So, why has Apple not dominated and driven out the inferior species? Instead, it is as if Neanderthal wiped out humans. Some of the answers are suggested above in regard to faux capitalism. Other answers are cultural. Apple designers – especially Steve Wasniak – just seemed too much on the cultural margins of North American life. Macs were too “artsy.” They had too little gravitas. Macs seemed to be the technological equivalent of youthful rebellion. For a highly – and I DO mean highly – conformist culture like that in the U. S. and Canada, Macs just seemed too trivial. [Apple Ipods would be another matter since they were, initially, not part of an interconnected business culture. They did not do our work. Ipods merely entertained. (Although all of that, along with Iphones and Blackberries are going to change the notions of frivolity attached to MP3 players and Ipods)]. Evolution, therefore, is a lot more complex, based on a lot more SHIFTING contingencies, than many scientists would like to believe.

Oh, and I guess it is clear that I have always used Apple computers.

Friday, November 6, 2009

In Defense of Youth

A recent article in the National Post (Oct. 17, 2009) by Robert Fulford, the celebrated literary editor and journalist, has been gnawing at me since I read it. Entitled, “The Teenage-ification of Manhood,” it was the last in a series of articles by the Post regarding the tendency for modern young people – and too often not so young people -- to become adults at a later and later age, and often not to “grow up” (whatever that means) at all. It is a cruel column, and I dare say the preceding editorials on this subject, were equally mean. There is no need to rebut Fulford’s and the Post’s claims about a long road to adulthood, but there is a need to challenge just about everything else.

As usual, there is a problem with his use of words. For example, Fulford identifies “teenagers” as a social group (now apparently permanent) that emerged after the 1940s. These teenagers, Fulford sneers, are made up of “self-important newcomers” who have constituted themselves as something other than “just adults-in-waiting.” Fulford nods at retailers and overly generous parents as culprits in the creation of this class but insinuates throughout that teenagers themselves are responsible for their continued shallowness and selfishness, for being insouciant slackers who refuse the responsibilities the world has thrust upon them. The truth, of course, is that consumer-capitalism almost single-handedly created and single-handedly continues to maintain, the teenage condition. As can be seen by looking at developed and developing and so-called underdeveloped cultures, the crass consumerist and capitalist underpinnings of modernization and popular culture are all that support the continued existence of the sociological phenomena called the “teenager.” Without these underpinnings, we might resurrect the less derogatory, more benign, more agreeable terms of “young people” or “youth.”

Fulford also implies that to become and remain “adult” is the goal of human existence. It is the ultimate stage of accomplishment in one’s journey through life. An “adult” is superior to a child or an adolescent or a teenager. All conditions and stages of life other than adulthood, he and many other people unthinkingly suggest, are precedent to adulthood and are therefore necessarily incomplete and flawed stages when we consider them in isolation from the goal of adulthood. The calculus is clear: to be an adult is to be mature; to be mature is to be virtuous; to be virtuous is to be rational, emotionally composed, and willing to take responsibility for one’s actions.

This calculus for virtuous adulthood and deficient young people simply does not hold true. In fact, it might be stood on its head. We might say, without much exaggeration, that most adults are persons who have made up their minds about everything important. They are people who have fixed political and social and cultural and moral views. They have stopped growing intellectually and often morally. They have settled for their job, for their old opinions, for their old prejudices. They have given up and have often become cynical about most of the value-laden aspects of the world around them.

We might say, without much exaggeration, that most youth are persons who continue to explore different things in life. They are open to political, social, moral, and cultural change and improvement. They continue to grow intellectually and often morally. They have not settled on an occupation; they abandon old opinions for better new ones; they have not given up.

As for “maturity,” it strikes me that adults and youth (not infants and small children, of course) are about equal in the employment of rationality, emotional composure, and taking responsibility for their own actions. In my personal experience with university students, I believe that “youth” outscores “adults” in all of these categories.

But the condemnation of today’s adults must go further than that. People of my generation (I am on the cusp of being a “baby-boomer,” depending on which demographer you care to cite, and I was a “teenager”) have had many of the advantages of today’s youth, in regard to recreation and possessions and cultural opportunities. Robert Fulford and I have hardly suffered. In addition to that, we have prospered in our adulthood. We received excellent educations at no, or little, financial cost to ourselves. As “adults” we were able to buy houses and stereos and nice automobiles and sometimes even take comfortable vacations. Some of us are even secure in retirement.

Only those who are really old, those who no longer have any contact with vibrant youth, can have the gall to claim that young people are avoiding adulthood in order to continue their lives of play and irresponsibility. When I was first in graduate school I was sometimes accused (by individuals or by the press) of remaining in university to avoid “growing up.” Then, as Vietnam exploded, I was accused (by the same types of people) of remaining in university in order to avoid the draft (I was 1A through my grad school years and could have been called up at any time; I foolishly would have gone). Young people today have it worse. They are told to get good careers and to anchor themselves by establishing their own homes while at the same time society tells them they will never have “permanent” jobs but must continually re-tool themselves for ever-shifting workplace demands.

Incredibly, many young people attempt to conform to this contorted culture and to find their place in this near-impossible economic environment. I know of many students who have a university degree and also have acquired a practical craft skill. I know of many others who have one or two undergraduate degrees and usually a post-graduate degree. I even know some who have multiple graduate degrees. Many of these young people also have extensive volunteer experience. Almost all of them have enormous -- corruptly proffered and enforced, I might add -- student loans (for which they are blamed by “adults” who had to pay little on no tuition themselves). Remarkably, almost all have accommodated themselves – without anger – to having less hope for success and resources than those generations who preceded them (you know, those “adults” who have shifted the blame to the phantom character flaws of youth today). Many young people, of course, have been unable to overcome such the ludicrous Sisyphean challenge placed before them and must eke out what satisfactions they can in life even if they must prey on their parents good will to do so.

So, if you want to retard the aging process quit categorizing and criticizing all young people. Start spending some time with the young people around you. You will become more rational, emotionally composed, and, in the process, you might take more responsibility for yourself and the societal flaws you helped to bring about.

Saturday, October 24, 2009

Three Words to Banish

My friend Erin Phillips recently stated in her blog that she wished the word “closure . . . could be banned from the English language.” She is perfectly right. Unless a person is using the word closure to describe an archaic and arcane political maneuver employed to end a parliamentary debate (usually spelled “cloture”), it should have no place in the language of whole human beings. Why? Because of how the common usage and meaning of the word “closure” has evolved. As I remember it, “closure” flowed out of the lexicon of psychological counseling into mainstream usage a few decades ago as a recommendation for those unable to regain their footing in life because of some loss (death) or personal tragedy. It was directed – it seemed – at those who felt too much, those who were too sensitive. Now, it has too often come to mean: forget your troubling loss or tragedy; “move on”; or, simply and coldly, “get over it.” Original intent has been set on its head. Closure can be seen as serving the most simplistic and selfish purposes. Put the deaths of members of your family behind you – that is, forget them. Forget those events that have caused you hurt. In short, do not admit tragedy into your life. But to be human, of course, is to confront tragedy, to recognize its inevitable role in our lives. It is only through doing so that we acquire any emotional depth, any understanding of the complexities of existence, and any appreciation for the most interesting and unique characteristic of the human condition – irony.

While attending a meeting recently that dealt in part with how the City of Lethbridge might best decide the future use of a segment of its downtown “civic block,” a City official assured everyone that “all of the stakeholders” involved in this issue would be consulted. Stakeholder is another archaic word – this time from the early modern era (16th – 17th centuries, that is) – resurrected by capitalist ideologues over the last thirty years to supplant the term citizen. In keeping with neo-liberal economic theory (read: right-wing economic and political beliefs), “stakeholders” are those folks privileged, as John Locke argued, to have exclusive political interests and rights in society. Not only that, but Locke – to the great comfort of the modern right-wing – also encouraged “stakeholders” to seldom employ their political rights because “the least government was the best government.” Graduates of “management” schools or faculties love this word as a substitute for the messy business of democracy – which lets all of the riff-raff into making public decisions. Thus, “stakeholder” pushes aside three centuries of democratic progress, and subverts words like “citizen,” “civic-mindedness,” and “community or public interest.”

Have you been to your dentist lately? You know, the one you have come to admire and trust. Has he begun a “procedure” with the words – “this may cause a little discomfort”? For me, discomfort is finding the blanket has slipped off of me during the night. Discomfort is discovering that the wine you are drinking has more the undertones of tannins than of berries. Discomfort is that sweater that doesn’t quite fit right. An injection into your gums of novacaine or a root canal, are not “discomfort”; they are pain. As in our avoidance of tragedy (in relationship to “closure”) or our avoidance of the great-unwashed mass of citizens (in relationship to “stakeholding”), discomfort allows us to live the lie that we need no longer suffer pain. It is the soft, daily life equivalent of more profoundly disturbing words like – “collateral damage.” It is not quite Orwell’s “double-speak,” but discomfort is a word, along with its many kin, employed with a sleight-of-hand directed at a purpose that is fully and completely manipulative.

In my experience, everyone has several words that make them wince or protest. Computer language, with its catalogue of words now forced into the service of new meanings – e.g., input, download, boot – is a frequent villain among purists. And, ubiquitous phrases changed for no apparent reason – “step up to the plate” rather than the original “step up”; or “at that point in time” rather than the original “at that time” – can be just irritating to some of us (read: me). The point is: word usage usually says a great deal about the pretenses or the politics we embrace.

Friday, October 9, 2009

Nova Scotia

We are on the train leaving Nova Scotia. It has been some time since we took the train, and some of the contrasts with modern air travel are striking and ironic. During a flight, we see almost nothing, except endless sky, or a land or seascape that is hardly moving. Life is suspended; we are unengaged in almost every way. Narrow aisles, narrow seats, narrow airline management leave us feeling antisocial and trapped. Our privacy has vanished – often in the most embarrassing ways. Anxious for terra firma, our sole attention is directed toward counting off the minutes and hours before we land. The monotony, boredom, and lack of stimulation inside the hull of an airplane are akin to pre-civilized human existence, where we all huddle uncomfortably beside a weak and stinking fire waiting for the weather to clear so that we can move and hunt and eat. Aside from takeoffs and landings, air travel is visually and sensually flat and static. Flying, in short, stimulates few of our modern senses.

The length of time required to get to your destination on a long train trip discourages anxieties over time, and compels us to accept that we are on a journey. We do not have the sense, as we do in air travel, of being a mere object thrown like a dart at a destination. We tend to remain social animals, even if we engage only with our own traveling companion(s). We retain a sense of privacy and self, even a sense of human agency, that is absent in commercial passenger flight. During daylight hours, at least, our modern brains and eyes -- educated by film and television – trace scenes rapidly flashing before us with familiar acceptance. The “big screen” windows on the train -- emulating modern movies and television -- contrast tellingly with the postage-stamp portals of the airplane, through which we see only lonely space or still picture images of distant, unreal, city and landscapes.

What can be said about Nova Scotia that everyone, and all of the “literature” about Nova Scotia, has not already proclaimed? It has a charming coastline. Its landscapes and seascapes blend the wild and pastoral in perfect balance. It preserves its environment and its historic past. Its residential architecture seems placed just right for all passersby to admire and enjoy. (This is a play on June’s observation that all of the sheep in Devon, England, seem to have been positioned ideally for the perfect pastoral scene). Nova Scotia’s beauty is all packaged just right, in proportions manageable in ideal gradation to the foot or the eye or the automobile. And, if you need a change of scenery, a thirty-minute car ride pretty well guarantees any change you seek.

Regions vary extensively enough to make us want to see it all (or most of it). It has small towns with character, and with characters. Its residents seem genuine andcomfortable in their friendliness toward strangers. And, it has enough seafood to allow the formulation of meaningful gourmand comparisons regarding how every restaurant cooks and serves its chowder, scallops, and haddock.

But I am drawn to two other aspects of our holiday in Nova Scotia that return to my mind again and again. First, we are here in the fall. That is, we are in an area of North America where fall is a real and full season. I had forgotten all of the feelings that fall, with its colours and crisp air and mature beauty, evoke in me. At one time, it evoked a happy anticipation of a new school year – at least when I was very young, and then again when I was an undergraduate. As I sometimes reflect either joyfully or sadly, the return-of –the-school-year emotion no longer resonates in me. I suspect that for most persons, fall is mainly a harbinger of winter. It points toward ends, not hopeful beginnings. It is the last call at the pub.

But when I am in an area that has a real fall – not a region where summer’s plants are all ruthlessly and pre-maturely murdered in the first frosts, or where snow appears so suddenly that one feels embarrassed trying to take in their lawn furniture while still wearing shorts and sandals – I am filled with a sense of romantic languor and the satisfactory completion of things. It is the best of seasons for food. Summer menus and winter comfort foods are both appropriate, with fresh vegetables taking their rightful primary place on the table. It is a powerfully romantic season. Sentiments felt through the year are heightened and made more alert. Fall is the harvest, in many ways, of both what the land offers up and our best human empathy and emotions (what the 18th century called – “sensibility”). It is fulfillment, not end.

Nova Scotia also reminds me of the 1950s. It must be admitted that as we retreat further and further from the 1950s, our appreciation of that long decade (really 1948 through 1962, in my historical calculation of periods) diminishes. What an ugly time the 1950s were in terms of public politics and affairs, and social relations (at least in the North America, and especially the U. S., where I grew up). Cold War hysteria, the McCarthy era with its very real assault on decent people, political conservatism, the beginnings of modern consumerism and greed, overt racism and bigotry, the suppression of more than half of the population (women) in a patriarchy more powerful than at any time before in North American history, and many more issues are rightly subject to our disdain. The 1950s as historical antecedent to the second half of the 20th century and the early 21st century comes across as truly irresponsible and reprehensible. So, when the right-wing seeks to return to the 1950s, we are easily repelled by the prospect.

I like to think, however, that even that 20% of Americans, and a far lower percentage of Canadians, who are right-wing, are imagining a different 1950s. The 1950s that Nova Scotia brings back to my mind is, first, one of modest expectations. Most folks in the 1950s, at least where I came from, experienced some sense of employment security, and some sense that they could cloth and educate the kids, fix the house, and drive an automobile that was reliable. I get that same sense of economic equilibrium between anxious poverty and excessive wealth in Nova Scotia. There is an underlying acceptance of things economic in this. No one expects a house with four bathrooms and a whirlpool bath or a Lexus in their driveway, and with those expectations out of mind, they can enjoy what they have. They can also live to a rhythm that is not frantic and distracting. The 1950s (at its best) and Nova Scotia today also seem to demonstrate a level of family and friendship interaction in which people were (are) more attentive to one another. While patriarchy may linger in the background (it did not in my family, even in the 1950s), people then (and today in Nova Scotia) seem to communicate better than we do now. Relationships seemed more complementary and not, as today, either formal or singular and private (i.e., relationships that are binary rather than communal). Anomie, although an emergent characteristic of the 1950s, was less pronounced than it is today. In the 1950s (and what little I have seen of Nova Scotia today), there was (is) a greater desire to live and act in a community of family and friends and acquaintances in a seamless way, rather than compartmentalize all of one’s relationships.

This is all wildly subjective and speculative on my part, of course, and that is why a blog is so much fun. This specific blog is also wistful. But I believe wistfulness is not solely negative. It incorporates a sense of what was or could be again as well. And, our capacity for wistfulness suggests that we need not think that alienation, estrangement, material greed, and consumerism are our only options.

Wednesday, September 2, 2009

Willful Ignorance and Its Consequences

While most of the people I know may allow others to make outrageous “truth claims” out of whimsy or sport, we do not accept “truth claims” based on clear and demonstrable ignorance. In consequential matters, we expect a quality of intellectual curiosity, informed opinion, and mature reflection. And, we assume that no one will fall back on so-called “received knowledge” or “blind faith” arguments as a means of ending all further debate.

This is not the nature of the today’s culture, however, where a series of unique circumstances has led us to allow others not only to employ willful ignorance in debate but to take seriously the absurdities of willful ignorance as well. Here are a few brief examples of how some modern historical factors have led to this condition. Some of these are sinister in their calculation; some are the consequence of distensions or corruptions of well-intended cultural change.

1. Modern corporations have manufactured “truth sets” about themselves and their products that involve constructed “realities” in which societal truths are distorted in order to advantage the corporations profit motive. For example, auto companies, through the design of their products and their advertising, promote a sweeping ideal of the good and successful life – a life centered on the automobile.

2. Radical Christianity today has “dolled-up” traditional beliefs, added a guitar player and folksongs to create a glaze of modernity, converted sports arenas into churches to emphasize the popularity of their beliefs, re-done the message of “The Fundamentals” so that “name it and claim it” greed becomes God’s wish, re-invented Middle Eastern History (e.g., John Hagee), and trimmed down the real sources of truth sources to one -- The Bible. The consequence is a contrived alternate universe of belief that appears doubt free and complete in its simple answers to complex problems.

2. Modern politicians have lost almost all sense of public responsibility under the bright light of “celebrity” status and under the easy pressure of corporations with whom they are familiar and even chummy and to whom they are beholden. “Stakeholders,” they willingly admit, trump “citizens” in every important way.

3. Post-modernism, which has legitimately questioned the perspectives from which truths are often derived, has fallen under the pen of acolytes not up to the task of retaining the complexity and integrity of the best intellectual claims of post-modernism. As a result, all truth has become all too relative to all too many people.

4. The decent quest for ethnic, racial, and cultural equality over the past three-quarters-of-a-century has been corrupted into an argument on behalf of a wide range of behaviors and truth-claims based on ethnicity or race or cultural distinction, despite the fact that some of these claims can be authentically challenged by arguments from perspectives other than group identity.

5. Modern education, in its noble attempt to democratize education and bring everyone inside the tent of knowledge, has succumbed to a bizarre reductionism in which “anyone’s opinion on anything is as good as anyone else’s.” Thus, we now have both a juvenile and an adult population who, several times a day at least, bring an end to a conversation or discussion with one word – “whatever.”

During the so-called “town meetings” in the United States this past summer, we saw some extraordinary examples of “willful ignorance.” Bald racist statements, threatened violence, and a shout-down-the-opposition style of free speech employed to deny intelligent discourse and the right of others to speak freely were standard fare. Unable to articulate an argument or use historical examples correctly, “socialism” was easily confused with “totalitarianism” (a word too big for the ranters to handle in any case) and public interest with “communism.” Energized by some assumed “higher truth,” many of this right-wing crowd showed no temerity or embarrassment in telling the citizens of other countries (especially Great Britain and Canada) that they suffered under failed health care systems and that socialized health care was to blame (i.e., “universal, single-payer health care”; most of them seemed incapable of comprehending that complex concept well enough to delineate what they actually hated).

The media’s favorite example of opposition to the current Congress’s health care reform was voiced by a woman who – by chance, I think – began her peroration against health care reform assuming the role of an indignant sufferer who already had had something highly invasive thrust upon her. She ended with the usual “socialism” and “Russian dictatorship” references. Two things interested me in her brief “speech”: her crescendo of anger and her admission at the very beginning of her rant that she did not usually have much interest in, or knowledge of, politics or political affairs. No commentators seemed bothered by either her rage or her admission of ignorance of “political” matters. Journalists and ordinary citizens alike have no cultural language to initiate challenges to the heart-felt opinions of person’s like her– no matter how grossly absurd and incorrect those opinions are. Emotion and theatre trump sanity and reason every time.

No realm of human knowledge is so exemplary in exposing willful ignorance as the extensive lack of political knowledge among members of a democracy. Perhaps I read too much George Orwell when I was younger because it was Orwell who convinced me that everything is political. He made that claim on several occasions in clear prose and implied it in almost everything he wrote. Like many others of his generation, he also got to see how willful ignorance of political matters and head-in-the-sand morality led to the horrors of WWII and its dress rehearsal for WWII – the Spanish Civil War. In my own experience, students (and others) used to tell me individually that they “were not political.” I usually tried to convince each of them that everyone, through either action or passivity, was political. In class, I would rant more damningly on the subject.

No society -- no matter how rich and secure – can afford the luxury of willful ignorance. Yet, that is pretty much where popular politics stands in the U. S. In an earlier blog, I compared the right-wing campaign against health care reform -- supported by willfully ignorant followers -- to the processes at work in Nazi Germany in WWII. Calling someone or something “Fascist” or “Nazi” has become a hackneyed cliché, and almost all listeners and readers dismiss the speaker or author for employing mere hyperbole when those epithets are used. For that reason, and to avoid being willfully ignorant and irresponsibly angry myself, I propose to argue that a substantial number of people on the far right of American politics are indeed acting in a fascistic manner, a manner that would be at home in 1921 Italy and 1933 Germany. There are caveats, of course. Not all of what is happening in opposition to American health care reform fits that definition. But too much of it does, and that ranting opposition is not simply the result of the heartless economic interests of corporate America or the political interests of the Republican Party. This is not to say that some “putsch” has begun. But I am claiming that radical right-wing behavior has taken on the attributes of something more than colorful extremist speech – speech at which some may laugh (e.g., “The Daily Show”) and others simply weep.

Let me elaborate by referring to two specific essays that address the issues at hand: one by Umberto Eco, regarding Fascism; the other by Erik Hoffer, the very popular author of The True Believer: Thoughts on the Nature of Mass Movements (1951). Eco is a professor of semiotics – not political ideology. Hoffer, a longshoreman and not technically a philosopher, thought and wrote extensively on the psychology of adherents to mass movements of all sorts. I am drawn to both partly because they employ historical experience as a legitimate guide and partly because, in regard to the subject of blind believers and followers, they considered these issues from the viewpoint of public intellectuals, not as academics with some puny professional thesis to promote.

In a small article in the New York Review of Books (1995) entitled Ur, or Eternal Fascism, Eco laid out fourteen specific “features” of fascism. To call someone or something fascist, Eco noted, was to apply Wittengenstein’s observation that games cannot be precisely defined by firm rules but must be understood as having broadly shared characteristics which constitute an ineffable “family resemblance” to one another. Fascism, for Eco, depends on a similar set of resemblances. It is politically, ideologically and philosophically “discombobulated,” he says, but it is firmly fixed in many of its emotional manifestations. Of the fourteen general points that Eco identified with eternal fascism, at least ten apply to the all-too-visible angry right-wing in the U. S. today. And, as Eco argued, of his fourteen points “it is enough that one of them be present to allow fascism to coagulate around it.”

Hoffer’s arguments complement Eco’s. No matter what their belief, Hoffer claimed, “true believer’s” are alike in believing there is “one and only one truth.” Eco also saw his eternal fascists as people for whom “truth has been spelled out once and for all.” Ironically, having been given the right to be taken seriously as a consequence of modern conditions of relativism, these people insist on absolute truth. True believers tend to follow a few leaders, Hoffer observed, and they are notable in their lack of humility.

In relationship to today’s angry right-wing extremists opposing health care reform (and many other “liberal” things), Eco and Hoffer offer some enlightenment.

1. The “town-meeting” extremists are completely uninterested in rational discourse or attempting to discover solutions to problems apparent to the rest of the population. They are not open to argument. They make absurd claims and charges. But, as Hoffer noted, “Crude absurdities, trivial nonsense and sublime truths are equally potent in readying people for self-sacrifice if they are accepted as the sole, eternal truth.” Eco more subtly defines this as attachment to a “syncretic” culture in which internal contradictions and incompatibilities may appear but in which the dominance of some “primeval truth” will always prevail.

2. The “town meeting” extremists and the vast array of right-wing radio talk show hosts are angry and filled with hate. Why? Because, as Eco points out, “disagreement is treason” in eternal fascism. Disagreement implies “diversity,” something they also oppose, as can be seen in their hatred of Muslims and Mexicans, and so on and on. Fascists have a “natural fear of difference,” Eco notes, and they may pretend that the mother country is under siege. Hoffer, too, observes that attempts to shame or belittle or poke fun at such extremists does not work and is simply “more likely to stir their arrogance and rouse in them a reckless aggressiveness.” Just think of Rush Limbaugh, who DOES speak for this minority, and apply the above.

3. Right-wing political and economic extremists proclaim that all they want is freedom. “Freedom” has been their mantra in all debates, despite the fact that the freedom they advocated in health care meant more dictatorial authority in the hands of their insurers, and likely more costs to everyone in society, including the economy in general. But the “freedom” they speak of is not one that involves toleration of speech or press or political association. They make no apology for opposing representative government, another of Eco’s conditions of “eternal fascism.” Why this apparent internal contradiction of being “for freedom” but against its practice? Hoffer contends that true believers find that affective “freedom is an irksome burden . . . We join a mass movement to escape from individual responsibility, or, in the words of an ardent young Nazi, ‘to be free from freedom.’”

4. These right-wing adherents offer no complex solutions to complex issues because they are not interested in being autonomous human beings (truly free persons) struggling with difficult, sometimes irresolvable problems. Their “’Leader’ pretends to be their interpreter,” Eco argues. So, all that these right-wing followers have to do is use a few words that they believe are so obnoxious as to turn people away from any reform – words like “communism” and “socialism” and, laughably, “liberalism.” “All of the Nazi or Fascist schoolbooks made use of an impoverished vocabulary, and an elementary syntax,” Eco claimed, “in order to limit the instruments for complex and critical reasoning.” The final word on this matter belongs to Hoffer in his description of the antithesis of the true believer:

“Free men are aware of the imperfection of human affairs, and they are willing to fight and die for that which is not perfect. They know that basic human problems can have no final solutions, that our freedom, justice, equality, etc. are far from absolute, and that the good life is compounded of half measures, compromises, lesser evils, and gropings toward the perfect. The rejection of approximations and the insistence on absolutes are the manifestation of a nihilism that loathes freedom, tolerance, and equity.”

The right-wing in the U. S. today should not be taken lightly. Allusions to them as “fascistic” are not entirely misplaced. The cruelties of Mussolini and the Nazis need not be repeated for fascist tendencies to be in play. And, we should all heed Hoffer’s understanding of the truly free person. In doing so, however, we must take “willful ignorance” as a serious threat and a social disease that must be cured insofar as possible.

As postscript, here are a couple of recent quotations peeled off the internet on the matters above:

“ Hopefully, the attempt to restore 1953 America will not turn into an attempt to impose 1933 Germany.” (JohnG – Paul Krugman, “Comments of the Moment” list, Aug. 30, 2009, NYTimes)

“ If ignorance is bliss, why are these people so angry?” (Len Kaminsky, cited in Paul Krugman “Comments of the Moment” list, Aug. 30, 2009, NYTimes).

Tuesday, August 11, 2009

No More Good Will

Barack Obama came to the presidency of the U. S. not so much as the next liberal president or the next reforming president but as the “good will” president. His invitation to “good will” cooperation is ending as badly as possible. With his inability to find a “good will” purchase point from which to lever improvements in American society and culture, we may be seeing the end of the worthwhile remnants of American democracy.

This is not to say that “democracy” has been the most important thing to Americans in their short past, despite their wish to have-their-cake-and-eat-it-too by basking in the moral glow associated with democracy. Americans have always been more advocates of “freedom” than democracy (“Live Free or Die” reads the New Hampshire license plate; something one would expect to read as a tattoo on an Hell’s Angel’s biker rather than as a state motto). They have also been more “cowboy capitalists” than democrats (capitalism meaning only the right to make a lot of money and then restrict the free market in order to retain that wealth; yes, whether we are talking about persons or corporations, that is the American definition of capitalism; let Adam Smith turn over and over and over in his grave).

But now something more precious yet has been lost. What has been lost is “good will” -- the willingness to let policies be debated, for people to disagree, for one’s ideas to be defeated by the will of an informed majority, and above all, to trust one’s fellow citizens. The reasons for this failure are several. Destroyers of “good will” believe that there is only ONE truth. Enemies of “good will” do not believe that there is such a thing as an informed majority, and whether they are the current majority or the current minority they will not agree to any policy or action that defies their own particular views – in full. The rest of us display little effort in destroying their “willful ignorance.” Hate-mongering “citizens” wear their ignorance as a badge of honour and never appeal to reason while we willingly respect their right to vent their spleen rather than spit upon their offensive ideas and positions. In an earlier time, everyone would turn away from such lunatics with disgust and never, ever heed their claims. Because these creatures of hate promote certainties about life (and afterlife for that matter) that are based on “received knowledge” – that is, not knowledge at all but blind belief – trust of one’s fellow citizens is irrelevant. In fact, there no longer is such a thing as “citizen.” Now we have “stakeholders” (those who hold financial stock in the enterprise, not those who were born and live in a community).

The prospect could not be more appalling. Some like to see the problem as “Republicans” or “right-wingers” or “free-marketers” or “pro-lifers” or “evangelical rightists.” And, we apply labels like “fascists” to these people and groups, somehow trying to relieve ourselves of their presence and ugly brutishness by dismissing them as advocates of an antiquated and rejected way of thinking. We are wrong. They are worse than “fascists.” “Kristallnacht” has a close cousin in “town meetings” where gutless screamers shout down Congressmen/women who have organized these meetings to get “input” on the issue of health care. Goebells has nothing on a radio announcer who thinks that a few liberals should be left alive so that we can remind ourselves of what we should avoid. National Socialism’s emphasis on intelligent, healthy Aryan youth has nothing on a society that does not believe that millions of their own citizens should receive preventative and rudimentary health care.

If some vigorous (not violent, not hateful) response is not made very, very soon, Americans are headed for something much worse than second-class world economic status. They are headed for a persistent culture of “hate,” something akin to what has existed in Northern Ireland and still exists in Israel/Palestine. The not-so-Holy Land will have a New World counterpart.

Friday, August 7, 2009

Chronology

A couple of weeks ago, I was reading Anthony Grafton’s Worlds Made By Words: Scholarship and Community in the Modern West. Among other themes, Grafton’s essays consider the wide intellectual interests of Renaissance figures, interests that they believed were not particularized and distinct but integrated and closely related. In one essay he discusses, Johannes Kepler – the brilliant scientist who introduced the idea of elliptical orbits of the planets (pgs. 114-36). Kepler, it seems, had a passion for chronology, not quite what we today would call “history,” but rather the dull, plodding ordering of people and events. Why? Because Kepler knew that knowledge was constructed over time in a particular order and that that order was critical in discovering truths and dismissing myths about the natural order and the universe.

Kepler was battling what modern advocates of evolutionary science have to fight – opponents who find the hand of God in creating complex things out of nothing, or find God intervening capriciously in the evolution of knowledge, or those who think that the past is made up of quixotic jumps from one particular set of circumstances to an utterly different set of circumstances. Evolution is about the order of things, and getting that order of things right. Whether evolution is gradual or subject to surges (à la Stephen Jay Gould), it is a matter of chronological order.

My son, Ike, and I were discussing this on the phone the other day. When I told him that I was struggling with writing a little essay on this subject, he got enthusiastic, saying that he, too, was frequently troubled by the realization that most people have no idea about “what followed what” in history. Ike was an anthropology/archaeology major in university and thus may feel Kepler’s concern more intimately than even I do. (I say this because my interest in history is less with chronologies than with asking relevant questions about some period in the past so that the answers might enlighten us about who we are in the present). Chronology is more dogged (and I might say more boring) than that, even though it is an absolutely essential underpinning in order to address the things I think need to be considered in history.

After reflecting on this problem more, let me pose this issue of chronology in two ways – one way that is essentialist and the other way that is relativistic.

1. We need to understand how one thing necessarily follows another in time.

2. We need to realize the relative nature of time in regard to different aspects of the natural and social world..

By now you are saying, these are trite truisms – as perhaps they are. But, do you have a firm and extensive grasp about how things have evolved over time? Do you understand the necessity of Greek and Roman philosophy to the evolution of early Christianity, to the Renaissance, to the emergence of the modern world? Do you know when the modern world began, and what the chronological landmarks of increased modernization and modernism are? Do you understand the necessity of the long evolution of Catholic doctrine as a condition for the emergence of the Reformation? Do you understand that the Enlightenment was built on a broad foundation of ideas that emerged in the 16th and 17th centuries? Do you realize that the rise of a “national people” in the American and French Revolutions was not some idea that suddenly came to everyone as just common sense but was built on foundations that included, for example, Bolingbroke’s reflections about what made a “patriot king”? Do you remember that Marx called the rise of the capitalist “bourgeoisie” the most revolutionary thing that had happened in modern times, and then suggested communism was simply a logical and inevitable response to that radical development? And, so it goes – on and on – in a correct order.

Those who do not realize how things evolve chronologically from the past to the present allow themselves to believe several preposterous and false things:

1. They may believe in inexplicable divine creation and whimsical divine intervention.

2. They may believe in something historically akin to “immaculate conception,” “spontaneous combustion,” or fate and chance; things just happen, in no particular order or with no particular meaning. Thucydides wrote his History of the Peloponnesian War because he knew that when and how one thing followed another in time produced critical consequences.

3. They may also believe that “now” – the present – is the only reality, and that the past is “dead” and gone; the modernist (and false) corollary to that is that the present is superior to the past in all ways, the past being just a collage of inferiorities and failures. This latter is also intimately tied to the rise of technology. People know that the discovery of electricity preceded their laptop computer, and therefore the present is always superior to the past because it is always technologically more advanced (many individuals deny this but their implicit behavior shows that they indeed embrace this false concept of past and present).

In other words, those with little respect for and knowledge of chronology can be characterized as either blissfully arrogant in their ignorance or suffering from a kind of subconscious vertigo when viewing the world outside their immediate home and locale.

But in fact, of course, all people do have some sense of time passing and of events following one another, and “home and locale” provide their understanding. They get their sense of time from the family and from the small events immediately relative to their lives. This substitute for a broader knowledge of chronology prejudices everything in favor of the individual and the family. If we do not know anything about how our culture and society evolved over time we discount the profound importance of that broader world, even if and when our well-being and very lives may depend on a knowledge of events in time.

Time as relativistic and not meted out in even measures can be seen by the false categorization of time that we all participate in and falsely promote. This is seen most prominently in our depiction of decades – the 1920s, 30s, 40s, 50s, 60s and apparently on and on for the rest of the modern era (an era which may, in fact, be nearly over). But the 20s might be seen as 1922 to 1929, and the 50s as 1948 to 1963, and so on – all depending on understood perspectives. We historians commit the same error in regard to centuries. We glibly say the 18th century was this, and the 19th century that, and the 20th century something else. Chopping up the passage of time into decades and centuries ignores the fact that some time is measured in very long paces (geological time, for example); that some foundations of society and cultures are tied to a longue durée of time (which Marc Bloch, Fernand Braudel, Leroy Ladurie and the Annales school of history has presented as more important than mere surface events [histoire évènementielle]); and, finally, that some historical events have sped up time (the French, Russian, and Chinese revolutions, the two World Wars, the introduction of moveable type and the rise of the internet).

Like serious music, historical time is not merely the matter of whole notes held over several bars but rather the interplay of whole through 64th notes (the speed of time) played in varying time measures (the different pace in the march of time depending on place). It involves melodies (prevailing historical tendencies over a long period of time – e.g., modernization) answered by counter-melodies (resurgent traditionalism – e.g., reborn evangelicalism or rediscovered ideological capitalism). And, it hosts new themes suddenly and dramatically introduced which throw earlier themes akimbo (revolutions -- quiet or violent).

Ike and I share a kind of quizzical interest when we meet a new person, quietly wondering just where this person places herself in time and the universe. The result can be like giving a lecture to a class of immigrants who have a limited knowledge of the language; just how much vocabulary can I use that makes sense to this person?

At its worst, in a democratic society, large numbers of people having no sense of historical time can lead to some bad consequences, as we have experienced again and again. A majority may subscribe to static religious answers to who and where we are, and force the rest of us to accede to their solutions to what they see as “our” problems. Not knowing the long evolution over time of human and civil rights, still others might re-institute restraints that the rest of us thought had been sorted out in an earlier era (or earlier century, more likely). The same may be said of those who have resurrected unlimited free-market capitalism, apparently oblivious to the fact that we already went through that cycle more than a century ago, and that over time we learned what was wrong with unyielding implementation of that ideology (did Milton Friedman know nothing about the past, nothing whatsoever?) We need to know what about the past is irrelevant or truly “past,” and we need to know what about the past persists and is relevant today (this is my nod to George Santayana). We can still respect the ordered events of the past as different from today, as a distinct “other time,” without considering the past to be a time inferior or irrelevant to our own. Insofar as possible, we need to know chronologically “where” we are; historians can add the ingredient of analyzing “how” we got here. No one can provide a full and satisfying answer to “why,” however.

Thursday, June 25, 2009

Fifteen books

My friend, Maren Wood (Dr. Maren Wood, that is), recently posted on "Facebook" a list of 15 favorite books that she could remember and list in 15 minutes. I thought it was quite interesting. I do not know if she believes these books reflect who she is, or who she has become, or just if she found these books amusing or interesting or informative.
I was going to do the same thing, but then I began to reflect on this exercise a bit. I am 66 years old. I have read a few more books than Maren but, more importantly, I read many of them a VERY long time ago. What did influence me? And, were my influential books ones that I enjoyed? Just what would I be saying if I made such a list? I still do not know the answer to all of those questions, but I made the list anyway. To be fair, I probably took 30 minutes to put it together. And, I am not surprised, in some ways, in regard to what I listed while I am stunningly surprised in other ways. The books I listed, I should note at the start, are ones that profoundly effected my thinking or my perspective at the time that I read them, and for some time thereafter. I cannot, on the other hand, quote from any of these books. Nor can I even tell you the summary of the story or the ideas or sometimes even all of the subject matter. First, I will list them, and then maybe I can explain some things.

15 Books

(in order of publication date, not importance, I guess)

1. Michel de Montaigne, Essays (1580)

(Montaigne had human behavior and individual aspirations and human foibles sorted out long before Freud or modern psychology or modern sociology)

2. Thomas Paine, The Rights of Man (1791)

(Paine invented modern democracy. He invented a modern journalistic style of argumentation. He articulated the true nature of society and its needs in a postive, even optimistic way. And, what did he get for this? The refusal the country he helped bring into being -- the U. S. -- to reclaim him as a citizen from a Jacobin prison cell. Admire George Washington all you want, he failed the morality and decency test on this one (as did a number of people). The only thing his critics then and now had right was that Paine could be a pain as a dinner or house guest.)

3. Ralph Waldo Emerson, Essays (1841, 1844)

(I would not want to have dinner with Emerson either. Stuffy, arrogant and oh so judgmental, in a way only New Englanders can be. But I would like to write with the care and precision and insight with which he wrote. His themes and ideas on life and nature are broad, universal, and inspirational, with just the right dash of east Asian religious philosophy.)

4. Henry David Thoreau, Civil Disobedience (1849)

(Emerson’s better half; and, to his credit, Emerson knew and admitted this. This is an essay for a young person. When you get older, you squirm uneasily in your seat as you read it, and realize you have not lived up to your human and civic responsibility. I could have included Walden (also written for the young), or Thoreau’s other naturalist stuff, which are just as important, but this essay lays our responsibilities on the line.

5. George Eliot (Marian Evans), Middlemarch (1874)

(I read this later in life. It is a romantic novel, and its sentiments would be syrupy or cloying in the hands of a less skilled writer. But Evans makes her characters warmly human and sympathetic, and as the pages turn you cannot help but feel better about individual human beings – well, about the potential she found in human beings. Some call this the best novel ever -- I will not disagree.)

6. Emile Zola, Germinal (1885)

(My brother brought this book home during his first year of university, and I may have read it that summer. In any case, I read it when I was young and had hardly heard of Marx or the miseries of industrialization or the need for social justice or – well, you get the idea. This novel was like a slug in the stomach. One of the best works of realism (well, it was one of the novels that introduced realism), it is moving throughout, despite being a long book.)

7. George Orwell, Essays (1920-1950)

(Orwell was a journalist and essayist. Animal Farm and 1984, while brilliant in theme, are not especially well written and are heavy-handed in satire and argument. Orwell the essayist is always engaging as an observer, as a critic of those modern forces that have overwhelmed humanity and deeply harmed fragile human beings. I do not always agree with his politics but as a 20th century critic he was not bettered by anyone.)

8. Hannah Arendt, On Totalitarianism (1951)

(I read this book in graduate school, and it introduced me to a thinker and writer whom I have admired and recommended ever since. All of her works set out the limits of human achievement. Besides articulating the “banality of evil” in individual and collective human behavior, Arendt sets out the profound underpinnings of totalitarianism in a manner that suggests its ubiquitousness and its persistence. Totalitarianism is still with us, as is fascism.)

9. Kurt Vonnegut, Cat’s Cradle (1963)

(As a science fiction story that I cannot remember, this novel is a bit of an oddity on this list. But I remember it as the best of Vonnegut. Sometime in the 1960s I made Vonnegut my summer reading, and I loved his whimsy and his sense of the absurdity of modern life. Many saw him as a fun read, but I think he was a powerful thinker with a lot to say for his time. By the way, Vonnegut apparently pushed this novel as his dissertation in anthropology, and finally won out!)

10. Annie Dillard, Pilgrim at Tinker Creek (1974)

(Nature, ecology, philosophy, religion, and sensuality (yes, beyond the cat scene in the opening; there is always something sensual about Annie Dillard, matched only by a brilliant intellect) all laid out by a brilliant young writer – well, she was at that time – articulated in the small compass of Tinker Creek in Virginia. I love her later stuff as well – especially For the Time Being.)

11. V. S. Naipaul, A Bend in the River (1979)

(The persistent power of tribalism over nationalism and modernism and democracy, and the suffocating grip of the past over the future make this a dismal but compelling novel about Africa in general and the Congo in particular. I once took up Naipaul as my summer reading, and when I displayed signs of persistent depression and melancholy, and June discovered what I was reading, she made me quit. Good thing. Naipaul is too often right about the worst prospects for the future.)

12. William McNeil, The Pursuit of Power: Technology, Armed Force, and Society since A.D. 1000 (1985)

(This is a brilliant work of big history. I made it the foundation for the latter portion of my History 1000 classes for the last 15 years of my teaching. The relationship of the economy and democracy and arms build-ups and war is frighteningly laid out by McNeil.)

13. Robert Darnton, The Great Cat Massacre and Other Episodes in French Cultural History (1985)

(While many other works might take the place of this work (some by Leroy Ladurie, for example), this was one of the first and one of the best representations of cultural history and its promise – at least for me. Being an old fashioned political historian, it helped release me from the dull statistics of social history and imagine a future for history that was vibrant and alive. It also encouraged another field to emerge – microhistory -- and this is a field that is going to be of as great importance as cultural history will be.)

14. Charles Taylor, The Ethics of Authenticity (1991)

(A lot of the historical interpretation I have brought to my writing has centered around human agency, autonomy and authority. Taylor takes on the existential, relativist side of these qualities and, again, while I do not always reach the conclusions he does, no one has seized this issue more intelligently than Taylor.)

15. Louis Menand, The Metaphysical Club (2001)

(The best book I have read in the last ten years, again written by a journalist. Menand frames the real lives of the leading pragmatist thinkers – Oliver Wendell Holmes, Jr., Charles Peirce, and William James – in a fascinating and engaging way. Menand’s book stands as one of three books – the other two being Wallace Steigner’s Wolf Willow (1955 ) and Norman McLean’s A River Runs through It (1976) – as the books I have either given away or recommended highly to a lot of people.)

Conclusion:

What surprises me is how serious most of this stuff is: I am not a fun-lovin' guy. I was also struck by how little of this list consists of "official" history; sure, I list McNeil and Darnton -- who are academics. And, Menand is a popular historian; Arendt is a philosopher who knows her history (what a rarity!), and Naipaul is a novelist who knows history and much more. But secondary works of history have not often impressed me, partly because I find them a boring read and partly because I find the analysis in academic history to be weak or limited. I believe that "history matters more," but not that certified historians matter more. I am also surprised to see how many of the things that have influenced me have been essays -- sometimes by philosophers, often by journalists. Montaigne, Paine, Emerson, Thoreau, Orwell, Darnton and Taylor are all essayists essentially. Perhaps I am too lazy for longer books; perhaps I like arguments presented in a short space without all of the footnotes and documentation. Six of my list are Americans, although their ideas range well beyond national boundaries, in the main. Only Thoreau, Dillard, and Menand have written on themes local to America, and even they are looking at a more distant horizon.

Ingmar Bergman begins his film Fanny and Alexander (1984) by portraying his nostalgic memories of Christmas at his grandmother's home. I somehow recall (but cannot locate and quote) a scene early on in the film, where Bergman's fictional grandmother -- Helena Eckdahl (played by Gun Wälden) -- admits that she never knew what a mother was, and that she only acted out what she thought was the role of being a mother. Becoming and imitating and role-playing rather than essentialist being is, of course, pure existentialism and pure Bergman. Looking at my list of books, I wonder if my life has not played out as a contest between essentialism and existentialism. Many of these books shaped my views of society, politics, human agency, cultural interaction, manners, morals, and nature. Have I just lived a life imitating the arguments, ideas and persuasions of books? Or, did I choose those books to read -- and to be influenced by -- because of already established proclivities of mind and sentiment? It is a "chicken and egg" question, I know, but the degree to which I give weight to either side of this question is of interest to me.

Tuesday, June 16, 2009

Listening

     Our ability to say what we want, to express our opinions, to reveal our inner-most emotions, to let people know what we are thinking or feeling RIGHT NOW, has never been greater. Most people are not going to get the opportunity of journalists or editors or pundits to let the world know what they have on their minds or in their hearts. But who needs that. We all have the chance to say what we want, when we want to say it. Conversations have never been more free. Electronic media -- such as e-mail, blogs, "Skype," "Facebook," "Twitter," and a long list of devices from cell phones to photo phones to Blackberrys have produced FREE SPEECH (in most cultures and countries). Even repressive regimes cannot stop people from speaking their mind. (I am only stating the "bleeding obvious" here.)
     Post-modern analysts has elevated "discourse" above all other motive forces in human relations. Long ago, it became the catchword of Michel Foucault, and deconstructionists like Jacques Derrida. Now it is the catchword of thousands of academics, and attached to, and dismissed by its opponents,as a culture of secular humanism. The material world, the world of institutions, the world of traditional culture, has given way to a world identified by, and defined by, discourse alone.
     Do not get me wrong. I love this spread of what we now call "discourse." I love being able to retrieve the opinions of loved ones, friends, and acquaintances about all manner of subjects. Two of my good friends are, for want of a better word, "talkers." I am happy with that. I learn much from them -- both in regard to rational thought and modern emotions. I delight in the opportunity for millions of people to express ideas and feelings to a wide audience. "Facebook," which I have criticized in another blog, allows my "friends" to let the rest of us know how they feel, what they are doing, what they feel is important -- now.
     But discourse is not just about making speeches. Discourse is not just about expressing one's own feelings and angst and outrage and opinions. Discourse is, by dictionary definition, also about "conversation," and for "conversation" to occur, there has to be a "listener." No, that is not correct. There has to be at least two persons who are both speakers and listeners.
     In the last twenty or so years of my life, I have acquired the ability (although I have not always exercised it wisely) to listen to others speak at great length, with only an occasional comment or reflection on my part. The people with whom I have done this, would probably deny that I have done this. They would claim that I spoke, interjected, interrupted, and generally dominated the "conversation" more than they did. I have been charged, sometimes rightly, with taking up all of the conversation time. But, lately, I have been actually timing how often I speak, and how often those in my company speak. In most cases, I have not exceeded my quota of time.
     This has been driven home to me in a concrete way by the fact that I have been conducting oral history interviews of first generation members of the administration, faculty, and student body of my university. In some of these interviews, I have almost been an inanimate object. My subjects have narratives to tell, and they have not needed questions to propel them forward. In other interviews, I have commented on one or another subject, in the hopes of eliciting some response from my interviewee. But, in all of my interviews, I have begun to recognize again the importance of just being there, of making eye contact, of showing an interest, of smothering a smile or a laugh, of nodding in agreement or shaking my head in disagreement. I am the listener. Although my family and friends would not believe this, I truly enjoy just listening.
     Much of post-modern discourse literature and theory acknowledges the importance of listening. But, aside from "reader response" theories of literature, there has been only a modest concession given to "listening" as an important part of discourse. By this I mean true listening:  listening not just to the ideas and opinions of others but to the cadence of the speech of others, to the manner in which they express themselves, and to the modes of expression they employ. Unfortunately, listening has become much like manners -- something one can ignore with social impunity. But no one can be a whole person without listening; just as no one can be a whole person without expressing themselves in some way. 

Wednesday, June 10, 2009

[This letter to the editor was published in The Lethbridge Herald for June 10, 2009]

      Alberta Bill 44 provides that parents (or guardians) of students must be notified “where courses of study, educational programs or instructional materials . . . include subject-matter that deals explicitly with religion, sexuality or sexual orientation.” Safe teachers will interpret “explicitly” as “to address in ANY WAY matters regarding religion, sexuality or sexual orientation.” That is one big problem. Another is that our Legislative Assembly wrongly thinks that religion and sexuality and sexual orientation are discreet subjects that can be segregated, even surgically divided from other subjects. Their understanding of “knowledge acquisition” (to use an ugly but revealing business term for education) is narrow, laughable, and ludicrous. Most real knowledge is integrated, and “religion” and “sex” are among those that overlap, intersect and merge with other subjects in all sorts of ways.     “Religion,” especially, is a subject so broad and so intimately involved with the basic elements of being human, that it touches upon almost every other subject involving human beings. Being religious (or not) has to do with how we view our world; how we react to our world; how we make sense of our world. To place it under special status (totalitarian regimes over the last century have eagerly placed issues of import under this status) is to eliminate much of what we call philosophy, history, and the study of society (to say nothing of art).

    “Sexuality” and “sexual orientation” are as problematical. While we KNOW what the legislature thinks it means by “sexual orientation,” they are babes-in-the-woods in regard to the broad subject-matter of “sexuality,” subject-matter intimately associated with many critical aspects of human development. The message is clear for any teacher, however:  stay away from anything that has to do with processes of biological reproduction in any form and, for good measure, anything dealing with human affection and intimacy. Who knows how fast those subjects might suddenly veer into the forbidden realm of human sexuality.

     If we lived in a province where “reasonable expectations” prevailed, good teachers with options would leave, and faculties of education would howl in protest. But we live in Alberta, an alternate-reality universe. Aside from a few courageous students and teachers, we will see little more than tighter lips, and young people poorly prepared for the world in which they live.

Monday, June 1, 2009

Halve Everything

     By now it is obvious that the hope aroused by the candidacy and then the election of President Obama has not caused a sea-change in the hearts and minds of his countrymen/women. Historians may find, fifty years from now, some important change among some young people who were inspired by the President's appeal to pragmatism and decency and reform. Right now, no such transformation is apparent. Why has the rocket fizzled? First, because, although the Republican Party and most of its federal representatives appear insane at best, congressional Democrats appear little better -- a hopeless motley crew of partisan hacks, intellectual light-weights, and visionless place-holders. They simply have no clue about moving forward in a clear, uncompromising direction toward anything. They consider halving every piece of legislation as bold and courageous reform. Secondly, and even worse, President Obama, partly because Congress has forced him to do so,  has decided to halve everything as well:  half a stimulus program; half ownership of GM (actually more); assuming half the costs of an irresponsible banking industry; a half-assed health program that promises very little change; half-way measures in pursuing those responsible for instituting a widespread torture program into American "intelligence" work; a halved promise on closing the Guantanamo Bay "facility"; a half-and-half attitude toward the virtues of unregulated capitalism.
     But these are surface issues, ones that can be changed with the election of a better Congress, and the appointment of better justices to the federal courts, and perhaps evidence of more backbone in the current administration once some victories are posted. Deeper cultural currents and bigger problems cannot be so easily eradicated. Here are my seven deadly sins of American politics and society:
     1. The American public, as responsible citizens, continue to lag behind every democratic, or democratically developing country, in terms of their political acumen and activism. "I am not political" is a phrase worn as a badge of moral honor only in North America. When students began to use this "excuse" with me in Canada in the 1970s, I developed a standard response:  "If you are not political, you are immoral." By political, I mean something more than passively voting. I mean acquainting oneself with the political issues of the day; protesting policies one considers bad or wrong-headed through a variety of means; and, discussing politics with one's acquaintances. These are the minimums. Contributing to a political party or working for a campaign or signing petitions and supporting online political interest groups, is a step further in the right direction.
     2. American journalism is immoral in the news they choose to cover, in the manner in which they report the news, and in ways they choose to analyze the news. Failures of omission and of commission are replete throughout all branches of the media. If we are not being addressed by vacuous air-heads of both sexes, whose hairdos alone tell you where they place their priorities, we are being assaulted and insulted by a parade of right-wing "experts" and subdued moderates in what journalism considers "balance" in analysis. There is no balance, and even if all sides were represented equally in these "debates," halving the views of two extremes does not result in truth and sensibility.
     3. "We live in the grip of the most powerful ideology the world has ever known -- capitalism." These are the words I used for over twenty years in my first year history classes whenever the issue of ideologies of the past became a topic of the course. Most older students thought I was going to end that sentence with the word -- "communism." The rest shrugged this sentence off as irrelevant, set against the power of pop culture (which is itself a partner in maintaining the myth of capitalist inevitability). But the pervasive and destructive influence of capitalism as an ideology seems to continue. And, it has emerged from our financial crisis virtually unscathed -- a remarkable feat for a set of ideas that should have been badly damaged by its advocates and extreme enthusiasts. Indeed, journalists make no objection when commentators -- or the "punditocracy," as Michael Moore correctly calls it -- sweepingly proclaim that the free market system is sacrosanct and must not be impeded. What utter nonsense. Some things must be nationalized (health care, we say today; roads and public utilities, so said Adam Smith in 1776; and why, by the way, don't right-wing ideologists read and cite him). Some things need regulation (uh, savings-and-loans, as proven by the early 1990s fiasco under Bush I, and the banks, as proven today). And, some things need to be driven by the market (our choices in what foods we want in our restaurants and what clothes styles we want to put on our backs).
     4. Paul Krugman, in a recent NY Times opinion piece, identifies the beginning of the current state of economic crisis with the Reagan administration. This is true. I have recently come across a talk I gave when Reagan was re-elected in 1984, and was reminded again that I never could comprehend his election to either term. What were people thinking? He was not even the jolly person most people made him out to be. He was a vicious anti-communist, anti-unionist, and anti-government-activist. He presented himself as some kind of lollypop libertarian; maybe that's why people think he was sweet. And then, just like the New England Puritans of the 17th century, the Republican presidential leadership proceeded to decline. Bush I (a seemingly good hearted and courageous veteran), along with his country-club, pretty boy running mate -- Dan Quayle, stumbled through a term. Newt Gingrich then steered the Republicans of the 1990s into an Alice-in-Wonderland vision of politics and economics and the future. And, then there was Bush II, a man almost as shocked as William Henry Harrison to be inhabiting the White House. We know the rest about the worst president in American history; Bush II was kind of the "Secretariat" (to use a horse-racing analogy unflattering to that great race horse) of bad and evil politics. Thirty years of wrong ideas, of "spend a lot but don 't tax" policies, of anti-democratic politics, has left most of us with no memory of how politics might be practiced correctly.
     5. Only in the impoverished world, do we see a middle and lower class as dispirited as we find them in the U.S. They have been down so long that just keeping one's job, or keeping a pay check that does not rise with inflation, is seen as a victory to be celebrated. Marx was only partly right in calling "religion the opiate of the masses"; sports, pop culture diversions, and, hey, real opiates, are also part of the "opiate[s] of the masses." Some say that ordinary folk have been "dumbed down." It is worse than that; they have been thoroughly anesthetized against hope and planning for the future. No hope and no planning are emblematic of societies of the poor throughout the world.
     6. How long have we put up with fighting the brush fires of idiotic right-wing political and religious groups and advocates. OK, abortion is not a good thing; but given sex education in the U.S. (and many other places) it is at least a necessary "evil." Plus, as a man, I expect to have authority over my body; women should too.  Darwinian evolution is right, insofar as every credible scientific test has been applied against it. Schools are not places over which parents en masse should determine curriculum or how subjects should be taught. Parents must insist on the production of good teachers, and then get out of the way. Being "gay" or "lesbian" is natural; "homosexuality," for want of a better comprehensive word, has existed from ancient times to the present. The only debate is how many people are naturally gay or lesbian; and that, my friends, is a discussion just too, too boring for me to address. Stupid cultural and moral issues are exhausting and diverting from real issues regarding how millions of real people are to live their real lives well.
     7. No one, from teenagers to the enfeebled elderly, are "entitled" to all that they claim. Yes, the young should be educated and protected. Yes, the elderly should be cared for in a humane and caring way. After that, it is all a matter of how far a society wants to go to enhance these protections without extending false expectations. If you are a lazy and not very bright teenager, you should expect the consequences of those twin failings -- one outside your control, the other supposedly within it. If you are a cranky, contentious, and poor senior, you should expect something less than luxury and fawning attention from those around you. There is no historical imperative that any age group should lead a life of sybaritic ease, or that ennui is the correct and expected response to unfulfilled expectations.
    So, with these 7 Deadly Sins still in full play, I am not anticipating seeing anything like the changes to politics and society that, only a few months ago, I thought might be possible in my lifetime.
[For those who think I am being harsh regarding the Obama administration, please read Kevin Baker's article, "Barack Hoover Obama:  The Best and the Brightest Blow It Again," Harper's Magazine, July, 2009]