Monday, November 30, 2009

American Exceptionalism: History and Reality Denied

Ignoring American claims of “exceptionalism” is as impossible as ignoring the U. S. as a whole. Exceptionalism, nationalism, and identity are welded together in the American psyche to form a shield that seems impervious to the facts of comparative history or the conclusions of rational discourse or even the mere use of close observation.*

A recent opinion page by Thomas L. Friedman in the NY Times resurrects some of the distended delusions that can follow from the over-deployment of the exceptionalist claim. In “Advice from Grandma” (Nov. 11, 2009), Friedman, to his great credit, concludes with an excellent prescription regarding citizenship and politics in order that the U. S. avoid a “suboptimal” future in the new competitive world economy. (I have offered a similar prescription at various times in my blog page). But, before offering this prescription, Friedman adds to some of the old canards about American exceptionalism. He begins by noting disapprovingly that some would claim that while the 19th century was “owned” by Great Britain and the 20th century was “owned” by the U. S., the 21st century is to be owned by China. Unwilling to concede the last part of this equation, Friedman goes on to argue contrarily that the U. S. will still maintain dominance as a great world economic power through the special genius of American “imagination.” Citing the Apple iPod as an example, Friedman proceeds to argue that America “is still the world’s greatest dream machine.” This is American exceptionalism a la Friedman today.

So, there we have it, the latest in a long line of characteristics that presumably make the U. S. not only “exceptional” but perhaps even outside the normal constraints of history as well. In 1629, it was the John Winthrop’s claim of “a city upon a hill” for all to see (and admire) that established the first exceptionalist claim; that lasted less than a century. In the late 18th century it was the American Revolution, which was deemed exceptional in that, unlike the French Revolution, it led to “republican virtue.” In the early 19th century, with “republican virtue” all but invisible, it was Jacksonian democracy and individualism; and then it was American expansionism and “manifest destiny” or, in other words, the ability of the U. S. to steal lands formally belonging to Mexico. In the latter half of the 19th century, Americans claimed they were exceptional in their brand of industrial expansion, and in their invention of the modern corporation. And, they crowed, the American “dream machine” was emblemized by the inventions of Thomas Edison -- who promised a major, life-changing, invention every few months of so, followed closely by the mass production genius of Henry Ford. In the late 20th century, the continental empire of vast resources combined with “democratization” of “lesser” people, who had fallen under American military or economic suzerainty, that marked the rewards of American exceptionalism. By the mid-20th century we were all watching a fifteen-minute show on television named “Industry on Parade”; the industry was all American, all of the time, and endless in its promise of growth and prosperity (a laughable visual image today). By the late 20th century, we were told that American exceptionalism could be found in its ultra-advanced economy, which wedded finance, industry, service industries and technology in a way the rest of the world could only envy. By the end of the 20th century, this exceptionalism had been pared back to claims of technological and educational superiority. Then came the tech stock collapse and the realization that Asians and Indians and “even” the Irish had developed far better educational and mathematical skills than Americans could match.

So, what is left? What is left are still vague claims of superiorities of American character. As hard as this is to believe, American exceptionalism remains as a substitute for history and historical understanding in the mind of even well-educated Americans (e.g., Friedman’s “imagination” claim). But the experience we have garnered from globalization and world trade is that all cultures have about the same abilities – given the chance for a level playing field – to succeed. If you want to see “imagination” at work, just watch any group of poor people around the world, who through their vast capacities for invention find ways to raise food, provide shelter to their families, and sometimes even to advance the education of their children.

Globalization is here to stay, and with it new ideas about law and rights and the protection of all of the world’s citizens in fundamental ways – health care, decent shelter, clean water, good food, education, security, and employment – must be addressed and instituted. Exceptionalism is the last bastion of the 19th century nation-state. It is the moral equivalent of claims about racial superiority and inferiority. And, we can only avoid global ecological and environmental disasters if we abandon exceptionalisms of all types. If the world community can make any progress on these fronts, we will have all made ourselves more exceptional than any people who ever lived in the past.


* I have addressed this issue academically in, “ ‘And We Burned Down the White House, Too’: American History, Canadian Undergraduates, and Nationalism,” The History Teacher 37, no. 3 (May 2004); and, re-published, in part, as “American History, Canadian Undergraduates, and Nationalism” in Carl Guarneri and James Davis, eds., Teaching American History in a Global Context (2008).

Wednesday, November 18, 2009

"Mac vs PC"

We have all seen the “I am a Mac, and I’m a PC” television ads. Justin Long (“Mac” – a skilled actor who retains a boyish appearance and cultivates a suave yet youthful demeanor) and John Hodgman (“PC” – a wildly inventive and always slightly over-weight comedian) stand together against a blank white background. “PC” usually opens with seemingly limitless optimism about the product he represents, only to be disappointed, and then embarrassed when his latest defense of the PC and its operating systems and its applications is utterly undermined. Of course, the actual superiority of Apple computers challenges credulity, and we are all encouraged to wink at some of the near falsehoods at the fuzzy edges of these ads – falsehoods somehow made unimportant in the general spirit of humor and entertainment.

It is these ads, in fact, that suggest the first of three observations I want to make about how Mac computers have come to be symbolically representative of important and controversial elements coursing through contemporary North American culture. For many decades now, the “hard sell” approach to advertising has been revitalized. Billy Mays’s irritating voice (now silenced by his early death) and carnival hucksters like Vince Shlomi (“Shamwow”) are just the grotesque edge of the hard sell approach. The novel “Winesburg, Ohio” suggested long ago that Americans were attracted to grotesques (although one thinks that Sherwood Anderson did not really have these characters fully in mind). Fifty years ago, all of my teenaged friends and acquaintances sneered at the hokey lies of the hard sell. How could anyone succumb to lies and rants of these men (yes, they were all men; watch the TV show “Mad Men”)? We all loved the VW ads or any product that had even a hint of self-deprecation in their message.

The interesting thing about the “Mac” ads is the subtle soft sell underlying the blunt implication that Macs are better than PCs. This soft sell is founded on the underlying decency of both central characters – “Mac” and “PC.” They meet like well-meaning acquaintances, if not quite friends. In fact, they seem to genuinely like each other. There is no muscle flexing and fist-pumping. “Mac” cringes at “PC’s” humiliations and sympathizes with his failures even as “PC” seeks to convince his counterpart of at least some redeeming qualities to his product. True dialogue is attempted. “Mac” remains open to “PC’s” repeated entreaties. The message remains that a Mac is a better computer but that there is a place for PCs in this world as well. There is a hint of the old idea of “market share” as opposed to a Hobbesian war of “all against all.” One would almost think we were back in the 1960s. It is these Mac ads, through their contrast with standard hard-sell ads, that reminds us of the relentless marketing barrage we are exposed to in contemporary times – a barrage that is often aggressive, visceral, and visually manipulative.

The continued existence of Mac computers leads to my second observation: this one is about the centripetal forces of what may start out as capitalism but become something else. No one can agree on the “market share” of Apple computers. Estimates range from 3% to over 20%. It all depends on what you are counting. Nevertheless, Apple computers will never dominate the computer market – not even the personal computer market. This is a consequence of clear forces (not truly “market” ones) that were applied early on in the history of computers. Microsoft on created an early monopoly on operating system software. With no large competitors in the business market, they established the dominance of PCs in every medium-to-large workplace. All institutions had to follow. In my university, only two or three departments (my history department was one) used computers early on. We all used Apple computers. In the late 1980s, as computers became ubiquitous, institutions such as ours adopted Microsoft’s operating systems and IBM hardware. Those of us who clung to Mac’s (and that would be everyone who started with one) were marginalized in many ways. We had more trouble communicating through the university system. We had far, far less tech support. Our then president even told one unit manager that she needed to get rid of “that garbage” (Macs) that she had been using in her unit. Whether it wants to or not, Apple can never intercede in a market that is so interlinked. Much of what passes for capitalism is in fact collusion at best and monopoly at worst. There are no ways, not even anti-trust legislation, to stop this juggernaut, and the experience of Apple computers proves it.

Last week I attended a lecture on why we should still read Charles Darwin’s “Origin of the Species." The speaker pointed out that evolutionary theory has advanced substantially from its origins in the 19th century, and that many contingencies – genetic, molecular, cultural and human – intercede to modify evolutionary change. For some odd reason, I thought of Mac computers. If some kind of genetic fitness were the sole governing element, Mac computers would be the overwhelming favorite of all personal computer users. Their operating systems, especially from OS X through “Snow Leopard,” are superior in every way. They are more reliable, intuitive, and sophisticated. The Apple operating systems have better graphics than Microsoft systems, and always have had. The artistic design of Mac personal computers (even some of the retro models) have always been far in advance of the gray-flannel-suit appearance of PCs. So, why has Apple not dominated and driven out the inferior species? Instead, it is as if Neanderthal wiped out humans. Some of the answers are suggested above in regard to faux capitalism. Other answers are cultural. Apple designers – especially Steve Wasniak – just seemed too much on the cultural margins of North American life. Macs were too “artsy.” They had too little gravitas. Macs seemed to be the technological equivalent of youthful rebellion. For a highly – and I DO mean highly – conformist culture like that in the U. S. and Canada, Macs just seemed too trivial. [Apple Ipods would be another matter since they were, initially, not part of an interconnected business culture. They did not do our work. Ipods merely entertained. (Although all of that, along with Iphones and Blackberries are going to change the notions of frivolity attached to MP3 players and Ipods)]. Evolution, therefore, is a lot more complex, based on a lot more SHIFTING contingencies, than many scientists would like to believe.

Oh, and I guess it is clear that I have always used Apple computers.

Friday, November 6, 2009

In Defense of Youth

A recent article in the National Post (Oct. 17, 2009) by Robert Fulford, the celebrated literary editor and journalist, has been gnawing at me since I read it. Entitled, “The Teenage-ification of Manhood,” it was the last in a series of articles by the Post regarding the tendency for modern young people – and too often not so young people -- to become adults at a later and later age, and often not to “grow up” (whatever that means) at all. It is a cruel column, and I dare say the preceding editorials on this subject, were equally mean. There is no need to rebut Fulford’s and the Post’s claims about a long road to adulthood, but there is a need to challenge just about everything else.

As usual, there is a problem with his use of words. For example, Fulford identifies “teenagers” as a social group (now apparently permanent) that emerged after the 1940s. These teenagers, Fulford sneers, are made up of “self-important newcomers” who have constituted themselves as something other than “just adults-in-waiting.” Fulford nods at retailers and overly generous parents as culprits in the creation of this class but insinuates throughout that teenagers themselves are responsible for their continued shallowness and selfishness, for being insouciant slackers who refuse the responsibilities the world has thrust upon them. The truth, of course, is that consumer-capitalism almost single-handedly created and single-handedly continues to maintain, the teenage condition. As can be seen by looking at developed and developing and so-called underdeveloped cultures, the crass consumerist and capitalist underpinnings of modernization and popular culture are all that support the continued existence of the sociological phenomena called the “teenager.” Without these underpinnings, we might resurrect the less derogatory, more benign, more agreeable terms of “young people” or “youth.”

Fulford also implies that to become and remain “adult” is the goal of human existence. It is the ultimate stage of accomplishment in one’s journey through life. An “adult” is superior to a child or an adolescent or a teenager. All conditions and stages of life other than adulthood, he and many other people unthinkingly suggest, are precedent to adulthood and are therefore necessarily incomplete and flawed stages when we consider them in isolation from the goal of adulthood. The calculus is clear: to be an adult is to be mature; to be mature is to be virtuous; to be virtuous is to be rational, emotionally composed, and willing to take responsibility for one’s actions.

This calculus for virtuous adulthood and deficient young people simply does not hold true. In fact, it might be stood on its head. We might say, without much exaggeration, that most adults are persons who have made up their minds about everything important. They are people who have fixed political and social and cultural and moral views. They have stopped growing intellectually and often morally. They have settled for their job, for their old opinions, for their old prejudices. They have given up and have often become cynical about most of the value-laden aspects of the world around them.

We might say, without much exaggeration, that most youth are persons who continue to explore different things in life. They are open to political, social, moral, and cultural change and improvement. They continue to grow intellectually and often morally. They have not settled on an occupation; they abandon old opinions for better new ones; they have not given up.

As for “maturity,” it strikes me that adults and youth (not infants and small children, of course) are about equal in the employment of rationality, emotional composure, and taking responsibility for their own actions. In my personal experience with university students, I believe that “youth” outscores “adults” in all of these categories.

But the condemnation of today’s adults must go further than that. People of my generation (I am on the cusp of being a “baby-boomer,” depending on which demographer you care to cite, and I was a “teenager”) have had many of the advantages of today’s youth, in regard to recreation and possessions and cultural opportunities. Robert Fulford and I have hardly suffered. In addition to that, we have prospered in our adulthood. We received excellent educations at no, or little, financial cost to ourselves. As “adults” we were able to buy houses and stereos and nice automobiles and sometimes even take comfortable vacations. Some of us are even secure in retirement.

Only those who are really old, those who no longer have any contact with vibrant youth, can have the gall to claim that young people are avoiding adulthood in order to continue their lives of play and irresponsibility. When I was first in graduate school I was sometimes accused (by individuals or by the press) of remaining in university to avoid “growing up.” Then, as Vietnam exploded, I was accused (by the same types of people) of remaining in university in order to avoid the draft (I was 1A through my grad school years and could have been called up at any time; I foolishly would have gone). Young people today have it worse. They are told to get good careers and to anchor themselves by establishing their own homes while at the same time society tells them they will never have “permanent” jobs but must continually re-tool themselves for ever-shifting workplace demands.

Incredibly, many young people attempt to conform to this contorted culture and to find their place in this near-impossible economic environment. I know of many students who have a university degree and also have acquired a practical craft skill. I know of many others who have one or two undergraduate degrees and usually a post-graduate degree. I even know some who have multiple graduate degrees. Many of these young people also have extensive volunteer experience. Almost all of them have enormous -- corruptly proffered and enforced, I might add -- student loans (for which they are blamed by “adults” who had to pay little on no tuition themselves). Remarkably, almost all have accommodated themselves – without anger – to having less hope for success and resources than those generations who preceded them (you know, those “adults” who have shifted the blame to the phantom character flaws of youth today). Many young people, of course, have been unable to overcome such the ludicrous Sisyphean challenge placed before them and must eke out what satisfactions they can in life even if they must prey on their parents good will to do so.

So, if you want to retard the aging process quit categorizing and criticizing all young people. Start spending some time with the young people around you. You will become more rational, emotionally composed, and, in the process, you might take more responsibility for yourself and the societal flaws you helped to bring about.