Wednesday, December 19, 2012

It's the Culture, Stupid


   The U. S. has once again proven that it is a culture in very bad shape, if not near collapse (taking the long historical perspective, in other words -- not tomorrow). The Newtown, Connecticut massacre reaffirms the “exceptionalism” of the U. S., especially in relationship to the grotesque possibilities of civil violence. Tortured miss-readings of the Second Amendment to the U. S. Constitution coupled with a machismo vision of rugged individualism have insured the continuance of dramatic acts of killing that no other modern nation has experienced, or likely understands. Most of the rest of the world, embarrassed by but resigned to the bizarre culture of the U. S., can only mourn the consequences of a culture that no longer carries the flag of humanity in regard to firearms. But the rest of the world does not matter to the U. S., unless historical events in other nations can be used to buffer the perverted violence that is so prolific in the U. S. A. To his great credit, Mayor Bloomberg of N. Y. C. alone has spoken truth to power and ignorance, pointing out that the repeated massacres in the U. S. are not found in such repetition in other countries. Politicians and media folk are often too timid to make this stark comparison.
   While the President of the U. S. wants to open a dialogue on what can be done to change things – which, given his record, may mean bargaining all effective action down to the level of ineffectualness, most other Americans likewise look to half measures to solve the problem of gun-based massacres. Restrictions on assault weapons and limitations on magazine capacities will lower the number of casualties – to a degree. Other measures, designed to restrict the use of guns, all pale in comparison to Chris Rock’s comic, but oddly more effective argument, that weapons should be freely allowed but bullets should cost $50,000 each. Sadly, Rock’s solution is better than some serious solutions we have heard and will hear.
   Part of the problem is easy access to weapons and ammunition, particularly in a country with 250 to 300 million firearms. A bigger part of the problem is a nation that misinterprets the Second Amendment to the Constitution. The Second Amendment was proposed and ratified only ten years after the American Revolution had ended. In most men’s minds, militias were an integral part of American success in the War for Independence, and memories of English restrictions of all sorts were also fresh in everyone’s memory. In fact, however, George Washington, and every other intelligent participant in the war, realized that an organized, regular army was the most important element in victory. Militias had been “irregular” in almost every way during the Revolution, including stability and success. Still, national myth held militias in high esteem, and individual valor, actually uncommon in the revolutionary war, was promoted as a means of self-congratulation, despite the fact that the winning of the war was largely a consequence of French intervention.
   No one in the early republic (1789-1815) that followed could imagine how “arms” would change in their nature and potency. Effective repeating rifles did not appear for another one hundred years. As an American historian I think I can say, and I think anyone can safely say, that none of the so-called “founding fathers” would condone 2nd Amendment protection for modern weapons, even pistols with clips of fewer than eight bullets. All would be appalled at the culture of weaponry protected by the 2nd Amendment today.
   The biggest problem, however, is a culture that has come to glorify a rare form of brutish individualism and an anarchistic definition of freedom. It is evident in everything from unrestricted capitalism to the banality of television to the celebration of violence in entertainment and sports. Kindness, gentleness, and a commitment to contentment are all but invisible in the U. S. Good young men – those who do not commit violence and resolutely resist the visceral culture of violence -- have little status in this society. Instead, they are encouraged to “man up,” which in some places means – buy a weapon and use it for something.
   Bill Clinton’s and James Carville’s declaration in the 1990s that “It’s the economy, stupid,” may have seemed a shrewdly focused political battle cry. It was ,in fact, a narrow vision of what was needed then, and now. For thinking persons to declare, “It’s the culture, stupid,” which is a more accurate understanding of where matters rest, throws up a challenge that currently appears impossible for Americans to meet, challenge, and change. As I have said before, twenty-five to thirty per cent of the American public know and understand America’s problems very well, including this problem of gun violence. They think critically and carefully, with an open generosity and kindness seen among few people on this earth. They are not the majority, however, in a society that too often lives by vaguely understood precepts, myths and slogans.

Saturday, November 10, 2012

The U. S. Election of 2012


(The following is an extended and modified version of a talk given to the Southern Alberta Council on Public Affairs on Nov. 8, 2012)

Sound and Fury, Signifying What?: The Elusive Election Campaign of 2012 and the Fundamental Political Culture Behind It

The U. S. election of 2012 was very important – for what it prevented. It prevented the formulation of a U. S. Supreme Court that would have had a decided majority on the far right. That court would likely have wreaked havoc on women’s rights and affirmative action. That court almost certainly would have made necessary government regulations beyond the reach of Congressional legislation. The election possibly prevented a mindless assault on entitlement programs like Medicare and Medicaid. And, it prevented a new president with reckless and idiotic views on taxation and deficit-debt reduction from doing something that might have caused the Great Depression II.
Political vision and progressive reform were not victors in that election. Because Mr. Obama seemed to promise so much in 2008 (without specifying what grand visions he proposed to initiate), and because Mr. Obama stood in such sharp contrast to possibly the worst president in American History – George W. Bush, it was assumed that the differences in governance of Mr. Obama and the Democrats from the Republicans would be vast. He seemed to have a mandate, despite Republican control of the House of Representatives. Reform and reformers should have been in a position from 2009 forward that would have made Bill Clinton’s derisive mockery of the Republicans in 2012 -- “We left him a total mess, he hasn't finished cleaning it up yet, so fire him and put us back in” – the first, last, and only words necessary to re-elect Mr. Obama and the Democrats. They were not. Mr. Obama and the Democrats instead squandered their moral authority with modest accomplishments, timid compromises, and half measures. Aside from a modest stimulus, a no-brainer salvaging of the auto industry, and a confusing Affordable Health Care Act --- all of which merely represented shoring up private, consumer capitalism as it has been promoted by the Republicans during the last 30 years – Mr. Obama and the Democrats did not take on any massive re-working of the economy or initiate any substantial vision for the future.
Mr. Obama has been astoundingly vague and bland about the vision thing, allowing political low life like Sarah Palin to taunt him with comments like, “How is that hopey, changey thing going for you Mr. President.” Mr. Obama speaks with the glib solemn voice and tone of nineteenth-century politicos (not with an academic voice, as he is sometimes accused) about what Americans need and want and where the country should go. Bill Ivey, author of Handmaking America, a just published book that addresses the sorry matter of how directionless Americans have become, quotes an address the President Obama gave in 2010:
What has defined us as a nation since our founding is the capacity to shape our destiny – our determination to fight for the America we want for our children. Even if we’re unsure exactly what that looks like. Even if we don’t yet know precisely how we’re going to get there. We know we’ll get there.
How glib, how empty, how visionless, even how befuddled, are words like those. Ivey correctly calls this statement one of “vision vacuum” and “leaderly drift.”
The fact that Mr. Romney was even worse about vision and reform provides little comfort. His 2012 primary and general election campaign represented the most extreme use of delusion and flip-flopping that presidential politics has ever seen. He seemed to support both the most reactionary Republican positions on abortion and women’s rights to their own bodies while hinting that he would not actively promote the radical Republican campaign platform on which he was nominated. He promised to overhaul entitlements but to protect Medicare and Medicaid in manner that suggested a shell game. To further his shell game “now-you-see-it-now-you—don’t, he promised to lower the deficit while not raising taxes -- in fact lowering taxes by 20% on the middle class – although he offered no evidence as to how this would happen, despite the undisputed evidence of independent researchers that Mr. Romney’s numbers simply did not, and could never, add up. Chameleons would be embarrassed to be compared to Mr. Romney; they don’t change color that fast or often. It is no coincidence that “fact-checking” media, as re-reported by Michael Enright on the CBC program “Sunday Morning,” have determined that Mr. Romney’s campaign was filled with contextual lies, lies, and “pants on fire” lies in 46% of pro-Romney advertising; Mr. Obama’s campaign, it is sad to say, did so 28% of the time.
In short, neither Mr. Obama nor Mr. Romney offered anything approaching a vision for the future of the U. S.
It came as no real surprise, therefore, that they also dodged the issues. Because the economy and jobs are so important in the U. S. today, both candidates seemed to be focused in addressing economic matters almost exclusively in their campaigns. But did they? Mr. Obama talked about the past – saving the auto industry, and banks, and Wall Street. But what did he propose for the future? Almost nothing – no new stimulus, no new health care reform, no entitlement reform, etc. He promised, essentially, to “stay the course,” whatever that is. Paul Ryan, the Republican V.P. candidate, was right to challenge Mr. Obama by asking what would be different in the next four years if Mr. Obama was elected. He has not answered, and we do not know.
As I have already said, Mr. Romney dodged the issue of the economy even more, simply arguing by the end of the campaign that as a successful business man, his election would, in and of itself, instill confidence in American business and lead to the revival of the economy and the restoration of jobs in the old trickle down fashion Republicans have argued since the 1870s. David Brooks of the NYTimes, a usually intelligent if misguided analyst, declared without a shred of evidence that he thought Mr. Romney better able to make “big changes.” This was blind hope at its worst.
Look at what issues the candidates did not address at all. First, despite the fact that the campaign ended with an enormous super storm – “Sandy” – neither Mr. Obama nor Mr. Romney raised the issue of climate change, despite the fact that storms like this one are predicted as a likely consequence of ocean temperatures rising. Secondly, while the infrastructure of the U. S. crumbles, neither candidate this time around suggested a program or programs to address this matter. Dwight Eisenhower was a staunch private enterprise Republican who hated federal government spending, but even he suggested, and helped bring into existence a vast interstate highway system through a combined private-government effort. Thirdly, despite the blatant villainy of banks and Wall Street in initiating the economic crisis, and despite the fact that they were bailed out, neither campaign took up the public cry for new and necessary bank and Wall Street regulations. In the New Deal, some bankers and Wall Street criminals were at least prosecuted, and the public today would support such action if either party had courage enough to pursue those cases. Finally, they refused to mention the Supreme Court. The new president will be making swing vote appointments to the Supreme Court, meaning that court would have become radically right wing under Mr. Romney but now probably moderately centrist under Mr. Obama.  All of these, and other issues, are of immediate importance.
In short, both candidates either avoided the very big issues altogether or were glib about what they might do on the matters they did raise.
The candidates also campaigned locally to a very small sliver of the American electorate, those found in swing states. Among these, the states of Ohio, Florida, and Virginia received most of the attention, along with Colorado, and to a lesser extent Wisconsin. Why? Analysts tell us it is because all of the other regions of the U. S. were already decided, and they were already decided because like-minded voters have tended to clump together, either in large cities or gated communities, leaving whole states already secured in the pocket of one party or the other. Thus, presidential candidates have not even bothered to speak to the vast majority of American voters, except indirectly, and indirectly means glibly, without having to ponder regional and local matters of federal import.
The Electoral College, with its winner-take-all application in most states, is partly to blame. Remedies for that have been suggested but will unlikely pass since it is not in the interest of the dominant party in such states as California or Alabama to change it.
A culture of ideology -- which forms when social discourse has broken down and no good, pragmatic, workable ideas are available -- is also to blame for these two political worlds that do not meet and do not speak to one another. The goal of unrestrained, unregulated corporate capitalism became the real religion of most of America with the election of Ronald Reagan, who declared government to be the problem, not the solution. So, from the late 1980s onward, all Republicans and then, slowly, most Democrats subscribed to this new version of how economies are superior to societies and promoted again – as they had in the 1870s through the 1890s, and in the 1920s -- a kind of rugged individualistic, almost libertarian view of how life should be lived. Behind the scenes, modern, corporate capitalism and consumerism, which have infected almost every nook and cranny of public life, were the real benefactors.  The opposing ideology, of some Democrats, does little to challenge this Ayn Rand idea of laissez faire joined to hyper-individualism. But the only “liberal” thing left about the Democrats has flowed out of the “New Left” movement of the 1960s and 1970s, which disdained the old left’s concern for economic issues and instead promoted culture issues social liberation, including racial equality, women’s rights and equality, pro-choice, and the rights of the marginalized.
In other words, democratic dialogue has ended in the U. S., and it was astounding to watch Barack Obama spend his first four years trying to restore it among people who have not even begun to think outside their ideological boxes and move toward cooperation and compromise.
Lack of vision or major issues, and a campaign that addressed a small part of the electorate, did not lessen the ferocity of the campaign or its extraordinary cost. All of this reminded me of the politics of the Gilded Age (between the late 1870s and the late 1890s). In that period of poorly regulated industrial capitalism and unrestricted Wall Street power, the Democrats and the Republicans shared the same fundamental beliefs and interests. Both major parties supported the Robber Barons of industry. By the late 1880s, for example, John D. Rockefeller had bribed almost all of Pennsylvania’s legislators, including those of both parties. Pork barrel and earmarked legislation prevailed, as it has in our own time. Mr. Obama and Mr. Romney both have ties to Wall Street. Both parties are lobbied by the same financial interests, and both receive campaigning financing support from those interests. Therefore, Mr. Obama does not speak like Franklin Delano Roosevelt, who, upon election in 1932, immediately pushed bank reform, condemned the “economic royalists,” proposed and had passed a “Wealth Tax,” spent enormous sums to employ the unemployed, and attacked the Supreme Court for its backward ideas and malicious obstructionism. Mr. Obama has preferred to fashion himself after Abraham Lincoln (a Republican), in his constant quest to shape his own personal character even though the urgent matter is the political, economic, and cultural condition of the U. S. as a whole. Mr. Romney, emulating the Rockefellers and Carnegies of the Gilded Age, has been more brazen and unapologetic, paying a mere 15% in taxes on his enormous wealth, and sending some of his assets to safe tax havens offshore.
In the Gilded Age, the narrowness in the essential differences between the two leading parties meant that elections were closely contested, and thus more angry and violent. In 1876, Rutherford B. Hayes, the Republican, gained fewer votes than his rival, Samuel Tilden, yet became president after a tangled and corrupt recount of votes (remember G. W. Bush and Al Gore in 2000, when Gore got a half million more popular votes than Bush but was defeated by a Supreme Court that ordered an end to the recount in Florida?). In 1880, Winfield Scott Hancock (Demo.) lost to James A. Garfield by .1% of the vote. In 1884, Grover Cleveland (Demo) beat James G. Blaine (Rep.) by .3%, after which Cleveland lost to Benjamin Harrison in 1888 despite beating Harrison by .8% in the popular vote. Then Cleveland came back in 1892 to be the only president to win a second term after losing the previous one, through a split election among the Democrats, Republicans, and Populists. The 2012 election looks like a comfortable win for Mr. Obama when one considers the electoral vote, but any examination of the popular vote, especially in swing states, shows that Mr. Obama’s two million odd majority was in fact a slender victory, not a mandate.
The big issues of the Gilded Age were the currency (hard or paper) and the tariff (high or moderate). (I once had to take a course on the currency and the tariff in this period in grad school :  at 8am!!). In other words, despite starving farmers, militant workers, rich new economic and social reform ideas, and cities and towns in disrepair, the Gilded Age only aroused currency and the tariff as central issues because both parties and their leaders were wedded to banks and Wall Street. Elections were tight because the two parties were so similar, or at least appeared to be. From 2000 to the present, elections have also been tight, but the candidates have not appeared to be all that different in where they are coming from and the extent of reforms and changes they propose.

So, both presidential candidates offered no vision, avoided some of the biggest issues of our time, campaigned narrowly, and, in kowtowing to the large financial interests of the U. S., presented themselves as latter-day Gilded Age politicians.

Why have we come to this state of affairs? What cultural underpinnings have brought us to this point of political stasis and compromised democracy? Let me offer three, sometimes linked, reasons.

I. The Tyranny of Corporatism
In the 1870s through the 1890s, and in the 1920s until the Great Depression, Americans embraced the rule of the unregulated corporate capitalism instead of democratic choice. After the election of Ronald Reagan in 1980, first the Republicans and then many Democrats re-opened the pandora’s box of corporatism. The New Corporatism first promoted the view that privatization was usually superior to public ownership or control, and has now built to privatization always being superior to public ownership or control – of anything. Now there is no acknowledgement that anything falls within the public interest. Secondly, corporatism has come to claim that democracy succeeds to a free market system, that is, democracy is born from the free market and is dependent upon it. Therefore, as John Ralston Saul put it in his 1995 book, The Unconscious Civilization, “the citizen is reduced to the status of a subject at the foot of the throne of the marketplace.” (p.76) The marketplace is to be the eternal mother of democracy. All things flow from the economy, and all things are secondary to it. Thirdly, corporatism holds that individualism is the most important thing to be protected, in fact, almost the only thing to be protected. Again, as Saul has said, “There are those who talk about individualism as if it were a replacement for government.” (p. 73). The individual, unencumbered by society and government, is trumpeted as the hero of a mythical corporatist world. Corporatism even suggests, in contradiction of thousands of years of intellectual reflection on the nature of society, that the individual and their families are the only reality, with society being a mere fiction.
This time around, corporatism has not meant economic corporations and businesses exclusively. This time, corporatism is more total, including big labor unions, whether auto workers or teachers; the media, which seeks truth second, if at all, to satisfying their advertisers and getting more of them; institutions of higher education, which seeks the shaping of students into able citizens and members of society second, if at all, to selling credentials for employment; and, organized radical religions, which sometimes proclaim that the only truth is the Bible, which somehow is made to comport with corporatist goals, “God’s Will,” which is used as a last retort to democratic action on any issue whatsoever. “We are – almost all of us – employees in some sort of corporation,” John Ralston Saul says, “public or private, . . . . [and as such our] primary obligation is loyalty to the corporation.” (p. 91).
The idea of a collective society, real and necessary, has largely been abandoned. How many times did you hear Mr. Obama or Mr. Romney use these phrases:  “the general welfare,” “the public interest,” “the public good,” “the common good,” or “the improvement of society as a whole.” Corporatists have conducted a long smear campaign against these phrases, suggesting that the public good is code for “big government spending, “liberalism,” or worse – “socialism,” or even worse than that – “communism,” and thereby have succeeded in repressing any viable notion of the public good. 
Meanwhile, the U.S.A., outside of this corporatist framework, exists primarily as a shell, its individuals alienated from nature and society alike. The United States of America languishes in emblematic form – as the American flag or Uncle Sam or the pledge of allegiance; or as a slogan – as “the greatest nation ever”; or as a chant at sporting contests -- “USA, USA, USA”; or as the leading military power in the world, able to impose “shock and awe,” even if directionless and impotent in creating peace, justice, and liberty.

II. “Entertaining Ourselves To Death”
   It is compulsory nowadays to first recognize every American election for what they have become:  day-after-day, month-after-month, sometimes year-after-year soap opera entertainments, or what media folks call, without apparent embarrassment, “horse race politics.”
“Horse Race Politics” are the consequence, in part, of fixed date elections, which allow parties and candidates to maintain permanent campaigns between elections. The rise of American popular political journalism in the 1790s, made even more loud and anxious over the last two centuries, has given us one political campaign after another of bombast, ballyhoo, and feigned importance. Modern factors have made matters worse. The rise of television, and even computer networking, has elevated things like a candidate’s “appearance” and “likeability” quotient into the most important elements in a campaign. In 1789, George Washington was elected the first president of the U. S. on the basis of his character; few would argue that he was likeable, personable, or attractive. The addition of PACs (Political Action Committees) in the 1970s and 1980s gave candidates access to “arms length” negative advertising whereby dubious “truths” taken out of context could be applied to one’s opponent. The U. S. Supreme Court’s decision in Citizen’s United v. Federal Election Commission in 2010 extended the recognition of corporations as citizens to include the 1st Amendment free speech rights of individual citizens (for whom it was intended), thereby allowing even more enormous funding by PACs – at even further “arms length” from candidates – to say almost whatever they wanted about candidates. Of late, this decision seems to have further distended the idea of free speech, and who should have it, to allow employers to recommend strongly to their employees for whom they should vote (with all of the threats to advancement or employment that may imply).
The consequence of these developments has not been to create an electorate richly informed about candidates and politics, but to distort ideologies and issues and personal characteristics further. In the main, they have furthered a “dumbing down” in politics, or what Neil Postman called “entertaining ourselves to death” in his 1985 book of that title (Entertaining Ourselves to Death:  Public Discourse in the Age of Show Business). According to Postman, Americans have traded rights and responsibilities for medicated bliss. “Form” in the television age, he argued (and I’m certain he would agree, in the age of the internet and twitter), “excludes the content,” or, as Marshall McLuhan put it, “the medium is the message.” Any new edition of his book would have to have the title:  Entertained and Now Dead. The public is now fully anesthetized insofar as they pretend to be citizens.


III. The Myth of Exceptionalism
During his first term as president, Mr. Obama was accused by his opposition of “apologizing” for America abroad, and, of not believing in American “exceptionalism.” He campaigned doggedly to reverse that impression, often alluding to “America as the greatest nation,” one imbued with “exceptional” qualities and having an “exceptional” future.
“Exceptionalism” is, of course, code for, “we are better than everybody else”; if otherwise, Americans would readily admit that other nations are also “exceptional” in their own specific ways. They do not.
“Exceptionalism” also suggests that Americans have shared a culture amongst themselves apart from the rest of the world. They live in, and want to live in, an isolated culture. It began with John Winthrop telling the Puritans before arrival in New England that, they “would be as a City upon a Hill for all to see.” They were to be a religious example, a superior religious example. The American Revolution of 1776-1783 was fought by those who believed the U. S. to be more “virtuous” than England. Americans deluded themselves into believing that they were less corrupt and could be freer than any other people. Europe was their the evil, degenerate other (as it still is), thereby allowing Americans to forget that they won the Revolution in 1781 because of French military and supplies support. When Thomas Jefferson sent the Lewis and Clark expedition west, he envisioned an “empire of liberty,” unlike any other imperialist take over of land in history. The “Empire of Liberty” was justified imperialist conquest. Pretending that the west was uninhabited, or largely so, Americans later embraced the “Manifest Destiny” of their western settlement. Still later yet, during WWI, President Wilson proposed that only the U. S. could “make the world safe for democracy,” after which Americans proceeded to claim victory in WWII and the re-building of Europe as a consequence of their special genius and beneficence (which it partly was).
With the receding of America as a world empirical power, and with what I think is a retreat from military adventurism for the foreseeable future (as I argued upon Mr. Obama’s election in 2008, and repeat here again), American exceptionalism seems to me an excuse for further isolation and isolationism from the outer world. But the increasingly lavish displays of American exceptionalism in all sorts of flag waving and other gestures, seems to me to be filled with fear as well as isolationim. Winthrop did not tell the Puritans on the ship the Arabella that they were to be a “City upon a Hill” because he thought they were morally stronger than other humans, but because he feared they were not. And, that unconscious fear seems to me to be the case with exceptionalism today.
What does this have to do with American politics? It means that image is vastly more important than reason, that politicians must always appeal to this now bizarre idea of exceptionalism, and that the scope for offering vision or new ideas is so limited as to strangle real democracy.

What do Americans need? They must restore some ideas about the public, collective good. Regulations and restraints will not be enough. Qualities of freedom joined to equality, of social community joined to opportunity, and of individual autonomy joined to public responsibility, must be established.
My next blog will attempt to address in more particular fashion, the new progressive era that must be established if the U. S. is not to languish as a riven, angry, alienated, dangerous nation.

Friday, June 1, 2012

Reforming the University


           My recent critical appraisal of modern universities makes necessary this second essay outlining how universities may be improved. Given that the perfect university or college is impossible to achieve and maintain, improvement is all anyone can expect. It is impossible to imagine, for example, that the current corporate model of the university can be fully changed. To do so, would require, first, that governments more fully fund public universities and colleges, and perhaps also that parents and others assume a substantial share of the cost of higher education in order that students do not enter university thinking that, as purchasers of their whole higher education, they are clients who can demand an end product on their terms rather than receive a true education. Funding changes of that order are not likely to happen. The almost religious dominance of free market economy ideology, and political systems and parties that operate under the constraints of that ideology, are incapable of making the most fundamental and necessary reform:  public funding.
            It is just as unlikely that universities will abandon diversity in favor of more limited programs or a more tightly focused unity of programs. The comprehensive, mega-university is here to stay. As my son notes, universities and colleges must, in fact, experience more diversity as long as knowledge continues to expand. He mentions neuroscience but he could have added the vast new possibilities for new fields in the life sciences and physics, which are in turn matched in less scientific fields like anthropology, psychology, and history, where the scope of the subjects studied is enlarged by new subject matter and new theoretical and methodological possibilities. Some practical, applied programs that make up the diverse university will always remain somewhat remote from grander, intellectually purer, abstract or over-arching goals of the university, although the number and influence of these narrow applied programs need not define the university as a whole.
            Given that we will see no revolutionary change in funding and no stopping the burgeoning diversity of the modern university, what is left? Here are a few ideas that I have pondered (to say nothing of having tried to implement) over the past 50 years:
1. Goals of Higher Education – Because universities and colleges have come to assume that their very existence is based on educating for employment, they have mistakenly drawn the conclusion that they must narrowly target the supposed needs of given professions and occupations. They have thereby pushed right past first principles, that are foundational and primary, to educate to secondary and tertiary goals. That is, they have re-oriented themselves to educate by way of information rather than knowledge, set formulas rather than critical thinking, and narrow skill sets rather than a broad range of methodological abilities. This approach to higher education has its own built in high costs and inevitable failures. Primarily, it educates to the past and not to advancement in the future. Alexis de Tocqueville, in Democracy in America, long ago noticed that Americans interpreted their individual freedom to mean that all avenues of success were open to them, only to discover that as they all individually converged on the latest, seemingly vast and open, opportunity, that opportunity (read: specific job) was suddenly closed and no longer available. And, as Tocqueville noted, as young people swept, like flocks of birds, from one seeming chance to another, they were turned away again and again. I know many people who have a bachelor’s degree, multiple master’s degrees, and sometimes two or more PhDs, largely because of their necessary chase after employment niches that quickly disappear because the number of applicants far exceeds job demands.
            Universities need not have capitulated so easily and so fast to the dead-end, failed approach that Jane Jacobs has dubbed, “credentialing.” Traditional universities are now beginning to experience the competition of for-profit, on-line universities, and if credentials are the be-all and end-all of education, those institutions that become the most lean (hiring the cheapest teachers) and most systematized (low overhead and most target-oriented) and least concerned with a culture of learning will prevail. The idea that one needs to experience education on-campus, among peers and professors, has an increasingly hollow ring among a clientele that sees the traditional university as wildly expensive (as it is), especially when their sole focus is to acquire skills that are marketable at the moment.
            What universities must do, and do so boldly and transparently, is embrace the idea that their primary responsibility no longer involves teaching technical skills or creating a factory floor environment of credential production but to forward the idea that their primary focus will be on teaching and guaranteeing broad intellectual skills, including the ability to think critically, reason, read, write, understand and employ statistics, understand science and know how to use the scientific method, and appreciate and pursue creativity. If undergraduates leave universities with ability in all or most of these areas, with a few professional credentials added into the mix, they will be much more ready for ever-shifting job opportunities and cultural change. Again, this is not to say that these intellectual skills go unattended in universities today but they do not exist as the primary mission of universities, even though some universities purport to do so in the often unread and generally scorned romantic mission statements in their calendars.
            The evidence that reason, writing, thinking and creativity are no longer at the forefront of universities is clear all around us. The humanities, for example, are increasingly seen as irrelevant, and in order to avoid defending this argument of irrelevancy, critics simply charge that post-modernism and its attendant esoterica (go ahead, read Derrida, and you will understand this charge) are the villains. The fact is, much of post-modernism is valuable (though not as ideology), and much activity and teaching in the humanities has nothing to do with post-modernism. Critical thinking is seen as more of a luxury or a bad joke by pundits. Witness Margaret Wente’s recent article in the Globe and Mail, in which she “disses” the humanities and social sciences (the “soft side” of higher education; she herself, by the way, has Arts degrees with majors in English) as useless pursuits with no employment future, hinting that they take up to much of the university footprint (see “Educated for Unemployment,” G and M, May 15, 2012). Individual intellectual advancement in universities has recently been pegged at 7% (how the statistics for this were derived boggles the imagination), and even if that number is ridiculous, experienced university professors would certainly testify that in too many instances this lack of intellectual growth at institutions that are supposed to be all about intellectual life, is not far off the mark (see a recent study by Richard Arum and Josipa Roska entitled, Academically Adrift: Limited Learning on College Campuses).
            If intellectual skills and intellectual rigor are most important for higher education as well as our public and personal well being, how is this rigor and these skills to be achieved? Let me attempt a partial answer under two headings:  “Things we do now but could do better” and “New things to do in the future.”
1. Things universities do now but could do better
a) We still offer degrees in the “arts” and “sciences” not degrees in English or History or Physics or Chemistry. In other words, universities at least still pretend that they are educating students broadly, and they often do so with breadth requirements and other mechanisms to guarantee a true intellectual experience rather than narrow knowledge in a particular field. Universities must be more sincere and insistent about the need for students to have both breadth and depth of education. I, and friends of mine, still think the best formula for undergraduate curricula is to have one-third of student’s undergraduate courses dedicated to the major, one-third to breadth, and one-third to the student’s discretion. We cannot afford to drift toward what appears to be the next logical step for the corporate-credential factory:  degrees that would read “Bachelor of Retail Marketing” or “Bachelor of 20th Century American History.” (I have heard students describe themselves in such a manner, and even proudly to proclaim themselves specialists in some narrow sub-division of knowledge which, by the way, they do not actually possess).
b) Most universities offer first-year courses in the subjects that they offer which have the potential, at least, to be exciting introductions to new ways to see or experience knowledge. The retention of good first-year courses is critical to intellectual development down the road. Many persons will not encounter again subjects they were required to take in their first two years of university. My first two years were among the most intellectually stimulating of my life. The very best teaching must take place in these first-year or introductory course, which usually means employing older, more experienced, professors in these courses. By taking first-year courses seriously, teaching itself may be taken more seriously. When push-comes-to-shove, teaching must will out over every other interest of the university and the professor.
2. New things to do in the future
After a career of batting around different liberal education strategies for providing intellectual skills and intellectual rigor in undergraduate education (including interdisciplinary studies, independent studies, capstone courses, integrated studies for first-year students, and so on) I have settled on two large strategies that I believe all universities should promote at the undergraduate level above all else.
a) The development of good writing skills among all graduates is fundamental. Some universities would be embarrassed if they knew the number of their graduates who could barely write at all. When my wife and I were living in Maastricht (where I was looking at their “problem-based” education programs) we met a very bright woman from Indonesia who was trying to write her PhD dissertation in English, but who had very little knowledge of written English. I have since heard distressing stories about non-English foreign students who have the same problem. Many students maneuver their choice of courses to avoid having to write much at all. Universities would be more embarrassed if they knew how much their graduates read (almost nothing at all, on average). Knowing how to write well requires an extensive background in good reading, so the two matters are naturally joined. Universities will protest that they already are committed to “writing-across-the-curriculum,” but these proclamations of virtue are usually just covers for sweeping the problem under the carpet, and under a cheap carpet at that. I believe that a senior essay requirement for all undergraduates (yes, including those in business school or health or mathematics or music) is the best solution to the writing issue, if it is taken as a priority and taken seriously by all members of the university community. I know that ivy league and superior liberal arts colleges in the U. S. often have this requirement but they have also lost their interest and intensity in administering this requirement, or they have just made the requirement so routine as to be uninteresting. Other than suggesting that good education is getting too expensive, Anthony T. Grafton is not at all clear about his objections when he says:  “Like a string quartet, too, the college cannot improve its productivity if it goes on doing what it has always done:  for example, putting small groups of students into classes run by full members of the faculty, or requiring every senior to write a thesis based on original research, supervised by a professor.” (see Anthony T. Grafton, “Can the Colleges Be Saved?,” NYRB, May 24, 2012). In fact, those two things are essential to creating truly educated undergraduates. Democratic education need not be diluted education, as it has become.
b) A commitment to insure that every student successfully completes a large (probably a semester-long credit course) problem-based course strikes me as another way, along with the senior essay, to cap an undergraduate’s education and send her forward into the world with real credentials. Problem-based courses are not new; they have been employed at McMaster University and Maastricht University for some time. Advocates of this approach are sincerely enthusiastic, once they have seen how they can be implemented. Design is critical, of course, and this essay is too short to address specific design problems and resolutions. Suffice it to say here that problem-based education incorporates creativity in subject selection as well as in approaches to solve the problem. Problem-based approaches also provide the invention and enhancement of research skills, cooperation, reflection, critical thinking, and, perhaps above all, the explicit conjunction of intellectual life and real life (a natural conjunction which should be obvious to all of us but, alas, is not).

Many other specific things can be done to rescue universities from the grip of the econocentric, market-place driven, corporate model. A change in the overall culture of universities would help but that will only come from the implementation of some smaller, targeted improvements. The true university has long existed and it, or some facsimile of it, will always exist as long as there are people who believe, as did Cardinal John Henry Newman, that real education involves knowing and educating students one by one, not by employing some kind of factory output method (see Newman’s The Idea of the University).
  

Sunday, March 11, 2012

The Tragedy of Modern Universities


            Aside from the debilitating wars of the last half century, no issue of public affairs has been more tragic than the institutional evolution of North American universities (including some American liberal arts colleges). Having lived close to this evolving “tragedy” for almost my entire life, my title is nothing if not understated. Critics could, with justice, call it the “decline and fall” of the university or the “end” of the university, and the words “corrupt,” “criminal,” “ignorant,” “self-serving,” and even “vulgar” would not be out of place in defining many aspects of the modern university.
            It is important, before embarking on this indictment of modern universities, to remind myself as well as others that many good things happen in universities and liberal arts colleges. Many students encounter, and wrestle with, unfamiliar and challenging ideas. They meet persons with different beliefs from different cultures. Many exceptional professors still hold to the best academic ideals, and make every effort to guide students toward a true education. Some good research is accomplished, although not as much as many lay people and academics may assume. Modern universities have sometimes consciously, sometimes unconsciously, promoted these good qualities, and I would not be so ungenerous as to claim that these things happen, as we say, “in spite” of the university. I do mean to say that the main purposes and goals of today’s universities, as institutions, is something other than the pure advancement of knowledge and the proper education of students.
            A long list of negative things may be said about the institutional university, however, and I will try to compress some of these under a few short and general headings. Here they are:

1. Universities have no coherent idea about why they exist.
         Ask any five administrators or faculty about the over-arching purpose of the modern university, and aside from vague hand waving over the word “education,” you are likely to get five different answers. They dare not say that universities today exist primarily to remain in existence, to grow if they can, and to increase intake and increase output. If that sounds like the same amoral goals of for-profit businesses or political parties, well then, you understand what universities today are largely about. After universities enthusiastically embraced the market economy and the corporate business model in the 1980s, they have seldom questioned or examined their intent or their motives in doing so, choosing instead to increase their marketing strategies and efforts. The corporate model drives most of the energies (and much of the money) of today’s universities, leaving them with little time or motive to come together to re-evaluate who they are and why they are here (a few do, especially private schools with deep pockets, e. g. Harvard; this is not to say that Harvard, which is a chief mainstay of the elitist establishment, has really risen above the crowd). Universities exist to award degrees, not to educate, and what they produce, most of the time, are students with credentials, not knowledge.
            Evidence for this claim is everywhere. Because governments have decided to saddle students with enormous debt in obtaining their degrees (“credentials”), universities are in an often unspoken conspiracy with students to “process” them efficiently and swiftly. Good grades are much easier to obtain. Class sizes are inflated to take care of larger numbers of students in the corporate capitalist environment of today’s universities. Assignments are truncated to allow these larger numbers to flow through. And, students, who have now become “clients,” demand that they get the credentials they paid for. Intellectual growth, knowledge, and skills are incidental byproducts of this industrial model. The corporate university has attempted to counter this stark reality, in part, by claiming that we live in a complex “information” age, and that they hold the only key to unlock the sources of this “information”; they less often talk about knowledge. Any person with a library card and internet access can acquire “information.”

2. Today’s universities have become all things to all people.
         Universities are enormous bazaars, hawking a huge variety of goods (i.e., degrees, programs, and majors) to their customers. They are “comprehensive” universities made up of semi-independent parts. They are “diversities,” not universities. There are few or no core goals for students that are shared among the scattered faculties and schools and programs, from the arts and sciences to health science to business to oceanography to mortuary science (make up your own list; it will take awhile). While some schools will parrot their support for such goals as critical thinking, quantitative and qualitative analytical skills, elemental scientific understanding, and good writing, you can look through calendars and documents and courses until your head hurts, but you will find little coordinated effort to implement these goals.
         In the late 1950s, my brother was enrolled in an English course in the Faculty of Engineering at The University of Michigan (a prestigious school). In his first year he had to take a course entitled:  “Honors English for Engineers.” What is that??? Is there a way to read Crime and Punishment and Tess d’Ubervilles (both of which he had to read) in a way appropriate to “engineers”?? Of course not, but the Faculty of Engineering at the U of M was a powerful independent state within the university that could duplicate in course form whatever subject they wanted.
         Every hawker’s stall in the university strives to become a self-sufficient, autonomous entity. They offer their own statistics courses, their own philosophy and ethics courses, and their own history and sociology courses. If they lack some basic element, they either ignore it, or they make that basic element adjunct to one or more of their courses. I have read, and heard academics claim, that a “liberal education” can be got from many corners of the university, even from “Marketing” programs in a School of Business. It is hard for me to believe that “liberal education” is any more than a veneer in these cases.
         In order to satisfy the appetites and wants of students who have usually not encountered enough of the breadth of fundamental knowledge to make an informed decision, universities offer not just courses but programs in narrowly specialized subjects. Are you still interested in dinosaurs and swimming with dolphins? Do not worry, universities have fashioned programs that will appeal to you.
         In the process of becoming “credentialing factories” (as Jane Jacobs long ago labeled universities), and being all things to all people, universities have ventured even further into applied fields of study that used to be the domain of community colleges and trade schools. It is hard to tease out important elements of human knowledge, let alone intellectual rigor, in programs in “hospitality” or “equestrian” studies.
      .
3. Universities treat their communities poorly.
     Students:  Student loans have already been addressed, and universities are not so culpable in this matter as we citizens who do not demand more public financial support for public higher education. Unfortunately, students get less and less for their money. Undergraduates are less important than graduate students and academic research. Entering undergraduates are often taught by less experienced graduate students, rather than by senior professors, despite the fact that most first-year courses are among the most difficult to teach well, and are often the most important courses, especially for those taking these for breadth requirements only. Because most first-year courses are large, leaving students feeling anonymous, they become acculturated to invisibility and the lack of dialogue in their learning. My informal queries also lead me to conclude that first-year students are also acculturated to thinking that learning facts and information is the foundation of knowledge. For a much fuller discourse on the many, many ways in which we disappoint and disadvantage undergraduates, see Thomas C. Pocklington and Allan Tupper, No Place to Learn:  Why Universities are not Working (2002).
         Graduate students fare little better than undergraduates. They are often a source of cheap labor for their research supervisors; they are always a cheap source of labor for teaching tutorials and classes. Some are very, very good at these necessary tasks but the enormous profit margin they provide for the university as a corporate whole is the main reason they are in the classroom. In today’s economy, fewer and fewer are reaping any of the employment rewards from their period of indentured servitude, however. Supervisors and teachers are morally culpable (and this includes me) for not being more forthright and forceful in making clear that getting a master’s or Ph.D degree is not an easy avenue to appropriate employment. Because graduate school education provides the basis for hiring more faculty (in more and more esoteric fields), graduate student numbers cannot be decreased without reducing the size and profitability of the whole corporate university enterprise. This corrupt, inbred system of interlocking dependencies is not unlike the structure of society in France just before the French Revolution.

      Faculty:  Few outside the academic world understand just how difficult and competitive it is to get and keep academic employment, and then to advance within it. Fewer still, often including those inside the academy, realize what a large role luck has played in their fortunes (or misfortunes). In many fields, there are at least ten excellent candidates for every position available. If one is lucky enough to obtain a tenure-track position, there is no guarantee that tenure will follow. Publications (not necessarily good ones, in my experience) long ago replaced excellent teaching as the chief basis for getting tenure. Refusal of tenure is near to a death sentence; it means you have been fired, and you will not likely get a job elsewhere in the academy, even though your knowledge and skills are not easily marketable anywhere else. Tenure is no protection from dismissal, however; if a university decides to do so, for example, they can eliminate entire programs (as some U. S. southern schools have done recently in regard to programs like computer science, which are undersubscribed).
         If a faculty member gets tenure, they must then turn their attention to getting promotion to Associate and then Full Professor. This never ending exercise of nit-picking evaluation means more research and publishing. If one cannot accomplish that, she will often be assigned to a fairly basic salary and to extra “duties” that may do little for her or the advancement of knowledge.
         Those who climb the ladder to Full Professor do very well indeed. They have considerable prestige in their university and often in their profession. They have very good salaries. A recent study of income inequality in the U. S., places university professors among a small group of persons who, while they may not be among the top 1%, are among a very elite group of income earners. The statistics, however, were taken from full professors at very elite institutions like Harvard and Princeton, and they ignore the fact that it is a long hard climb to full professorship, leaving few years for many full professors to enjoy a high income. Unfortunately, this latter condition has led to another problem with universities. Senior full professors are holding onto their positions well past the age of 65, thereby depriving a younger generation to move up through the process. Senior professors do less and less teaching, however, despite the fact they have a much broader knowledge base than younger faculty, and presumably bring more “wisdom” to the subject.
      
         Administration and Staff:  There is no more vertical and rigid hierarchy in the annals of history than among the staff and administration of a university. Those at the bottom, maintenance staff for example, are like the invisible poor of the 17th century, despite the fact that most of them come into meaningful contact with students and faculty on a frequent basis. I know of one woman, now retired, whose employment was to make sandwiches and serve cafeteria meals, yet she had a big influence on students who were distressed or confused or who just needed some encouragement. Above this level of staff are those, as councilors and advisors and remedial studies providers, to name a very few, who often influence a student’s career in as profound a manner as do any faculty. They seldom receive meaningful credit for their accomplishments, and more importantly, they are an easy source to eliminate one by one or in whole sections when economic hard times hit, given that neither faculty nor graduate students can be easily eliminated.
         Then there are the university administrators who differ from their business colleagues only in the degree of ambition and manipulation they bring to advancing up the corporate ladder (some are more vicious that the private economies players; some are less so). The majority are recruited early on in their faculty careers, after they have demonstrated “people skills”; these people skills might better remain tied to undergraduates in the classroom. Untutored in administration, they make mistakes that usually, ironically, have to do with interpersonal relations.
         Upper administration are like titans, beyond the reach of ordinary staff and even faculty. In recent years, becoming a vice-president or provost or even being a long serving dean, means that you have acquired rights to a “golden parachute” when you retire. Even before you retire, you have supplements to your income and sometimes unvouchered, and always vouchered, expense accounts. Upon retirement, many months and sometimes even years of paid leave provide a happy sendoff to retirement. Equally problematical with upper administration in the corporate university is their clear lack of focus:  are they fund-raisers, marketers, public relations officers, crisis interveners, or just plain business administrators? One thing they are not -- at least any longer – is educational visionaries and protectors of the intellectual life.

It is hard for me to review these elements of the modern university without despairing about the future of higher education. I believe that community colleges, despite sharing many of the problems outlined above, are, at least, more honest in their goals and purposes. Modern universities, however, are a long way from reform. There expense has led in part to their quasi-private status, thereby leading to their exploitation of students, staff and faculty. It is no surprise that for-profit, distance-learning institutions have made large inroads on public universities. Once these institutions can establish a better claim to providing their “clients” with more sound “credentials,” public universities will be in a difficult competitive environment. The decline and near disappearance of the idea of the public interest also militates strongly against any reforms. So, in general, and in my field of the humanities and history in particular, much of what universities are assumed to provide – intellectual stimulation, critical thinking, problem-solving, integrative knowledge – will be got by persons discovering things on their own, or in new institutions that will address the loss that has so clearly occurred in our mega-institutions.

Thursday, January 12, 2012

Fundamental Conditions for Living Well

            Over the holidays, I have been asking friends and acquaintances about what is necessary to live well. I get many of the usual answers:  health, money, purpose in life, education, and political freedom, among others. It struck me that the tone of most responses suggested a causal relationship between one’s own initiative (i.e., one’s autonomy, agency, and authority), and the conditions that our specific culture or society provides. While I did not disagree with most observations suggested to me, I was also struck that two very big factors were entirely ignored – the health of our one-and-only planet, and luck. It also seemed to me that a kind of democratic spirit in their responses elevated tertiary conditions (e. g., political freedom) and under-estimated more critical ones (e. g., education).

1. The Health of Our Planet – It seems to me that lately we have responded to this absolutely fundamental condition to “well-being” in three ways. First, many persons feel exhausted and defeated after decades of outrage over the incapacity of our governments and societies to attempt even modest responses to climate change.  Secondly, many remain in denial, for one bad reason or another. Thirdly, and most absurdly, some seem to believe that we will find a new planet to colonize. These folks might as well be lumped in with the deniers.
            Even if we are alarmed by what is occurring, the health-of-our-planet issue clearly suggests just how limited human agency is, whether we are thinking of ourselves as individuals, or in the collective sense of societies. Perhaps this is why we do so little, or make such small, gratuitous efforts (e. g., buying an electric car), in the face of possible extinction. We cannot face how limited our authority is, or how “un-special” we are as a species.

2. Luck – We used to speak of good fortune or luck more frequently in the past. The Greeks, after all, centered their whole conceptualization of the cosmos on “fate,” although they also felt human beings need not invite bad fortune by acting with hubris or stupidity. Until very recent times, most of us were humbled by the chance good fortune we had received. In my own case, I would have to say that most of my good fortune (and I have had a lot of it) was the consequence of the convergence of many lucky circumstances. That is, I inherited good genes and health; I was lucky to be born at a time, and in a place, where democratic, and relatively inexpensive, education had reached its zenith; I was lucky to marry a person who supported and aided and encouraged me; I was lucky to get one of the last university teaching positions in my field, and so on.
            Luck is not a popular concept in our aggressive ideological age. To be humble when reflecting on one’s good fortune is anathema to an age that rewards “attitude,” bravado and brash self-assertion, and, social and economic “bullying.” One “makes one’s own luck” is the modern, and often false, mantra (especially, it appears, of some CEOs). To admit luck, good or bad, as a fundamental quality to our well-being means that those of us who are wealthy have no way to justify our wealth. Charity is the buy off for good luck; it is the action that reveals how, beneath it all, we know we are lucky in comparison to someone else. Unfortunately, charity can also act like Catholic confession; we often like to think it can absolve us of taking further social and political action and reform.

3. Education – A person can be poor or lack good health, but if they are educated to their abilities their well-being will be vastly improved. Obviously, if a person is terminally or perhaps even chronically ill, health may claim a superior place above education, at least until one is healthy again. In short, I do not agree with the slogan:  “if you have your health you have everything.” For most of us, however, education at all levels is the most important element which human beings can control and improve. There are corollaries to  this axiom. First, everyone needs education that is available and affordable for all. In other words, equality of opportunity in education is essential, and in our society, this means public education. Secondly, vocational education, while a useful secondary consequence of real learning, is not real education for the larger, more important, purposes of creating better human beings, better citizens, and persons who can think rationally and express themselves creatively. Thirdly, education must be perceived individually and collectively as something that occurs lifelong. With good education for all, the richness of life is accomplished, and things like good government, liberty, and social human decency will follow.

4. A “Modest Competence” – Having a “modest competence” is, in large part, a mere consequence of living on a planet that remains beneficent, of the good luck inherent in one’s person or society, and a culture in which equal education is secured. But it is also a consequence of public goals and private beliefs. I like to use the 18th century phrase – “modest competence” – because it is so much more inclusive than saying “a good income” or “money.” A “modest competence” implies economic resources that keep one out of poverty. It is revealing that most countries measure economic poverty as a falling below a percentage of median personal or family income. In other words, poverty is made relative, recognizing that the poor are not a fixed social class but are persons who, through their circumstances, been deprived of the full means to live life fully. By contrast, the U. S. measures poverty as an absolute number, and, the right-wing Heritage Foundation goes further and measures poverty by how many material “amenities” a person or family have. This latter means of measuring suggests that all matters of poverty should be measured in terms of levels of material acquisition and consumption alone.
            A “modest competence,” however, also implies a set of skills or abilities that make one free from the most demeaning human labor, and of being free of slavery. And, it can be seen as a measure of personal autonomy and agency in general as well as the capacity to be competent in being a contributing member of society.

Undoubtedly, almost all of you who read this will disagree with at least some of my list, and likely some of my conclusions. Given my list, however, I am bothered first by how puny our attempts have been, and apparently will continue to be, in regard to really big issues, like climate change. It makes me worry that we are simply not terribly competent as a species. Secondly, I am appalled at how little true humility and true charity we feel and express as a consequence of the luck factor. This suggests to me that we are not terribly competent as a species but believe we are. Thirdly, our abandonment of real education at almost every level draws me toward the conclusion that we are not terribly competent as a species, and do not give a damn if we are not. And, finally, our inattention to establishing a “modest competence” for all suggests to me that we are not terribly competent as a species, and are filled with disrespect for others and a general self-loathing of the human condition. Yet, despite all of my concerns about the limits of human intelligence and goodness, we can look at our current condition as so bad that there is no way but up. In fact, I do believe that we are at least on the cusp at addressing some of the matters that would improve our collective well-being.