NAS Online Forum
|The National Association of Scholars (NAS) Online Forum provides concise and timely commentary on recent news and current issues in American higher education. Although the opinions of contributors are not necessarily those of the NAS, we welcome your reaction to these postings and this site. Please reply by clicking HERE.|
|Support This Forum and the NAS|
Monday, August 04, 2003
Clinical Psychology: Another Totalitarianism of the Left?
Submitted by a Clinical Psychologist from the Midwest
I have been a clinical psychologist for the last 22 years, working with the severely mentally ill in government-operated institutions. Because these institutions lacked university affiliations, I was protected for several years from the liberal orthodoxy that was taking over academia. But not any more. The field of clinical psychology is increasingly focused on bringing about "equity of result" among the various cells of the periodic table of race, class, and gender. It seems unwilling to tolerate diversity of thought that might be in conflict with this preoccupation.
A discussion that took place a few months ago at a meeting of psychologists is a good example of what is happening. The topic concerned "competencies," or the skill areas that the American Psychological Association requires psychology interns to be trained in. One of these is "diversity." A few people noted that they had been trained ten or more years ago and felt ill-informed about the area, and were at a loss when it came to training and assessing students in terms of their competence in it. No one, including myself, pointed out that perhaps this was because it had very little to do with the actual practice of clinical psychology (otherwise as practicing clinicians they would have had to become informed about the topic), but everything to do with academic fashion and political correctness.
The discussion turned to the "complexity of diversity," that there is racial, ethnic, religious diversity and that this contributes to the difficulty in training students. I pointed out that the field of psychology is lacking in political diversity, that psychologists are far more liberal than the population at large, whom we are responsible for treating. I heard someone remark that the ability to advocate in behalf of marginalized groups -- gays, lesbians, transgendered, ethnic minorities -- was being included as one aspect of being judged competent to provide services to such groups, a competency now required of all those being trained to become clinical psychologists.
I, suppressing a sense of outrage, stated that one person's advocacy was not necessarily another person's, and that this appears to be a rather thinly disguised way for certain groups to advance their political agenda. Two people acknowledged immediately that it was political, one of them seeming pleased, the other, distressed. I said it that seemed quite unfair to require allegiance to any group's politics to be judged professionally competent. The person next to me turned and said, softly smiling, "if you don't agree and go along with the program, your internship won't be accredited."
We see psychology as a science and rely on it as a source of truth about the human experience, especially in matters concerning the socialization of the young. In many ways, it has supplanted the religious, cultural, and social traditions that used to be dominant in our society. But psychology is not a science, critically examining tentative hypotheses about human behavior in the light of new data. Psychology is becoming a totalitarian force, reshaping our society in the image and likeness of orthodox liberalism.
Why is there no awareness of this pervasive bias within the field of clinical psychology? Are we as clinicians so caught up in our work with patients that we give little thought to the broader implications of our thinking? Have we become so liberal that we are unaware of any other ways of understanding society aside from the race, class, gender triad? Do we see the human experience as a perpetual therapy session, in which being "judgmental" about anything is forbidden? Are we no longer educated in the true liberal sense, and thus lack the critical skills to challenge "politically correct" thinking?
Psychology needs to be challenged publicly and its biases exposed. Perhaps there should be a website focused on the "deconstruction of psychology." Any other ideas?
Tuesday, July 29, 2003
A Crucial Distinction in Writing and Teaching History
Neil Cameron, John Abbott College
The July 11 Frontpagemag.com carried an article by Greg Yardley, called Historians Against History. It is full of good intentions, but I'm afraid it badly misunderstands just what has gone wrong in a lot of recent historical scholarship. Yardley's ire is directed at the radical activist group within the American Historical Association who call themselves "Historians against the War." He attacks these tenured propagandists in the name of historical "professionalism," a "noble dream" of "objectivity, balance, and evenhandedness."
Yardley holds that historians are free to engage in polemic journalism and political activism, but that their collective views on current issues should have no more credibility than, say, "Gardeners against the War." When teaching history himself, he tried to adopt an "objectivity" of approach himself, behaving differently than he would as a journalist.
But his notion that there once was a world of such "objective" scholarship, until the radical and subjectivist serpent entered the garden, is just a fine romance, which could be easily shredded by the radicals that he deplores. Certainly no one could conceivably argue that before "the study of history became a profession," even the greatest histories were free of "presentist" political influences: think of Hume or Macaulay. There are also contemporary popular historians, Paul Johnson the best example, who still write very good history in this tradition.
Not only that, much "professional" history in the twentieth century always showed obvious, and by no means indisputable, political inclinations. A whole school of "progressive" historians like Charles Beard and Carl Becker was followed by similar figures in later generations.
Even 1960s radicalism, while mostly producing more heat than light, brought a few substantial historical contributions from writers like Christopher Lasch and Eugene Genovese. Yardley and many other younger writers now seem unaware of an old scholarly distinction, that between partisanship and bias.
Partisanship is what historians, and scholars in other liberal disciplines, are bound to display as a simple feature of their individual character. The approach made to documents is bound to be different for the religious or the secular, the radical or the conservative. Some of the most intellectually and morally instructive history has been written by passionately partisan scholars.
But from Thucydides to Niall Ferguson, these writers have two other qualities as well. They display their parti pris openly to their readers, and they take it for granted that they still have an obligation to treat documentary historical evidence with a code of honesty, willingness to give full weight to documents that tell against their preferences, and intelligent criticism from other scholars with opposing views.
Bias literally means "slant," and what is typical of the awful stuff produced by contemporary academic radicals is that it is so slanted as, at a minimum, to suppress the whole truth, and in many cases propagate outright lies. In the Cold War years, this charge could sometimes be justifiably brought against those historians who were actual communists or fellow travelers, since it was impossible for them to give an honest account of the countless historical topics on which the Party had a "line." But the causes of appalling bias by the radicals of the last two decades have been somewhat different.
The New Left radicals of the 1960s recognized a real frailty in "professional" scholarship as such. While they, too, confused bias and partisanship, they accurately discerned plenty of the latter, and a bit of the former, in the "mainstream" and America-affirming liberal history of writers like Arthur Schlesinger Jr. They further confused "liberalism" as a political program and as a methodological principle of openness to different points of view. In this latter sense, academics of widely varied political views had formerly been able to say, "we are all liberals now." Radicals treated this as a mere convention of the hegemonic bourgeoisie, but initially had to play along.
But as the post-secondary educational system got bigger and bigger, quantitative changes started becoming qualitative ones. In university departments of only, say, half a dozen professors, there might be only one really outstanding scholar or teacher, but there would seldom be more than one complete dud or nutcase as well. But when these departments began to number in the dozens, both scholarly capacity and political opinion began to approach a normal curve distribution in each institution, so that two or three political lunatics could establish their own coteries of undergraduates and graduate students and campus lobby groups.
Many "middle-of-the-road" professors long underestimated the full implications of this change. They had been formerly able to assume safely that bad scholarship and teaching, since they generated nothing of lasting value, could just be treated as the irritating chaff that went along with the harvest of wheat. What they did not anticipate was that the chaff would set out to drive out the wheat altogether. Real scholars and teachers are correctly recognized as unbearable competition, dangerously attractive to bright and undeceived students, and producing articles and books that are actually read.
Yardley's plaint is not really against historical studies in particular, it is against the corrupt politicizing in all large formal organizations that bow to the insistent demands of the noisy and ambitious for the sake of a quiet life. Off the university campus, consider the National Education Association, largely dedicated to wrecking the American public school system for the last half century, or even the prestigious American Association for the Advancement of Science, increasingly being bullied into cheering on environmental arguments based on weak or non-existent scientific evidence.
The Cold War once did a favour for defenders of Western civilization that had little to do with the political struggle between the United States and the Soviet Union. It made it possible for academic liberals (in both the political and methodological sense) to recognize that there were at least some variants of radical thought that they were compelled to fight head on. In a "post-Marxist" world, few have any idea how to confront a subjectivist, nihilist, America-hating nitwit, especially if he/she/he-she also claims to speak on behalf of previously silenced lesbian Latinos in wheelchairs, or whatever.
Yardley actually ends his article by proposing that federal or state authorities intervene directly to straighten things out. This is a terrible idea, and a very naïve one. In a world like the present, state intervention would be far more likely to entrench the nutcases, not the scholars. The latter are just going to have to learn to fight, using the old weapons of logic, reason, and common sense. They need not be ashamed of being "partisan" in such causes as patriotism, reason, and intellectual honesty.
Thursday, July 24, 2003
Web Logs to Keep Watch on the Academy
King Banaian, St. Cloud State University
A recent article Scholars Who Blog (Subscribers Only) in the Chronicle of Higher Education discusses the use of web logs or "blogs" by academics. "Some have started blogging" (blog is both a noun and a verb on the Internet) "in order to muse aloud about their research. Others want to polish their chops at opinion-writing for nonacademic audiences. Still others have more urgent and personal reasons."
It notes the rise of watchdog blogs as well, including mention of one site, Erin O'Connor's Critical Mass, which "advances a generally conservative critique of grade inflation, sensitivity training, and speech codes."
Blogs have been around for a few years now. A blog is simply a webpage that one frequently updates with entries that give the flavor of a diary, or of a log of one's experiences, or a set of journal entries for one's future writing.
A company called Blogger (since acquired by Google) provided both the software and servers that made creating and maintain blogs quite easy. Other companies offer different software. Blogging service companies offer the ability to allow others to comment on the entries or to track the diffusion of one's blog entries on other blogs. A blogger tends either to provide many links to interesting things one can read, or to make longer entries of commentary on things they have read.
The latter kind of blog can be used to advertise or to inform. Academics wishing to share their expertise with a wider public can use blogs to create a market for their talent. This has proved particularly useful for law professors and those in the social sciences.
Two of the most popular blogs, InstaPundit and the Volokh Conspiracy, are written by law professors (the Volokh Conspiracy is actually written by more than a dozen authors, but its namesake and most frequent contributor is a law professor.)
Economists such as Brad DeLong and Daniel Drezner also are able to advertise their skills.
Critical Mass, on the other hand, does not advertise Prof. O'Connor as an individual but acts as information on troubles in the academy, told from the perspective of an English professor.
Invisible Adjunct and Accidental Administrator reflect on the academy from their perspectives as an adjunct history professor and a middle-level university administrator. I am coauthor of a site, SCSU Scholars, with (tenured) professors in different colleges, which allows us to disseminate information from across the campus.
The rise of blogs like these is important in sharing information. Academic institutions are large, and many administrations suffer from insularity. Lounges where faculty could discuss the institution informally and without fear of reprisal for unpopular views are disappearing (at least at St. Cloud).
When one or two faculty speak out against unhealthy turns in the academy, such as political correctness, they are branded as isolated, unrepresentative incidents. A use of blogs, then, is to allow faculty at one school to see patterns of behavior repeated elsewhere and to obtain ideas of how others may have been able to stop that behavior.
Just as important as these have been the blogs created by students to describe poor behavior by professors inside and outside the classroom, as well as poor administrations and student governments. These are too numerous to name completely, but one site, Campus Nonsense, has acted as a clearinghouse for many student blogs and conservative student newspapers.
Another site, No Indoctrination, was created by a concerned parent of a student who felt left-wing indoctrination in some of his classes. No Indoctrination is not a blog, but represents the increasing use of the Internet to pass on information about professors using their classrooms to express extracurricular political views.
As campuses continue to pursue speech codes or otherwise control the flow of information from administration or faculty union to the faculty, the Internet will continue to provide a means of working around those roadblocks. Blogs are particularly useful in that information can be provided quickly, in a public space, outside the domain of the university, and bring in valuable sunlight from the wider public to what goes on in universities today.
Wednesday, July 23, 2003
Taking Diversity Seriously
George C. Leef
The Pope Center for Higher Education Policy
The recent Supreme Court ruling that state universities may use racial preferences in order to obtain a "diverse" student body -- provided that they aren't too blatant about it -- has met with fulsome praise from most educational leaders. But now that they have been given the green light to engineer diversity, will universities actually do it?
While the Constitution requires the states to give "equal protection of the laws" to all citizens and the Civil Rights Act of 1964 forbids educational institutions that receive any federal funds from engaging in racial discrimination, in Grutter v. Bollinger the Court saw fit to give universities a pass on their policies of selective racial preference in admissions on the grounds that there were "educational benefits" to be derived from having a diverse student body and that a state has a "compelling interest" in attaining those benefits. Preferential treatment of applicants based on race is therefore legitimate.
There are strong reasons to doubt that those educational benefits are either real or significant. The Court just took the schools' word for it that more diversity means more harmony, understanding, etc., despite a considerable amount of evidence that it often has just the opposite result.
And if you're looking for proof that "diversity" enables students to learn their chemistry or history or economics better, forget it. The supposed benefits are not that students learn their subjects better, but that they learn about each other. Most university leaders have warmly embraced the multiculturalist conceit that if we turn our institutions into big Koom-By-Yah sings, bad feelings will disappear. So let's assume for the sake of argument that diversity does really good things for a university and its students.
We then have to ask why our colleges and universities are doing so little to promote it. American universities public and private have for years been endeavoring to increase the numbers of black, Hispanic and American Indian students. Fine, but how do we know that this limited amount of racial and ethnic diversity is enough? Or even the best kind of diversity to have?
After all, human beings can be different in many aspects other than race. Consider religion, for example. The predominant religious affiliation in North Carolina, where I live, is Baptist. Does it contribute more to diversity on campus to add another black student who is Baptist, or to add another white student who is Catholic or Jewish - both denominations with far fewer adherents in the state? Or perhaps to add a religious nonbeliever of any background? If exposure to people with different racial backgrounds is educationally beneficial, wouldn't exposure to people of different religious convictions be at least as beneficial? Why aren't our universities working to make sure that they have a reasonable cross section of the range of religious belief?
What about political and economic philosophies? Students differ enormously in that regard. Although many students don't have any strong views on political and economic issues, a lot of them do. Within the country, you'll find everything from devotees of Ayn Rand to die hard believers in socialism. Democrats, Republicans, Libertarians, Green Party members. We know that the universities don't make any attempt at ideological balance among the faculty members, who are overwhelmingly leftist, but if it's important to have a diverse student body, shouldn't they make sure that there is at least a "critical mass" of students representing all points on the spectrum?
And which of these diversities is the most beneficial? I don't know, but shouldn't college administrators be studying that question? Does a black Baptist Democrat do more for diversity than a white agnostic Marxist? If we are really going to take diversity seriously, we shouldn't just assume that racial or ethnic characteristics are necessarily much more potent diversity boosters than are features based on the individual's thoughts.
We have only scratched the surface of diversity, though. People are very diverse in their socio-economic backgrounds. Some students come from struggling, single-parent households, while others grew up in households with two parents and no shortage of money. Some students come from families with small businesses, while in others, Mom, Dad, or perhaps both worked for large corporations.
Some students come from small, rural towns; others grew up in big cities and their suburbs. Those differences may have a strong impact on a person's outlook on life. Are our colleges trying to ensure that there's at least a "critical mass" of students from each of the many identifiable socio-economic groups in society? If not, why not?
Do we need to be certain that we have diversity with regard to preferences for sports and fitness versus couch potatoism? Diversity in musical tastes? Diversity in food preferences? Is diversity enhanced more with the admission of a guy whose lawyer father's ancestors came from Spain and who likes NASCAR racing, hard rock music and western Carolina barbeque, or a vegetarian white gal whose single mother taught her to love and play Bach's cello suites?
Don't we also need to ensure sufficient representation on campus of animal rights activists and hunters? Of home schooled kids? Of students who abstain from alcohol, drugs, and sex, and those who don't? Of believers in astrology and those who scoff at it?
Once the college admissions department has assembled the ideal mix of diverse students, whatever that might be, the job is hardly over. If we're going to take diversity seriously, we can't stop with the overall student body. To ensure a maximum of diversity contacts among students, administrators will carefully have to engineer dormitories for balance and "critical mass" along all the various diversity vectors. The same will have to be done for classes.
If diversity is truly a compelling state interest, we can't leave it to chance that we'll get enough of it by allowing students to live where they want, and study what they want. We'll need housing diversity experts to tell us if it's better to have an Hispanic Catholic hunter who likes beer room with the white Episcopalian PETA supporter who also likes beer, or the black Methodist who is indifferent toward hunting, but thinks that no one should let a drop of alcohol pass his lips. The research necessary for such determinations will undoubtedly lead to many new federal grants.
Doing diversity looks exceedingly hard. Or is it?
What if universities simply admitted the best qualified applicants? By "best qualified," I mean those with the strongest evidence of academic aptitude -- the highest SAT scores and grade averages. Those students would no doubt differ markedly in many ways. You would have some racial and ethnic diversity, some religious diversity, political diversity, socio-economic diversity, and so on.
Whether, and to what extent there would be any "educational benefits" from that mixture is questionable. But those students would be fairly homogeneous in one important respect. They'd all be almost equally capable of learning the material covered by their professors.
Isn't that what college is supposed to be about, anyway?