Monday, April 28, 2008

Were we prepared to teach?

I returned from the APA Central meeting in Chicago last week, and a conversation I had with several philosophers there has really stuck with me. The conversation included only young-ish philosophers (those who've received Ph.D.'s within the last ten years, along with some current grad students) and the discussion came around to how well graduate philosophy programs prepare students for their future careers. As we all observed, the Ph.D. is largely a research credential, but most programs at least make some effort to develop their students' teaching abilities. At the same time though, it was nearly unanimous (especially among those who had tenure-stream jobs) that few programs do enough to prepare students for specific pedagogical or vocational challenges: teaching large courses and/or large numbers of sections, designing courses, writing good examinations, serving on committees, etc.


What was clear to me afterward is that, for the overwhelming majority of academic philosophers, their jobs are quite different — in their day-to-day rhythms, patterns of work, even fundamental professional expectations — from what they were trained to do in graduate school. One participant in the discussion put it very nicely: "They trained us to replace them. But what else should we expect from graduate faculty? That's probably all they've ever known." In other words, graduate faculty train graduate students to do the work of graduate faculty at Research I institutions. Yet only a tiny minority of philosophers have that kind of academic position.

I've long wondered if our discipline takes the right approach to preparing future faculty. Needless to say, the evidence from my discussion at the APA is largely negative. Yet I thought we could take a modestly systematic approach to this question here at ISW: In the comments to this post, I'd like to hear whether those in the profession feel their graduate school training has prepared them (or is preparing them) for the jobs they have (or will have). I'd appreciate people being as specific as possible: I'm curious about teaching, obviously, but I'd also be interested in anything else you think is relevant to the performance of your professional duties. I'd also be interested in hearing from people at various career stages, to see if there's a larger evolution in how grad programs prepare people for work. Thanks -- I look forward to your input and reflections. (And since I can imagine people wanting to be anonymous in comments, I'd appreciate your helping keep everyone straight by commenting as Anon 1, Anon 2, etc.)


Thursday, April 24, 2008

Extra Credit Gone Wild?

There are a lot of ethics & philosophy-related events that happen in my geographical area (Atlanta). For my ethics classes especially, I let students know about select events and allow them to attend (and write up a detailed summary and reaction) for extra credit. My main reason for doing this is to encourage them to get out and be exposed to philosophical ideas, arguments, information and people that most of them are not very familiar with. Many students take advantage of these opportunities, and seem to often enjoy and benefit from them, but I am wondering about a possible problem here.

The possible problem is some semesters there are are so many of these events that a student could significantly raise his or her grade by going to a lot of these events. Thus, conceivably, a student could do rather poorly on the tests and papers -- so much so that the student would fail -- but the extra credit results in a passing average. Or the student is at a 'C' or so, but does so much extra credit to get bumped up to even an 'A'. Since I suppose one's grade should really be (mostly?) based on to what degree on has "mastered" the information and skills presented in the class, these extra-credit-enhanced grades don't reflect that competence.

Is this a problem? If so, what to do about it?

One idea is a "cap" on extra credit. But what would that cap be? And, more importantly, if there were a cap then fewer students would be getting out there to see these interesting presentations, which would be bad, from my goals for teaching.

Another idea is to revise the view that students' grades should be one's grade should (mostly) based on to what degree on has "mastered" the information and skills presented in the class. Another category could be added (e.g., "interest in the material," manifesting itself in going to see outside speakers?), but perhaps any such category would be bogus. Or maybe not: maybe cultivating students' interests in and motivation for engaging intellectual topics is a goal in itself, although one that can't easily be measured. I don't think interest in the material should be graded, but perhaps it can be justly rewarded?

Another option is that extra credit activities could be assimilated into the legitimate "skills" category, since they are supposed to be events where one applies the critical thinking skills we are learning and practicing in class, and somehow by that route such radical grade improvements are justified.

I wonder what thoughts people have to these issues. Maybe someone has already dealt with this problem and has a good response.

Wednesday, April 23, 2008

"It Makes no Difference Whether or Not I Do It"?

Here's a recent article by Michael Pollan from NY Times entitled "Why Bother?" that discusses a response that many people, including students, make to many possible moral demands, namely, "It just doesn't matter what I do so I don't need to do anything," "My efforts won't make any difference or enough of a difference so there's no obligation here for me" and so on. Perhaps this article will help people think about and respond to this kind of objection both in class and in their philosophizing (and, ideally, both).




April 20, 2008
The Way We Live Now
Why Bother?
By MICHAEL POLLAN

Why bother? That really is the big question facing us as individuals hoping to do something about climate change, and it’s not an easy one to answer. I don’t know about you, but for me the most upsetting moment in “An Inconvenient Truth” came long after Al Gore scared the hell out of me, constructing an utterly convincing case that the very survival of life on earth as we know it is threatened by climate change. No, the really dark moment came during the closing credits, when we are asked to . . . change our light bulbs. That’s when it got really depressing. The immense disproportion between the magnitude of the problem Gore had described and the puniness of what he was asking us to do about it was enough to sink your heart.

But the drop-in-the-bucket issue is not the only problem lurking behind the “why bother” question. Let’s say I do bother, big time. I turn my life upside-down, start biking to work, plant a big garden, turn down the thermostat so low I need the Jimmy Carter signature cardigan, forsake the clothes dryer for a laundry line across the yard, trade in the station wagon for a hybrid, get off the beef, go completely local. I could theoretically do all that, but what would be the point when I know full well that halfway around the world there lives my evil twin, some carbon-footprint doppelgänger in Shanghai or Chongqing who has just bought his first car (Chinese car ownership is where ours was back in 1918), is eager to swallow every bite of meat I forswear and who’s positively itching to replace every last pound of CO2 I’m struggling no longer to emit. So what exactly would I have to show for all my trouble?

A sense of personal virtue, you might suggest, somewhat sheepishly. But what good is that when virtue itself is quickly becoming a term of derision? And not just on the editorial pages of The Wall Street Journal or on the lips of the vice president, who famously dismissed energy conservation as a “sign of personal virtue.” No, even in the pages of The New York Times and The New Yorker, it seems the epithet “virtuous,” when applied to an act of personal environmental responsibility, may be used only ironically. Tell me: How did it come to pass that virtue — a quality that for most of history has generally been deemed, well, a virtue — became a mark of liberal softheadedness? How peculiar, that doing the right thing by the environment — buying the hybrid, eating like a locavore — should now set you up for the Ed Begley Jr. treatment.

And even if in the face of this derision I decide I am going to bother, there arises the whole vexed question of getting it right. Is eating local or walking to work really going to reduce my carbon footprint? According to one analysis, if walking to work increases your appetite and you consume more meat or milk as a result, walking might actually emit more carbon than driving. A handful of studies have recently suggested that in certain cases under certain conditions, produce from places as far away as New Zealand might account for less carbon than comparable domestic products. True, at least one of these studies was co-written by a representative of agribusiness interests in (surprise!) New Zealand, but even so, they make you wonder. If determining the carbon footprint of food is really this complicated, and I’ve got to consider not only “food miles” but also whether the food came by ship or truck and how lushly the grass grows in New Zealand, then maybe on second thought I’ll just buy the imported chops at Costco, at least until the experts get their footprints sorted out.

There are so many stories we can tell ourselves to justify doing nothing, but perhaps the most insidious is that, whatever we do manage to do, it will be too little too late. Climate change is upon us, and it has arrived well ahead of schedule. Scientists’ projections that seemed dire a decade ago turn out to have been unduly optimistic: the warming and the melting is occurring much faster than the models predicted. Now truly terrifying feedback loops threaten to boost the rate of change exponentially, as the shift from white ice to blue water in the Arctic absorbs more sunlight and warming soils everywhere become more biologically active, causing them to release their vast stores of carbon into the air. Have you looked into the eyes of a climate scientist recently? They look really scared.

So do you still want to talk about planting gardens?

I do.

Whatever we can do as individuals to change the way we live at this suddenly very late date does seem utterly inadequate to the challenge. It’s hard to argue with Michael Specter, in a recent New Yorker piece on carbon footprints, when he says: “Personal choices, no matter how virtuous [N.B.!], cannot do enough. It will also take laws and money.” So it will. Yet it is no less accurate or hardheaded to say that laws and money cannot do enough, either; that it will also take profound changes in the way we live. Why? Because the climate-change crisis is at its very bottom a crisis of lifestyle — of character, even. The Big Problem is nothing more or less than the sum total of countless little everyday choices, most of them made by us (consumer spending represents 70 percent of our economy), and most of the rest of them made in the name of our needs and desires and preferences.

For us to wait for legislation or technology to solve the problem of how we’re living our lives suggests we’re not really serious about changing — something our politicians cannot fail to notice. They will not move until we do. Indeed, to look to leaders and experts, to laws and money and grand schemes, to save us from our predicament represents precisely the sort of thinking — passive, delegated, dependent for solutions on specialists — that helped get us into this mess in the first place. It’s hard to believe that the same sort of thinking could now get us out of it.

Thirty years ago, Wendell Berry, the Kentucky farmer and writer, put forward a blunt analysis of precisely this mentality. He argued that the environmental crisis of the 1970s — an era innocent of climate change; what we would give to have back that environmental crisis! — was at its heart a crisis of character and would have to be addressed first at that level: at home, as it were. He was impatient with people who wrote checks to environmental organizations while thoughtlessly squandering fossil fuel in their everyday lives — the 1970s equivalent of people buying carbon offsets to atone for their Tahoes and Durangos. Nothing was likely to change until we healed the “split between what we think and what we do.” For Berry, the “why bother” question came down to a moral imperative: “Once our personal connection to what is wrong becomes clear, then we have to choose: we can go on as before, recognizing our dishonesty and living with it the best we can, or we can begin the effort to change the way we think and live.”

For Berry, the deep problem standing behind all the other problems of industrial civilization is “specialization,” which he regards as the “disease of the modern character.” Our society assigns us a tiny number of roles: we’re producers (of one thing) at work, consumers of a great many other things the rest of the time, and then once a year or so we vote as citizens. Virtually all of our needs and desires we delegate to specialists of one kind or another — our meals to agribusiness, health to the doctor, education to the teacher, entertainment to the media, care for the environment to the environmentalist, political action to the politician.

As Adam Smith and many others have pointed out, this division of labor has given us many of the blessings of civilization. Specialization is what allows me to sit at a computer thinking about climate change. Yet this same division of labor obscures the lines of connection — and responsibility — linking our everyday acts to their real-world consequences, making it easy for me to overlook the coal-fired power plant that is lighting my screen, or the mountaintop in Kentucky that had to be destroyed to provide the coal to that plant, or the streams running crimson with heavy metals as a result.

Of course, what made this sort of specialization possible in the first place was cheap energy. Cheap fossil fuel allows us to pay distant others to process our food for us, to entertain us and to (try to) solve our problems, with the result that there is very little we know how to accomplish for ourselves. Think for a moment of all the things you suddenly need to do for yourself when the power goes out — up to and including entertaining yourself. Think, too, about how a power failure causes your neighbors — your community — to suddenly loom so much larger in your life. Cheap energy allowed us to leapfrog community by making it possible to sell our specialty over great distances as well as summon into our lives the specialties of countless distant others.

Here’s the point: Cheap energy, which gives us climate change, fosters precisely the mentality that makes dealing with climate change in our own lives seem impossibly difficult. Specialists ourselves, we can no longer imagine anyone but an expert, or anything but a new technology or law, solving our problems. Al Gore asks us to change the light bulbs because he probably can’t imagine us doing anything much more challenging, like, say, growing some portion of our own food. We can’t imagine it, either, which is probably why we prefer to cross our fingers and talk about the promise of ethanol and nuclear power — new liquids and electrons to power the same old cars and houses and lives.

The “cheap-energy mind,” as Wendell Berry called it, is the mind that asks, “Why bother?” because it is helpless to imagine — much less attempt — a different sort of life, one less divided, less reliant. Since the cheap-energy mind translates everything into money, its proxy, it prefers to put its faith in market-based solutions — carbon taxes and pollution-trading schemes. If we could just get the incentives right, it believes, the economy will properly value everything that matters and nudge our self-interest down the proper channels. The best we can hope for is a greener version of the old invisible hand. Visible hands it has no use for.

But while some such grand scheme may well be necessary, it’s doubtful that it will be sufficient or that it will be politically sustainable before we’ve demonstrated to ourselves that change is possible. Merely to give, to spend, even to vote, is not to do, and there is so much that needs to be done — without further delay. In the judgment of James Hansen, the NASA climate scientist who began sounding the alarm on global warming 20 years ago, we have only 10 years left to start cutting — not just slowing — the amount of carbon we’re emitting or face a “different planet.” Hansen said this more than two years ago, however; two years have gone by, and nothing of consequence has been done. So: eight years left to go and a great deal left to do.

Which brings us back to the “why bother” question and how we might better answer it. The reasons not to bother are many and compelling, at least to the cheap-energy mind. But let me offer a few admittedly tentative reasons that we might put on the other side of the scale:

If you do bother, you will set an example for other people. If enough other people bother, each one influencing yet another in a chain reaction of behavioral change, markets for all manner of green products and alternative technologies will prosper and expand. (Just look at the market for hybrid cars.) Consciousness will be raised, perhaps even changed: new moral imperatives and new taboos might take root in the culture. Driving an S.U.V. or eating a 24-ounce steak or illuminating your McMansion like an airport runway at night might come to be regarded as outrages to human conscience. Not having things might become cooler than having them. And those who did change the way they live would acquire the moral standing to demand changes in behavior from others — from other people, other corporations, even other countries.

All of this could, theoretically, happen. What I’m describing (imagining would probably be more accurate) is a process of viral social change, and change of this kind, which is nonlinear, is never something anyone can plan or predict or count on. Who knows, maybe the virus will reach all the way to Chongqing and infect my Chinese evil twin. Or not. Maybe going green will prove a passing fad and will lose steam after a few years, just as it did in the 1980s, when Ronald Reagan took down Jimmy Carter’s solar panels from the roof of the White House.

Going personally green is a bet, nothing more or less, though it’s one we probably all should make, even if the odds of it paying off aren’t great. Sometimes you have to act as if acting will make a difference, even when you can’t prove that it will. That, after all, was precisely what happened in Communist Czechoslovakia and Poland, when a handful of individuals like Vaclav Havel and Adam Michnik resolved that they would simply conduct their lives “as if” they lived in a free society. That improbable bet created a tiny space of liberty that, in time, expanded to take in, and then help take down, the whole of the Eastern bloc.

So what would be a comparable bet that the individual might make in the case of the environmental crisis? Havel himself has suggested that people begin to “conduct themselves as if they were to live on this earth forever and be answerable for its condition one day.” Fair enough, but let me propose a slightly less abstract and daunting wager. The idea is to find one thing to do in your life that doesn’t involve spending or voting, that may or may not virally rock the world but is real and particular (as well as symbolic) and that, come what may, will offer its own rewards. Maybe you decide to give up meat, an act that would reduce your carbon footprint by as much as a quarter. Or you could try this: determine to observe the Sabbath. For one day a week, abstain completely from economic activity: no shopping, no driving, no electronics.

But the act I want to talk about is growing some — even just a little — of your own food. Rip out your lawn, if you have one, and if you don’t — if you live in a high-rise, or have a yard shrouded in shade — look into getting a plot in a community garden. Measured against the Problem We Face, planting a garden sounds pretty benign, I know, but in fact it’s one of the most powerful things an individual can do — to reduce your carbon footprint, sure, but more important, to reduce your sense of dependence and dividedness: to change the cheap-energy mind.

A great many things happen when you plant a vegetable garden, some of them directly related to climate change, others indirect but related nevertheless. Growing food, we forget, comprises the original solar technology: calories produced by means of photosynthesis. Years ago the cheap-energy mind discovered that more food could be produced with less effort by replacing sunlight with fossil-fuel fertilizers and pesticides, with a result that the typical calorie of food energy in your diet now requires about 10 calories of fossil-fuel energy to produce. It’s estimated that the way we feed ourselves (or rather, allow ourselves to be fed) accounts for about a fifth of the greenhouse gas for which each of us is responsible.

Yet the sun still shines down on your yard, and photosynthesis still works so abundantly that in a thoughtfully organized vegetable garden (one planted from seed, nourished by compost from the kitchen and involving not too many drives to the garden center), you can grow the proverbial free lunch — CO2-free and dollar-free. This is the most-local food you can possibly eat (not to mention the freshest, tastiest and most nutritious), with a carbon footprint so faint that even the New Zealand lamb council dares not challenge it. And while we’re counting carbon, consider too your compost pile, which shrinks the heap of garbage your household needs trucked away even as it feeds your vegetables and sequesters carbon in your soil. What else? Well, you will probably notice that you’re getting a pretty good workout there in your garden, burning calories without having to get into the car to drive to the gym. (It is one of the absurdities of the modern division of labor that, having replaced physical labor with fossil fuel, we now have to burn even more fossil fuel to keep our unemployed bodies in shape.) Also, by engaging both body and mind, time spent in the garden is time (and energy) subtracted from electronic forms of entertainment.

You begin to see that growing even a little of your own food is, as Wendell Berry pointed out 30 years ago, one of those solutions that, instead of begetting a new set of problems — the way “solutions” like ethanol or nuclear power inevitably do — actually beget other solutions, and not only of the kind that save carbon. Still more valuable are the habits of mind that growing a little of your own food can yield. You quickly learn that you need not be dependent on specialists to provide for yourself — that your body is still good for something and may actually be enlisted in its own support. If the experts are right, if both oil and time are running out, these are skills and habits of mind we’re all very soon going to need. We may also need the food. Could gardens provide it? Well, during World War II, victory gardens supplied as much as 40 percent of the produce Americans ate.

But there are sweeter reasons to plant that garden, to bother. At least in this one corner of your yard and life, you will have begun to heal the split between what you think and what you do, to commingle your identities as consumer and producer and citizen. Chances are, your garden will re-engage you with your neighbors, for you will have produce to give away and the need to borrow their tools. You will have reduced the power of the cheap-energy mind by personally overcoming its most debilitating weakness: its helplessness and the fact that it can’t do much of anything that doesn’t involve division or subtraction. The garden’s season-long transit from seed to ripe fruit — will you get a load of that zucchini?! — suggests that the operations of addition and multiplication still obtain, that the abundance of nature is not exhausted. The single greatest lesson the garden teaches is that our relationship to the planet need not be zero-sum, and that as long as the sun still shines and people still can plan and plant, think and do, we can, if we bother to try, find ways to provide for ourselves without diminishing the world.

Michael Pollan, a contributing writer for the magazine, is the author, most recently, of “In Defense of Food: An Eater’s Manifesto.”


Monday, April 21, 2008

Plagiarizing to avoid plagiarism

I came across a nice article about anti-plagiarism strategies at Inside Higher Ed. I really like what Kate Hagopian, a writing instructor at North Carolina State, does: She has the students intentionally plagiarize.



For one assignment, she gives her students a short writing passage and then a prompt for a standard student short essay. She asks her students to turn in two versions. In one they are told that they must plagiarize. In the second, they are told not to. The prior night, the students were given an online tutorial on plagiarism and Hagopian said she has become skeptical that having the students “parrot back what we’ve told them” accomplishes anything. Her hope is that this unusual assignment might change that.

After the students turn in their two responses to the essay prompt, Hagopian shares some with the class. Not surprisingly, the students do know how to plagiarize — but were uncomfortable admitting as much. Hagopian said that the assignment is always greeted with “uncomfortable laughter” as the students must pretend that they never would have thought of plagiarizing on their own. Given the right to do so, they turn in essays with many direct quotes without attribution. Of course in their essays that are supposed to be done without plagiarism, she still finds problems — not so much with passages repeated verbatim, but with paraphrasing or using syntax in ways that were so similar to the original that they required attribution.

When she started giving the assignment, she sort of hoped, Hagopian said, to see students turn in “nuanced tricky demonstrations” of plagiarism, but she mostly gets garden variety copying. But what she is doing is having detailed conversations with her students about what is and isn’t plagiarism — and by turning everyone into a plagiarist (at least temporarily), she makes the conversation something that can take place openly

.

Has anyone tried this? Might it work in a philosophy course?

Monday, April 14, 2008

A happy result with student writing

I've just finished grading a batch of papers for my Introduction to Philosophy course. This semester I assigned several (five) very short readings on how to write a philosophy paper over the course of the three weeks prior to when students would begin their papers. The readings were culled from various texts and websites about how to write philosophy papers. Here are the results:



Many of the sorts of errors that can distract us from assessing the progress of our student's skills in philosophy were wonderfully absent. Gone were the windy introductions, the awkward, high-falutin' language, the simple proof-reading errors many of us feel we cannot leave unmarked, etc. It was freeing to be able to spend most of my time focusing on the areas of the paper that demonstrated success (philosophically speaking) and on the specific skills they ought to focus on in order to improve (philosophically speaking).

While it is true that clear thinking and clear writing cannot be divorced, placing the burden on students to work on these areas of their writing prior to submitting work is empowering and effective. It also prevents students from becoming demoralized by an overload of comments.

Of course, it could be fluke. But I'll be doing this in all my 100 and 200 level philosophy courses from now on.

Saturday, April 12, 2008

The Philosophy Major Gains Popularity

Reposted from the APA webpage: A recent article in the New York Times (April 6, 2008) highlights the increasing number of undergraduate philosophy majors nationwide. Another article, from The Guardian (Nov. 20, 2007), shows evidence that philosophy degrees are in growing demand from employers.


Here they are, in case they become inaccessible at the original pages:

In a New Generation of College Students, Many Opt for the Life Examined

Sylwia Kapuscinski for The New York Times

Zachary Perry, a junior at Rutgers University, reasons out a position at a meeting of the university’s philosophy club.

Published: April 6, 2008, NY Times

NEW BRUNSWICK, N.J. — When a fellow student at Rutgers University urged Didi Onejeme to try Philosophy 101 two years ago, Ms. Onejeme, who was a pre-med sophomore, dismissed it as “frou-frou.”

Sylwia Kapuscinski for The New York Times

Rebecca Clipper, a senior in a philosophy class at Rutgers, which has 100 philosophy majors graduating this year.

“People sitting under trees and talking about stupid stuff — I mean, who cares?” Ms. Onejeme recalled thinking at the time.

But Ms. Onejeme, now a senior applying to law school, ended up changing her major to philosophy, which she thinks has armed her with the skills to be successful. “My mother was like, what are you going to do with that?” said Ms. Onejeme, 22. “She wanted me to be a pharmacy major, but I persuaded her with my argumentative skills.”

Once scoffed at as a luxury major, philosophy is being embraced at Rutgers and other universities by a new generation of college students who are drawing modern-day lessons from the age-old discipline as they try to make sense of their world, from the morality of the war in Iraq to the latest political scandal. The economic downturn has done little, if anything, to dampen this enthusiasm among students, who say that what they learn in class can translate into practical skills and careers. On many campuses, debate over modern issues like war and technology is emphasized over the study of classic ancient texts.

Rutgers, which has long had a top-ranked philosophy department, is one of a number of universities where the number of undergraduate philosophy majors is ballooning; there are 100 in this year’s graduating class, up from 50 in 2002, even as overall enrollment on the main campus has declined by 4 percent.

At the City University of New York, where enrollment is up 18 percent over the past six years, there are 322 philosophy majors, a 51 percent increase since 2002.

“If I were to start again as an undergraduate, I would major in philosophy,” said Matthew Goldstein, the CUNY chancellor, who majored in mathematics and statistics. “I think that subject is really at the core of just about everything we do. If you study humanities or political systems or sciences in general, philosophy is really the mother ship from which all of these disciplines grow.”

Nationwide, there are more colleges offering undergraduate philosophy programs today than a decade ago (817, up from 765), according to the College Board. Some schools with established programs like Texas A&M, Notre Dame, the University of Pittsburgh and the University of Massachusetts at Amherst, now have twice as many philosophy majors as they did in the 1990s.

David E. Schrader, executive director of the American Philosophical Association, a professional organization with 11,000 members, said that in an era in which people change careers frequently, philosophy makes sense. “It’s a major that helps them become quick learners and gives them strong skills in writing, analysis and critical thinking,” he said.

Mr. Schrader, an adjunct professor at the University of Delaware, said that the demand for philosophy courses had outpaced the resources at some colleges, where students are often turned away. Some are enrolling in online courses instead, he said, describing it as “really very strange.”

“The discipline as we see it from the time of Socrates starts with people face to face, putting their positions on the table,” he said.

The Rutgers philosophy department is relatively large, with 27 professors, 60 graduate students, and more than 30 undergraduate offerings each semester. For those who cannot get enough of their Descartes in class, there is the Wednesday night philosophy club, where, last week, 11 students debated the metaphysics behind the movie “The Matrix” for more than an hour.

An undergraduate philosophy journal started this semester has drawn 36 submissions — about half from Rutgers students — on musings like “Is the extinction of a species always a bad thing?”

Barry Loewer, the department chairman, said that Rutgers started building its philosophy program in the late 1980s, when the field was branching into new research areas like cognitive science and becoming more interdisciplinary. He said that many students have double-majored in philosophy and, say, psychology or economics, in recent years, and go on to become doctors, lawyers, writers, investment bankers and even commodities traders.

As the approach has changed, philosophy has attracted students with little interest in contemplating the classical texts, or what is known as armchair philosophy. Some, like Ms. Onejeme, the pre-med-student-turned-philosopher, who is double majoring in political science, see it as a pre-law track because it emphasizes the verbal and logic skills prized by law schools — something the Rutgers department encourages by pointing out that their majors score high on the LSAT.

Other students said that studying philosophy, with its emphasis on the big questions and alternative points of view, provided good training for looking at larger societal questions, like globalization and technology.

“All of these things make the world a smaller place and force us to look beyond the bubble we grow up in,” said Christine Bullman, 20, a junior, who said art majors and others routinely took philosophy classes. “I think philosophy is a good base to look at a lot of issues.”

Frances Egan, a Rutgers philosophy professor who advises undergraduates, said that as it has become harder for students to predict what specialties might be in demand in an uncertain economy, some may be more apt to choose their major based simply on what they find interesting. “Philosophy is a lot of fun,” said Professor Egan, who graduated with a philosophy degree in the tough economic times of the 1970s. “A lot of students are in it because they find it intellectually rewarding.”

Max Bialek, 22, was majoring in math until his senior year, when he discovered philosophy. He decided to stay an extra year to complete the major (his parents needed reassurance, he said, but were supportive).

“I thought: Why weren’t all my other classes like that one?” he said, explaining that philosophy had taught him a way of studying that could be applied to any subject and enriched his life in unexpected ways. “You can talk about almost anything as long as you do it well.”

Jenna Schaal-O’Connor, a 20-year-old sophomore who is majoring in cognitive science and linguistics, said philosophy had other perks. She said she found many male philosophy majors interesting and sensitive.

“That whole deep existential torment,” she said. “It’s good for getting girlfriends.”




I think, therefore I earn



Philosophy graduates are suddenly all the rage with employers. What can they possibly have to offer?

Jessica Shepherd
Tuesday November 20, 2007
The Guardian


Philosophy student Joe Cunningham
Philosophy student Joe Cunningham: considering a future in medical ethics. Photograph: Graham Turner


"A degree in philosophy? What are you going to do with that then?"

Philosophy students will tell you they've been asked this question more times than they care to remember.

"The response people seem to want is a cheery shrug and a jokey 'don't know'," says Joe Cunningham, 20, a final-year philosophy undergraduate at Heythrop College, University of London.

A more accurate comeback, according to the latest statistics, is "just about anything I want".

Figures from the Higher Education Statistics Agency show philosophy graduates, once derided as unemployable layabouts, are in growing demand from employers. The number of all graduates in full-time and part-time work six months after graduation has risen by 9% between 2002-03 and 2005-06; for philosophy graduates it has gone up by 13%.

It is in the fields of finance, property development, health, social work and the nebulous category of "business" that those versed in Plato and Kant are most sought after. In "business", property development, renting and research, 76% more philosophy graduates were employed in 2005-06 than in 2002-03. In health and social work, 9% more.

The Higher Education Careers Services Unit (Hecsu), which also collates data of this kind, agrees philosophers are finding it easier to secure work. Its figures show that, in 2001, 9.9% of philosophy graduates were unemployed six months after graduation. In 2006, just 6.7% were. On average, 6% of all graduates were unemployed six months after graduation.

In 2001, 9.3% of philosophy graduates were in business and finance roles six months after graduation. In 2006, 12.2% were. In 2001, 5.3% were in marketing and advertising six months after graduation. In 2006, 7.3% were.

It is particularly significant that the percentage finding full-time work six months after graduation has risen, since the number of philosophy graduates has more than doubled between 2001 and 2006. In 2001, UK universities produced 895 graduates with a first degree in the discipline; in 2006, they produced 2,040.

And it is so popular with its graduates that many go on to postgraduate study rather than join the workforce. Charlie Ball, who runs Hecsu's labour market analysis, says: "More philosophy graduates are being produced, and they are much less likely to be unemployed than five years ago."

Philosophers have always come in handy in the workplace with their grounding in analytical thinking. Why, only now, are they so prized by employers?

Open mind

Lucy Adams, human resources director of Serco, a services business and a consultancy firm, says: "Philosophy lies at the heart of our approach to recruiting and developing our leadership, and our leaders. We need people who have the ability to look for different approaches and take an open mind to issues. These skills are promoted by philosophical approaches."

Fiona Czerniawska, director of the Management Consultancies Association's think tank, says: "A philosophy degree has trained the individual's brain and given them the ability to provide management-consulting firms with the sort of skills that they require and clients demand. These skills can include the ability to be very analytical, provide clear and innovative thinking, and question assumptions."

Deborah Bowman, associate dean for widening participation at St George's, University of London, which offers medicine and health sciences courses, says philosophers are increasingly sought after by the NHS: "Graduates of philosophy who come in to graduate-entry medicine, or to nursing courses, are very useful. Growth areas in the NHS include clinical ethicists, who assist doctors and nurses. Medical ethics committees and ethics training courses for staff are also growing. More and more people are needed to comment on moral issues in healthcare, such as abortion."

Being on an ethics committee of the NHS is something Cunningham is looking into. "It would be a direct application of my skills," he says.

The popular philosopher Simon Blackburn, a professor at Cambridge University, sees the improving career prospects of philosophy graduates as part of a wider change of public perception. "I guess the public image of a philosopher has tended to concentrate on an ancient Greek in a toga, or some unwashed hippy lying around not doing very much," he says. "I do detect a change in the way the public sees philosophers. I have been pleasantly surprised by the number of people who come to philosophy events nowadays."

Blackburn can take some credit. The user-friendly books on philosophy that he and other philosophers such as AC Grayling, Stephen Law, Julian Baggini, Nigel Warburton and Alain de Botton write have made their way into the mainstream.

Course design

Those in charge of designing university courses have also become sensitive to claims that their subject has no relevance to the modern day.

Blackburn says: "In the years after the second world war, there was a sort of Wittgensteinian air about philosophy, which meant practitioners were proud of the fact that they appeared slightly esoteric and were not doing anything practical. There was very little political philosophy, and moral philosophy was disengaged from people's actual moral problems, and that did lead to the subject being marginalised. That has changed. Political philosophy is a central part of the Cambridge course."

Jonathan Lowe, professor of philosophy at Durham University, agrees that courses' concern with the real world has accelerated in the past five years.

"It's probably because of the new financial arrangements for students that courses have had to prove they are applicable to real world issues," he says. "And the teaching methods have changed. There are more student-led sessions. Students have to argue on their feet and give presentations. That probably shows at interviews."

News that employers and the public hold philosophers in higher regard should presumably be cause for celebration? Not entirely, says Blackburn. "It is also slightly worrying, because people turn to philosophers when they feel less confident and more insecure."


Thursday, April 10, 2008

The Problem of Privatization

I am participating in a professional development faculty learning group this semester. We're reading The Courage to Teach, by Parker Palmer. On the whole, I like the book. Even when I disagree, I find that Palmer gives food for thought. Palmer argues that a problem for teachers in the university is that we teach, generally, out of collegial sight.

In the cases of surgeons and trial lawyers, who practice their profession before the eyes of others who are also experts in their profession, professors almost always teach out of the sight of their colleagues. As Palmer puts it, "Lawyers argue cases in front of other lawyers, where gaps in their skill and knowledge are clear for all to see. Surgeons operate under the gaze of specialists who notice if a hand trembles, making malpractice less likely. But teachers can lose sponges or amputate the wrong limb with no witnesses except the victims (p. 146)."

One result is that when teaching is evaluated, undue weight is placed upon student evaluations, the results of which can be manipulated (If we want to get rid of a faculty member, their strong student evaluations are merely evidence of "popularity". If we want to keep a faculty member, their weak student evaluations are irrelevant because they impart rigorous scholarship).
Palmer's solution is to learn about teaching and learning in community with our colleagues. Though there are barriers to this, such as lack of time and lack of trust, Palmer argues that we can and should take the time to know enough about each other's teaching to ask real questions of each other and aid one another's professional development as teachers, along the following lines:

1. Does this person take teaching seriously, as signified by her involvement in conversations about it?
2. What kind of process does this person go through in designing a course?
3. How does this person identify and respond to the problems that arise as a course proceeds?
4. Does this person learn from past mistakes in designing and implementing future courses?
5. Does this person attempt to help colleagues with issues in their teaching?

These questions seem useful not only to ask of others, but for personal reflection as well that ultimately will include but go beyond mere technique to the heart or soul of the teacher, as Palmer puts it.

Tuesday, April 8, 2008

Add infinitum?

The teaching profession is an essentially altruistic profession. Over time, I've grown to accept this. Indeed, the opportunity to help others (well, or at least not harm them) is one of features of teaching that motivates me to teach well. But on the other hand, I sometimes find that the altruistic dimension of teaching results in uncomfortable situations, situations that could be seen as ethical dilemmas. Today's example: Am I obligated to add students who wish to add my courses?


A little background is perhaps in order: Due to budget woes at my university, course sizes have increased this quarter. Courses that typically run with 35 students have enrollment limits of 40 students — and these courses are fully enrolled after regular registration is complete. I have discretion to enroll above that number, and as you might expect, many students have requested that I add them.

I find this a very uncomfortable situation, personally and ethically. I think that I'm obligated to enroll up to official limits set by the university. (Indeed, I have no control over that.) But am I obligated to enroll students in numbers above those limits? On the one hand, I'd like to think that by taking my courses, students are getting something good and valuable, so I should be willing to provide it to anyone who requests it. Furthermore, some students need to enroll in my courses in order to graduate, either because the courses satisfy their unmet general education requirements or because the course is required for philosophy majors. But of course, each additional student is a little more work for me. This may not seem like much, but the university has already compelled me to admit five more students per section, and I teach three sections a quarter; that's 15 students more total. If I agree to take, say, five more students per section, that's 15 more. So under the usual arrangement, I teach about 105 students per quarter, and I find myself teaching 135 — not a trivial increase in my workload.

I feel guilty about declining to add students, but at the same time, there are limits to what my altruistic role can reasonably ask of me, aren't there? If 100 students wanted to add my courses, I'd have no obligation to take them all (I'm pretty sure). But if 100 is more than I'm obligated to add, then why isn't 15 more than I'm obligated to add? On top of this, there's a larger institutional context to consider: Students need to demand that their instructional needs be met, and if I add them to my courses, I'm removing an incentive for them to, frankly, be pissed about the budgetary situation. As a constituency, students have some power (if not now, then later on, when they are voters deciding on how the state universities will be funded), but if I (and/or my colleagues) add students, then the impact of the budgetary situation — and the fact that they have legitimate grounds for complaint about that situation — may be lost on them.

As it turns out, I've added a few more per section. But I remained deeply unsatisfied with the situation.

Sunday, April 6, 2008

The Ethics of Course Design

I was wondering what everyone's thoughts were on the idea of modeling a course on a book -- say, an overview of a particular subject area -- but not requiring students to buy or read the book. Here's the situation I'm thinking of:


One of the courses I have taught in the recent past was a seminar on the "meaning of life". Originally I wanted to cover the recent analytic literature on the problem, but my students weren't quite ready for that. So we ended up reading a few books and selections from an anthology, tied together by connections drawn by myself. It was actually my highest rated course ever, but I wasn't satisfied by the level of structure vs. free-thinking, so I put it on the shelf until I could reorganize things.

Publishing little books on the meaning of life seems to have been all the rage in the recent past, so I was looking over a few when I found Julian Baggini's What's It All About. It's not written for the specialist, but for someone new to the debate. And though it doesn't go into much depth (and contains some misunderstandings of Buddhism), it's pretty good at highlighting passages from philosophy, psychology, and pop culture that apply very smoothly to the overall debate. I wouldn't want to use the actual book for the course: I think students would end up attacking the author rather than the ideas and see the book as hubristic as opposed to the fairly humble work it actually is. They also like to read "the greats" rather than secondary literature about the greats, according to informal surveys. But the sequence of the topics, the literature Baggini refers to, and the basic structure of the arguments would make for an excellent outline in my course.

So of course I would have to give Baggini credit for the structure of the course at some point, and I have no idea with pointing my students to the book if they want to check it out for themselves. But what do you all think about the idea of otherwise completely ripping off someone's carefully assembled reading list without assigning the book in which it appears?

Tuesday, April 1, 2008

Philosophers in the News

On today's (April 1) "release" of Gmail Custom Time, Google has a quote from a "philosophy professor":

"This feature allows people to manipulate and mislead people with falsified time data. Time is a sacred truth that should never be tampered with." -- Michael L., Epistemology Professor

Another philosophy teacher in the news is surely welcome, even if fictional!