Showing posts with label History Profession. Show all posts
Showing posts with label History Profession. Show all posts

Friday, August 16, 2013

On the passing of Pauline Maier

Chris Beneke


Pauline Maier passed away Monday. Her academic title was William Kenan Jr. Professor of History at the Massachusetts Institute of Technology. It’s a distinguished position, but hardly does justice to the person who filled it.

I didn’t know Professor Maier was ill. Frankly, it’s hard to imagine her in a non-effervescent condition. Every other observation I’ve encountered since Monday affirms the testimony of my own experience: Maier was irrepressibly charming, ceaselessly brilliant, and blessed with a thunderous, seminar-shaking laugh.

Professor Maier published important books at a stately pace. There were four major monographs, one per decade. The first two changed our understanding of pre-Revolutionary politics;  the third upended standard interpretations of the Declaration of Independence;  the fourth provided the first full account of how the U.S. Constitution was ratified. Each was definitive, the sort of book that every early American historian needs to read at some point. Each was also a marvel of original documentary research and nonfiction storytelling.

Pauline Maier, the world-renowned scholar and teacher, also happened to be kind. She didn’t dispense saccharine praise to the minor figures who clamored for her attention. Instead, she offered genuine respect for their ideas and lots of tough, useful criticism. As a graduate student with no prior acquaintance, I invited her to comment on a conference panel. She agreed and, when the time came, cheerily directed me to think about many rudimentary things about which I should have already been thinking. A few years later, when I asked her to read part of my book manuscript—a punishingly dry and tedious tome—she agreed again, and then administered the thorough, friendly thrashing it needed.

These were the kind of things she did for people and partly explains why she published four landmark books instead of eight. Maier treated her fellow historians with the same honest consideration that she tendered her historical subjects, remaining ever open to the possibility that the lowly might possess more insight than their better appointed counterparts (as well as the possibility that they might be out of their senses). Ratification’s acknowledgements leave no doubt how much she appreciated the assistance that others offered in turn. She thanked them not merely in lists of names, or even sentences, but whole paragraphs. Word count be damned.

Maier was more irreverent in her work than big-name professors typically are. She had little use for unsubstantiated assumptions or pious orthodoxies. She had a keen eye for the workings of power and a still keener appreciation for the anxieties of those who confronted it. But Maier never succumbed to the vain illusion that doing history is the same thing as doing politics.

Professor Maier loved good argument but always approached the past with the kind of wonder and excitement that those burdened by mountains of historiography often find difficult to muster. Of course she knew the relevant scholarship inside and out. Maier just didn’t have much use for the things that we habitually say about those who lived before us. She wanted to know what they said themselves.

Scholars as celebrated as Pauline Maier are vulnerable to severe cases of self-absorption. That was not a malady from which she suffered. In addition to the innumerable kindnesses she extended to other historians, she was perpetually delighted by her children, grandchildren, and Rhode Island garden. And she loved her husband Charlie.

If, as planned, I had been able to attend her spring seminar at the Massachusetts Historical Society, I would have seen Professor Maier one last time. But I had kids to watch at home. As much as I regret that absence, I suspect she would have told me that the time was better spent with my boys. That’s one of the reasons I’ll miss her.

Wednesday, June 19, 2013

Mothers in the Academy: How to Do It All*

Heather Cox Richardson

Well, first you need a good household staff.

HAHAHAHAHAHAHAHAHA….

OK, now that we’ve got the hilarity out of the way, how really can mothers take on teaching, research and writing, and children—three incredibly labor-intensive jobs—at the same time?

Let’s start with teaching. Here are a few things I picked up along the way, largely by the seat of my pants as I jumped into a job when my first child (of three) was just shy of three months old. Nothing I learned was intentional, but some of it has stood me in good stead.

The key concept for enabling mothers to survive in the academy is efficiency. And here are some things that helped me to achieve it:

Teach big courses with a wide scope. That sounds counterintuitive, I know. Most people think it takes less energy for junior scholars—and most people with small children will be junior scholars—to teach smaller classes in their specialty. The problem with such specific classes is that they tend to be under enrolled, which means you will constantly have to come up with new courses to keep your numbers up. Repeatedly writing courses from scratch is a huge time-sink.

Monday, June 10, 2013

Mothers in the Academy

Heather Cox Richardson

A recent study shows that having children hurts women in academia at every stage of the profession. This will not be news to any woman who has had to sneak out of a meeting to pick up a child before the daycare fine system kicks in, who has had to explain to an older male colleague that his insistence on scheduling his pet seminar from 4 to 6 guarantees she can never attend no matter how angry it makes him, who has worked all night in the office because search files could not leave the premises and there was no time during the day to get enough time to read them all, and who has heard those chilling words: “You can have tenure or children, but not both.”

There is a push to change the mechanics of university life to address this problem, offering maternity leave to graduate students, for example, and extending tenure clocks for mothers. (More first-floor bathrooms wouldn’t come amiss either, by the way; two flights to a bathroom when you’re eight months pregnant is no picnic.) These steps are important, to be sure. But for historians, even more important to remember is that, by cutting more than half the population from the study of humanity, we are skewing our scholarship so badly it threatens to lose all meaning.

This is not a theoretical argument; it has real-world meaning for the study of history. Having a family—and nurturing it—is crucial for historians. (And this does not seem to me to have to be birth or adopted children, by the way. Investing in community does not require youngsters who actually live in your home.)

Monday, July 30, 2012

Rewriting History? The Case of Joe Paterno

Alan Bliss

As part of its sanctions against Penn State University, the NCAA last week "vacated" 111 of the Nittany Lions' football victories under their late coach, Joe Paterno. The order changes the official record of Penn State's teams from 1998 to 2011. Technically, then, Coach Paterno no longer holds the NCAA record as the winningest coach in Division I college football.

A non-academic friend, a lawyer by profession, complains that the NCAA is rewriting history. Professional historians like me, my friend argues, should be outraged. Surprisingly, my friend is hardly alone in reading this news as an intolerable assault on historical truth. In the July 24 New York Times, Northwestern University sociologist Gary Alan Fine published an op-ed ("George Orwell and the N.C.A.A.") objecting to the NCAA's records sanction against Penn State:

Professor Fine sees this as a disturbing attempt to re-write the past, or to create a false, "fantasized," history. "George Orwell would be amused," Fine believes. But neither Fine nor others who make this argument seem to be historians, who, as far as I know, are unconcerned by the NCAA's periodic fiddling with its own record books. One reason is that retroactive bookkeeping does little to alter any "history" other than the records of the institution doing the counting. And mind, we are talking here about Division I collegiate football, where even indisputable facts are disputed endlessly. Even if that weren't so, sports historians take pains to explicate the circumstances of athletic records. Future researchers looking up the Lions' football stats will be obliged to learn all about the University's miserable scandal. The NCAA's purpose in sanctioning Penn State will be lastingly served.

As I teach my students, the past is what happened, while history is how we explain and interpret the past. Denying or obscuring inconvenient facts throws historians off at times, and can indeed rise to the level of the Orwellian. But in the long run the practice often fails. For example, we now know that Woods Hole Oceanographer Bob Ballard was not really engaged in a pure-science project to locate the wreck of the RMS Titanic. His 1985 expedition was financed by the U.S. Defense Dept., which sought his technology to examine the deep-sea wrecks of its two lost Cold-war era nuclear submarines, the USS Thresher and USS Scorpion. After obliging the Navy, Dr. Ballard carried out his "cover" mission of locating the Titanic. The success of that side-trip made Ballard an inspirational hero on the order of a winning college football coach. Among his many admirers, the new facts haven't seriously knocked him off his pedestal - they just complicate his story and that of the Titanic's rediscovery.

Historians understand better than most how little we sometimes know. We are alert to the risks that go with formulating historical understanding from data. Numbers can lie, whether they involve college football or voting. I teach students to be skeptical, critical, and open to new ideas, new sources, new data, and new interpretations of the evidence of the past. Some ideologues disdain that as "historical revisionism." But history is endlessly under revision, and we shouldn't want it any other way.

Joe Paterno was a hero. He will always hold a place in history, though the context is different now. The truth about his and Penn State's football program has badly dented his legacy. The NCAA's action on his win-loss record can't hurt the late Coach, whose troubles are over. No doubt, his family and partisans will grieve about this poisonous affair for the rest of their lives. Mainly, Penn State's vacated wins are a message to other coaches, players, administrators, fans, boosters, and just regular onlookers. The sanctions also help show that history has an annoying habit, which historians encourage, of outting lies.

Alan Bliss is a historian of the modern U.S. His research is on metropolitan political economy, especially in Sunbelt cities. He is presently a Visiting Assistant Professor at the University of North Florida.

Monday, October 10, 2011

“No More Plan B”—Apocalypse or Opportunity?

Dan Allosso

Graduate students in the humanities are well aware that, in the words of Inside Higher Ed this week, many of our disciplines have promoted alternate career paths outside the academy while at the same time encouraging us to hold onto the hope that although others might need them, we won’t. Now, however, the president of the American Historical Association (AHA) has apparently committed his organization to admitting to history grad students that there are not enough jobs to go around, and the situation is not getting better.

These sentiments appear in a statement issued by Anthony Grafton, president of the AHA, and James Grossman, its executive director. The essay, titled “No More Plan B” and posted on the AHA website on September 26th, criticizes the traditional department’s approach to grad students on the grounds that it “ignores the facts of academic employment . . . it pushes talented scholars into narrow channels, and makes it less likely that they will take schooled historical thinking with them into a wide range of employment sectors.”

Now it would be easy to blame faculty for candy-coating both the overall change in the academy (or at least, in the humanities), and for making their program seem like one where these issues need not concern grad students. Would we be angry to find how few people our department has placed into significant, tenure-track positions in the last five years? But we’re all adults: why didn’t we know this going in?

Or—and this is where it gets interesting—if we really did suspect that the old center would not hold, why did we come anyway? Forgetting about the traditional academy and its appointment with oblivion, and remembering what we each, individually love about our discipline and subjects might be the key to personal solutions that will change not only our own outcomes, but the academy itself.

Yes, departments that can’t place PhDs should probably stop producing them. But what if this apocalypse for the academy liberates us, the grad students, and forces us to refocus? What do we hope to achieve by our work? What difference do we want to make in the world? Do we see ourselves teaching undergraduates in ten years, opening young people’s minds to creative, critical thinking; sharpening their analytical and interpretive skills; helping them learn to read, write, and speak effectively? If this is our core mission, does it matter whether the students are sitting in front of us in a lecture hall or convening in an online forum? On the other hand, if our main interest is research, or writing—either for expert audiences or for the general public—then perhaps the breakdown of the traditional professional model offers us a chance to focus on what we are really good at, and leave the rest behind.

The scary part is, we’ll have to really be good at it. The authors of “No More Plan B” hint that there’s something wrong with the idea that “the life of scholarship” protects us from “impure motives and bitter competition.” We shouldn’t see non-tenure track employment, they tell us, as a fall from “the light of humanistic inquiry into the darkness of grubby capitalism.” But it goes beyond simply embracing the market or awakening from a dream of the idealized, highly compensated academic life. The academy, after all, exists within society and the market, and responds—albeit slowly—to the needs and desires of students.

The rest of society has been struggling for a generation with many of the issues now facing the academy. Technology has been replacing humans on assembly lines, in service professions, and even in “Knowledge” work for decades. Globalization, outsourcing, and new media have changed or obsoleted entire industries. Along the way, the two questions that have been continually asked of each individual are, “what are your specific responsibilities?” and “what is your value-add?”

Steve Jobs was famous for promoting a corporate culture at Apple centered on the idea of the “DRI,” or directly responsible individual. Unlike many people at other companies (especially in Silicon Valley!) who rarely achieved anything from one staff meeting to the next, Apple workers got used to seeing a DRI name next to every task and action item. Individual responsibility helped the bottom line, of course; but it also gave people a way to say “I did that,” and know what they had contributed.

I’m not arguing that the academy should adopt direct individual responsibility—there are too many interests arrayed against it. I’m suggesting that each of us grad students can find a way out of the “Plan B” trap, by deciding what we do that benefits society (or the discipline, or the advance of useful knowledge, etc.), and then articulating it and doing it. What is our personal value-add? Regardless of whether we’re given an opportunity to do it in the institutional format we expected. After all, whose “Plan B” was it, anyway?

Thursday, June 2, 2011

How Can Anyone Hope to Be a Successful Grad Student?

Heather Cox Richardson

Dan’s post of earlier this week coincided nicely with a conversation I had recently about what makes a good graduate student. While there is no doubt that the academy is changing very rapidly right now, I would argue that it hasn’t been the “traditional” academy since at least 1980. The changes Dan identified have been underway for decades: the market for PhDs has been appalling, the role of adjuncts has been growing, the nature of college and its students have been changing. It’s just that we’re only now acknowledging these changes.

So how can anyone hope to be a successful graduate student?

A conversation I had last week with another academic might shed some light on that question. This scholar is in cognitive psychology while I am in history, but otherwise our experiences in the academy have been similar. My friend graduated from the same university I did back in the early 1990s (although we met only a decade ago when our daughters became friends). Like me, she had a fairly rocky start in the profession—we were both denied tenure at our first jobs—but worked her way back up to a position at a top-notch university, a more prominent place than the one that denied her tenure at the start of her career.

Our histories are significant because, although we are in very different disciplines, when we got talking about what we thought made a successful graduate student, we agreed completely. It seems likely that our agreement came in part from the fact that the “traditional” academy did not serve us terribly well, so our careers anticipated the crisis to which Dan has recently called our attention.

Neither one of us is much impressed by students who are what we called the “stars.” These are the students with stellar grades who can reel off all the established studies in the field, who usually write beautifully, and are enamored of Becoming Academics. (Most students know these people from how badly they intimidated the rest of the students in introductory courses.)

In our experience, those people rarely have a future in the modern academy for two simple reasons.

First, they are very good at figuring out what’s expected of them in school, and of performing it with excellence. The problem with that sort of successful experience is that such students rarely can think outside the box. They do brilliantly in classes that cover established material, but they cannot come up with big new ideas on their own. They’re rarely very interested in deep research, preferring to cover established studies and engaging in only cursory investigations of primary material. Their class work is impressive; their own scholarship is not.

The second problem with the “stars” is they’re used to being at the top of everything. When they inevitably get sent back to the drawing board over something—and about 90% of what we do involves reworking our material—they simply fold. They have no resources to figure out how to beaver away at a project until they actually succeed. They’ve never had to.

My friend and I agreed that what we look for in students is passion. She told the story of one of her best students of all time, who came to her from a mediocre school where his grades had been up and down. But she took him because he had sought her out at a conference on her fairly rarefied scientific field when he was an undergraduate, and in their discussion, she discovered that he had paid his own way to the conference although it was a hardship for him. He loved the material so much he couldn’t be kept away. She accepted him into her lab, and he became the most productive and innovative scholar she has had. (She later learned the up-and-down grades had come from a family crisis.)

Students with passion can’t be discouraged. They’re in the profession not to Become An Academic, but because they cannot imagine life without studying their chosen field. When you hand back a dissertation prospectus for the fifth time covered with comments and criticism, they dig back in, not to please an advisor but because they really care about getting it right. When they do emerge with a final product, it’s new and exciting, saying something no one has said before. Because they’ve worked so hard on it, it’s also well executed and well written. It moves the field forward.

In the past such students might have been lost. They do not necessarily fit naturally into traditional departments. But now, the changing academy and the opening of the world with the internet means that such students can build a community and find new opportunities outside traditional channels.

Academia seems to be becoming more entrepreneurial than it has been in the past. This certainly poses problems, but it also offers an enormously exciting opportunity to advance scholarship in new ways and to reintegrate scholarship into the world outside the academy.

For the right kind of graduate student, the glass is at least half full.

Wednesday, February 9, 2011

Questioning the Assumptions of Academic History

.
A lively forum in the January issue of Historically Speaking critiques some of the assumptions of academic history. Here are selections from the lead essay by Christopher Shannon and a comment by Elisabeth Lasch-Quinn.

"From Histories to Traditions: A New Paradigm of Pluralism in the Study of the Past"
Christopher Shannon

The last forty years have witnessed a tremendous expansion of the range of historical topics deemed fit subject matter for professional academic historians in America. Beginning in the late 1960s, social historians concerned to recover the experience of common people led a revolt against the perceived elitism of the then-dominant fields of political, diplomatic, and intellectual history. The pioneers of social history, particularly those rooted in the field of labor history, soon came under attack for focusing on white male historical actors. This critique gave birth to the flourishing of studies of women, racial and ethnic minorities, and most recently, sexual minorities. Even those sympathetic and supportive of these developments at times wonder if any principle of unity or synthesis remains in the wake of so much diversity, or if those seeking to make sense of the past must give up on History and rest content with the proliferation of histories.

This diversity of subject matter masks a fundamental uniformity of method. For all its openness to new subjects for inquiry, the historical profession in America has refused to accept any fundamental questioning of its basic assumptions about how we gain meaningful knowledge of the past. Despite various philosophical challenges over the past century, most historians remain committed to a common-sense empiricism rooted in the philosophical assumptions of the physical sciences that dominated the intellectual landscape of the West at the birth of the historical profession in the late 19th century. This consensus is, moreover, at once epistemological and political: common-sense empiricism must be defended because it is the epistemology most appropriate to liberal modernity. The epistemological alternatives are historicism and relativism; the political alternatives are fascism and totalitarian communism.>>>

"Comment on Christopher Shannon"
Elisabeth Lasch-Quinn

. . . . In his 2009 book God, Philosophy, Universities, Alasdair MacIntyre paints a portrait of the modern university as fragmented beyond coherence by specialization into the various disciplines. No overarching understanding of how the disciplines are connected exists. Whether “professedly secular or professedly Catholic,” universities today are no longer informed by “a notion of the nature and order of things.” It is the same with the subspecialties. Shannon may be assuming too much unity of perspective in suggesting that it is the liberal world-view that informs most historical writing today. Autonomy is a common enough theme, but it is not necessarily any longer part of a unified view of the world. Agency is more a mantra than a well-articulated intellectual framework or philosophical system. It appears adequate to conclude a study of this or that marginal group with the suggestion that the group, however oppressed, exhibited agency. Why that matters is assumed, not defended.

This diversity of subject matter masks a fundamental uniformity of method. For all its openness to new subjects for inquiry, the historical profession in America has refused to accept any fundamental questioning of its basic assumptions about how we gain meaningful knowledge of the past. Despite various philosophical challenges over the past century, most historians remain committed to a common-sense empiricism rooted in the philosophical assumptions of the physical sciences that dominated the intellectual landscape of the West at the birth of the historical profession in the late 19th century. This consensus is, moreover, at once epistemological and political: common-sense empiricism must be defended because it is the epistemology most appropriate to liberal modernity. The epistemological alternatives are historicism and relativism; the political alternatives are fascism and totalitarian communism.

The pluralistic tolerance of all views as equally valid is difficult, if not impossible, to reconcile with either traditions or histories possessing value judgments of any kind. Herein lies the problem. If the welcoming of Catholic history is to bear any fruit, we will need to confront head-on the fundamental clash between a totalizing pluralism, which only results in nihilism, and any normative story. As Shannon writes, it is not enough to concede that “objectivity is not neutrality.” But neither is it enough to let all flowers bloom. We have to confront the huge intellectual obstacles that arise when traditions are incompatible. I think this is what Shannon intends in his worthy embrace of “a new norm for historical inquiry, one less geared toward the industrial production of information about the past and more directed toward a philosophical reflection on human nature through the study of the past.” Moving toward this kind of reflective activity might be foreclosed by the unbridled embrace of traditions. It matters what those traditions are, their content, practices, and quality. I’m not sure I’d like to see “the ability of Enlightenment institutions like the modern history profession to open themselves to distinctly Catholic interpretive traditions” made into “a good test of their ability to include other nonliberal traditions.”>>>

Friday, January 14, 2011

Rarely is the Question Asked: Is Our Professors Teaching? Part II

Heather Cox Richardson

Randall asked a good question in his post wondering whether or not college and university professors are encouraged to improve their teaching. He has inspired me to blog about teaching issues in a more systematic way than I have before.

Today the topic that is consuming me is assessment. This is not a new obsession, either on my part or on that of the profession. We’ve talked about assessment for years. . . but what have we learned?

What, exactly, do we want our students to learn in our classes? Long ago, I figured out I should design my courses backward, identifying one key theme and several key developments that were students’ “takeaway” from a course. That seems to have worked (and I’ll write more on it in future).

But I’m still trying to figure out how to use assessments, especially exams, more intelligently that I do now. My brother, himself an educator who specializes in assessments, recently showed me this video (below), which—aside from being entertaining—tears apart the idea that traditional midterms and finals do anything useful in today’s world.

Shortly after watching the video, I happened to talk separately with two professors who use collaborative assignments and collaborative, open-book, take-home exams. They do this to emphasize that students should be learning the real-world skills of research and cooperation just as much—or more—than they learn facts. As one said,
facts in today’s world are at anyone’s fingertips . . . but people must know how to find them, and to use them intelligently. This is a skill we can teach more deliberately than we currently do.

These two people are from different universities and are in different fields, but both thought their experiment had generally worked well. One pointed out—as the video does—that the real world is not about isolation and memorization; it’s about cooperation to achieve a good result.

The other said she had had doubts about the exercise because she had worried that all the students would get an “A.” Then she realized that it would, in fact, be excellent news if all her students had mastered the skills she thought were important. When she actually gave the take-home, collaborative assignment, though, she was surprised—and chagrined—to discover the same grade spread she had always seen on traditional exams. She also saw that some of her student groups had no idea how to answer some very basic questions, and that she would have to go back over the idea that history was not just dates, but was about significance and change.

And that is maybe the most important lesson. The collaborative exam revealed that there were major concepts that a number of students simply weren’t getting. So she can now go back and reiterate them.

I’m still mulling this over, but I do think I’ll experiment with collaborative assessment techniques. Historians have some advantages doing this that teachers in other fields don’t. We can ask students to identify the significance of certain events, to write essays, and to analyze problems. With the huge amount of good—and bad—information on the web in our field, though, we could also ask students to research a topic, then judge their ability to distinguish between legitimate and illegitimate sources (something that might have helped Joy Masoff when she was writing her Virginia history textbook).

As I’ve been thinking this over, a third colleague has inadvertently weighed in on it. He discovered students had cheated on a take-home exam, working together and then slightly changing each essay to make them look original. At least an assigned collaboration would eliminate the problem of unapproved collaboration!

Monday, January 3, 2011

History Job Market Looks Bleak . . . Again

Randall Stephens

There is nothing like ringing in the near year with bad news . . . But, here goes.

The history job market is still bleak. (Not really news to anyone, I suppose. We're used to this. It's like watching the film Groundhog Day.) As it stands right now, the number of jobs listed through the American Historical Association is at a 25-year low.

Scott Jaschik reports at Inside Higher Ed: "The reality of radically differing job markets may be especially clear as 2011 begins with disciplinary associations gathering for job interviews at annual meetings and releasing data on the number of available positions." There will be many sad faces at this year's AHA meeting in Boston. (If you are on the market, and would like to improve your odds, see John Fea's interview advice at the Way of Improvement Leads Home and Claire B. Potter's suggestions at Tenured Radical.)

The number of new history PhDs rose to a 9-year high in 2009. You don't need any training in economic theory to know that there's something wrong with that picture. (Speaking of economics . . . the American Economic Association announced that its job listings have recovered from a 21% dip in 2008.)

Could it get worse? Maybe. The Inside Higher Ed piece draws from Robert Townsend's AHA report on the job market. (You may need to sign in to your AHA account to read this.) Townsend, assistant director of research and publications at the AHA, writes about long-term concerns in the new issue of Perspectives on History:

In addition to the chairs’ general concerns about what lies ahead for hiring in their departments, there are demographic reasons for viewing the coming decade with caution. First, the number of faculty approaching retirement age in the next 10 years is reaching the lowest level in 30 years. Currently, only 40 percent of the full-time faculty in history departments are 20 years or more from the time they earned their degrees.

Townsend wraps up his article with a note of caution. "Most history doctoral students are being trained for an academic job market that is now beset by crises," he observes. "Departments should begin to carefully reflect on the type of training they are providing their students and the number of students they are admitting to their programs."

See these related articles for more:

Eric Kelderman, "Colleges to Confront Deep Cutbacks. In states where new governors pledge no new taxes, higher-education budgets will suffer," Chronicle, January 2, 2011

Christopher Phelps, "A Move Abroad: Travels and Travails," Chronicle, January 2, 2011

Samuel Wren, "Rule Britannia. Being a job candidate in a British faculty search is a curiously different experience," Chronicle, April 10, 2010

Anthony Grafton, "History under Attack," Perspectives on History (January 2011)

Robert B. Townsend, "History under the Hammer: Department Chairs Report Effects of Economic Woes," Perspectives on History (January 2011)

Scott Jaschik, "No Entry," Inside Higher Ed, January 4, 2010

Hannah Fearn, "Shrinking job market sees nearly 70 applicants vie for every graduate job," THE, July 6 2010

Thursday, December 30, 2010

Rarely is the Question Asked: Is Our Professors Teaching?

Randall Stephens

Academics are, by nature, hand wringers. We worry about the decline in the humanities. We worry about grade inflation. We worry about the troubles of academic presses. Once in a while we worry about the state of teaching. Or, to paraphrase our former president, "rarely is the question asked: is our professors teaching"?

Quite often the appraisal of teaching is negative, though academics and non-academics offer different points of view. In the popular imagination, the old stereotypes persist, as Anthony Grafton points out, with tongue firmly in cheek:

We don’t teach undergraduates at all, even though we shamelessly charge them hundreds of dollars for an hour of our time. Mostly we leave them to the graduate students and adjuncts. Yet that may not be such a bad thing. For on the rare occasions when we do enter a classroom, we don’t offer students close encounters with powerful forms of knowledge, new or old. Rather, we make them master our “theories”—systems of interpretation as complicated and mechanical as sausage machines. However rich and varied the ingredients that go in the hopper, what comes out looks and tastes the same: philosophy and poetry, history and oratory, each is deconstructed and revealed to be Eurocentric, logocentric and all the other centrics an academic mind might concoct.*

Across the water, historian and filmmaker Tariq Ali and and Harvard historian and teledon Niall Ferguson speak to the BBC about what they see as the abysmal state of history teaching. (Hat tip to the AHA.) Students stop pursuing history in England at an early age, says Ferguson. And what history is taught is "too fragmentary." Ali agrees, saying that what is presented is, basically, "worthless," and hobbled by a chasing after so-called relevance. They both argue that the old anachronistic, triumphalist, island history of Britain, should be avoided, but students need a larger narrative. "It could hardly be worse than what is going on in schools today," concludes Ferguson.

How does history teaching fare in America's colleges and universities? Are teaching awards more than a feather in the cap? Do promotion and tenure committees value persistently good evaluations and commend teaching effectiveness in the same way that they reward scholarship? Do peers sit in on classes and make assessments? Do departments do anything when a professor continues to receive poor teaching evaluations one semester after another?

Nearly ten years ago Daniel Bernstein and Richard Edwards proposed that we need more peer review of teaching in the Chronicle. "[I]f educators are going to sustain the progress made, we will need to move toward a more rigorous and objective form of review," they wrote. "The goal of peer review has been to provide the same level of support, consultation, and evaluation for teaching that universities now provide for research." I can't imagine what the results of such efforts have been. Certainly, peer evaluation can turn into a messy, political business.

Does graduate training in history prepare men and women for classroom success? Budding historians spend far more time in graduate school working on research, parsing theory, and getting the historiography down. Less time is devoted to developing teaching skills and, at least as it was in my case, there is not much mentoring on teaching. (Most grad students I encountered came prewired with an interest in teaching. So, that was a plus.) Could graduate training be better oriented to prepare good history teachers? What would that look like?

Thursday, December 23, 2010

Laughing at Us: Academic Novels

Randall Stephens

"Why is the academic novel my favorite genre?" asks American literary critic Elaine Showalter in Faculty Towers: The Academic Novel and Its Discontents (University of Pennsylvania Press, 2009). "Maybe it's just narcissistic pleasure. One theory about the rise of the novel argues that it developed because readers like to read about their own world, and indeed about themselves." Of the genre itself, Showalter writes that it "has arisen and flourished only since about 1950, when American universities were growing rapidly, first to absorb the returning veterans, and then to take in a larger and larger percentage of the baby-booming population" (Showalter, 1). In the academic novel one finds the "tribal rites" of the profession, the weird quirks of tweedy academics, and stories of professional dread. (I'm guessing it should be a boom time for academic novels, given all the Cassandras wailing about the decline of the humanities.) When Showalter was an undergad, such books filled "a novice's need to fit into the culture" (2).

I like academic novels mostly because they make me laugh.

I'm reading, for the first time, novelist Kingsley Amis's Luck Jim (1954), a university sendup about a hapless history lecturer. At his provincial English university, James Dixon, an utterly uncommitted medievalist, weaves a web of ridiculous deceptions, while preparing to deliver a lecture on "Merrie England." (Let's just say the lecture does not go well.) Fretting about his love life and his teaching prospects for the next year, Jim schemes to make things right. Yet, no matter how hard he tries, this déclassé son of working class parents just can't win.

The book fits into that classic English schadenfreude, black humor tradition, evident today in British TV shows like The Office and Worst Week of My Life.

A few fun history-related passages:

Jim rides in the car with his dry-as-dust, scatter-brained senior colleague, Welch, and frets over his work-in-progress article.

Dixon looked out of the window at the fields wheeling past, bright green after a wet April. It wasn't the double-exposure effect of the last half-minute's talk that had dumbfounded him, for such incidents formed the staple material of Welch colloquies; it was the prospect of reciting the title of the article he'd written. It was a perfect title, in that it crystallized the article's niggling mindlessness, its funereal parade of yawn-enforcing facts, the pseudo-light it threw upon non-problems. Dixon had read, or begun to read, dozens like it, but his own seemed worse than most in its air of being convinced of its own usefulness and significance. "In considering this strangely neglected topic," it began. This what neglected topic? This strangely what topic? This strangely neglected what? His
thinking all this without having defiled and set fire to the typescript only made him appear to himself only more of a hypocrite and fool (14-15).

Jim Prepares to proctor an exam and thinks about the hideousness of the Middle Ages.

The examinations were now in progress, and Dixon had nothing to do that morning but turn up at the Assembly Hall at twelve-thirty to collect some scripts. They would contain answers to questions he'd set about the Middle Ages. As he approached the Common Room he thought briefly about the Middle Ages. Those who professed themselves unable to
believe in the reality of human progress ought to cheer themselves up, as the students under examination had conceivably been cheered up, by a short study of the Middle Ages. The hydrogen bomb, the South African Government, Chiang Kaishek, Senator McCarthy himself, would then seem a light price to pay for no longer being in the Middle Ages. Had people ever been as nasty, as self-indulgent, as dull, as miserable, as cocksure, as bad at art, as dismally ludicrous, or as wrong as they'd been in the Middle Age - Margaret's way of referring to the Middle Ages? He grinned at this last thought, then stopped doing that on entering the Common Room . . . (87).

A real pleasure read. I'm now looking out for similar so-called campus novels. (Any suggestions? I've not read David Lodge, Vladimir Nabokov, or Zadie Smith's contributions to the genre.) The 2009 indie film Tenure, starring Luke Wilson, brings the genre back to the silver screen. (Watch it in full on Netflix.)

I'm still on the lookout for Ian McGuire's Incredible Bodies (2006). The Bloomsbury website describes McGuire's higher ed farce: "Coketown University, also known as the ‘plughole of England’, is where thirty-something Morris Gutman has achieved the mighty heights of temporary lecturer. . . . Now Morris is hoping to negotiate a permanent department job under the noses of smarter and better candidates by being obsequious, cheap and willing to do anything."

I can picture it clearly enough.

Sunday, September 5, 2010

Higher Ed Jeremiads

Randall Stephens

Read Christopher Shea's review essay in the NYT: "The End of Tenure?" Quite a few American's outside the academy are mad as hell, and not going to take it anymore. Rumors of pampered academics tooling around their college towns in Maseratis are utterly cartoonish. But, something like that vision dominates popular thinking about the professor as aristocrat. (Anyone know how many, say, history professors actually work at schools with a 2-2 load? I'd bet money they're in the smallish minority.)

Should academics be accountable to the broader public for the writing and teaching that they do? Perhaps something like the UK's Research Assessment Exercise could be in American higher ed's future.

Anyhow, Shea considers several books that offer up nightmare scenarios of privilege or offer some suggestions for reform.

"The higher-ed jeremiads of the last generation came mainly from the right," says Shea. "But this time, it’s the tenured radicals — or at least the tenured liberals — who are leading the charge. [Andrew] Hacker is a longtime contributor to The New York Review of Books and the author of the acclaimed study 'Two Nations: Black and White, Separate, Hostile, Unequal,' while [Mark] Taylor, a religion scholar who recently moved to Columbia from Williams College, has taught courses that Allan Bloom would have gagged on ('Imagologies: Media Philosophy'). And these two books arrive at a time, unlike the early 1990s, when universities are, like many students, backed into a fiscal corner. Taylor writes of walking into a meeting one day and learning that Columbia’s endowment had dropped by 'at least' 30 percent. Simply brushing off calls for reform, however strident and scattershot, may no longer be an option.">>>

Tuesday, May 4, 2010

What is it Good for?

Randall Stephens

Standards, standards, impact, impact. In recent years historians in the UK have had the Research Assessment Exercise to contend with. (Sorry, your publications with Yea-oh University Press and Oxfort College Press don't pass muster.) Administrators and the public also push for disciplines in the humanities to prove their "usefulness" and "impact."

"As with philosophy," writes Ann Mroz in THE, "it is hard to show history's value beyond an intellectual pursuit. Any moves to make it demonstrate 'impact' risk pushing it down the heritage trail . . ." Your knowledge of Medieval tax law will help you to . . . ? Your study of child rearing in the Elizabethan Age equips you to . . . ? Start training to become a reenactor. Polish up your English Civil War "armour." Get that pike out of the closet.

Richard Overy's April 29 essay in THE, "The Historical Present," has created a stir. He throws down the gauntlet with these words:

Historians have always generated impact of diverse and rewarding kinds, and will continue to do so without the banal imperative to demonstrate added value. There is no real division between what historians can contribute and what the public may expect, but the second of these should by no means drive the first.

Nor should short-term public policy dictate what is researched, how history is taught or the priorities of its practitioners. If fashion, fad or political priority had dictated what history produced over the past century, British intellectual and cultural life would have been deeply impoverished. Not least, the many ways in which historical approaches have invigorated and informed other disciplines would have been lost.

Over at the NYRB, Anthony Grafton worries about the results of this utilitarian calculus. England's Slow Food academy has morphed into McDonald's. "Have it your way." Scholars working in fields that administrators deem useless--paleography, early modern, and premodern history, philosophy--have landed on the chopping block. "From the accession of Margaret Thatcher onward, the pressure has risen," writes Grafton. "Universities have had to prove that they matter. . . . Budgets have shrunk, and universities have tightened their belts to fit. Now they are facing huge further cuts for three years to come—unless, as is likely, the Conservatives take over the government, in which case the knife may go even deeper."

Historians working in America, too, struggle with the burdens of constrained budgets, reduction in full-time positions, eliminated raises, and the push for "relevant" curriculum. But, if the buzz in THE is any indication, what's happening in the UK is something else. Surely, the field of history won't vanish into thin air, as Overy imagines. (More doubtful are his comments on Canadian historian Margaret MacMillan, who "in her 2008 book, The Uses And Abuses of History, called on her peers to reduce their commitment to theory and to write shorter sentences. To do so would be to dumb down what history as a human science is doing." Really?) Still history across the pond may suffer much in this new climate.

Monday, April 5, 2010

Liberal Arts, Humanities Roundup

~~
The following appeared in recent days. Just when you thought there could not be any more essays or forums on the decline in liberal arts education of the crisis of the humanities. . .

Nancy Cook, "The Death of Liberal Arts," Newsweek, April 5, 2010
. . . . But there's no denying that the fight between the cerebral B.A. vs. the practical B.S. is heating up. For now, practicality is the frontrunner, especially as the recession continues to hack into the budgets of both students and the schools they attend.>>>

Richard A. Greenwald, "Graduate Education in the Humanities Faces a Crisis. Let's Not Waste It," Chronicle Review, April 4, 2010.
I was recently reading Dr. Seuss to my 2-year-old daughter, when, bored of The Cat in the Hat and The Lorax, I picked up a lesser book from the Seussian canon: I Had Trouble Getting to Solla Sollew. To my surprise, the plot of that little-known children's book reminded me a great deal of the current crisis of American higher education.>>>

"Graduate Humanities Education: What Should Be Done?" Chronicle Review, April 4, 2010.
Does graduate education in the humanities need reform? By nearly all indications, the answer is yes. The job picture is grim. The Modern Language Association is projecting a 25-percent drop in language-and-literature job ads for the 2009-10 academic year, while the American Historical Association announced that last year's listings were the lowest in a decade.>>>

Simon Jenkins, "Scientists may gloat, but an assault is under way against the arts" the Guardian, March 25, 2010.
Which is more important, science or the humanities? The right answer is not: what do you mean by important? The right answer is a question: Who is doing the asking?>>>

Elizabeth Toohey, "The Marketplace of Ideas: What’s wrong with the higher education system in the US and how can we fix it?" Christian Science Monitor, March 11, 2010.
The structure of the American university has long been a subject of contention, and now is no exception, especially given the current economic climate. Last year, Mark Taylor called for an end to tenure and traditional disciplines in The New York Times op-ed, “End of the University as We Know It,” and William Pannapacker’s column, “Graduate School in the Humanities: Just Don’t Go,” was among the most viewed links on the Chronicle of Higher Education’s website.>>>

Tuesday, January 12, 2010

Post-AHA Roundup

Scott Jaschik, "Historians, Sons, Daughters," Inside Higher Ed, January 12, 2010
SAN DIEGO -- When Adam Davis was growing up and wanted paper to draw on, his parents gave him the blank back sides of the first typed drafts of the books that established his father, David Brion Davis, as one of the preeminent historians of slavery.

Scott Jaschik, "Ph.D. Supply and Demand," Inside Higher Ed, January 11, 2010
SAN DIEGO -- As history graduate students arrived in the large table-filled ballroom here Friday to try to learn how to find a job, the room was seriously overheated. These would-be professors didn't need any more sweat or discomfort.

Scott Jaschik, "Is Google Good for History?" Inside Higher Ed, January 8, 2010
SAN DIEGO -- At a discussion of "Is Google Good for History?" here Thursday, there weren't really any firm "No" answers. Even the harshest critic here of Google's historic book digitization project confessed to using it for his research and making valuable finds with the tool.

Marc Bousquet, "At the AHA: Huh?" Chronicle of Higher Ed, January 08, 2010
A funny thing happened on the way to the AHA this year -- American Historical Association staffer Robert B. Townsend issued his annual report on tenure-track employment in the field.

Marc Bousquet, "Who's a Historian to the AHA?" Chronicle of Higher Ed, January 08, 2010
My piece questioning the supply-side bent to the American Historical Association's 2010 job report has gotten thoughtful replies by historiann, Alan Baumler, Jonathan Rees, Ellen Schrecker, Sandy Thatcher and others, both at my home blog and here at Brainstorm.

David Walsh, "Highlights of the 2010 Annual Convention of the American Historical Association in San Diego," HNN, January 7, 2010

Lauren Kientz, "Exciting New Pedagogy Based in the History of Ideas," January 12, 2010, U.S. Intellectual History Blog
A decade ago, several professors at Barnard College created a pedagogy based in the History of Ideas called "Reacting to the Past." I attended a session at the AHA discussing this pedagogy

Wednesday, January 6, 2010

John Fea on Interviewing for that History Job

Over at The Way of Imrpovement Leads Home, John Fea offers hints for all those individuals interviewing for history jobs at the AHA, ASCH, and, or, just over the phone. Along with giving general tips, Fea provides more specific advice for candidates who are interviewing for jobs at teaching institutions, research universities, and church-related schools.

Here's a particular useful bit:

Once you find out who will be doing the interview, start researching. By this point you should have already familiarized yourself with the department web site, but now you want to go a bit deeper. Find out as much as you can about the people who will be seated on the other side of the table. What courses do they teach? (You do not want to propose a course that gets too close to the "turf" of another professor in the department). What are their research interests? (You may want to mention how your work has some theoretical connections to the work of a particular interviewer). All of this stuff is pretty straightforward and most good candidates do not need to be told any of this, but you might be surprised to learn just how many people come to an interview unprepared.

Read more here.

Wednesday, November 11, 2009

Richardson's Rules of Order, Part XI: A Note About Professors

Heather Cox Richardson

Please remember that your professors are human and it’s hard work to stand in front of a hundred pairs of eyes and talk for an hour. In the last decade, students seem more and more to regard us as if we’re behind a screen, and seem to think they can talk, read, sleep, or just stare at us glassy-eyed without it having any effect on our performance. This is a shared enterprise. It’s hard to lecture to an apparently disinterested sea of eyes. If you don’t think a lecture hall is intimidating, take a minute after class some day to stand behind the podium and look at all those seats. Then imagine holding the attention of everyone in those seats for an hour, two days a week. Wouldn’t it be easier if the people there seemed interested? You don’t have to act like you’re watching U2, but do try to make it clear your heart hasn’t actually stopped beating.

Please don’t let the anonymity of a large classroom make you feel like you can use an evaluation form to be vicious. While you can walk away from that form, remember that your teacher is going to live with whatever you say on it for the rest of his or her career. Your bile, spilled on a page, can devastate a junior professor, while even older scholars would rather not have the chair, the members of the personnel committee, and the dean (all of whom read our evaluations), read commentary on our personal attractiveness, our choice in clothing, or on what professions would suit us better. Criticize when it’s appropriate, yes, but do so constructively. It doesn’t hurt to mention things that have gone particularly well, too.

Remember that for many history professors their university jobs dictate that only about a third of time and energy should go into teaching (although it always takes way more time and effort than that!). We have significant responsibilities outside of the classroom. We’re supposed to sit on the committees that keep the university running, as well as to manage national and international scholarly and educational projects. In addition to teaching and what is called “service,” we’re also supposed to maintain a prominent profile as scholars and writers. These three parts of our professional lives mean that we are usually trying to manage three different kinds of schedules, as well as three different kinds of work, all of which take place in widely different locations and settings. If we cannot meet you at a time you think is convenient, it is not because we’re being jerks, but because, for example, we have to be in another city that week to help evaluate a university. We will try to make things convenient for you, but please do remember that we have other professional commitments.

Finally, you might want to Google your professors to see what they do outside the classroom. You will probably see that your school has an extraordinary faculty. You might find that your school has national leaders in nanotechnology and sports medicine; or Pulitzer Prize winners and consultants to the State Department. Go meet these people, talk to them, work with them. When an extraordinarily famous professor agreed to work with a friend of ours on her undergraduate thesis, we were shocked. “How did you get HIM?” we demanded. “I just went and asked,” she answered. “He says no one ever asks him to do anything anymore because he’s too famous, and he misses students.” A professor can’t work with every one who asks, but it’s certainly worth talking to someone whose work you admire.

Wednesday, June 10, 2009

Which History?

Randall J. Stephens

Patricia Cohen reports on the steady drop in college courses offered on diplomatic, economic, and intellectual history in "Great Caesar’s Ghost! Are Traditional History Courses Vanishing?" NYT, June 10, 2009. "To the pessimists evidence that the field of diplomatic history is on the decline is everywhere," writes Cohen. "Job openings on the nation’s college campuses are scarce, while bread-and-butter courses like the Origins of War and American Foreign Policy are dropping from history department postings."

Cohen's comments are sparked by a roundtable on the topic at the upcoming meeting of the Society for Historians of American Foreign Relations, June 25th - June 27th at the Fairview Park Marriott in Falls Church, Virginia. That session is titled "What’s in a Name?: Diplomatic History and the Future of the Field" and includes the following participants: Thomas Zeiler, University of Colorado at Boulder; Matthew Connelly, Columbia University; Christopher Endy, California State University, Los Angeles; Barbara Keys, University of Melbourne; Robert J. McMahon, Ohio State University; Lien-Hang T. Nguyen, University of Kentucky; Emily Rosenberg, University of California, Irvine.

According to Cohen the percentage of history departments with faculty who specialize in intellectual, diplomatic, or economic history has declined sharply since the 1970s. By contrast greater numbers of faculty work in gender, women's, or cultural history. There's also been a slight increase in the number of faculty specializing in military history, oddly enough. (The latter bit contradicts what John Miller wrote in his 2006 National Review essay, "Sounding Taps: Why Military History is Being Retied.")

There are many reasons why some areas of history lose lines and why others gain them. Cohen cites David Kaiser, history professor at the Naval War College: “The boomer generation made a decision in the 1960s that history was starting over. It was an overreaction to a terrible mistake that was the Vietnam War.” To what extent has post-1960s identity politics shaped the profession?

As I read on I wondered what other fields could be added to those that Cohen mentions. I'm no longer certain that religious history would qualify as "neglected" or "underrepresented.” There is much interest in religious history. (See this interesting thread at Religion in American History on a recent conference on religious history/American religion.) Though it’s debatable whether or not that interest has bubbled up into curriculum and/or publications. I conducted a little informal, crude, utterly unscientific study of my own a couple years back. I went through journals like American Quarterly and the Journal of American History for the years 1997-2007. These rarely include religious history topics. (The American Historical Review was a little better.) Only 5% of the articles in the American Quarterly, the premier publication for American studies, covered religion. Only 4.3% of the essays in the Journal of American History dealt with religion over the same period. Thumb through most professional history conference programs and find much the same. (See John Butler's now-classic 2004 essay in the JAH: "Jack-in-the-Box Faith: The Religion Problem in Modern American History.")

What else? Would political history count as underrepresented? What about the history of science? This could probably be extended to include periods that receive less attention, too. I’m looking at you, Early Bronze Age.