Monday, February 4, 2008
Objectification
When you get down to the nitty gritty, a common word like “object” can be a real poser (or should that be poseur?). Thus do I charge headlong into the sexual politics of the phrase “objectification of women.”
One very common interpretation of this phrase can be encapsulated in a similar utterance: “treating her as a piece of meat.” That, of course, also alludes to the phrase “piece of ass,” or just abbreviated to “piece.” Carried further, a woman becomes a collection of piece parts, breasts, legs, ass, abs, or sternocleidomastoids (to use an anatomical part that I find particularly pleasing).
The attachment of sexual desire to inanimate or impersonal objects is actually a fetish, though I’ll agree that “objectification” is easier to pronounce than “fetishization.” Both Freudian and Behavioral psychology have a lot to say about the role of the fetish in sex, with Freudians holding it as an example of the projection of sexual desire, while behaviorists suggest that operant conditioning is the key to understanding. I have no quarrel with either mechanism and I’m willing to believe that both apply.
The psychologist Nathaniel Branden, Ayn Rand’s lover/collaborator (before their nasty breakup) told a story of one of his patients, a full-fledged Lothario complex, who would speak of his conquests as “mere receptacles.” Branden suggested that he conduct a thought experiment. Suppose that one could construct a perfect female replica; this was pre-Stepford Wives, but that was the clear intent. Make a simulacrum of a woman out of plastic and rubber, totally lifelike, down to the genitalia, animated by motors and actuators. Would the Lothario find such a construct a desirable partner for sex?
“God, no!” was the reply.
Despite novelty “blowup dolls” (sold more often as gag gifts than as real sexual objects, I suspect), and other mechanisms, I believe that Branden’s patient’s response is typical. What is called “objectification” isn’t about reducing women to mere material objects; it is about using women as objects of fantasy, which is not the same thing at all.
In Peter O'Donnell's Modesty Blaise books, Modesty’s response to rape (and her history includes a number of such incidents) was to separate her consciousness from the event, thereby depriving the rapist of anything other than her physical presence. She refuses emotional connection, depriving the rapist of real domination. Within the context of the Blaise books, it is yet another indication of the primacy of the heroine’s will, her power over self. It also illustrates a thwarting of rape, and what that implies. Fetishization and the preference for a fantasy object is certainly depersonalizing insofar as it ignores the reality of the Other. In a sense, it denies the objective reality of someone else’s subjective experience. It is another pathological adherence to an internal model, a fixed idea about the external world.
Recognizing that we are dealing with the elevation of fantasy over reality in such cases also allows the realization that this is not a problem confined to men alone. Women crave the fantasy ideal as surely as do men; their fantasies tend to differ, however. It’s an open question as to what degree these differences are learned or innate. What is indisputable is that 1) they vary from individual to individual and 2) they are malleable.
The late comedian Richard Jeni had a bit where he suggested that the standard porn film is most men’s idea of a romantic film with all the boring parts left out. Compare and contrast that with the notion of the “chick flick,” which supposedly is nothing but the (for men) boring bits.
The clear implication is that romance is collaboration, and collaboration is hard, no matter what the circumstances. It’s hard to tell whether the fantasies of men and women are converging or diverging at this time; that’s a project that’s well beyond my own capabilities, and, for that matter, my interests. But simple observation and personal experience suggests that success is possible at the level of individuals, and that’s where my sympathies lie, in this as in so many other things.
Saturday, February 2, 2008
Poly
“Well!” said the woman in a huff. “Isn’t that just the way a man would think!”
________________________
Strictly speaking, polygamy is divided into two categories, polygyny and polyandry, the former being one male with multiple mates and the other being one female with multiple mates. In practice, polygyny is so seldom used as a label that it doesn’t even show up in my spell checker, and polygamy is generally taken to mean the multiple wives thing.
The predominance of polygyny over polyandry is pretty typical of mammals where there is substantial sexual dimorphism, i.e. where males are larger than females. In species where it’s the other way around (large females and small males), things aren’t quite so phalocentric, with the extreme cases being those insectoid suicide matings (bees, black widow spiders, etc.), that have served as the basis of many horror stories (or, contrariwise, female revenge fantasies, point-of-view being a factor in nomenclature).
Sociobiology and ethnology offer a lot of speculative theories on the nature of polygamy, most of them controversial, (well duh). It’s pretty easy to see how a shortage of males can lead to polygyny. Indeed, one hard fact in population demographics is that the birth rate in any given region depends on the number of women of child bearing age—period. It’s almost impossible to reduce the number of men to a level where it affects the number of children being born.
In the state of perpetual warfare that sometimes exists in some societies, a shortage of men is almost inevitable, and some sort of polygyny often results. Given long enough, this becomes institutionalized. It’s not necessary to invoke perpetual warfare, either; hunting large game is dangerous, and hunter/gatherer societies can easily develop male shortages and the whole “alpha male” structure, almost by accident.
A suggested countervailing influence in stone age societies is female infanticide. This is the dark underbelly of Eden, the crude population control measure that allowed the human population to remain stable for millennia. Some anthropologists have suggested that this sometimes led to polyandry, due to a shortage of females. It’s interesting to speculate about the future outcome of the gender imbalances that are being set up in some Asian countries as a result of pre-natal screening and selective abortion.
Female infanticide as a population control measure has been suggested as the origin of the form of institutionalized polyandry that exists in Tibet. One difficulty with this argument is that the custom is confined to a property owning class (which suggests that privation isn’t the primary origin), and that the woman’s spouses are fraternal, i.e. she marries the “family” as it were, and one brother is dominant, with the rest merely enjoying spousal privileges. That suggests that in this case, the custom is more akin to primogeniture, with the multiple husbands simply as insurance against infertility in the primary “alpha” male. It may be noted that this looks similar to the commonly noted phenomenon of infidelity on the part of the mates of the alpha males in various primate societies.
Substantial gender imbalances were the norm in the expansion of Europeans into the Americas, and history and folklore abounds in unconventional modes of co-habitation in the Old West. The Mormons dealt with their substantial gender imbalance in the early church with a “revelation” of God’s blessing for polygyny. By contrast, Wyoming Territory, responded to an extreme shortage of women by giving them voting rights in 1869, as an attempt to get more women to move to the territory.
A careful examination of social behavior in the Old West suggests that there is a form of polyandry that is seldom noted as such: prostitution. The legal system does its best to deny that the prostitute/client relationship is legitimate, as do practically all religious doctrines. But on any honest analysis suggests otherwise. What does a polyandrous relationship have that does not appear in prostitution? Certainly emotional relationships form; the client who wants to “make an honest woman of her” is so common as to be a stereotype. Children? Frequently children are the reason why women turn to prostitution. While it’s true that the anonymous sex of a street hooker doesn’t much look like marriage, it’s easy to find more domesticated arrangements upscale in the sex trade, while contrariwise, it’s not that difficult to find legal marriages that make a street hooker and her john look positively loving and healthy.
There’s no doubt that human relationships rapidly increase in complexity as the number of players increases. Same-sex monogamy will inevitably be even less complicated than the sort of opposite sex serial monogamy that has become normal in the modern world. By the same token, divorces in same sex marriages will certainly be at least as complicated as opposite sex divorces. Having both spouses “cheat” with the same individual is relatively uncommon in opposite sex couples, though it does happen, and, yes that’s yet another kind of gossip that will probably never appear in these essays.
But group arrangements, even when the group is so small as three, becomes so very complicated so very quickly that I doubt that they will ever be common enough for the law and mores to take much note of them. It should go without saying that such “outlaw” behavior is just the sort of thing that young people do as a way of testing limits, their own and others, just another set of behaviors that Seem Like a Good Idea At the Time.
Sunday, July 22, 2007
The Vacant Lot
[Cross-posted to WAAGNFNP]
Our house on Ironwood Drive, in Donelson Tennessee, in the 1950s, was a typical example of post-War construction, cinder block walls, asbestos exterior shingles, and shoddy construction. My parents discovered years after purchase that the overflow pipe from the attic water heater didn’t actually exist; there was a short pipe in the attic and another short pipe beneath the house and nothing in-between. “Shoddy” doesn’t actually cover something like that, since it was pure fraud to fool the building inspector. There was something similar with the septic tank, which, after we’d left, turned out to be covered only with plywood that finally rotted through, much to the distress of subsequent inhabitants.
It was an “all-electric” house, electric stove and electric “radiant” heaters that were nothing but wire wound around ceramic cores. The heating and cooling expansion made little clicking noises whenever they turned on or off. The electricity was cheap, though, courtesy of the TVA, a fact that made Goldwater’s loss of Tennessee in 1964 inevitable. He’d gone of record as wanting to privatize TVA, even saying he’d “sell it for a dollar” if he could. The voters of Tennessee thought that the fight against socialism could maybe be first started in another state, for example, Arizona, where there were plenty of Federal water projects to privatize first. The Senator from Arizona never quite grasped that logic.
Before tossing up the masses of houses, the developers had done some landscaping, which is to say that they’d bulldozed the tops off the hills and used them to fill in the gullies. One of these landfills was in our back yard, where some trees had been half buried, but still managed to grow. So the soil on the downslope was rich (for that part of Tennessee), while the soil around the house proper was not.
The slope started about halfway to the property line in the back yard, then leveled off in what we always called “The Vacant Lot.” The nearby roads were twisty turny, and the vacant lot was an orphan plot, surrounded by hastily built homes on their hastily graded lots, but it had no direct access to any road. I have no idea who owned it; possibly the power company, since we also were graced with a nice, high transmission tower only a few hundred yards away.
The lot was where the debris from the construction had been dumped, as I vaguely recall. I recall more vividly the lot clearing operation that the neighborhood mounted sometime after we moved it. It culminated in an enormous bonfire, fueled by the leftovers, some of it entire tree trunks, one of which burned for days after the bonfire was over and which still wasn’t completely consumed. It left a charcoal husk that seemed huge at the time, though I’ll guess it was chest high to someone who is only four feet tall. Nevertheless, it was there for years afterwards, on a little ridge-let behind some houses that were behind us. The little ridge was cool because there was a full ‘dozer cut in it, so I can say with authority that our particular area had layers of sandstone under it, with clay sometimes sandwiched between those.
After the clearing, the vacant lot went through what I now know as ecological succession, first weeds, then short bushes, finally small trees, though we kids tended to cut down the small trees, using the ever popular “machete,” which I believe was actually a WWII vintage bayonet. I recall it as being Bill’s property, Bill being the alpha male of our particular group, one year older than me, and notably larger and more physical. As the weeds grew, we used the machete and other cutting tools to create paths, then sometimes tunnels through the weeds, culminating in hidey holes of various kinds that appeal to children before the age of reason.
I think there was another weed clearing, years later, after I’d started school, because my memories of the vacant lot later show a less jungle-like terrain, though some of it is probably also just physical growth, with we kids “growing like weeds” and, if not outpacing the actual weeds, holding our own.
One gray day in winter, my sister and I were playing out in the back yard, then down into the vacant lot, since we could go pretty far that way and not disobey the dictum of her not crossing any streets. Given our ages, it was probably a matter of me playing and her tagging along. Or maybe she was out exploring and I was being a good brother and making sure she didn’t get into trouble. I’d guess that I was somewhere around 7-9 and she would have been 5-7, so either of those was plausible.
I don’t remember how it was that we came to look into the tool shed of the people who lived all the way on the other side of the vacant lot, at the corner of Cottonwood and Sinbad. We were definitely trespassing, though without felonious intent. In any event, the thing that trumps all other memories of the day was the monkey.
It was young; I’m pretty sure of that. I don’t know what kind it was, but I can say that its arms were very long. I don’t remember if it had a tail, so it could even have been a chimpanzee, though I doubt it.
It was shivering from the cold, and it climbed onto my back, no doubt trying to get a little warmth, and maybe also because young primates ride their mother’s back. I heard its breathing, because it wheezed. I imagine that it had a respiratory infection and I doubt it lived much longer after that.
I now know, of course, that the way that monkeys were captured in the wild is to shoot their mothers. They were then loaded into cages, shipped to foreign lands, then sold, as “pets,” often to owners who had no more knowledge of how to care for them than did the owner of the unfortunate simian that I met briefly that day. Maybe it was an impulse buy, later repented, but without an exit strategy.
It’s a long chain of accountability, and it’s ever so easy for everyone in it to shift the blame. The pet owners don’t know how the system works. The store owners are only meeting the demand. The hunters are just trying to make a living, and besides, they’re only animals.
The monkey in the tool shed, of course, immediately peed on my back, and I peeled him off of me and we put him back into the shed and closed the door. I was pretty anxious to get back home and clean up, after all. We didn’t tell anyone about it because we were snooping where we had no business being. And I rarely think about the way the monkey looked at me, or how human his eyes looked, and how much misery was in them, or that we might have done something for him if we hadn’t been afraid of the consequences.
Sunday, May 13, 2007
Cowboys and Indians
–Temporarily Humbolt County, Firesign Theater
In my recent essay False Positives, I mentioned several racial/ethnic groups for which I have a “slight positive bias,” a tendency to look upon favorably, give benefit of doubt, and so forth. It might be said that this comes at the expense of my own ethnic group, i.e. middle-class white guys, but the fact is that I know my own group well enough so that other factors come into play almost immediately. Notice, for example, that there would be separate entries for “female” and “non-middle class,” if I’d just started with the group, “white folk,” but that barely begins to cover it. We all automatically respond to such things as accents, use of language, height, weight, perceived intelligence, and all the rest, so the “white guy” part gets lost in the mix pretty quickly. Truth to tell, the “slight positive bias” doesn’t go very far either if I’m confronted with someone who is pushing some of my other buttons. And yes, I have plenty of buttons, some of which are still unknown to me, I’m sure.
The racial/ethnics groups I mentioned were African-Americans, Asian-Americans, Hispanics, and Jews. There was another one that I considered, but left off the list: Native Americans. The “racial bias” test is actually one for skin tone, so there’s some probability that my bias would extend to them. For that matter, Hispanics, primarily Mexicans and other Latin Americans, have a high percentage of persons having pre-Colombian ancestry, and the dark-skinned thing may explain some of the good will there.
So why didn’t I include Native Americans in my bias group? Let’s see where this introspection trail leads.
Notice that, in the above paragraph, I used the phrase “pre-Columbian ancestry.” There’s one of the problems. I have, in fact, some problem with the term “Native American,” and it’s related to the same problem that I have with “Indian,” which was the term used when I was young.
I generally use “Native American” because that seems to be the preferred term used by those who “self-identify” as the jargon goes, and I like to use the names that people wish to go by. I will call a guy “her” if she insists on it, (unless he pisses me off in the right sort of way). African-American, black, or “person of color,” yes, I’ll go along, fair enough. I might have a mild objection to the term “Native American” because it’s a bit jargonesque, since, by ordinary usage, anyone born here is a “native American.” Then, if you dig deeper, it gets worse. The word “America” isn’t itself native; it’s a word applied originally by Europeans to a newly discovered (by them) continent.
I’d like to use the word the pre-Columbian natives used for the continent, but they didn’t have such a word. Indeed, they had no idea that they were living on a “continent” because that’s another external invention.
So, as you can see, what I’d like to do is to use the original names of self-identification, but there we run into a wall. There is no group name, because the grouping itself is a racial group, imposed by outsiders. The actual pre-Columbian tribes and nations had separate names for themselves, Cherokee, Inca, Mohawk, Seneca, Shoshone, and yes, I know that these are imperfect transliterations, but it’s a step towards politeness, and back from racialism, so I take it when I can.
And this is just the trouble I have with the names. What the hell do you do about the stereotypes?
There have always been both positive and negative stereotypes about the native American peoples. Noble savage. Blood-thirsty redskin. Pocahontas, Sacagawea, Crazy Horse and Sitting Bull. Tonto and Kemo Sabe. Cowboys and Indians. Ah, Jeez, I could keep this up for far too long. Davy Crockett, “Indian Fighter,” but he broke with Jackson over the Trail of Tears.
There are very few ways of repelling the stereotypes. From the very beginning, there were stories of “Good Indians” and “Bad Indians,” with the difference between good and bad always being a matter of how they related to white immigrants. Ultimately, however, both kinds mostly wound up as “dead Indians,” in one of the greatest population declines in human history. There are libraries of scholarly arguments about the size of the native population of North America pre-Columbus, with estimates ranging from less than ten million to over one hundred million. There are debates as to whether the European occupation amounted to “genocide” or “democide.”
What is not really open to debate is that the native cultures were virtually obliterated. Sociology is now history and archeology. Anything other than scholarship becomes stereotype.
Where I grew up, there were more than a few people who claimed at least some Native American ancestry. My mother’s family claims some, and there’s at least some evidence of the truth of it.
But that’s just genetics. It’s essentially racialism to hold that somehow the survival of the genes negates the destruction of cultures and peoples. I can take a little comfort in the belief that, over the centuries, some cultural diffusion occurred, that Philip Rahv’s distinction of writers as “redskins” or “palefaces” might have some deeper taproot into the American psyche. Certainly there were hundreds of years of cultural contact, before the final—and largely successful—attempt to herd all remaining tribes onto reservations, teach them English, and inculcate them with the self-loathing that can only exist in someone who has been told, and shown, from birth that they are second-rate, not even citizens really, but some lower form of life. So God knows I’d like to think that some of the original native cultural influences still survive, if only to hold the hope that it wasn’t all lost or reduced to pop culture crap.
I’ve known several “professional Native Americans” over the years, individuals who found a way to make a living by playing on all that was really left to them, stereotypes of their history and nature. There’s some money to be made from liberal guilt and I won’t scorn anyone who chooses to scoop up some of it.
But I’m stuck with the pity of it, and there’s not a lot of money in pity, not mine anyway, and besides, they don’t want or need that sort of thing from the likes of me. I like the fact that some tribes have figured out how to get some of that paleface gambling cash, and good on them. I hope they manage to keep it more of it than was the case with the Oklahoma oil money. I expect they will, as they aren’t dumb, and not quite as many are out to take it away from them this time around, or so I hope.
I don’t really have a conclusion here. That’s the reason for the omission in the previous essay: confusion and bewilderment. I hope that this is largely a product of my own ignorance; that somewhere there are native tribes that maintain a deep culture, or who have managed to re-invent themselves for the modern world, the way that so many others have done. I imagine that there are some such, and I have absolutely no doubt that there are admirable and amazing individuals who self-identify as Native American. That I have none as personal acquaintances is my failing and my loss, yes, absolutely.
So hey, man. Got any peyote?
Wednesday, May 2, 2007
A Moral Equivalent of Socialism
A couple of jobs ago, in another example of cubicle hell, I visited one of my co-worker’s desk, to get something or other, I forget what. She was originally from Afghanistan, and on her wall was a map of the area of that country and its surrounding neighbors, and each country had a little legend giving population, birth rate, mortality rate, literacy, life expectancy and so forth. It was very, very apparent which countries had been a part of the old Soviet Union. Those were the countries with the high literacy rates, while the other countries, out allies in the Cold War, had much lower literacy rates, particularly among women.
In my earlier bit of research on mortality rates of professions and countries, I happened to check on a statistic that I’d heard some while back and found it to be true: Cuba has the lowest infant mortality rate in Latin America and the Caribbean (with a few low population exceptions such as Aruba). In fact, Cuba’s infant mortality rate is lower than the U.S. although this slight difference is probably because the U.S. has more low-weight births, i.e. babies that would not have come to term in Cuba do so in the U.S. and many of these die. Nevertheless, the low infant mortality in Cuba is a sizable achievement; if it were the same as the surrounding countries, there would be over 2,000 additional infant deaths per year in Cuba.
The analysis in which I found the explanation of the U.S. low weight birth phenomenon, there appeared a disclaimer about how the author certainly was no fan of Castro or Cuba, and I’m should probably echo that disclaimer here. McCarthy may be long dead, and the Soviet Union is no more, but if one can be accused of being a communist, or a socialist, or a liberal, a sizable section of the populace breathes a sigh of relief and ceases to listen. One must, of course, stipulate that authoritarian states such as the Soviet Union and Cuba are Bad Things. Certainly there are many countries that manage to avoid authoritarian governments that nevertheless manage to achieve high literacy and low infant mortality. There are also many authoritarian states with low literacy and high infant mortality that nevertheless manage to remain “allies” of the U.S. and other western powers.
Still, the idealists who turned to socialism in the 1930s and at other times had some things in mind and I’ll go out on a limb here and suggest that public education and public health services were probably very high on that list. In fact, I’ll suggest that countries that fail in those areas have failed “socialism,” however else one chooses to identify socialism. Under that schema, North Korea is not socialist, but rather an authoritarian monarchy. China seems to be slowly losing whatever socialist credentials it once had, at least insofar as there has been a reported decline in its public health services, though avoiding such horrors as the Cultural Revolution probably still counts for something.
The idea of state ownership of the means of production has always struck me as a pretty bad idea, though the current U.S. situation of having the means of production owning the state does not strike me as being a much better idea. Nevertheless, I don’t see anything in the idea of respecting private property as automatically implying that all organizations must benefit some select private individuals. Surely there is a place for public education and public health services in the idea of a nation. Why then the current hostility towards those institutions?
Let me just say in closing that my own idea of “winning” does not require that there be losers who are sick and ignorant. If that’s what it takes to be a winner, I don’t think that winning is a good thing.
Friday, April 20, 2007
Relativism vs Absolutism
The original version of this brief followup to "What Moral Relativism Means to Me" came about because I ran across a set of blog postings that were valiantly trying to keep a discussion going, with multiple posts cross-linked across multiple blogs. The discussion was about moral relativism. Naturally the whole thing became very confused, but that’s not entirely due to the blog medium. Part of it is that academic philosophy itself has confused the issue of with a plethora of nomenclature about all possible permutations of moral philosophy. Yet there seems very little comment concerning what seems to me to be the central issue (noted in my original essay): that people have a certain agenda when they decry “moral relativism” and that agenda certainly looks like it’s designed to excuse a refusal to consider differing points of view.
While it’s usually dangerous to use scientific analogies in moral and ethical inquiries, I’m going to take the risk and use Special Relativity as an example. What’s interesting about Relativity in the scientific sense is that it is not at all in opposition to objectivity; Special Relativity is objectively testable and has passed every such test. Similarly, there is nothing in the idea of Moral Relativism that requires it to conflict with objective reality. It is true that someone’s subjective viewpoint of what is good or bad (for them) is central to moral relativism (or at least my version of it), but it is often pretty easy to objectively determine whether or not a particular event is good or bad for someone who isn’t you. Give a hungry man a meal: probably good. Hit him over the head with a hammer: probably bad. The fact that there are exceptions to both of these general rules only underscores my point.
What Special Relativity does is overthrow the ideas of Absolute Space and Absolute Time, or more technically, the idea that there is an Absolute Reference Frame. Other sorts of Absolutes are not absent from science, though there are often some interesting caveats. Absolute Zero, for example, is a perfectly respectable scientific concept, it just happens to be unattainable as a practical matter. Nothing wrong with that.
So I’m holding that “Moral Relativism” does not stand in opposition to whatever one would mean by “Objective Morality” (though I think that the only meaning the latter can have is that one would judge an action by its objective consequences, and not, for example, what one intended those consequences to be). Rather, “Relativism” would stand opposed to “Absolutism.”
That evens the odds, I think. I mean, how many followers would Ayn Rand have gotten if she’d called her philosophy, Absolutism?
Wednesday, April 18, 2007
The Crime of Thomas Jefferson
The most recent furor about Thomas Jefferson is a pretty good experimental smashup in that regard, though the first result is one I’ve mentioned a while ago: when sex enters a narrative, the narrative becomes all about the sex. In Jefferson’s case, the burning issue of the day was whether or not he had a child by his slave Sally Hemings. DNA testing of Hemings’ descendents put the debate into the realm of the truly bizarre. The testing was “inconclusive” in that it could only say that someone in Jefferson’s immediate family was the progenitor, so it could have been either Thomas or his brother Randolf. Naturally, a lot of Jefferson scholars immediately set out to prove that it was Randolf, because otherwise, Thomas Jefferson would have had to have had sex with his slave Sally, who, it should be noted was his deceased wife’s half-sister. That would have made Jefferson no better than…his father-in-law.
Sally was also Jefferson’s property, and it’s hard for us to really grasp what that means. A southern slave owner could certainly legally have sex with his slaves. He could also beat them, mutilate them, force them to mate with anyone he chose, or kill them for any or no reason, all without any repercussion other than financial. Do you own a dog or a cat? Your pets have more legal rights now than did a slave in the Old South.
On the other hand, for Jefferson’s brother to have had sex with Sally Hemings without Thomas’ permission, would have been a serious breach of manners and ethics. If it were done with Jefferson’s permission, then, well, which is worse, sleeping with your wife’s half sister or pimping her out to your brother?
As I say, sex in the narrative tends to muddy the waters and muddle the thinking. In any case, Annette Gordon-Reed, in Thomas Jefferson and Sally Hemings: An American Controversy pretty clearly demonstrates that Thomas Jefferson was the only possible father of Hemings' seven children. More importantly Gordon-Reed addresses the issue of how it is that the fairly clear and compelling evidence of the relationship was ignored or explained away by scholars who were basically devaluing the evidence provided by historical sources who were black. Not to put too fine a point on it, implicit racist assumptions led to false conclusions.
Jefferson, of course, was a paragon in the founding of America. His was the language of the Declaration of Independence. As President, he arranged the Louisiana Purchase. He was the prime mover behind the founding of the University of Virginia, the first secular university in America. His is the spirit behind the First Amendment, and his aphorisms in favor of freedom of speech, press, and religion are part of the discourse to this day. He was also a major scholar and scientist. As John Kennedy once quipped at a White House dinner for Nobel Prize winners "I think this is the most extraordinary collection of talent, of human knowledge, that has ever been gathered at the White House, with the possible exception of when Thomas Jefferson dined alone." Jefferson’s library formed the basis of the Library of Congress. He was probably, after Benjamin Franklin and Benjamin Thompson, (later Count Rumford) the most internationally famous scientist and intellectual in America at the time.
So then, how to react when confronted by something like this:
I advance it therefore as a suspicion only, that the blacks, whether originally a distinct race, or made distinct by time and circumstances, are inferior to the whites in the endowments of both body and mind. It is not against experience to suppose, that different species of the same genus, or varieties of the same species, may possess different qualifications. Will not a lover of natural history then, one who views the gradations in all the races of animals with the eye of philosophy, excuse an effort to keep those in the department of man as distinct as nature has formed them? This unfortunate difference of color, and perhaps of faculty, is a powerful obstacle to the emancipation of these people. --Thomas Jefferson, “Notes on the State of Virginia”
Jefferson’s ownership of slaves made him a part of his culture, and his racist views were also part of that culture. This is that “cultural relativism” we hear so much about. Those who decry cultural relativism must then decide whether Jefferson was an evil man, or whether slavery wasn’t so bad as all that. Since I have no quarrel with cultural relativism per se, I’m willing to give Jefferson a pass on the slave owning, though not a full pardon. Washington gets a full pardon; he freed his slaves at his death. Jefferson supposedly wanted to do the same, but he’d ran up so many debts that his estate couldn’t afford the gesture, so his slaves got sold off, families split, the whole horror show, all because Jefferson just had to add that extra staircase onto Montecello and import a few more varieties of plants for his experiments.
But those sorts of moral transgressions are transient, personal, and local. The same cannot be said for the scientific racialism that he expounded as a whitewash to his own personal good fortune of having been born white and rich in a society whose wealth depended upon slave labor. It may be asking a lot for someone to give up all those benefits of position and privilege. But to use one of the finest minds of his era to rationalize that situation, that is crime that continues to this day. Certainly scientific racism would have existed without Jefferson, but he was one of its originators in this country. And that was a crime against both free society and against science.
Thursday, February 15, 2007
What Moral Relativism Means To Me
I’ll start with the concept of “personal good.” I would hope that this is not controversial (dream on), since it is easily verified and exists in the language in such phrases as cuo bono (“who benefits”), and “whose ox is being gored.” Simply put, one’s notion of what is good or bad is often purely individual. If I am hungry, a can of peanuts is good, because I can eat them and satisfy my hunger. However, for someone with a peanut allergy, peanuts are bad, because if they eat peanuts they could die.
As an aside, let me note that poisons are often called “vile” or “evil,” (similar words with possibly differing roots). This tends toward personification, ascribing human attributes to inanimate matter. The personification may be showing something important. While Hurricane Katrina killed more people than the Oklahoma City bombing, one rarely hears the hurricane itself called evil, not like (for example) illegal drugs are called evil. I’ll suggest that there is a tendency to ascribe “evilness” to things that deceive. A poison may lurk in a tasty food, so it is evil. A drug may convey pleasure, but corrode the spirit, also evil. And so forth. In any case, I’ll argue that the concept of “evil” carries with it an assumed human content, which may be important, given that we are talking about the difference between subjective and objective.
Good and bad are, therefore, at least at one level of discourse, subjective phenomena. Their attribution depends upon a point of view. However, human beings tend to try to objectify their subjective experiences, and the impulse is reasonable, assuming that there is an objective reality, and I’m not here to dispute that. The question here is, “Can one objectify the concepts of good and bad to the point where they become absolute?” This constitutes no problem for religion; morality then becomes a matter of divine judgment as determined by revelation and adherence is a matter of faith. However, this is an assertion rather than an argument and I’ll not deal with it here.
The strongest argument against the idea of purely objective, logically derived, absolute morality is that a number of Very Smart People have tried to devise such a system and have failed. Kant’s Categorical Imperative, for example, leads to the result that one must always be truthful, and so must tell Nazis where the Jews are hidden if they ask. John Rawls’ “A Theory of Justice” contains an interesting thought experiment (What society would you design if you knew you would be a member of it, but could not chose which role you would have?), but it is vulnerable to the inclusion/exclusion problem. Are we only talking about human beings? How about domestic animals? Wild animals? Trees? And if it is only human roles, what do you do about someone who denies the humanity of slaves, the vegetative, the French, fetuses, or terrorists? In any case, any attempt to design a society ab initio is suspect.
(The argument against the Categorical Imperative used above is common in discussions of moral philosophy. It’s often mistaken for “reductio ad absurdum” but the result is more horrific (turning people over to be murdered) than absurd. I’d call it “reductio ad nauseam” but “ad nauseam” is usually taken to mean “massive repetition” and that does not apply, except insofar as this is yet another use of Nazis in an argument, which is seldom a good sign).
The Nazi example is often used as an argument against moral relativism generally, or cultural relativism specifically, but it’s not a fair cop. National Socialism failed on its own terms: it destroyed the German Volk, got its leaders killed, and never achieved anything like its goal of a purified “master race.” That the last goal was impossible should be an indication that “relativism” has real limits, as opposed to the straw man version.
The fundamental principle of morality is that actions have consequences, and some of those consequences are better than others. Moreover, a single action will have multiple consequences, again, with varying value. Even from a solitary point of view that admits no others (the alone-on-a-desert-island example), consequences will vary over time, such that one would need to organize one’s behavior to maximize the good results – over time – and minimize the bad. It does not take much analysis to recognize that, to our hypothetical desert islander, using the morphine in the emergency kit to get high is more likely to have bad consequences than good. Having someone curse someone else’s lack of proper behavior is a good indicator that something has gone wrong in the morals/ethics department, even when the cursee and the curser are the same person with an intervening gap of time.
A philosophy of moral relativism, with its emphasis on point of view, is hardly a prescription for moral laxity or license. In actual fact, it recognizes the difficulties of human action more completely than do absolutist doctrines. Moral absolutism carries within it powerful temptations such as sanctimony and grievance. If some injury is done to me by someone, I consider their actions to be “bad,” which they certainly are, from my point of view. If morals are absolute, then the actions that led to that injury must also be bad, and the people who did them are also bad. If morals are absolute, if they are objective, then it is not necessary for me to consider the point of view of someone who has harmed me. That, after all, is merely their “relative” judgment, while I am backed by an absolute judgment. Add only the doctrine that it is okay to do bad things to bad people (I’m trying to think of some philosophy that does not allow crimes to be punished and I’m not coming up with much), and you have a prescription for war.
In short, absent a deity that speaks in a clear and unambiguous voice, a doctrine of absolute morality appears to be an invitation to projection: making one’s own judgments (and prejudices) the center of the universe, the law which all must obey. Moral relativism, at the very least, carries within it the idea that the moral universe has more than one center. I consider that to be an improvement.