Tag Archives: newsweek

Unemployed White Males

Newsweek recently ran an article about the plight of the formerly great white male. The article reveals that as of early 2011 600,000 college educated white males in the 35-64 age group were without jobs. This is a 5% unemployment rate. The gist of the article seems to be that the white male is in dire straits. However, this claim does not seem to be supported by the available evidence. This is not, however, to say that it would be incorrect to be concerned about the plight of people in that demographic.

While the 5% unemployment rate is twice what it was prior to the economic meltdown, it is still far better than other demographics. This is not to say that the men who are unemployed are not suffering-they surely are. However, this hardly seems to be a clear sign that educated white males do not have a “freaking prayer.” Rather, it shows that the economic mess hit very hard-hard enough to impact even those in the upper tiers.

That said, it would also be a mistake to simply dismiss concerns about this demographic as being groundless. After all, to dismiss the plight of the unemployed white men because they are white and male would be comparable to dismissing the plight of any group based on the gender or ethnicity of its members. As such, it seems right to be concerned about these people because they are, after all, people.

It might be argued that even if these white males are worse off than before, this should not be  matter of concern. After all, white males have been doing very well at the expense of others for quite some time. As such, they certainly deserve to pay for these past injustices.

While this does have a certain appeal, there is the obvious concern about what is actually just. If those individuals who oppressed minorities and women are now paying for their misdeeds, then that could be seen as just. However, it would hardly be just if all white men were treated as interchangeable, so that the men losing their jobs now are somehow justly paying for the actions of their predecessors based on an inheritable white guilt.

It might also be argued that the plight of the unemployed white men should not be a matter of concern because the wealthiest people are still white males. As such, the white male hardly deserves any sympathy.

While it is true that most of the very wealthy in America are white males, it is not true that most white males are very wealthy. If it was reasonable to claim that because some people of type X are wealthy, then we need not be concerned about people of type X being unemployed, then it would follow that we would not need to be concerned about anyone. For example, Oprah is very rich, yet it should not be inferred that we should not be concerned about black women. Likewise, the mere fact that Trump is white, male and rich (maybe) does not entail that we should not be concerned about the white men who are unemployed.

I, of course, am well aware that white, educated men are still very well off relative to everyone else. However, this does not entail that all white men  are well off or that it is foolish to be concerned about those people who are unemployed, but also happen to be white men. After all, the fact that most wealthy people in the US are white males is hardly a big help to the white guy who cannot find a job.

My point is, of course, not that special attention should be paid to the white male. Rather, my point is that the white males who are not doing well should not be ignored simply because some white males are still doing very well indeed.

Enhanced by Zemanta

Yes, But…

In the 1890s when gender role reversals could ...

Image via Wikipedia

I have been a consistent supporter of the idea that women should be regarded as the moral, legal, and political equals of men. In general, I have based this support on the principle of relevant difference: people can (morally) be treated differently only the the basis of a (morally) relevant difference between them. So, while it would be acceptable to pay someone who has more education more than another person, it would not be acceptable to pay someone less simply because she happens to be a woman (or he happens to be a man). I first learned of this principle as an undergraduate during a class on feminism. This class had a lasting impact on me, including an interest in gender issues that persists to this day.

As time marched on from my undergraduate days, I was pleased to see various unjust aspects of American society change. Women had ever increasing opportunities in business, education, sports and in many other areas as well. This trend continued, with the occasional specific set back, until some feminists went so far as to claim that feminism had grown stale or even that there was no longer a need for feminism in America.

While I was pleased with the trend towards equality, another trend that has stood out is what could be called the “yes, but…” trend. I first noticed this when I was doing research for some essays on men, women, and higher education (which appeared in my book). I found that the although the majority of undergraduates were women, there seemed to be almost no concern about this new gender inequality. This initially struck me as odd. After all, feminists and their allies had always been very quick to point out gender disparities that were not in favor of women and endeavored to rectify such imbalances. When I would bring up my concerns about the male decline in higher education, I would most often by the phrase “yes, but…” where the “but” would be followed by some area where men still exceeded women, such as in  physics or the highest levels of the corporate world. Watching the occasional news report that mentioned gender issues, I noticed a similar pattern: it would be pointed out that women exceeded men in some area, but this would be followed by pointing out some area (like income) where women were said to lag behind men.

I most recently noticed this in a Newsweek article, “Born Again Feminism“, by Kathleen Parker. She writes:

As a group, we are worse at some things, but better at others—the very “others,” it also turns out, that happen to be driving today’s economy and that of the future.

Consequently, in the U.S. today, women hold a majority of the jobs, and dominate in colleges and professional schools. They also hold a majority of managerial and professional positions, and about half of all accounting, banking, and insurance jobs.

These socioeconomic facts don’t mean that women have achieved perfect parity with men, who still dominate at the highest levels of business.

As a side point before getting back to the main issue, it is interesting to note that Parker also makes use of a common device in today’s discussion of gender issues: men and women are different, but women are better than men in terms of what is needed today. This, in many ways, is a distorted echo what might be called the old sexism in which men and women were seen as different, but men were regarded as being better than women in the ways that mattered economically, politically and so on. Given this similarity, this sort of thing should be a point of concern among those who are worried about sexism.

Getting back to the main point, this nicely illustrates the “yes, but…”approach. Parker notes that women hold the majority of American jobs, classrooms, managerial positions and have parity in accounting, banking and insurance. But, they have not “achieved perfect parity with men.”

One obvious response is that she is quite right. In America, women have not achieved perfect parity because they are the majority in the areas she mentioned. Perfect parity would require that no gender dominates in any area-even if the dominate gender is female.

I always find it interesting how quickly certain people can transition from saying how women dominate in so many areas to criticizing the fact that there are still areas dominated by men. What is most interesting about this is that the arguments used to argue for equality in the areas still dominated by men would certainly seem to apply to the areas that are now dominated by women. As such, it would seem that the concern about the remaining male dominated areas should also apply to those areas where women now dominate. After all, if gender inequality is unjust when it favors men over women, it would seem to be unjust when it favors women over men. However, this concern often seems to be lacking and it might be suspected that there is a certain moral inconsistency at play in some cases.

This is not to say that there are not areas where the inequality does not unjustly favor men nor is it to say that there are no longer any valid problems left in the area of women’s rights. When people use the “yes, but…” approach they often do point out legitimate problems that need to be addressed. However, they all too often seem to miss the legitimate concerns in regards to areas in which women dominate.

Naturally, I am open to the idea that cases of gender inequality need not be cases of injustice. For example, in my book I consider that the gender disparities in higher education might be due to free choices on the part of men and women and not the result of any form of sexism. However, I am also careful to consider (as I learned from the feminists) that gender disparities could be the result of injustice. Those who use the “yes, but…” approach should be careful to apply a consistent set of principles to both sorts of situations, those in which men dominate and those in which women dominate. After all, we surely do not want to trade one form of sexism for another.

Enhanced by Zemanta

Beauty & Discrimination

Hooters Bikini Contest. Annual bikini contest ...
Image via Wikipedia

Dahlia Lithwick wrote an interesting essay in the June 14th issue of Newsweek about the law and beauty bias. This got me thinking about the issues she raised.

It is reasonable well established that attractive people generally have an advantage over people who are less attractive. It is also reasonable well established that some businesses discriminate against people who fail to meet up with their standards of attractiveness. For example, Hooters famously fired a waitress for being too heavy.

Currently, there is little legal protection against discrimination based on appearance. Of course, there is the obvious question of whether there should be such protection.

On one hand, it could argued that there is no need for such laws. First, such laws could be seen as intuitively absurd. A law against not liking ugly people? How absurd!  Of course, this might simply be an appeal to ridicule: the mere fact that something can be laughed at or seems silly is hardly prove that it is.

Second, there is the reasonable concern that such laws would set a legal precedent for even more laws that would lead to either real legal harms or at least to a degree of absurdity that would be undesirable. For example, what if laws were passed to prevent “discrimination” against people for being foolish. Of course, this could be seen as a slippery slope argument. Unless, of course, reasons can be given showing that these negative results would follow.

Third, there is also the reasonable concern that people are naturally biased in favor of attractive people and also biased against people they regard as unattractive.  This can be seen as being analogous to the fact that people tend to be biased in favor of people who are pleasant, friendly or entertaining while they tend to biased against people who are unpleasant, unfriendly or boring.  It would seem absurd to pass laws that attempt to compensate for the bias people have in favor of such people. If the analogy holds between looks and personality, then it would seem absurd to pass laws against discriminating based on appearance.

Despite these points, there is a rather significant reason to favor such a law. This reason has nothing to do with unattractiveness but rather to do with the notion of relevance. From a moral standpoint, to fire or otherwise mistreat a person in a professional context (for the law to cover personal relationships would be rather absurd) based on appearance would seem to be unacceptable. To use a specific example, if a Hooters Girl is doing her job as a waitress to fire her because she weighs “too much” would be unjust. After all, as long as she is physically able to perform her job, then her weight would not be relevant.

One possible reply to this is that there are certain jobs in which attractiveness would be a relevant factor. To use an obvious example, super models are supposed to be beautiful and it would be rather odd for someone of average or less appearance to argue that they are a victim of discrimination if they were not chosen to be a supermodel. To use an analogy, if a job required a great deal of physical strength or a high degree of intelligence or creativity it would hardly be discrimination if people who lacked those attributes were not hired for such jobs.

That, it might be said, can be seen as a crucial part of the matter. If appearance is a legitimate asset and actually part of certain jobs, then to hire (or not hire) people based on appearance would not be discriminatory in these cases.

Of course, there is the concern  that there should not be jobs that are based on physical appearance. Such jobs, it might be argued, are inherently discriminatory and also serve to create various problems. Feminists, for example, often present such arguments. However, it could also be argued that there should not be jobs based on other natural assets such as wit, humor, intelligence or creativity. After all, if valuing beauty is somehow wrong, then it would seem that valuing these other qualities would also be wrong.

Reblog this post [with Zemanta]

Pleasures of Bodiless Souls

Rosa Celeste: Dante and Beatrice gaze upon the...

Image via Wikipedia

Lisa Miller recently wrote an article for Newsweek about life after death. In this article (and her book Heaven) she runs through various theories and views on this matter. One view she considers is the notion of an immortal soul. She writes:

After death, the soul—unique and indestructible—ascends to heaven to be with God while the corpse, the locus of our senses and all our low human desires, stays behind to rot. This more reasonable view, perhaps, has a serious defect: a disembodied soul attaching itself to God in heaven offers no more comfort or inspiration than an escaped balloon. Consolation was not the goal of Plato’s afterlife. Without sight or hearing, taste or touch, a soul in heaven can no more enjoy the “green, green pastures” of the Muslim paradise, or the God light of Dante’s cantos, than it can play a Bach cello suite or hit a home run. Rationalistic visions of heaven fail to satisfy.

The crux of this problem is that a bodiless soul cannot have the same experiences that an embodied soul can experience.

One important assumption in her assessment is that a disembodied soul cannot (or at least will not) have the same sort of experiences it had while it was embodied. However, it is easy enough to make a case that such a soul could have experiences comparable to bodily experiences. To do this, I simply need to borrow some skeptical arguments. For example, take Descartes’ classic Meditations on First Philosophy. In the first meditation he presents the dream argument, arguing that everything he experiences could be a dream and uncaused by external objects. Even more relevantly to this issue, he considers that God might be causing his ideas of his physical body and an external world even though there are no such entities. This is a coherent scenario and God could presumably do this for a bodiless soul “in” heaven. Since God would be doing this for the benefit of the soul, this would not be immoral on God’s part. In fact, God (or His agents) could make it clear what is being done. Interestingly enough, for philosophers like George Berkeley life on earth is just like this: there is naught but minds  and the ideas in them. In short, on such views we are already bodiless souls. So, life in heaven as a bodiless soul could be every bit as satisfying as life on earth.

A second assumption on her part is that a bodiless soul cannot have experiences that differ from the bodily experiences yet are as (or even more) satisfying than the bodily experiences of life. While Plato’s goal was not to comfort people with his idea of the Platonic heaven, in his theory the souls are not “escaped balloons.” The souls “commune” with the forms and engage in intellectual activities. Even better, the souls are fully in the presence of the good. In Plato’s theory this seems to be a rather interesting and satisfying existence. In contrast, life on earth is far less satisfying.

Turning to a more religious view, it could be argued that being in the “presence” of God and the other souls would be immensely rewarding and satisfying. This experience would not be like what we experience on earth. It would, one might argue, be vastly different and vastly better. After all, to assume that the mere physical pleasures are all that a human can experience and enjoy is a bit like assuming that a human can only enjoy the pleasures of animals such as pigs.

True, when most people think of pleasure they do think of bodily pleasures and would, naturally enough, see heaven in that light. For example, people enjoy sex so it is natural that some folks would claim that there will be a number of virgins waiting to have sex with them in heaven. However, the fact that most people think in terms of bodily pleasures simply shows the limits of their imagination and conception of what is fine (as Aristotle would say). The fact that the many would prefer bodily pleasures on earth and in heaven hardly shows that these are the best pleasures. As Plato and others have argued, there are better pleasures than these and these pleasures could very well be enjoyed by a bodiless soul.

Of course, one might wonder whether there are souls or not and if there is a heaven.

Reblog this post [with Zemanta]

Hitchens on Sports

BEIJING - AUGUST 20:  Wilfred Kipkemboi Bungei...

Image by Getty Images viaDaylife

While I am a professional philosopher, I am also an amateur athlete and, as such, found Hitchens’ recent article on sports to be rather…interesting.

Hitchens does make some reasonable and valid criticisms of international sports. To be specific, he does point out that international sporting events have led to serious conflicts and some rather reprehensible behavior. However, he does not stop there. He moves on to attack sportsmanship itself by pointing out bad behavior on the part of athletes and fans. He also attacks the overuse of sports metaphors in politics, complains about the coverage afforded sports, and takes the usual shots at the overemphasis of sports in major universities.

His criticism of sports does have some merit. After all, the incidents and behavior he points to are quite real. Like him, I find the excessive coverage of sports a bit tedious and I also have been critical of how sports is often handled at the university level. However, Hitchens sweeping attack has a rather serious flaw, namely that he is engaging in a relentless straw man attack.

His specific form of a straw man is one that I point out to my students in my critical thinking class: one way to make a straw man of something is to focus entirely on the negative aspects of the target, while conveniently ignoring or underplaying the positive aspects.  To fairly assess something, such as sports, it is important to consider the positive aspects as well. After all, focusing merely on the negatives will produce a rather distorted assessment  (as would focusing only on the positive). Naturally enough, such a balanced assessment can lead to the conclusion that something is rather negative. But, at least such a conclusion would be properly justified.

This tactic is standard for Hitchens and one he routinely employs against religion.  Perhaps he honestly sees the world this way and is psychologically incapable of  presenting a fair assessment. Perhaps he merely uses this tool because it works as a persuasive device (while failing as a logical method). However, his motivations are (obviously enough) irrelevant to assessing his case.

To begin with my reply, I am obligated to say once more that I am an athlete so as to allow people to be aware of this as a possible biasing factor in my views. I competed in high shool and college and still compete today. Of course, the merit of my case has no connection to my status as an athlete-to think otherwise would be to fall victim to an ad hominem fallacy.

My main contention against his case is, as noted above, that he seems to simply ignore any positive evidence in favor of sports. While my view of sports is based on my own experiences, these still count as evidence for the positive aspects of sports.

First, my own experience as an athlete has made me a better person. My coaches always emphasized fairness, good sportsmanship and character and they took all this very seriously. Through their guidance and through the lessons of competition I learned the importance of competing fairly, of maintaining integrity and showing respect to my fellow athletes.  I can honestly say that sports helped shape my moral character and much of what is best about me has come through sports.  I am not claiming to be a saint or exceptionally good. But, I do know that my experience in sports has, as Aristotlewould say, has developed my virtues.

Second, my observations of my fellow athletes has shown that most of them have also benefited from sports. With some notable exceptions, the people I have competed with and against have shown good character. To see this for yourself, go to a local road race or even a large race and observe how people behave. To use just a few examples, runners will share water with their competitors, tell people they are racing against the right way to go, and even stop to help an injured competitor.  People also volunteer to work at such races, often getting up very early and sometimes enduring rather tough conditions. This is hardly a sign of bad character or poor behavior. Yes, there are some people who are jerks (I’ve taken a few needless elbows to the chest, for example). But, what I have observed has generally been rather positive in character. Lest I be accused of presenting a small sample or a biased one, I have competed in hundreds of races ranging from the Beach to Beacon 10K to the Columbus Marathon to high school meets to college meets and so on. As such, I have a fairly broad sample to work with.

Of course, I can still  be accused of presenting a biased sample. After all, my experience has been primarily with runners and often with runners I know. Perhaps running is different from other sports in significant ways. Also, there is the obvious concern about extending my experiences from this one sport to other sports. However, even if running is unusual it does serve as counterexample against Hitchens’ attacks on sports. Also, Hitchens can also be accused of using a biased sample: he focuses only on the negative while ignoring the positive.

To finish up, I do agree that Hitchens makes some points well worth considering. Sports can lead to rather bad behavior and serious problems. However, this is not a quality that seems to be inherent to sports. Rather, it is a problem with how people react to sports and how people behave. The fact that some athletes act badly and that some fans are true fanatics who engage in violence over sports merely serves to show their failings rather than the failings of sports. As noted above, my experiences with running have been very positive and shows that sports can be something very positive. Like everything else in life, sports is largely what we make of it. Those who bring vice to sports will find it there. Those who bring virtue will find that.

Reblog this post [with Zemanta]

Vegetarians Who Eat Meat?

Spit barbecue meat hanging on Avenue C in the ...

Image via Wikipedia

I recently ran across an article in Newsweek entitled “Vegetarians Who Eat Meat”, which got me thinking about two issues. The first is whether a person can be a vegetarian and also eat meat. The second is whether the way the meat animal is raised impacts the morality of eating it.

On the face of it, a vegetarian cannot eat meat and remain a vegetarian. To use an analogy, just as a bachelor cannot be married, a vegetarian cannot be a meat eater.  Of course, some folks might wish to be able to call themselves “vegetarians” yet have the occasional cheeseburger. A conversation with such a person  at a party might go like this:

Vegetarian: (loudly) “Does this have meat in it? I’m a vegetarian, so I want to avoid eating any meat.”

Me: “Yes, that ham salad has ham in it. That’s meat, you know. But, I’ve seen you eat meat recently-like that cheeseburger you had the other day.”

Vegetarian: “Well, I do have a little meat now and then. But I’m still a vegetarian.”

Me: “Ah. I know some people who practice abstinence that way: they only have a little sex now and then.”

But perhaps being a vegetarian is not like being abstinent, but rather like being honest. An honest person does not stop being honest just because they tell a fib now and then. What matters is that such a person is mostly honest. As such, perhaps being a vegetarian is like being honest: they do not have to always avoid meat to justly keep the label, they just have to do so the majority of the time.

Also, there are many variations on the vegetarian theme, so a person could (with a suitable category choice) be a vegetarian and still consume meat. This, of course, does lead to some questions about what it means to be a vegetarian if people can claim that title despite consuming meat. But, as I see it, as long a they are not too self-righteous about it there is no harm in letting them enjoy their self applied title.

I’ll address the second issue in my next blog post.

Reblog this post [with Zemanta]

Emotion and Ethics

Paging through last week’s Newsweek, I came across Sharon Begley’s article “Adventures in Good and Evil.” I found the article rather interesting and, shockingly enough, have some things to say about it.

Begley accepts a current popular view of ethics: it is rooted in evolution and grounded in emotions.  She briefly runs through the stock argument for the claim that morality is an evolved behavior. Roughly put, the argument is that our primate relatives show what we would consider altruistic behavior (like helping each other or enduring hardship to avoid harming others of their kind). Naturally, the primates are more altruistic with their relatives. It is assumed that our primate ancestors had this same sort of behavior and it helped them survive, thus leading to us and our ethical behavior.

Perhaps this “just so” story is true.  Let us allow that it is.

Begley then turns to the second assumption, that ethics is more a matter of “gut emotion”  than “rational, analytic thought.” Using a stock Philosophy 101 example, she writes:

“If people are asked whether they would be willing to throw a switch to redirect deadly fumes from a room with five children to a room with one, most say yes, and neuroimaging shows that their brain‘s rational, analytical regions had swung into action to make the requisite calculation. But few people say they would kill a healthy man in order to distribute his organs to five patients who will otherwise die, even though the logic—kill one, save five—is identical: a region in our emotional brain rebels at the act of directly and actively taking a man’s life, something that feels immeasurably worse than the impersonal act of throwing a switch in an air duct. We have gut feelings of what is right and what is wrong.”

Begley’s reasoning is, of course, that since the logic is identical, it follows that the different judgments in the cases must be based in emotion rather than reason. While her view is reasonable, I disagree with her on two points:  I believe that the logic is not actually identical and that her explanation of the distinction between the two cases is mistaken. Obviously enough, I need to make a case for this.

While the logic of the two cases is similar, the logic only becomes identical if the cases  are considered in a rather abstract manner.  To be specific, the logic is identical if we only consider that the agent is choosing between five deaths or one. If this fact were the only morally relevant fact about the situations, then the logic would indeed be identical (because the situations would be identical). However, there certainly seem to be morally relevant distinctions between the two cases.

One obvious distinction is the oft discussed letting die versus killing. In the first case, the agent has a role to play in who dies. However, the agent is not killing the children. Rather, s/he is deciding who the gas will kill. In the second case, if the agent does nothing, then s/he lets one person die. If she acts, then she kills a person. Since this distinction has been discussed in great length by other philosophers I will not go beyond saying that it is reasonable to take this to be a morally relevant distinction. Hence, it is reasonable to consider the possibility that the cases are not identical-and hence that the logic is not identical. If this is the case, then the distinction in the positions need not be explained in terms of a gut reaction-it could be the result of a rational assessment of the moral distinction between killing and letting die.

Another matter worth considering in regards to the logic is that of moral theories. When I teach my ethics class, I use the same sort of examples that Begley employs: I contrast a case in which the agent must chose who dies with a case in which the agent must chose between killing or letting die. Naturally enough, I use a case like Begley’s first case to illustrate how our moral intuitions match utilitarianism: if we cannot save everyone, then we are inclined to chose more over less. However, I do not use the second case to illustrate that ethics is a matter of a gut reaction. Rather, I use it to show that we also have moral intuitions that in some cases it is not the consequences that matter. Rather, we have intuitions that certain actions “just aren’t right.” Naturally, I use this sort of example in the context of discussing deontology in general and Kant‘s moral theory in particular. In the case at hand, it need not be a gut reaction that causes the agent to balk at killing an innocent person so as to scrap him for parts. On Kant’s view, reason would inform the agent that he must treat rational beings as ends and not simply as means. To murder a man for his organs, even to save five people, would be to treat him as a means and not an end. Hence, it would be an immoral action. There is, obviously enough, no appeal to the gut here and the logic of the cases would be different.

Other moral approaches would also ground the distinction without an appeal to the gut. For example, my religious students often point out that murdering someone would be an evil act because it violates God’s law. In this case. the appeal is not to the gut but to God’s law. As another example, a rule-utilitarian approach would also ground the distinction. After all, the practice of murdering people to use as parts would create more unhappiness than happiness-people would worry that they would be the next person being cut to pieces.  In both of these examples the logic of the two cases is not identical and there is no appeal to the gut.

Naturally, it is reasonable to consider the role of emotions in moral decision making. Obviously, most people feel bad about murder and this no doubt plays a role in their view of the second case. However, to simply assume that the distinction is exhausted by the emotional explanation is clearly a mistake. After all, a person can clearly regard murdering one person to save five as immoral without relying on a gut reaction. It could, in fact, be a rational assessment of the situation.

The Leadership Lid?

Anna Quindlen recently wrote an article about the leadership lid. Her thesis is that America is making an error in not using her greatest natural resources: women leaders. I found much to agree with in her article, but I also found much I wish to contend. Naturally, the purpose of this blog is to assess her case.

Quindlen begins by addressing one obvious reply to the claim that America is not making use of women leaders: women seem to be doing quite well. In fact, Quindlen has her own regular feature in Newsweek and Sarah Palin is the Republican VP candidate. In reply to this view, Quindlen asserts that she and women like Palin are show ponies who are “trotted out” to send the message that women are doing well. She regards this as a deception.

Second, she addresses three apparent signs that women are doing well: most Americans accept the idea of women leading, leadership positions are open to women and many more women are entering professional fields (and hence will rise to the top over the years). She regards these signs as being deceptive as well.

To back up her claim that these alleged signs are mere deceit, she relies on data from the White House Projects Corporate Council. According to this source, there is a leadership lid: on average, women make up 20% of the leaders in America (political, business, military, etc.). Since women are (about) 51% of the population, this is taken to indicate a problem.

It should be noted that the White House Project is dedicated to advancing women’s leadership. While it would be a fallacy to assert that this makes their claims false, it does provide grounds for being skeptical. After all, any group with an agenda should be subject to an extra degree of scrutiny. I am not asserting that their numbers are mistaken-just that they have a strong potential bias when assembling and assessing data.

Despite this concern, let it be granted that women make up only 20% of the leadership in America. While this might seem problematic, this need not be a matter of concern. After all, if women are freely making life choices that do not lead them to be 50% of the leadership, then there would seem to be no problem to be worried about (unless, of course, you think that women should be choosing to be leaders).

To use an analogy, consider my gaming group. The group is, coincidentally enough, 20% female. However, it should not be inferred that there is any sexism or unfairness involved in this. The group is open to all gamers regardless of gender. However, the fact is that most women do not find such games (like Dungeons & Dragons and Call of Cthulhu) very appealing and hence most women do not play them. Since this is a matter of free choice, there is nothing wrong with the fact that my group is only 20% female. Obviously, if I and the other male gamers took steps to limit the involvement of women simply because they are women, then sexism would be in play.

Quindlen, well aware of this fact, turns to considering the factors that keep women out of the leadership roles. She makes use of an article in the Harvard Business Review by Alice Eagly and Linda Carli. According to them, women face a “labyrinth of leadership full of twists and turns.” Of course, getting into leadership positions is not easy. While most women are not in leadership positions, neither are most men. Naturally enough, the question arises as to whether everyone faces roughly the same labyrinth or whether women are confronted with one that is more dire and difficult than the one between men and leadership positions.

In considering this, Quindlen brings up the stock problems that women face: women are supposed to face more burdens at home and in the family. Of course, this would not apply to single women who do not have children and the same challenges would apply to men who are single parents or have to bear the majority of the burdens in the home. Perhaps it is the case that women still, on average, bear a greater burden than men. If this is the case, then women must accept some of the blame: they are allowing men to put this burden on them. As such, I would suggest to single women that they steer clear of men who will burden them in this manner. Women who are already burdened should do what it takes to get a more equitable division of labor. When I was married and my (now ex) wife was finishing her PhD, I did all the cleaning and housework with the exception of her laundry (I can’t be trusted with complex laundry) and grocery shopping (carnivores cannot be trusted shopping for vegetarians). While I might be unusual, most of my married male friends do a large amount of housework and child care. Hence, it seems evident that men can do their share (and more, in some cases). So, women need to show leadership in getting men to do more.

Quindlen next turns to a standard maneuver in the gender dispute: women are actually better than men, but men somehow twist things so they are in charge. She begins by considering the results of a Pew Research Center survey about leadership traits. Naturally, the survey ranked women higher than men in almost all these traits. Oddly enough, the majority of respondents ranked men and women as equally qualified to lead.

There is a fairly obvious explanation for this disparity-most Americans have been trained to say that men and women are equal. However, Quindlen presents an alternative explanation: men are judged by male standards (control and strength) while female leaders are judged by the male standards and the stereotypical standard applied to women (mostly involving social skills).

Her reply does raise an excellent question: how should potential leaders be judged and selected? Men seem to be better at getting into leadership positions-which might be a sign of leadership. After all, a clear mark of leadership is that people accept you as a leader. Laying that aside, if she is right, should women be judged the same as men by removing that third standard? Or should a different set be selected? If so, should it be based on traits that women are supposed to excel in over men? In short, should the standards be switched from an alleged male biased set to a female biased set? My thought is that we should try to find the qualities that would objectively make for a good leader and use those. Obviously enough, the way leader selection really works would almost always ruin that approach-but starting with high standards means that corruption will drag things a bit less low.

Quindlen then turns to to case of Sarah Palin. While people point to Palin as a sign that women are in positions of leadership, Quindlen takes this as just another piece of evidence that there is a lack of women leaders. Palin, she contends, was simply dragged in “to fill a vacuum for the convenience of men.”

Given her view, it is not clear what would count as evidence that women are moving into positions of leadership. After all, if each example can be dismissed as yet another ploy on the part of the patriarchy, then how can we tell if any improvement is being made?

Quindlen then finishes with another standard argument: if women occupied more positions of leadership, things might be better and perhaps we would not be facing the problems we are facing now. She asks: “if women made up half the leadership of that industry, half the members of Congress, half the overseers in government agencies, might it have ended differently? If women led in proportion to their numbers, would things be better?”

Obviously, she most likely thinks the answer to these questions would be “yes.” Fortunately, these questions can be empirically examined. While women are 20% of the leadership, we can examine the current female leaders and see how they have dealt with problems. If they averaged better than comparable men, then there would be a good case for the superiority of female leadership. On the face of it, the female leaders have not seem to have done a better job on average. For example, Nancy Pelosi has been Speaker of the House and the House has done dismally. As another example, Rice has been Secretary of State and does not seem to have done a superior job. I think the evidence is that women can be just as good and just as bad as men when it comes to leadership. As such, I do not endorse merely getting more women into leadership positions out of a hope that their being women will make things better.

I do, however, agree with her point that excluding women from leadership can be a terrible a waste of talent. If less competent men are occupying leadership positions that could be occupied by more competent women, then things are worse than they should be. Assuming that leadership positions should be assigned based on merit, then this situation would also be morally wrong in that regard as well. In general, we would be better off if the best people were able to become leaders. If competent women are being unfairly kept out of such positions, then action should and must be taken, if only out of the selfish desire to get better leaders working on the problems we face. As such, less competent men should be removed to make way for more competent replacements: male or female.

Human Studies & Experiments

Newsweek’s Sharon Begley recently wrote an article lamenting the red tape and paternalism interfering with research on humans.

In her article, she presents three main factors that impede such experiments.

First, she notes that the university panels that oversee human experimentation tend to be overprotective. As an example, she cites the restrictions placed on Scott Atran’s research regarding why people become terrorists.  He was not permitted to ask captured terrorists personal questions because this was regarded as violating their right to privacy.

Second, she points out that human studies and experiments do not have the “sex” appeal of basic science because they are not cutting edge or innovative enough. Of course, scientists are generally not permitted to do anything cutting edge with human subjects and this ensures that human research will be less “sexy.”

Third, she finishes with a common problem in academics: people in one field sometimes fail to see the value of research in another field and hence can be inclined to deny requests for experiments.

Not surprisingly, Begley sees these problems as interfering with important research. Her article also raises points that are philosophically interesting.

While the problems she presents are matters of concern, review panels for research are extremely important from a moral perspective. From a utilitarian standpoint, they can be justified because they serve to protect people from various harms.  Sadly, there are numerous examples in which research was conducted on human beings without such oversight. In the United States, one of the more infamous examples is the Tuskegee syphilis experiment in which 399 men were deceived and ultimately allowed to simply die. Cases such as this one show the clear need for careful review and regulation of human research.

Naturally, it might be argued that allowing unrestricted experimentation on humans would create more good than harm in general terms. After all, without such restrictions medical experimentation could be conducted more rapidly and perhaps more effectively. This would lead to more and better cures, thus outweighing any damage done to the test subjects. However, history seems to show that unrestricted research tends to have the opposite effect: it often harms human subjects with little or no positive return. The infamous Japanese experiments in the Second World War provide disturbing examples of this.

A more reasonable approach would be to retain the review of human research while taking steps to ensure that only legitimate moral concerns are taken into account.

Take, for example, the research on why terrorists become terrorists. While there should be clear moral limits regarding what can be done to captured terrorists, it does not seem reasonable to be overly concerned with their right to privacy in personal matters. Presumably Atran had no intent to threaten or torture the prisoners if they failed to answer his questions Hence, if they wished to maintain their privacy, they could do so by simply not answering his questions. Further, the potential benefits of his research seem to outweigh concerns about the privacy rights of imprisoned terrorists. If asking them personal questions could help reduce terrorism in the world, then this seems to be a reasonable research request to grant.

Of course, there is the concern that prisoners cannot provide informed consent and that they might say things that would result in additional prosecution. The first concern is legitimate. After all, if a person cannot provide such informed consent, then making them research subjects seems morally suspect at best. Imprisoned terrorists might believe that they have no choice but to participate or might mistake the researcher for someone who has been sent to interrogate them. While these are legitimate concerns, there seem to be clear ways around them. The prisoners could be clearly informed as to what the researcher intends and assured that their involvement is completely voluntary. Provided that such steps are taken, it would seem that informed consent could be provided even by imprisoned terrorists.

The second concern has some legitimacy in that the researcher could indirectly harm the subjects should the subjects reveal things that would result in further prosecution. This is based on the  legitimate concern that potential harms to the subjects should always be taken into account when research is being conducted.

In the case of imprisoned terrorists it is tempting to say that it would be good if they revealed new information to the researcher. After all, if plans for an attack were revealed, then the attack could be thwarted and lives saved. Or, if the subjects were revealed to be involved in other crimes, then they should be punished for them. While there are good grounds to believe in a right to privacy, this right does not seem to extend to concealing past or planned misdeeds.

Obviously, not every case will be like the imprisoned terrorists case. However, the review of any case can benefit from the proper application of practical and moral reasoning. The challenge is developing the ethical and practical guides in such a way that the results are correct more often than not. This is, of course, part of the general challenge of getting through life. This, obviously enough, assumes that there are better and worse results.

In regards to the fact that human research is not as “sexy” as basic research, it seems that something should be done to change that perspective. While basic research is important, human research is also critical as well. Perhaps Justin Timberlake can help researchers with this.

More seriously, the importance of human research can be stressed and this might help improve its image among those who dole out grants.

Finally, there is the matter of how academics in one field sometimes fail to see the value of research in other fields. While I have not researched this matter rigorously , I have been a professor for quite some time and have had opportunities to observe and discuss this matter.

In some cases, academics are reasonably well informed about other fields and do their homework when they are called upon to make judgments about such fields. In other cases, academic folks are woefully ignorant of other fields and can even regard them with unconcealed contempt.

While professors and other academic types are often rather busy, it seems reasonable to expect people on such panels to  take the time to have at least some basic grasp of the research they are assessing. Also, steps can and should be taken to build understanding across the disciplines. While a person cannot be expected to master yet another field, it is possible to develop some basic understanding of and sympathy for other fields. This would be beneficial in the context of these review panels and in general.

With such improved understanding, panel members can better review proposals. This will make it more likely that beneficial research projects will gain approval and this would be a gain for humanity. Obviously, the panels  will still reject some proposals but at least it will be more likely that the rejection will be for legitimate reasons.

Umbrage & The Web

Jonathan Alter of Newsweek recently wrote a column on umbrage and the web. While I agree with some of his claims, the article does require a response. As such, I will reply to his main points and offer both commentary and criticism.

Alter begins with a common theme: the umbrage that is present on the web. As Alter notes, the web provides an anonymous vehicle for lies, crudeness and degradation. Of course, the use of the written (or typed) word as a vehicle of umbrage is nothing new. While I am not a professional historian, as a philosophy professor I research the times and backgrounds of many philosophers.  Based on what I have learned over the years, I can assure you that umbrage has been with humanity since we started writing things down. Interestingly, after I read Alter’s article this morning, I saw a show on the History Channel about two rival Chinese gangs who wrote slurs against each other in the American newspapers. This was during the 1800s. I later read an article in the June 2008 Smithsonian about Darwin (Richard Conniff, “On the Origin of a Theory”, 86-93). The article noted some of the written sniping between various people regarding the concept of evolution.  Before Darwin published his work, Robert Chambers wrote Vestiges of the Natural History of Creation in 1845. One geologist replied to the work by expressing his desire to stamp “with an iron heel upon the head of the filthy abortion, and put and end to its crawlings.” (page 90). That is eloquent bit of umbrage every bit as venomous as the comments inflicted on the web today. Of course, it does not quite match the concise with of “boitch u r teh suckz.”

If one turns to politics, examples of venom throughout history are far too numerous to list. For those who wish to search for examples, I suggest beginning with political cartoons from the 1700s and 1800s. You will find that the poison pens of old crafted many venomous cartoons.  Other excellent sources the are various anonymous political tracts from the same time period. As such, umbrage and venom in print are nothing new.

Like Alter, I believe that the umbrage and venom are negative and undesirable. Such venom adds nothing to the quality of discussions and simply serves to inflame emotions to no good end. It also encourages intellectual sloppiness because people feel that they have made an adequate reply when they have merely vented their spleens (to use the old phrase).

Alter next turns to a matter of significant concern: while bloggers offer a great deal of commentary, they rarely provide people with news in the true sense. While some blogs do post the news, it is (as Alter points out) generally taken from some traditional media source. Newspapers and other traditional media sources are, as he notes, are currently laying off reporters due to financial problems. This means that there will be less original investigation and reporting. Fortunately, some bloggers are stepping in and doing their own investigations. I suspect that this might lead to the more substantial blogging sites gradually stepping into the openings created by the decline of traditional  print media. Of course, there is the obvious question of whether a web based organization can afford to do robust investigation and reporting. In principle, however, there seems to be no reason why they cannot partially replace traditional print media.

A third point made by Alter is that print media is moving towards the web’s style of writing. To be specific, there is a push towards short articles like those in blogs. Presumably this is to match the alleged shorter attention span of the modern audience. I do agree with Alter that there can be a negative side to taking this approach. While a short piece can be fine, there is still a clear need for depth and details and this requires more than a blog entry sized block of text. As you can see from most of my own blogs, I tend to go on at considerable length. Hence, it is hardly shocking that I would support him in this matter.

A fourth point that Alter makes is the very common criticism that people exploit the anonymity of the web to launch attacks and spew venom. This is, of course, a concern. However, this is nothing new. History is full of examples of anonymous writings that are quite critical and venom filled. The web merely makes it easier to make such works public and to avoid being identified. After all, if I have to print and distribute an anonymous tract, there will be a fairly clear trail leading back to me. But, on the web I can easily make use of a free service that ensures my identity will remain unknown by making my posting effectively untraceable.

As Alter points out, the “web culture” tolerates anonymity. However, many writers do identify themselves and people are often quite critical of those who hide behind anonymity when they spew forth venom. While there can be good reasons to hide one’s identity (such as fear of reprisals from oppressive governments), most people lack a legitimate reason to remain hidden. My view is that if someone believes what she is typing, then she should have enough courage to actually claim her own words. There is also the matter of courtesy. Anonymous posting is like talking to people while wearing a mask. That is a bit rude. Unless, of course, you happen to be a superhero.

His fifth point is that people often prefer rumors to facts. As he points out, some people believe the emails about Obama being a Muslim and similar such things. What is new here is not that people often prefer rumors, but the delivery mechanism of the rumors. In the past, people had to rely on newspapers, gossip, and public broadsheets in order to learn of rumors. Today, rumors can be sent via email. As such, we have the same sort of rumors using a different medium.

Since I teach critical thinking, I am well aware that people prefer a rumor that matches their biases over truth that goes against them. I am also well aware that people generally prefer something dirty, juicy, or titillating over dull facts. Hence, the appeal of rumors is hardly surprising. Obviously, people should have better rumor filters so as to avoid believing false things (or even true things on the basis of inadequate evidence). The internet has just changed the medium and not the basic problem: most people are poor critical thinkers. Fixing this requires what philosophers have been arguing for since before Socrates: people need to learn to think in a critical manner.

Alter’s sixth point is about a commonly remarked upon phenomena: the internet (email and web comments) seems to be especially awash in venom. As noted above, this is nothing new. However, as Alter points out, the web and email lead to disinhibition. While he does not explore the reasons for this, there are three plausible causes. First, email and web comments are effectively instant. With a written letter, you have time to think about it as you put it in the envelope and go to mail it. During this time you might think better of what you said. With an email or web comment, you just push a button and it is done. Second, email and web comments are generally not edited. Professional newspapers and magazines are edited and hence venomous comments generally do not get into print. Hence, the web seems like a more venomous place. Since people know that what they type will appear, they are less inclined to be restrained. Naturally, this feeds the beast-when people see the first venomous remark, they are (like someone who sees trash already on the ground) more inclined to follow suit. Third, the web allows for anonymous posting and emailing so people can (as noted above) spew from behind a mask. This, naturally enough, encourages people to be less nice.

Some web sites deal with this problem by reviewing comments before publishing them. On the plus side, this does help filter out some of the venom. On the minus side, such editing does tend to interfere with the freedom of expression. It is, obviously enough, very tempting for an editor to delete comments because she disagrees with the contents. Of course, this approach does not deal with the main causes of the problem: poor impulse control, poor ethics and poor reasoning skills.

Philosophers have been trying to deal with those two problems for centuries. Aristotle provided some of the best advice on how to deal with poor impulse control  and poor ethics in his Nicomachean Ethics. Of course, most people do not seem very inclined to follow that advice. Almost all philosophers have tried to encourage people to work on their reasoning skills. However, this has not met with great success. Until more people have better impulse control, better ethics and better reasoning skills, the deluge of venom can expect to continue.

Alter’s seventh point is the usual lamentation about how the web was supposed to bring us breadth in coverage but did not live up to the dream. As he notes, bloggers tend to mainly follow right along with the cable networks. For example, as the American financial system was taking serious hits,  most bloggers and the cable news focused mainly on the “satirical” Obama cover on the New Yorker.

Obviously, this behavior is hardly shocking. Bloggers do the same thing the traditional media does: they focus on the stories they think people will want to hear about. While they can be criticized for pandering to the masses, the masses should also be criticized for wanting such things. When my students ask me why the media focuses on the sensational over the substantive, I provide the easy and obvious answer: the media gives people what they want. Thus, in order to have more substantial coverage, people would need to switch their desire from what is sensational to what is substantive. Good luck with that.

That said, there is actually significant breadth in the realm of blogs. If you leave the mainstream blogs and search around a bit, you will easily find blogs on vast array of topics. For example, there are many blogs devoted to philosophical issues (such as this one). As another example, there are blogs devoted to science. These bloggers do not blindly follow the main media. This, obviously, means that they do not get as much attention as the bloggers who stick with the mainstream. As such, much of the perceived lack of breadth is merely a lack of looking.