Tag Archives: barack obama

A Shooting in South Carolina

While the police are supposed to protect and serve, recent incidents have raised grave concerns about policing in America. I am, of course, referring to the killing of unarmed black men by white police officers. In the most recent incident Patrolman Michael Thomas Slager shot Walter Lamer Scott to death after what should have been a routine traffic stop. What makes this case unusual is that there is video of the shooting. While the video does not show what happened before Scott started to flee, it clearly shows that Scott is no threat to Slager: he is unarmed and running away. Police are not allowed to shoot a suspect merely for fleeing. The video also show Slager dropping an object by Scott’s body—it appears to be Slager’s Taser. When Slager called in the incident, he described it as a justifiable shooting: Scott grabbed his Taser and he had to use his service weapon. Obviously Slager was unaware that he was being recorded as he shot the fleeing Scott.

Since I am friends with people who are ex-law enforcement (retired or moved on to other careers) I have reason to believe that the majority of officers would not engage in such behavior. As such, I will not engage in a sweeping condemnation of police—this would be both unjust and unfounded. However, this incident does raise many concerns about policing in the United States.

As noted above, what makes this incident unusual is not that a situation involving a black man and white officer escalated. It is also not very unusual that a black man was shot by a police officer. What is unusual is that the incident was videotaped and this allowed the public to see what really happened—as opposed to what was claimed by the officer. If the incident had not been recorded, this most likely would have gone down as the all-too-common scenario of a suspect attacking a police officer and being shot in self-defense. The tape, however, has transformed it from the usual to the unusual: a police officer being charged with murder for shooting a suspect.

Since I teach critical thinking, I am well aware that the story of one incident, however vivid, is but an anecdote. I am also well aware that to generalize broadly from one such incident is to commit the fallacy of hasty generalization. That said, the videotape does provide legitimate grounds for being suspicious of other incidents in which suspects have been shot while (allegedly) trying to attack an officer. Since we know that it has happened, we clearly know that it can happen. The obvious and rather important concern is the extent to which this sort of thing has happened. That is, what needs to be determined is the extent to which officers have engaged in legitimate self-defense and to what extent have officers engaged in murder.

This videotape shows, rather dramatically, that requiring police to use body cameras is a good idea—at least from the standpoint of those who believe in justice. People are, obviously enough somewhat less likely to act badly if they know they are being recorded. There is also the fact that there would be clear evidence of any misdeeds. The cameras would also benefit officers: such video evidence would also show when the use of force was legitimate, thus helping to reduce suspicions. As it stands, we know that at least one police officer shot down a fleeing suspect who presented no threat. This, naturally enough, motivates suspicion about all shootings (and rightly so). The regular use of body cameras could be one small contribution to addressing legitimate questions about use of force incidents.

What is also usual about this incident is that there has been a focus on the fact that Scott had a criminal record and legal troubles involving child support. This is presumably intended to show that Scott was no angel and perhaps to suggest that the shooting was, in some manner, justified. Or, at the very least, not as bad as one might think. After all, the person killed was a criminal, right? However, Scott’s background has no relevance in this incident: his having legal troubles in the past in no manner justifies the shooting.

What was also usual was the reaction of Bill O’Reilly and some of the other fine folks at Fox, which I learned about from Professor Don Hubin’s reaction and criticism. Rather than focusing on the awfulness of the killing and what it suggests about other similar incidents, O’Reilly’s main worry seems to be that some people might use the killing to “further inflame racial tensions” and he adds that “there doesn’t seem to be, as some would have you believe, that police are trying to hunt down black men and take their lives.” While this is not a claim that has been seriously put forth, O’Reilly endeavors to “prove” his claim by engaging in a clever misleading comparison. He notes that “In 2012, last stats available, 123 blacks were killed by police 326 whites were killed.” While this shows that police kill more whites than blacks, the comparison is misleading because O’Reilly leaves out a critical piece of information: the population is about 77% white and about 13% black. This, obviously enough, sheds a rather different light on O’Reilly’s statistics: they are accurate, yet misleading.

Naturally, it might be countered that blacks commit more crimes than whites and thus it is no surprise that they get shot more often (when adjusting for inflation) than whites. After all, one might point out, Scott did have a criminal record. This reply has a certain irony to it. After all, people who claim that blacks are arrested (and shot) at a disproportionate level claim that the police are more likely to arrest blacks than whites and focus more on policing blacks. As evidence that blacks commit more crimes, people point to the fact that blacks are more likely (adjusting for proportions) than whites to be arrested. While one would obviously expect more blacks to be arrested in they committed more crimes (proportionally), to assume what is in doubt (that policing is fair) as evidence that it should not be doubted seems to involve reasoning in a circle.

O’Reilly also raised a stock defense for when bad thing are done: “You can’t … you can’t be a perfect system. There are going to be bad police officers; they’re going to make mistakes; um .. and then the mistakes are going to be on national television.” O’Reilly engages in what seems to be a perfectionist fallacy: the system cannot be perfect (which is true), therefore (it seems) we should not overly concerned that this could be evidence of systematic problems. Or perhaps he just means that in an imperfect system one must expect mistakes such as an officer shooting a fleeing suspect to death. O’Reilly also seems rather concerned that the mistakes will be on television—perhaps his concern is, as I myself noted, that people will fall victim to a hasty generalization from the misleading vividness of the incident. That would be a fair point. However, the message O’Reilly seems to be conveying is that this incident is (as per the usual Fox line) an isolated one that does not indicate a systemic problem. Despite the fact that these “isolated” incidents happen with terrible regularity.

I will close by noting that my objective is not to attack the police. Rather, my concern is that the justice system is just—that is rather important to me. It should also be important to all Americans—after all, most of us pledged allegiance to a nation that offers liberty and justice to all.

My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Follow Me on Twitter

Better than the Truth

: Fountain of Youth Park.

: Fountain of Youth Park. (Photo credit: Wikipedia)

While my adopted state of Florida has many interesting tales, perhaps the most famous is the story of Juan Ponce de León’s quest to find the fountain of youth. As the name suggests, this enchanted fountain was supposed to grant eternal life to those who drank of (or bathed in) its waters.

While the fountain of youth is regarded as a mere myth, it turns out that the story about Juan Ponce de León’s quest is also a fiction. And not just a fiction—a slander.

In 1511, or so the new history goes, Ponce was forced to resign his post as governor of Puerto Rico. King Ferdinand offered Ponce an opportunity: if he could find Bimini, it would be his. That, and not the fountain of youth, was the object of his quest. In support of this, J. Michael Francis of the University of South Florida, claims that the documents of the time make no mention of a fountain of youth. According to Francis, a fellow named Gonzalo Fernández de Oviedo y Valdés disliked Ponce, most likely because of the political struggle in Puerto Rico.  Oviedo wrote a tale in his Historia general y natural de las Indias claiming that Ponce was tricked by the natives into searching for the fountain of youth.

This fictional “history” stuck (rather like the arrow that killed Ponce) and has become a world-wide legend. Not surprisingly, my adopted state is happy to cash in on this tale—there is even a well at St. Augustine’s Fountain of Youth Archaeological Park that is rather popular with tourists. There is considerable irony in the fact that a tale intended to slander Ponce as a fool has given him a form of ongoing life—his fame is due mostly to this fiction. Given the success of the story, it might be suspected that this is a case where the fiction is better than the truth. While this is but one example, it does raise a general philosophical matter regarding truth and fiction.

From a moral and historical standpoint, the easy and obvious answer to the general question of whether a good fiction is better than a truth is “no.”  After all, a fiction of this sort is a lie and there are the usual stock moral arguments as to why lying is generally wrong. In this specific case, there is also the fact (if the story is true) that Oviedo slandered Ponce from malice—which certainly seems morally wrong.

In the case of history, the proper end is the truth—as Aristotle said, it is the function of the historian to relate what happened. In contrast, it is the function of the poet to relate what may happen. As such, for the moral philosopher and the honest historian, no fiction is better than the truth. But, of course, these are not the only legitimate perspectives on the matter.

Since the story of Ponce and the fountain of youth is a fiction, it is not unreasonable to also consider it in the context of aesthetics—that is, its value as a story. While Oviedo intended for his story to be taken as true, he can be considered an artist (in this case, a writer of fiction and the father of the myth). Looked at as a work of fiction, the story does relate what may happen—after all, it certainly seems possible for a person to quest for something that does not exist. To use an example from the same time, Orellana and Pizarro went searching for the legendary city of El Dorado (unless, of course, this is just another fiction).

While it might seem a bit odd to take a lie as art, the connection between the untrue and art is well-established. In the Poetics, Aristotle notes how “Homer has chiefly taught other poets the art of telling lies skillfully” and he regards such skillful lies as a legitimate part of art. Oscar Wilde, in his “New Aesthetics” presents as his fourth doctrine that “Lying, the telling of beautiful untrue things is the proper aim of Art.” A little reflection does show that they are correct—at least in the case of fiction. After all, fiction is untrue by definition, yet is clearly a form of art. When an actor plays Hamlet and says the lines, he pours forth lie after lie. The Chronicles of Narnia are also untrue—there is no Narnia, there is no Aslan and the characters are made up. Likewise for even mundane fiction, such as Moby Dick. As such, being untrue—or even a lie in the strict sense of the term, does not disqualify a work from being art.

Looked at as a work of art, the story of the fountain of youth certainly seems better than the truth. While the true story of Ponce is certainly not a bad tale (a journey of exploration ending in death from a wound suffered in battle), the story of a quest for the fountain of youth has certainly proven to be the better tale. This is not to say that the truth of the matter should be ignored, just that the fiction would seem to be quite acceptable as a beautiful, untrue thing.

 

My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Follow Me on Twitter

Who Decides Who is Muslim?

English: Faithful praying towards Makkah; Umay...

(Photo credit: Wikipedia)

When discussing ISIS, President Obama refuses to label its members as “Islamic extremists” and has stressed that the United States is not at war with Islam. Not surprisingly, some of his critics and political opponents have taken issue with this and often insist on labeling the members of ISIS as Islamic extremists or Islamic terrorists.  Graeme Wood has, rather famously, argued that ISIS is an Islamic group and is, in fact, adhering very closely to its interpretations of the sacred text.

Laying aside the political machinations, there is a rather interesting philosophical and theological question here: who decides who is a Muslim? Since I am not a Muslim or a scholar of Islam, I will not be examining this question from a theological or religious perspective. I will certainly not be making any assertions about which specific religious authorities have the right to say who is and who is not a true Muslim. Rather, I am looking at the philosophical matter of the foundation of legitimate group identity. This is, of course, a variation on one aspect of the classic problem of universals: in virtue of what (if anything) is a particular (such as a person) of a type (such as being a Muslim)?

Since I am a metaphysician, I will begin with the rather obvious metaphysical starting point. As Pascal noted in his famous wager, God exists or God does not.

If God does not exist, then Islam (like all religions that are based on a belief in God) would have an incorrect metaphysics. In this case, being or not being a Muslim would be a social matter. It would be comparable to being or not being a member of Rotary, being a Republican, a member of Gulf Winds Track Club or a citizen of Canada. That is, it would be a matter of the conventions, traditions, rules and such that are made up by people. People do, of course, often take this made up stuff very seriously and sometimes are quite willing to kill over these social fictions.

If God does exist, then there is yet another dilemma: God is either the God claimed (in general) in Islamic metaphysics or God is not. One interesting problem with sorting out this dilemma is that in order to know if God is as Islam claims, one would need to know the true definition of Islam—and thus what it would be to be a true Muslim. Fortunately, the challenge here is metaphysical rather than epistemic. If God does exist and is not the God of Islam (whatever it is), then there would be no “true” Muslims, since Islam would have things wrong. In this case, being a Muslim would be a matter of social convention—belonging to a religion that was right about God existing, but wrong about the rest. There is, obviously, the epistemic challenge of knowing this—and everyone thinks he is right about his religion (or lack of religion).

Now, if God exists and is the God of Islam (whatever it is), then being a “true” member of a faith that accepts God, but has God wrong (that is, all the non-Islam monotheistic faiths), would be a matter of social convention. For example, being a Christian would thus be a matter of the social traditions, rules and such. There would, of course, be the consolation prize of getting something right (that God exists).

In this scenario, Islam (whatever it is) would be the true religion (that is, the one that got it right). From this it would follow that the Muslim who has it right (believes in the true Islam) is a true Muslim. There is, however, the obvious epistemic challenge: which version and interpretation of Islam is the right one? After all, there are many versions and even more interpretations—and even assuming that Islam is the one true religion, only the one true version can be right. Unless, of course, God is very flexible about this sort of thing. In this case, there could be many varieties of true Muslims, much like there can be many versions of “true” runners.

If God is not flexible, then most Muslims would be wrong—they are not true Muslims. This then leads to the obvious epistemic problem: even if it is assumed that Islam is the true religion, then how does one know which version has it right? Naturally, each person thinks he (or she) has it right. Obviously enough, intensity of belief and sincerity will not do. After all, the ancients had intense belief and sincerity in regard to what are now believed to be made up gods (like Thor and Athena). Going through books and writings will also not help—after all, the ancient pagans had plenty of books and writings about what we regard as their make-believe deities.

What is needed, then, is some sort of sure sign—clear and indisputable proof of the one true view. Naturally, each person thinks he has that—and everyone cannot be right. God, sadly, has not provided any means of sorting this out—no glowing divine auras around those who have it right. Because of this, it seems best to leave this to God. Would it not be truly awful to go around murdering people for being “wrong” when it turns out that one is also wrong?

 

My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Follow Me on Twitter

Guns on Campus

As I write this, the Florida state legislature is considering a law that will allow concealed carry permit holders to bring their guns to college campuses. As is to be expected, some opponents and some proponents are engaging in poor reasoning, hyperbole and other such unhelpful means of addressing the issue. As a professor and a generally pro-gun person, I have more than academic interest in this matter. My goal is, as always, is to consider this issue rationally, although I do recognize the role of emotions in this matter.

From an emotional standpoint, I am divided in my heart. On the pro-gun feeling side, all of my gun experiences have been positive. I learned to shoot as a young man and have many fond memories of shooting and hunting with my father. Though I now live in Florida, we still talk about guns from time to time. As graduate student, I had little time outside of school, but once I was a professor I was able to get in the occasional trip to the range. I have, perhaps, been very lucky: the people I have been shooting with and hunting with have all been competent and responsible people. No one ever got hurt. I have never been a victim of gun crime.

On the anti-gun side, like any sane human I am deeply saddened when I hear of people being shot down. While I have not seen gun violence in person, Florida State University (which is just across the tracks from my university) recently had a shooter on campus. I have spoken with people who have experienced gun violence and, not being callous, I can understand their pain. Roughly put, I can feel the two main sides in the debate. But, feeling is not a rational way to settle a legal and moral issue.

Those opposed to guns on campus are concerned that the presence of guns carried by permit holders would result in increase in injuries and deaths. Some of these injuries and deaths would be intentional, such as suicide, fights escalating to the use of guns, and so on. Some of these injuries and deaths, it is claimed, would be the result of an accidental discharge. From a moral standpoint, this is obviously a legitimate concern. However, it is also a matter for empirical investigation: would allowing concealed carry on campus increase the likelihood of death or injury to a degree that would justify banning guns?

Some states already allow licensed concealed carry on campus and there is, of course, considerable data available about concealed carry in general. The statistically data would seem to indicate that allowing concealed carry on campus would not result in an increase in injuries and death on campus. This is hardly surprising: getting a permit requires providing proof of competence with a firearm as well as a thorough background check—considerably more thorough than the background check to purchase a firearm. Such permits are also issued at the discretion of the state. As such, people who have such licenses are not likely engage in random violence on campus.

This is, of course, an empirical matter. If it could be shown that allowing licensed conceal carry on campus would result in an increase in deaths and injuries, then this would certainly impact the ethics of allowing concealed carry.

Those who are opposed to guns on campus are also rightfully concerned that someone other than the license holder will get the gun and use it. After all, theft is not uncommon on college campuses and someone could grab a gun from a licensed holder.

While these concerns are not unreasonable, someone interested in engaging in gun violence can easily acquire a gun without stealing it from a permit holder on campus. She could buy one or steal one from somewhere else. As far as grabbing a gun from a person carrying it legally, attacking an armed person is generally not a good idea—and, of course, someone who is prone to gun grabbing would presumably also try to grab a gun from a police officer. In general, these do not seem to be compelling reasons to ban concealed carry on campus.

Opponents of allowing guns on campus also point to psychological concerns: people will feel unsafe knowing that people around them might be legally carry guns. This might, it is sometimes claimed, result in a suppression of discussion in classes and cause professors to hand out better grades—all from fear that a student is legally carrying a gun.

I do know people who are actually very afraid of this—they are staunchly anti-gun and are very worried that students and other faculty will be “armed to the teeth” on campus and “ready to shoot at the least provocation.” The obvious reply is that someone who is dangerously unstable enough to shoot students and faculty over such disagreements would certainly not balk at illegally bringing a gun to campus. Allowing legal concealed carry by permit holders would, I suspect, not increase the odds of such incidents. But, of course, this is a matter of emotions and fear is rarely, if ever, held at bay by reason.

Opponents of legal carry on campus also advance a reasonable argument: there is really no reason for people to be carrying guns on campus. After all, campuses are generally safe, typically have their own police forces and are places of learning and not shooting ranges.

This does have considerable appeal. When I lived in Maine, I had a concealed weapon permit but generally did not go around armed. My main reason for having it was convenience—I could wear my gun under my jacket when going someplace to shoot. I must admit, of course, that as a young man there was an appeal in being able to go around armed like James Bond—but that wore off quickly and I never succumbed to gun machismo. I did not wear a gun while running (too cumbersome) or while socializing (too…weird). I have never felt the need to be armed with a gun on campus, though all the years I have been a student and professor. So, I certainly get this view.

The obvious weak point for this argument is that the lack of a reason to have a gun on campus (granting this for the sake of argument) is not a reason to ban people with permits from legally carrying on campus. After all, the permit grants the person the right to carry the weapon legally and more is needed to deny the exercise of that right than just the lack of need.

Another obvious weak point is that a person might need a gun on campus for legitimate self-defense. While this is not likely, that is true in most places. After all, a person going to work or out for a walk in the woods is not likely to need her gun. I have, for example, never needed one for self-defense. As such, there would seem to be as much need to have a gun on campus as many other places where it is legal to carry. Of course, this argument could be turned around to argue that there is no reason to allow concealed carry at all.

Proponents of legal concealed carry on campus often argue that “criminals and terrorists” go to college campuses in order to commit their crimes, since they know no one will be armed. There are two main problems with this. The first is that college campuses are, relative to most areas, very safe. So, criminals and terrorists do not seem to be going to them that often. As opponents of legal carry on campus note, while campus shootings make the news, they are actually very rare.

The second is that large campuses have their own police forces—in the shooting incident at FSU, the police arrived rapidly and shot the shooter. As such, I do not think that allowing concealed carry will scare away criminals and terrorists. Especially since they do not visit campuses that often already.

Proponents of concealed carry also sometimes claim that the people carrying legally on campus will serve as the “good guy with guns” to shoot the “bad guys with guns.” While there is a chance that a good guy will be able to shoot a bad guy, there is the obvious concern that the police will not be able to tell the good guy from the bad guy and the good guy will be shot. In general, the claims that concealed carry permit holders will be righteous and effective vigilantes on campus are more ideology and hyperbole than fact. Not surprisingly, most reasonable pro-gun people do not use that line of argumentation. Rather, they focus on more plausible scenarios of self-defense and not wild-west vigilante style shoot-outs.

My conclusion is that there is not a sufficiently compelling reason to ban permit holders from carrying their guns on campus. But, there does not seem to be a very compelling reason to carry a gun on campus.

 

My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Follow Me on Twitter

Should You Attend a For-Profit College?

The rise of for-profit universities have given students increased choices when it comes to picking schools. Since college is rather expensive and schools vary in regards to the success of their graduates, it is wise to carefully consider the options before writing those checks. Or, more likely these days, going into debt.

While there is a popular view that the for-profit free-market will consistently create better goods and services at ever lower prices, it is wisest to accept facts over ideological theory. As such, when picking between public, non-profit, and for-profit schools one should look at the numbers. Fortunately, ProPublica has been engaged in crunching the numbers.

Today most people go to college in order to have better job prospects. As such, one rather important consideration is the likelihood of getting a job after graduation and the likely salary. While for-profit schools spend about $4.2 billion in 2009 for recruiting and marketing and pay their own college presidents an average of $7.3 million per year, the typical graduate does rather poorly. According to the U.S. Department of Education 74% of the programs at for-profit colleges produced graduates whose average pay is less than that of high-school dropouts. In contrast, graduates of non-profit and public colleges do better financially than high school graduates.

Another important consideration is the cost of education. While the free-market is supposed to result in higher quality services at lower prices and the myth of public education is that it creates low quality services at high prices, the for-profit schools are considerably more expensive than their non-profit and public competition. A two-year degree costs, on average, $35,000 at a for-profit school. The average community college offers that degree at a mere $8,300. In the case of four year degrees, the average is $63,000 at a for-profit and $52,000 for a “flagship” state college. For certificate programs, public colleges will set a student back $4,250 while a for-profit school will cost the student $19,806 on average. By these numbers, the public schools offer a better “product” at a much lower price—thus making public education the rational choice over the for-profit option.

Student debt and loans, which have been getting considerable attention in the media, are also a matter of consideration. The median debt of the average student at a for-profit college is $32,700 and 96% of the students at such schools take out loans. At non-profit private colleges, the amount is $24,600 and 57%. For public colleges, the median debt is $20,000 and 48% of students take out loans. Only 13% of community college students take out loans (thanks, no doubt, to the relatively low cost of community college).

For those who are taxpayers, another point of concern is how much taxpayer money gets funneled into for-profit schools. In a typical year, the federal government provides $6 billion in Pell Grants and $16 billion in student loans to students attending for-profit colleges. In 2010 there were 2.4 million students enrolled in these schools. It is instructive to look at the breakdown of how the for-profits expend their money.

As noted above, the average salary of the president of a for-profit college was $7.3 million in 2009. The five highest paid presidents of non-profit colleges averaged $3 million and the five highest paid presidents at public colleges were paid $1 million.

The for-profit colleges also spent heavily in marketing, spending $4.2 billion in recruiting, marketing and admissions staffing in 2009. In 2009 thirty for-profit colleges hired 35,202 recruiters which is about 1 recruiter per 49 students. As might be suspected, public schools do not spend that sort of money. My experience with recruiting at public schools is that a common approach is for a considerable amount of recruiting to fall to faculty—who do not, in general, get extra compensation for this extra work.

In terms of what is spent per student, for-profit schools average $2,050 per student per year. Public colleges spend, on average, $7,239 per student per year. Private non-profit schools spend the mots and average $15,321 per student per year. This spending does seem to yield results: at for-profit schools only 20% of students complete the bachelor’s degree within four years. Public schools do somewhat better with 31% and private non-profits do best at 52%. As such, a public or non-profit school would be the better choice over the for-profit school.

Because so much public money gets funneled into for-profit, public and private schools, there has been a push for “gainful employment” regulation. The gist of this regulation is that schools will be graded based on the annual student loan payments of their graduates relative to their earnings. A school will be graded as failing if its graduates have annual student loan payments that exceed 12% of total earnings or 30% of discretionary earnings. The “danger zone” is 8-12% of total earnings or 20-30% of discretionary earnings. Currently, there are about 1,400 programs with about 840,000 enrolled students in the “danger zone” or worse. 99% of them are, shockingly enough, at for-profit schools.

For those who speak of accountability, these regulations should seem quite reasonable. For those who like the free-market, the regulation’s target is the federal government: the goal is to prevent the government from dumping more taxpayer money into failing programs. Schools will need to earn this money by success.

However, this is not the first time that there has been an attempt to link federal money to success. In 2010 regulations were put in place that included a requirement that a school have at least 35% of its students actively repaying student loans. As might be guessed, for-profit schools are the leaders in loan defaults. In 2012 lobbyists for the for-profit schools (who have the highest default rates) brought a law suit to federal court. The judge agreed with them and struck down the requirement.

In November of 2014 an association of for-profit colleges brought a law suit against the current gainful employment requirements, presumably on the principle that it is better to pay lawyers and lobbyists rather than addressing problems with their educational model. If this lawsuit succeeds, which is likely, for-profits will be rather less accountable and this will serve to make things worse for their students.

Based on the numbers, you should definitely not attend the typical for-profit college. On average, it will cost you more, you will have more debt, and you will make less money. For the most for the least cost, the two year community college is the best deal. For the four year degree, the public school will cost less, but private non-profits generally have more successful results. But, of course, much depends on you.

 

My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Follow Me on Twitter

Augmented Soldier Ethics III: Pharmaceuticals

Steve Rogers' physical transformation, from a ...

Steve Rogers’ physical transformation, from a reprint of Captain America Comics #1 (May 1941). Art by Joe Simon and Jack Kirby. (Photo credit: Wikipedia)

Humans have many limitations that make them less than ideal as weapons of war. For example, we get tired and need sleep. As such, it is no surprise that militaries have sought various ways to augment humans to counter these weaknesses. For example, militaries routinely make use of caffeine and amphetamines to keep their soldiers awake and alert. There have also been experiments

In science fiction, militaries go far beyond these sorts of drugs and develop far more potent pharmaceuticals. These chemicals tend to split into two broad categories. The first consists of short-term enhancements (what gamers refer to as “buffs”) that address a human weakness or provide augmented abilities. In the real world, the above-mentioned caffeine and amphetamines are short-term drugs. In fiction, the classic sci-fi role-playing game Traveller featured the aptly (though generically) named combat drug. This drug would boost the user’s strength and endurance for about ten minutes. Other fictional drugs have far more dramatic effects, such as the Venom drug used by the super villain Bane. Given that militaries already use short-term enhancers, it is certainly reasonable to think they are and will be interested in more advanced enhancers of the sort considered in science fiction.

The second category is that of the long-term enhancers. These are chemicals that enable or provide long-lasting effects. An obvious real-world example is steroids: these allow the user to develop greater muscle mass and increased strength. In fiction, the most famous example is probably the super-soldier serum that was used to transform Steve Rogers into Captain America.

Since the advantages of improved soldiers are obvious, it seems reasonable to think that militaries would be rather interested in the development of effective (and safe) long-term enhancers. It does, of course, seem unlikely that there will be a super-soldier serum in the near future, but chemicals aimed at improving attention span, alertness, memory, intelligence, endurance, pain tolerance and such would be of great interest to militaries.

As might be suspected, these chemical enhancers do raise moral concerns that are certainly worth considering. While some might see discussing enhancers that do not yet (as far as we know) exist as a waste of time, there does seem to be a real advantage in considering ethical issues in advance—this is analogous to planning for a problem before it happens rather than waiting for it to occur and then dealing with it.

One obvious point of concern, especially given the record of unethical experimentation, is that enhancers will be used on soldiers without their informed consent. Since this is a general issue, I addressed it in its own essay and reached the obvious conclusion: in general, informed consent is morally required. As such, the following discussion assumes that the soldiers using the enhancers have been honestly informed of the nature of the enhancers and have given their consent.

When discussing the ethics of enhancers, it might be useful to consider real world cases in which enhancers are used. One obvious example is that of professional sports. While Major League Baseball has seen many cases of athletes using such enhancers, they are used worldwide and in many sports, from running to gymnastics. In the case of sports, one of the main reasons certain enhancers, such as steroids, are considered unethical is that they provide the athlete with an unfair advantage.

While this is a legitimate concern in sports, it does not apply to war. After all, there is no moral requirement for a fair competition in battle. Rather, one important goal is to gain every advantage over the enemy in order to win. As such, the fact that enhancers would provide an “unfair” advantage in war does not make them immoral. One can, of course, discuss the relative morality of the sides involved in the war, but this is another matter.

A second reason why the use of enhancers is regarded as wrong in sports is that they typically have rather harmful side effects. Steroids, for example, do rather awful things to the human body and brain. Given that even aspirin has potentially harmful side effects, it seems rather likely that military-grade enhancers will have various harmful side effects. These might include addiction, psychological issues, organ damage, death, and perhaps even new side effects yet to be observed in medicine. Given the potential for harm, a rather obvious way to approach the ethics of this matter is utilitarianism. That is, the benefits of the enhancers would need to be weighed against the harm caused by their use.

This assessment could be done with a narrow limit: the harms of the enhancer could be weighed against the benefits provided to the soldier. For example, an enhancer that boosted a combat pilot’s alertness and significantly increased her reaction speed while having the potential to cause short-term insomnia and diarrhea would seem to be morally (and pragmatically) fine given the relatively low harms for significant gains. As another example, a drug that greatly boosted a soldier’s long-term endurance while creating a significant risk of a stroke or heart attack would seem to be morally and pragmatically problematic.

The assessment could also be done more broadly by taking into account ever-wider considerations. For example, the harms of an enhancer could be weighed against the importance of a specific mission and the contribution the enhancer would make to the success of the mission. So, if a powerful drug with terrible side-effects was critical to an important mission, its use could be morally justified in the same way that taking any risk for such an objective can be justified. As another example, the harms of an enhancer could be weighed against the contribution its general use would make to the war. So, a drug that increased the effectiveness of soldiers, yet cut their life expectancy, could be justified by its ability to shorten a war. As a final example, there is also the broader moral concern about the ethics of the conflict itself. So, the use of a dangerous enhancer by soldiers fighting for a morally good cause could be justified by that cause (using the notion that the consequences justify the means).

There are, of course, those who reject using utilitarian calculations as the basis for moral assessment. For example, there are those who believe (often on religious grounds) that the use of pharmaceuticals is always wrong (be they used for enhancement, recreation or treatment). Obviously enough, if the use of pharmaceuticals is wrong in general, then their specific application in the military context would also be wrong. The challenge is, of course, to show that the use of pharmaceuticals is simply wrong, regardless of the consequences.

In general, it would seem that the military use of enhancers should be assessed morally on utilitarian grounds, weighing the benefits of the enhancers against the harm done to the soldiers.

 

My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Follow Me on Twitter

Augmented Soldier Ethics II: Informed Consent

One general moral subject that is relevant to the augmentation of soldiers by such things as pharmaceuticals, biologicals or cybernetics is the matter of informed consent. While fiction abounds with tales of involuntary augmentation, real soldiers and citizens of the United States have been coerced or deceived into participating in experiments. As such, there do seem to be legitimate grounds for being concerned that soldiers and citizens could be involuntarily augmented as part of experiments or actual “weapon deployment.”

Assuming the context of a Western democratic state, it seems reasonable to hold that augmenting a soldier without her informed consent would be immoral. After all, the individual has rights against the democratic state and these include the right not to be unjustly coerced or deceived. Socrates, in the Crito, also advanced reasonable arguments that the obedience of a citizen required that the state not coerce or deceive the citizen into the social contract and this would certainly apply to soldiers in a democratic state.

It is certainly tempting to rush to the position that informed consent would make the augmentation of soldiers morally acceptable. After all, the soldier would know what she was getting into and would volunteer to undergo the process in question. In popular fiction, one example of this would be Steve Rogers volunteering for the super soldier conversion. Given his consent, such an augmentation would seem morally acceptable.

There are, of course, some cases where informed consent makes a critical difference in ethics. One obvious example is the moral difference between sex and rape—the difference is a matter of informed and competent consent. If Sam agrees to have sex with Sally, then Sally is not raping Sam. But if Sally drugs Sam and has her way, then that would be rape.  Another obvious example is the difference between theft and receiving a gift—this is also a matter of informed consent. If Sam gives Sally a diamond ring that is not theft. If Sally takes the ring by force or coercion, then that is theft—and presumably wrong.

Even when informed consent is rather important, there are still cases in which the consent does not make the action morally acceptable. For example, Sam and Sally might engage in consensual sex, but if they are siblings or one is the parent of the other, the activity could still be immoral. As another example, Sam might consent to give Sally an heirloom ring that has been in the family for untold generations, but it might still be the wrong thing to do—especially when Sally hocks the ring to buy heroin.

There are also cases in which informed consent is not relevant because of the morality of the action itself. For example, Sam might consent to join in Sally’s plot to murder Ashley (rather than being coerced or tricked) but this would not be relevant to the ethics of the murder. At best it could be said that Sally did not add to her misdeed by coercing or tricking her accomplices, but this would not make the murder itself less bad.

Turning back to the main subject of augmentation, even if the soldiers gave their informed consent, the above consideration show that there would still be the question of whether or not the augmentation itself is moral or not. For example, there are reasonable moral arguments against genetically modifying human beings. If these arguments hold up, then even if a soldier consented to genetic modification, the modification itself would be immoral.  I will be addressing the ethics of pharmaceutical, biological and cybernetic augmentation in later essays.

While informed consent does seem to be a moral necessity, this position can be countered. One stock way to do this is to make use of a utilitarian argument: if the benefits gained from augmenting soldiers without their informed consent outweighed the harms, then the augmentation would be morally acceptable. For example, imagine that a war against a wicked enemy is going rather badly and that an augmentation method has been developed that could turn the war around. The augmentation is dangerous and has awful long term side-effects that would deter most soldiers from volunteering. However, losing to the wicked enemy would be worse—so it could thus be argued that the soldiers should be deceived so that the war could be won. As another example, a wicked enemy is not needed—it could simply be argued that the use of augmented soldiers would end the war faster, thus saving lives, albeit at the cost of those terrible side-effects.

Another stock approach is to appeal to the arguments used by democracies to justify conscription in time of war. If the state (or, rather, those who expect people to do what they say) can coerce citizens into killing and dying in war, then the state can surely coerce and citizens to undergo augmentation. It is easy to imagine a legislature passing something called “the conscription and augmentation act” that legalizes coercing citizens into being augmented to serve in the military. Of course, there are those who are suspicious of democratic states so blatantly violating the rights of life and liberty. However, not all states are democratic.

While democratic states would seem to face some moral limits when it comes to involuntary augmentation, non-democratic states appear to have more options. For example, under fascism the individual exists to serve the state (that is, the bastards that think everyone else should do what they say). If this political system is morally correct, then the state would have every right to coerce or deceive the citizens for the good of the state. In fiction, these states tend to be the ones to crank out involuntary augmented soldiers (that still manage to lose to the good guys).

Naturally, even if the state has the right to coerce or deceive soldiers into becoming augmented, it does not automatically follow that the augmentation itself is morally acceptable—this would depend on the specific augmentations. These matters will be addressed in upcoming essays.

 

 

My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Follow Me on Twitter

Augmented Soldier Ethics I: Exoskeletons

US-Army exoskeleton

US-Army exoskeleton (Photo credit: Wikipedia)

One common element of military science fiction is the powered exoskeleton, also known as an exoframe, exosuit or powered armor. The basic exoskeleton is a powered framework that serves to provide the wearer with enhanced strength. In movies such as Edge of Tomorrow and video games such as Call of Duty Advanced Warfare the exoskeletons provide improved mobility and carrying capacity (which can include the ability to carry heavier weapons) but do not provide much in the way of armor. In contrast, the powered armor of science fiction provides the benefits of an exoskeleton while also providing a degree of protection. The powered armor of Starship Troopers, The Forever War, Armor and Iron Man all serve as classic examples of this sort of gear.

Because the exoskeletons of fiction provide soldiers with enhanced strength, mobility and carrying capacity, it is no surprise that militaries are very interested in exoskeletons in the real world. While exoskeletons have yet to be deployed, there are some ethical concerns about the augmentation of soldiers.

On the face of it, the use of exoskeletons in warfare seems to be morally unproblematic. The main reason is that an exoskeleton is analogous to any other vehicle, with the exception that it is worn rather than driven. A normal car provides the driver with enhanced mobility and carrying capacity and this is presumably not immoral. In terms of the military context, the exoskeleton would be comparable to a Humvee or a tank, both of which seem morally unproblematic as well.

It might be objected that the use of exoskeletons would give wealthier nations an unfair advantage in war. The easy and obvious response to this is that, unlike in sports and games, gaining an “unfair” advantage in war is not immoral. After all, there is not a moral expectation that combatants will engage in a fair fight rather than making use of advantages in such things as technology and numbers.

It might be objected that the advantage provided by exoskeletons would encourage countries that had them to engage in aggressions that they would not otherwise engage in. The easy reply to this is that despite the hype of video games and movies, any exoskeleton available in the near future would most likely not provide a truly spectacular advantage to infantry. This advantage would, presumably, be on par with existing advantages such as those the United States enjoys over almost everyone else in the world. As such, the use of exoskeletons would not seem morally problematic in this regard.

One point of possible concern is what might be called the “Iron Man Syndrome” (to totally make something up). The idea is that soldiers equipped with exoskeletons might become overconfident (seeing themselves as being like the superhero Iron Man) and thus put themselves and others at risk. After all, unless there are some amazing advances in armor technology that are unmatched by weapon technology, soldiers in powered armor will still be vulnerable to weapons capable of taking on light vehicle armor (which exist in abundance). However, this could be easily addressed by training. And experience.

A second point of possible concern is what could be called the “ogre complex” (also totally made up). An exoskeleton that dramatically boosts a soldier’s strength might encourage some people to act as bullies and abuse civilians or prisoners. While this might be a legitimate concern, it can easily addressed by proper training and discipline.

There are, of course, the usual peripheral issues associated with new weapons technology that could have moral relevance. For example, it is easy to imagine a nation wastefully spending money on exoskeletons, perhaps due to corruption. However, such matters are not specific to exoskeletons and would not be moral problems for the technology as such.

Given the above, it would seem that augmenting soldiers with exoskeletons poses no new moral concerns and is morally comparable to providing soldiers with Humvees, tanks and planes.

My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Follow Me on Twitter

Should Two Year Colleges Be Free?

Tallahassee County Community College Seal

Tallahassee County Community College Seal (Photo credit: Wikipedia)

While Germany has embraced free four year college education for its citizens, President Obama has made a more modest proposal to make community college free for Americans. He is modeling his plan on that of Republican Governor Bill Haslam. Haslam has made community college free for citizen of Tennessee, regardless of need or merit. Not surprisingly, Obama’s proposal has been attacked by both Democrats and Republicans. Having some experience in education, I will endeavor to assess this proposal in a rational way.

First, there is no such thing as a free college education (in this context). Rather, free education for a student means that the cost is shifted from the student to others. After all, the staff, faculty and administrators will not work for free. The facilities of the schools will not be maintained, improved and constructed for free. And so on, for all the costs of education.

One proposed way to make education free for students is to shift the cost onto “the rich”, a group which is easy to target but somewhat harder to define. As might be suspected, I think this is a good idea. One reason is that I believe that education is the best investment a person can make in herself and in society. This is why I am fine with paying property taxes that go to education, although I have no children of my own. In addition to my moral commitment to education, I also look at it pragmatically: money spent on education (which helps people advance) means having to spend less on prisons and social safety nets. Of course, there is still the question of why the cost should be shifted to the rich.

One obvious answer is that they, unlike the poor and what is left of the middle class, have the money. As economists have noted, an ongoing trend in the economy is that wages are staying stagnant while capital is doing well. This is manifested in the fact that while the stock market has rebounded from the crash, workers are, in general, doing worse than before the crash.

There is also the need to address the problem of income inequality. While one might reject arguments grounded in compassion or fairness, there are some purely practical reasons to shift the cost. One is that the rich need the rest of us to keep the wealth, goods and services flowing to them (they actually need us way more than we need them). Another is the matter of social stability. Maintaining a stable state requires that the citizens believe that they are better off with the way things are then they would be if they engaged in a revolution. While deceit and force can keep citizens in line for quite some time, there does come a point at which these fail. To be blunt, it is in the interest of the rich to help restore the faith of the middle class. One of the nastier alternatives is being put against the wall after the revolution.

Second, the reality of education has changed over the years. In the not so distant past, a high-school education was sufficient to get a decent job. I am from a small town and Maine and remember well that people could get decent jobs with just that high school degree (or even without one). While there are still some decent jobs like that, they are increasingly rare.

While it might be a slight exaggeration, the two-year college degree is now the equivalent of the old high school degree. That is, it is roughly the minimum education needed to have a shot at a decent job. As such, the reasons that justify free (for students) public K-12 education would now justify free (for students) K-14 public education. And, of course, arguments against free (for the student) K-12 education would also apply.

While some might claim that the reason the two-year degree is the new high school degree because education has been in a decline, there is also the obvious reason that the world has changed. While I grew up during the decline of the manufacturing economy, we are now in the information economy (even manufacturing is high tech now) and more education is needed to operate in this new economy.

It could, of course, be argued that a better solution would be to improve K-12 education so that a high school degree would be sufficient for a decent job in the information economy. This would, obviously enough, remove the need to have free two-year college. This is certainly an option worth considering, though it does seem unlikely that it would prove viable.

Third, the cost of college has grown absurdly since I was a student. Rest assured, though, that this has not been because of increased pay for professors. This has been addressed by a complicated and sometimes bewildering system of financial aid and loads. However, free two year college would certainly address this problem in a simple way.

That said, a rather obvious concern is that this would not actually reduce the cost of college—as noted above, it would merely shift the cost. A case can certainly be made that this will actually increase the cost of college (for those who are paying). After all, schools would have less incentive to keep their costs down if the state was paying the bill.

It can be argued that it would be better to focus on reducing the cost of public education in a rational way that focuses on the core mission of colleges, namely education. One major reason for the increase in college tuition is the massive administrative overhead that vastly exceeds what is actually needed to effectively run a school. Unfortunately, since the administrators are the ones who make the financial choices it seems unlikely that they will thin their own numbers. While state legislatures have often applied magnifying glasses to the academic aspects of schools, the administrative aspects seem to somehow get less attention—perhaps because of some interesting connections between the state legislatures and school administrations.

Fourth, while conservative politicians have been critical of the general idea of the state giving away free stuff to regular people rather than corporations and politicians, liberals have also been critical of the proposal. While liberals tend to favor the idea of the state giving people free stuff, some have taken issue with free stuff being given to everyone. After all, the proposal is not to make two-year college free for those who cannot afford it, but to make it free for everyone.

It is certainly tempting to be critical of this aspect of the proposal. While it would make sense to assist those in need, it seems unreasonable to expend resources on people who can pay for college on their own. That money, it could be argued, could be used to help people in need pay for four-year colleges. It can also be objected that the well-off would exploit the system.

One easy and obvious reply is that the same could be said of free (for the student) K-12 education. As such, the reasons that exist for free public K-12 education (even for the well-off) would apply to the two-year college plan.

In regards to the well-off, they can already elect to go to lower cost state schools. However, the wealthy tend to pick the more expensive schools and usually opt for four-year colleges. As such, I suspect that there would not be an influx of rich students into two-year programs trying to “game the system.” Rather, they will tend to continue to go to the most prestigious four year schools their money can buy.

Finally, while the proposal is for the rich to bear the cost of “free” college, it should be looked at as an investment. The rich “job creators” will benefit from having educated “job fillers.” Also, the college educated will tend to get better jobs which will grow the economy (most of which will go to the rich) and increase tax-revenues (which can help offset the taxes on the rich). As such, the rich might find that their involuntary investment will provide an excellent return.

Overall, the proposal for “free” two-year college seems to be a good idea, although one that will require proper implementation (which will be very easy to screw up).

 

My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Follow Me on Twitter

A Bubble of Digits

A look back at the American (and world) economy shows a “pastscape” of exploded economic bubbles. The most recent was the housing bubble, but the less recent .com bubble serves as a relevant reminder that bubbles can be technological. This is a reminder well worth keeping in mind for we are, perhaps, blowing up a new bubble.

In “The End of Economic Growth?” Oxford’s Carl Frey discusses the new digital economy and presents some rather interesting numbers regarding the value of certain digital companies relative to the number of people they employ. One example is Twitch, which streams videos of people playing games (and people commenting on people playing games). Twitch was purchased by Amazon for $970 million. Twitch has 170 employees. The multi-billion dollar company Facebook had 8,348 employees as of September 2014. Facebook bought WhatsApp for $19 billion. WhatsApp employed 55 people at the time of this acquisition. In an interesting contrast, IBM employed 431,212 people in 2013.

While it is tempting to explain the impressive value to employee ratio in terms of grotesque over-valuation (which does have its merits as a criticism), there are other factors involved. One, as Frey notes, is that the (relatively) new sort of digital businesses require relatively little capital. The above-mentioned WhatsApp started out with $250,000 and this was actually rather high for an app—the average cost to develop one is $6,453. As such, a relatively small investment can create a huge return.

Another factor is an old one, namely the efficiency of technology in replacing human labor. The development of the plow reduced the number of people required to grow food, the development of the tractor reduced it even more, and the refinement of mechanized farming has enabled the number of people required in agriculture to be reduced dramatically. While it is true that people have to do work to create such digital companies (writing the code, for example), much of the “labor” is automated and done by computers rather than people.

A third factor, which is rather critical, is the digital aspect. Companies like Facebook, Twitch and WhatsApp do not manufacture objects that need to manufactured, shipped and sold. As such, they do not (directly) create jobs in these areas. These companies do make use of existing infrastructure: Facebook does need companies like Comcast to provide the internet connection and companies like Apple to make the devices. But, rather importantly, they do not employ the people who work for Comcast and Apple (and even these companies employ relatively few people).

One of the most important components of the digital aspect is the multiplier effect. To illustrate this, consider two imaginary businesses in the health field. One is a walk-in clinic which I will call Nurse Tent. The other is a health app called RoboNurse. If a patient goes to Nurse Tent, the nurse can only tend to one patient at a time and he can only work so many hours per day. As such, Nurse Tent will need to employ multiple nurses (as well as the support staff). In contrast, the RoboNurse app can be sold to billions of people and does not require the sort of infrastructure required by Nurse Tent. If RoboNurse takes off as a hot app, the developer could sell it for millions or even billions.

Nurse Tent could, of course, become a franchise (the McDonald’s of medicine). But, being very labor intensive and requiring considerable material outlay, it will not be able to have the value to employee ratio of a digital company like WhatsApp or Facebook. It would, however, employ more people. However, the odds are that most of the employees would not be well paid—while the digital economy is producing millionaire and billionaires, wages for labor are rather lacking. This helps to explain why the overall economy is doing great, while the majority of workers are worse off than before the last bubble.

It might be wondered why this matters. There are, of course, the usual concerns about the terrible inequality of the economy. However, there is also the concern that a new bubble is being inflated, a bubble filled with digits. There are some good reasons to be concerned.

First, as noted above, the digital companies seem to be grotesquely overvalued. While the situation is not exactly like the housing bubble, overvaluation should be a matter of concern. After all, if the value of these companies is effectively just “hot digits” inflating a thin skin, then a bubble burst seems likely.

This can be countered by arguing that the valuation is accurate or even that all valuation is essentially a matter of belief and as long as we believe, all will be fine. Until, of course, it is no longer fine.

Second, the current digital economy increases the income inequality mentioned above, widening the gap between the rich and the poor. Laying aside the fact that such a gap historically leads to social unrest and revolution, there is the more immediate concern that the gap will cause the bubble to burst—the economy cannot, one would presume, endure without a solid middle and base to help sustain the top of the pyramid.

This can be countered by arguing that the new digital economy will eventually spread the wealth. Anyone can make an app, anyone can create a startup, and anyone can be a millionaire. While this does have an appeal to it, there is the obvious fact that while it is true that (almost) anyone can do these things, it is also true that most people will fail. One just needs to consider all the failed startups and the millions of apps that are not successful.

There is also the obvious fact that civilization requires more than WhatsApp, Twitch and Facebook and people need to work outside of the digital economy (which lives atop the non-digital economy). Perhaps this can be handled by an underclass of people beneath the digital (and financial) elite, who toil away at low wages to buy smartphones so they can update their status on Facebook and watch people play games via Twitch. This is, of course, just a digital variant on a standard sci-fi dystopian scenario.

My Amazon Author Page

My Paizo Page

My DriveThru RPG Page

Follow Me on Twitter