Author Archives: Miranda Nell

Defining Our Gods

The theologian Alvin Plantinga was interviewed for The Stone this weekend, making the claim that Atheism is Irrational. His conclusion, however, seems to allow that agnosticism is pretty reasonable, and his thought process is based mostly on the absurdity of the universe and the hope that some kind of God will provide an explanation for whatever we cannot make sense of. These attitudes seem to me to require that we clarify a few things.

There are a variety of different intended meanings behind the word “atheist” as well as the word “God”. I generally make the point that I am atheistic when it comes to personal or specific gods like Zeus, Jehovah, Jesus, Odin, Allah, and so on, but agnostic if we’re talking about deism, that is, when it comes to an unnamed, unknowable, impersonal, original or universal intelligence or source of some kind. If this second force or being were to be referred to as “god” or even spoken of through more specific stories in an attempt to poetically understand some greater meaning, I would have no trouble calling myself agnostic as Plantinga suggests. But if the stories or expectations for afterlife or instructions for communications are meant to be considered as concrete as everyday reality, then I simply think they are as unlikely as Bigfoot or a faked moon landing – in other words, I am atheistic.

There are atheists who like to point out that atheism is ultimately a lack of belief, and therefore as long as you don’t have belief, you are atheistic – basically, those who have traditionally been called agnostics are just as much atheists. The purpose of this seems to be to expand the group of people who will identify more strongly as non-believers, and to avoid nuance – or what might be seen as hesitation – in self-description.

However, this allows for confusion and unnecessary disagreement at times. I think in fact that there are a fair number of people who are atheistic when it comes to very literal gods, like the one Ken Ham was espousing in his debate with Bill Nye. Some people believe, as Ken Ham does, that without a literal creation, the whole idea of God doesn’t make sense, and so believe in creationism because they believe in God. Some share this starting point, but are convinced by science and conclude there is no god. But others reject the premise and don’t connect their religious positions with their understandings of science. It’s a popular jab among atheists that “everyone is atheistic when it comes to someone else’s gods”, but it’s also a useful description of reality. We do all choose to not believe certain things, even if we would not claim absolute certainty.

Plenty of us would concede that only math or closed systems can be certain, so it’s technically possible that any conspiracy theory or mythology at issue is actually true – but still in general it can be considered reasonable not to believe conspiracy theories or mythologies. And if one includes mainstream religious mythologies with the smaller, less popular, less currently practiced ones, being atheistic about Jesus (as a literal, supernatural persona) is not that surprising from standard philosophical perspectives. The key here is that the stories are being looked at from a materialistic point of view – as Hegel pointed out, once spirituality is asked to compete in an empirical domain, it has no chance. It came about to provide insight, meaning, love and hope – not facts, proof, and evidence.

The more deeply debatable issue would be a broadly construed and non-specific deistic entity responsible for life, intelligence or being. An argument can be made that a force of this kind provides a kind of unity to existence that helps to make sense of it. It does seem rather absurd that the universe simply happened, although I am somewhat inclined to the notion that the universe is just absurd. On the other hand, perhaps there is a greater order that is not always evident. I would happily use the word agnostic to describe my opinion about this, and the philosophical discussion regarding whether there is an originating source or natural intelligence to being seems a useful one. However, it should not be considered to be relevant to one’s opinion about supernatural personas who talk to earthlings and interfere in their lives.

There are people who identify as believers who really could be categorized as atheistic in the same way I am about the literal versions of their gods. They understand the stories of their religions as pathways to a closer understanding of a great unspecified deity, but take them no more literally than Platonists take the story of the Cave, which is to say, the stories are meant to be meaningful and the concrete fact-based aspect is basically irrelevant. It’s not a question of history or science: it’s metaphysics. Let’s not pretend any of us know the answer to this one.

The Meaning of Life

Primordial Soup, via Professor James Brown, NC State

Where do we even begin?

A writer on Scientific American’s blog, Ferris Jabr, has posted a piece on vitalism, siding with what is a fairly common contemporary position in philosophy: that life cannot be truly distinguished from non-life. It is often assumed that to consider life a separate category, one has to believe in a religious or supernatural component special to living things. In fact, the way Jabr represents Aristotle, it sounds like the Ancient philosopher would agree to such a claim:

Aristotle believed that, unlike the inanimate, all living things have one of three kinds of souls: vegetative souls, animal souls and rational souls, the last of which belonged exclusively to humans.

But this is a misrepresentation of Aristotle’s theory. The exploration of life in Aristotelian philosophy is extensive, and like much of his work, not always fully cohesive, but at least it is clear that “soul” for Aristotle is unhindered by the connotations it has gained through centuries of association with the church. Fabr links to Dan Dennett who does provide a little more clarity, if you read far enough (it’s not that far, but it’s past the first couple mentions of Aristotle’s soul):

“Yes, we have a soul, but it’s made of lots of tiny robots!” … The “tiny robots’” in question are cells (such as neurons) and even tinier robots (such as motor proteins and neurotransmitter molecules) that have evolved to form amazingly ingenious armies of operatives, uniting to form an organization-as Aristotle said-that sustains not just life, like the vegetative soul, and not just locomotion and perception, like the animal soul, but imaginative, rational, conscious thought.

Soul for Aristotle is organization -”the imprint in the wax” in one analogy – and the different kinds of souls he discusses are important aspects of organization in a world he defines as made of substances. So his first distinction is between substances and elements – those things that are organized toward a certain form or purpose (substances), and those that are, basically, “heaps.” The distinction is obvious at higher levels: if you break a rock, or a fire, it becomes (or can become) multiple rocks, multiple patches of fire. But if you break a donkey it will never become multiple donkeys. An element is the same throughout, and a portion of the element is just a lesser amount of the same.

The issue can arise of where plants fit. Sometimes you can split one tree into two, for instance – does that mean they are elements? Aristotle argues that they grow toward a certain form, and that is specific to living things. It has parts and is organized to maintain its form – roots which draw in nutrition, fruits which allow for reproduction, and so on. Of course, we create artificial things which are organized (and it is useful to be aware of the word organ-ized here – this doesn’t mean it has a pattern, but rather that it has specialized parts which work together toward a shared end). Things which human beings make may be organized, but they don’t grow or reproduce themselves. Aristotle’s famous example here is that if you were to bury a bedstead, it might sprout the start of a tree, but it would never grow another bedstead.

We process our artificial creations so much more now that sometimes people are surprised to remember that everything we use began as natural elements, but for the Ancients the difference between natural and artificial was pretty obvious – creatures, trees and rocks in the wild, versus those humans have shaped into usable or desirable items. Has the complexity of technology changed the definition? Well, the fundamental difference is that in nature, the substance is self-organizing, whereas in artifice, we’re the ones putting it together. That seems impossible to get around, but in theory an artificial life could be created and let loose so that our input was no longer necessary – a growing, self-moving, reproducing robot, a car that fixed itself and had baby cars. Without this, a robot that speaks or acts is not lifelike, but only mimicking a few chosen behaviors according to mechanical rules. For Aristotle, life is not simply mechanical. But that doesn’t mean it’s supernatural.

Patricia Churchland is well known for a mechanical view of consciousness, a view which stems from a simplistic dualism that english-language philosophers are a little too likely to embrace: either there must be a magical, inexplicable spirit that inhabits a body, or the body must be a machine just like a car or a computer. What they forget is that artificial machines are our secondary creations, and they are not representative of the full complexity of a living thing. Not surprisingly, this mechanical attitude toward life and consciousness became especially popular in the industrial and technological ages, as humans became more familiar with machines, and spent less time observing nature.

Returning to the Ancient view is useful, and Aristotle’s three levels of soul point out what is specific to life. The starting point is a growing substance, something actively organizing itself toward a unity and able to gather nutrients and dispel waste in the achievement of that formation. The next state is animal life, which adds the ability of perception and self-movement. The third level is temporal awareness and the ability to compare, or rationality. When Churchland suggests that “you are your brain” she is both making an obvious statement and an incorrect one – in fact, you are you, which is to say, the whole thing, although yes, your brain is generally speaking more fundamental than most other parts of you. You could lose both your arms and legs, have many organs replaced by artificial ones, have terrible facial scarring, and still sort of be who you are. But in many very real ways you would lose a lot of yourself. And in truth you could suffer various kinds of brain injury and still sort of be who you are, too. Still, with no brain at all, there would be no starting point, and you could definitely lose a kidney or a finger without it having all that enormous an impact, so comparatively speaking, it’s fair to focus on the brain.

More important is an understanding of what it means to say you are something. When Churchland claims that you are your brain, she almost makes it sound as if, actually, you’re not. Instead, you’re just controlled by it, and it is some mechanical thing that follows its own rules and forces you to behave according to them. But if you are your brain, if it is a full equation, then your brain is you, too. That is, neurons aren’t causing you to act a certain way. Neurons are just the physical manifestation of you. You are causing you to act a certain way.

Of course, psychology is complex and this does not mean you make every choice on a conscious level. Some of “you” is unconscious. Some of “you” is just biological. Often there is conflict, influence, confusion and habit intertwined. But it is still all you in the broadest sense, by being part of one living substance. And this is key to the idea of life – a human being is alive as a substance, and as a rational animal, which is to say that the fundamental activity of nature is recognizable at each level. This has been referred to in a variety of ways, as desire, will, survival of the fittest, emergence, or spark of life, but we do notice that while rocks endure, life thrives. Life isn’t built by external causes but builds itself – grows toward the completion of an intended design, and produces offspring to repeat the plan – and this stage of existence is definitely notable.

Fabr ends by saying that life is “a notion, not a reality”, which will sound a little naive to any philosopher, given that the epistemological stance on other categories is never addressed. Is the difference between a chair and a desk a notion or a reality? What about between a man and a woman, or a fruit and a vegetable? In other words, is there something special about life that makes it less distinct from non-life, or is the point of the article that categories can have edge cases, and are determined by social use and agreement? If it’s the latter, that doesn’t mean our categorizations are useless or don’t point toward important differences. Perhaps as a scientist he is hoping for more quantifiable categories like the elements, and has concluded that life refers to a broader group. But this is not unusual in our referents, nor does it erase the accuracy of the distinction. Just because something can’t be counted does not mean it isn’t perceived and judged. It may be difficult sometimes to decide if a color is orange or yellow, but those occasional tough calls do not lead us to think there’s no way to see colors at all.

In truth, most people are pretty satisfied with the agreed determinations of what we define as life. Crystals do not grow toward a unified form, but are caused by the attachment of multiple parts to one another. Viruses are distinct from parasites in their simplicity and fundamentally secondary nature – one theory of their origin has been that they regressed from parasites, but in current form they lack the substantial nature of biology as they do not feed or maintain an organic identity but rather use the cells they hijack. (Bacteria which are obligate parasites still have their own metabolic processes; they are reliant on environmental factors). Nonetheless, even with disagreements continuing on an edge case like viruses, life is a meaningful category. It isn’t narcissistic to notice that the movement of living things is different from the movement of dead things. It’s just observant, which is one of the qualities of animal life.

Why you are, categorically, racist (or sexist)

Given the discussion surrounding the Zimmerman verdict, and the recent controversy over Colin McGinn’s resignation due to sexual harassment charges, I thought I would make a brief comment on the larger issue these cases exemplify. In both of these cases, there are arguments to be made on specific incidents and those who defend the men involved do not think they are being racist or sexist—they’re just concerned about details. The problem is that people generally tend to be less concerned about those details when the incidents affect white men, or male students.

If you’re not sure that’s true, watch this ABC experiment which shows a white male, a black male, and a white female all performing the same action of stealing a bike in broad daylight. The results are both not all that surprising and a very solid reminder that small prejudices add up and have enormous impact. The white man is more or less left alone to his business. Some people are curious about what he is doing, but no one really actively interferes. The black man is immediately questioned and people call authorities very quickly. The white woman is approached by men, and they go out of their way to help, even with full knowledge that she is trying to steal the bike.

Obviously it sounds worst for the black man, and it is easy to shrug off the reaction for the woman as really more of a benefit – even when trying to do something illegal, she can get help from strangers. But does she want help? And do these sudden assistants expect anything in return? Even if it is no more than a friendly smile and flirtatious banter, the key to these stories is always how single interactions can add up. If a black man deals with just slightly more suspicion, but deals with it constantly, his life is radically different from the white man’s. Likewise, if a woman faces prurient interest, even if it is meant in fun, and not intended to lead to a sexual relationship, if she faces it from every direction it changes the world she lives in.

These effects are due to a common way that human beings think. It is a claim often made by philosophers that people think in categories — in fact according to some philosophers it is what makes the human mind human. I would argue that things are more complex, and that our ability to conceptualize is a skill and a habit that we develop. It makes it easier for us to hold multiple thoughts together at one time, but at the cost of detail and fine distinction. However, that fine-tuned capacity is still available; it just has to be brought into focus.

But categorical thinking is not the only way that humans parse the world, nor is it unique to human beings. Animals understand categories, just to less complex degrees, when they respond to “fetch” and “trot” and “cracker.” Dogs learn tricks, horses understand a series of different movements, birds and chimps can even communicate with people to a limited extent using words that people invented. More importantly, concepts are not stagnant—they can be altered through imagination, and are not absolutes but, to be meta about it, simply another concept we have come up with to explain the way we organize our reactions and ideas.

And the human mind responds to the world in non-categorical ways as well. For example, when responding to music, people generally do not think in categories, and yet they can make extremely complicated patterns and connections. It is a form of thought probably more complex in humanity than in animals although not unique to our species. Many other examples could be suggested but I’ll save that for another time.

More key here is the idea of recognition of individuals. Though we may at times reduce people to a concept of themselves, we still recognize something unique by a personal name. Such referencing applies to buildings, places, monuments, dates, royal babies and countless other aspects of life as well. The claim of certain schools of thought, like the language philosophers associated with post-Hegelian, Sellarsian, or Wittgensteinian thinkers, is that it is impossible for a human being to think without thinking in concepts: any time a word is used, it refers to a group or type of thing, as well as the unique referent. This is what it means to make a concept, and from Plato through Kant has been touted as monumental in human achievement.

While it is an important aspect of how we organize and stack our thinking, it is central to remember the unique component as much as the categorical. If we think in terms of the individual, it becomes clear that the conceptual aspects are choices we make to significant degrees. Levinas speaks of the importance of the recognition of “the face of the other” in an ethical interaction, and I think it is possible to apply this to our broader interaction with the world. Everything experienced is unique. It may be comparable to other substances or moments, but it is only in laziness, and, after industrialization a strong habituation, that we equate distinct things. We still experience the individual.

Our conceptualizing tendencies overall should be recognized as tools that can both help and harm our understanding. This is undeniable when applied to human beings. The fact that we can make faster decisions by applying broad categories, but that it can result in gross misunderstandings is true of smaller parts of life as well. Being more patient, more nuanced and more observant of the individual case allows a kind of knowledge with fewer assumptions, even if it may allow for less immediate utility.

Some will push for the division between people and other cases (Sartre would argue a free consciousness changes everything, for instance) but even if we were to grant this the problem of thinking in categories remains. The very idea that individuals of any kind can be “exact expressions of one soul” paves the way for a certain habit of thinking. Because we use the same word, we assume the same essence, and come to understand an equivalence as soon as something has been identified. A black man in a hoodie, or a young blonde woman, can face certain presumptions just by belonging to a category, and in time these attitudes can affect the way they understand themselves and behave as well, encouraging the stereotypes.

But if we are able to understand categories as just tentative judgments that help us clarify the world, though sometimes at the cost of complexity, our thought can be more developed. A reflective interplay of incomplete categorization and non-categorical consideration can allow for creativity, originality, and a better chance at reaching something like truth. On the other hand, if we think categories simply reveal essential natures, and we understand races and genders as categories that define people, it becomes a social norm to call the cops on certain bike thieves, leave some alone, and try to flirt with others.

Mine, Ours, or Mr Burns’?

Last year, I was staying at the house of a friend who is a self-described anglophile, and one afternoon she gave me a cup of coffee in a mug with a joke on it. It was the classic british “Keep calm and carry on” insignia but with the crown inverted and the words changed to “Now panic and freak out.” I laughed—out loud, even—as it seemed a perfect expression of our shared east coast neuroticism but in the form of the mother country’s cultural superiority.

But not long after visiting her, I saw that image somewhere on facebook, and soon after that I saw about one hundred different adaptations of it in a hundred different places. Within the course of probably weeks, but certainly months, something I encountered went from seeming unique, personal and appealing to ubiquitous and cliche. This is the nature of contemporary culture. Nothing lasts very long.

If we add to this fast-paced cultural language two more aspects of modern life, the planned obsolescence of technology, and environmental awareness, it does not seem too surprising that the idea of a sharing economy might emerge. Things go out of style quickly, and they actually stop being usable as well. Given that, buying them just means you will have to throw them away before they wear out, which is wasteful.

One solution is to fight the trend—go back to old-fashioned, longer lasting methods, avoid getting material things when it’s not necessary, and generally opt out of the modern economy. But an easier method, especially provided the capacity to organize which information technology can provide, is to just share access. Instead of each of us buying an item that will be obsolete or out of style, we can buy access to a collection and constantly trade in for something we haven’t used yet. We can create a public library of power tools, clothes, furniture and electronics. For transport, there’s zipcar; for hotels, airbnb—both to cut costs and to increase convenience the dream of this sharing economy appeals to many.

But it has skeptics as well. The two most common forms of criticism are from different angles, but boil down to the same issue: there are business people who say it won’t be possible to make money, and there are idealists who say that it is nothing but a cover for squeezing pennies from the consumer for every use.

The worry, then, is that it must be something capitalists are just going to exploit, and this is proven by capitalists worrying over whether they’ll be able to exploit it. The outcome of understanding this so-called sharing as basically a form of micro-renting reveals that it can take away the lower class’s power by denying them ownership. Given this, the business world is perfectly happy to embrace what is essentially a new form of feudalism.

So is anything changing? Certainly the spread of information is far greater than in the past. Additionally, travel and movement is becoming more common, if not to the same degree in different social circles. These factors mean that there may be a shift in the importance placed on ownership. While at one time, the idea of property was rather directly identified with the owner (think of how the word is also used to refer to qualities), in a world so full of duplicates, what we are concerned with now is access.

When I leave my house, I make sure I have my phone, my wallet and my keys—all items that allow access to other things. Very little of it is unique property. If I lost these things, the annoyance would be getting my access back—new pieces of plastic or metal that could once again allow me have or provide various information. A shared economy seems a natural enough extension of that. Rather than carry a bike around, the modern consumer expects to have a card in the wallet or an app on the phone that allows access to bikes as needed.

Clearly most of us are not so nomadic as to be able to just create a new home every night, but at the same time the deepest roots will probably be where the most social activity takes place, and if that is through texts and posts as much as in living rooms, it may become easy to feel at home if one can get to facebook or twitter, with less need for a physical space that represents community. Perhaps what matters is who has got control of those domains. The classic concerns that public ownership is incompetent while private is too greedy and self-centered persist, and it is clear that the idea of a sharing economy isn’t going to erase the issues that come with property. But the relationships might be changing nonetheless.

What Is Performance Philosophy?

Last weekend I attended a conference of philosophers, artists, and various people with ideas called “Performance Philosophy: Staging a New Field.” The aim was to mark out an area of concentration that could be distinguished from studies of performance arts, as well as from the focus on the performative within philosophy, but which would link the two and even take seriously the possibility that performance is a kind of philosophy, and philosophy is a kind of performance. As someone who works on the multiplicity of knowledge, and therefore non-discursive forms of knowing and thinking, this interests me, but really my connection to the topic goes further than that.

I’ve always thought the rise of theatre and philosophy around the same era in Ancient Greece was not coincidental – they are two sides of a coin, extroverted and introverted methods of human self-reflection. Life as a self-reflective creature is performative, and like the actor, we might accept a role, seek out a better one, sink our teeth into a part or ‘strut and fret the hour upon the stage’. The theatre mimics while philosophy wonders but both are triggered by and concerned with the duplicitous nature of the human experience, the ability to think one thing and do another (for instance), the separability of the mind.

As technology increases, the overlap is only more pervasive – documentaries, mockumentaries, reality television, and all forms of social media find new shades on the performative-introspective scale, and while the intended topic is obviously not always existential, it is a continuous undercurrent to any observation of life. The aesthetic has seemed like the modern world’s answer when faced with a search for meaning, but life itself as aesthetic brings us back full circle.

The conference included many points of view and approaches, and there was clearly interest from a range of different backgrounds. One plenary speaker warned against fusing philosophy and performance, suggesting that it is only in their distinction that we gain from the discussion.  Others presented as practitioners with philosophical interests – a musician exploring time theory, a dancer interested in the body as a cartographic machine, a map of history – and part of the purpose of the conference was to work out how broad the area is, and whether it is distinct from, or perhaps more a bringing together of, various other fields already undertaken. In any case, it was certainly a place full of ideas and discussion, which is the key component of a good conference, and I look forward to seeing what comes next.

Time for Biology, or Must We Burn Nagel?

 

NYU Philosopher Thomas Nagel’s new book Mind and Cosmos has faced quite a bit of criticism from reviewers so far. And perhaps that’s simply to be expected, as the book is clearly an attempt to poke holes in a standard mechanistic view of life, rather than lay out any other fully formed vision. The strength seems to lie in the possibility of starting up a conversation. The weakness, unfortunately, seems to be in the recycling of some unconvincing arguments that make that unlikely.

The key issue that I think deserves closer inspection is the concept of teleology. Nagel reaches too far into mystical territory in his attempt to incorporate a kind of final cause, but some of his critics are too quick to reject the benefit of interpreting physics with a broader scope. While functionalists, or systemic or emergence theorists, may be more aware of the larger meaning of causality, it is still the case that many philosophers express a simplistic view of matter.

The word teleology has become associated with medieval religious beliefs, and much like the word virtue, this has overshadowed the original Aristotelian meaning. Teleology, in its classic sense, does not represent God’s intention, or call for “thinking raindrops.” Instead, it is a way to look at systems rather than billiard balls. Efficient causes are those individual balls knocking into each other, the immediate chain of events that Hume so adeptly tore apart. Final causes are the overall organization of events. The heart beats because an electrical impulse occurs in your atria, but it also beats because there is a specific set of genetic codes that sets up a circulatory system. No one imagines it is mere probability that an electrical impulse happens to occur each second.

Likewise, the rain falls because the water vapor has condensed, but it also falls because it is part of a larger weather system that has a certain amount of CO2 due to the amount of greenery in the area. It falls in order to water the grass not in the sense that it intends to water the grass, but in the sense that it is part of a larger meteorological relationship, and it has become organized to water the grass which will grow to produce the right atmosphere to allow it to rain, so the grass can grow, so the rain can fall. These larger systemic views are what determine teleological causes, because they provide causes within systems, or goals that each part must play. This is distinct from the simple random movement that results from probability. It is obvious in some situations that systems exist, but sometimes we can’t see the larger system, and sometimes even when we do, we can’t explain its interdependence or unified behavior from individuated perspectives. Relying on efficient causality is thinking in terms of those interactions we see directly. Final causality means figuring out what the larger relationships are.

Now, those larger relationships may build out of smaller and more direct relationships, but a final cause is the assumption of an underlying holistic system. And if this were not the case, Zeno would be right and Einstein would be wrong; Hume’s skepticism would be validated and we truly would live in randomness – or really, we wouldn’t, as nothing would sustain itself in such a world. The primary thing about a world like this is that it is static, based only on matter but not on movement, which is to say, based only on a very abstracted and unreal form of matter that does not persist through time. Instead, the classic formation requires a final system that joins the activity of the world.

What this system is or how it works is not easily answered, but it must involve the awareness that temporality and interconnectedness are not the same as mysticism or magic. To boil all science down to a series of probabilistic events misunderstands the essential philosophical interest in understanding the bigger picture, or why the relation of cause and effect is reliable. The primary options are a metaphysics like Aristotle’s that unites being, a Humean skepticism about causality, or a Kantian idealism that attributes it to human perspective.  Contemporary philosophers often run from the metaphysical picture, preferring to accept the skeptic’s outlook with a shrug (anything’s possible, but, back to what we’ve actually seen…) or work with some kind of neo-Kantian framework (nature only looks organized to us because we’re the result of it).

But attempts to think about the unified nature of being – as seen in the history of philosophy everywhere from the ancients through thinkers as diverse as Schopenhauer, Emerson, or Heidegger – should not be dismissed as incompatible with science. Too often it is a political split instead of a truly thoughtful one that leads to the rejection of holistic accounts. What I appreciate about Nagel’s attempt here is that he is honestly thinking rather than assuming that experts have worked things out. Philosophers tend to defer to scientists in contemporary discussions, which means physicists have been doing most of the metaphysics (which has hardly made it less speculative). It seems that exploring the meaning of scientific assumptions and paradigms is exactly the area we should be in.

Questioning a mechanistic abiogenesis or natural selection may be untenable in current biological journals, but philosophy’s purview is the bigger picture, and it is healthy for us to reach beyond the curtain, not feeling constrained by what’s already been accepted. While my questions are not the same as Nagel’s (and I won’t review his case here), I am glad at least to see the connection made coherently. Writers in philosophy of mind often make arguments that seem incompatible with certain scientistic assumptions but simply do not address the issue. There are options beyond ignoring the natural sciences or demanding a boiled down, mechanical, deterministic view of life. Scientific research has inched toward more dynamic or creative ideas of natural change (like emergence, complexity theory, or neuroplasticity) and theories of holism (at least in physics) so challenges should not be associated with a rejection of investigation or an embracing of mythology. We all know philosophy is meant to begin in wonder – but perhaps that’s become too much of a cliche and not enough of a mission statement.

Violence, Fantasy and Civilization: Django Unchained

The current issue of reducing gun violence in America has cleaved into two basic premises: that there is a problem of violence, and that there is a problem of technology. Both of these are rich areas of discussion, but in this post I’m going to focus on the role of violence in America.

One of the strangest parts of the Sandy Hook massacre was its growing familiarity. Details were new and horrific – reading the names of 20 tiny children brought President Obama close to tears, and many who saw his speech besides – but the lone gunman killing helpless targets en masse, not for specific reasons but rather out of spite, rage, entitlement or power seems to be a cultural pattern and already stories have followed that confirm that. Yet in looking to history it’s hard to think anything has changed about humanity—the opening of Foucault’s Discipline and Punish, for instance, famously describes a man being torn apart by horses. The notion that violence could be specific to the modern American landscape seems laughable to anyone who’s dipped into the archives at all.

So can the enormous disparity of gun deaths be attributed directly to which deadly technologies are available to the consumer in the United States? Before going into this question too far, we have to consider why there would be such an interest in those technologies among a population to begin with, and while there are a number of potential arguments, one of them is certainly aesthetic. This returns us to the question of the American relationship to violence. The issue here is that danger, excitement, risk and power are symbolized by weapons, and associated with an ideal of freedom. In other words, violence of a certain kind is associated with the aesthetic of freedom.

Quentin Tarantino got a little upset the other day at being asked whether he thought the violence in his movies was socially destructive. His reaction is understandable given how often he has been asked to explain himself, although it’s too bad he wasn’t willing to have the discussion about Django Unchained in particular, as the movie is an interesting reflection on the American psyche. The line that sums it up is delivered during a tense moment: “He’s just not as used to Americans,” Django says, referring to his German friend’s discomfort at a brutal scene. Given that Tarantino just made a movie that was explicit about German brutality it’s clear this is not a simple claim that America is full of worse people than other parts of the world. But there is something worth thinking about in this quote.

In general, Tarantino’s movies depict a hero within a corrupt world of some kind, and shows the complications of belonging to a moral subculture (accentuated by connecting the characters to the broad popular culture otherwise—like jewel thieves talking about Madonna or hit men discussing burgers). In Django Unchained, the corrupt world is America. Actually, it’s a hybrid of two Americas, the antebellum South and the Wild West—and each of these plays an important role in the larger concept of America.

Both the South and the Wild West have been romanticized in American cultural history. Looked at from the perspective of the powerless, there is very little to find charming about the South, and in Django Unchained Tarantino puts on display torture methods and practices commonly brushed under the carpet in other depictions—things like neck irons, hot boxes, angry dogs and brandings.

The constant, oppressive violence of the South is systemic, and it can be seen as based on the acceptance of the economy that perpetuates it. Human usury is woven into the market and so into the culture, and post-hoc defenses are easily created and believed by those in charge. Slaves have the choice of seeking a relative level of comfort within the system or risking torture and death, and are labeled by their oppressors as naturally incapable of self-determination when they choose the more prudent if less admirable route. The system does not provide a method by which to change the system.

The Wild West, however, is a storybook form of America—the individual cowboy who can bring about justice on his own. The violence that occurs here is understood differently, as it is not the result of an oppressive system, but of an individual taking a stand. Tarantino divides the two types of violence clearly in the way that they’re presented, and there is a gloss of fantasy over the actions of the cowboys. Nonetheless the complications peek through.

There is a scene in Django Unchained when the surviving plantation owners are returning from a funeral, thinking Django is on his way to a punitive fate, and we see them entering in their finery to a home whose walls are literally stained with blood. It’s humorous somehow, and visually evocative, but it is also deeply tragic and metaphorically apt. Even if we root for Django to bring chaos to the lives of hateful masters, the impotence of this devastation is ultimately evident. The violence of the Wild West is optimistic whereas the Southern violence is repressive, and Tarantino’s presentation of a cowboy as a counterpoint to a slave-master is perfectly intuitive. But revenge is not as simple as some (perhaps even the director, at times) imagine it to be: although the violence is cathartic and exciting, it still brings baggage with it, and the emotional weight of causing death and destruction doesn’t result in a clean slate. The Wild West violence may be a positive when compared with the authoritarian, systemic violence it is reacting to, but in a larger sense it is hardly more than a further part of the structure, the other and more risky option initially provided (quietly enforce a corrupted system, or become a corrupted individual in response to it.) The movie is not just a revenge fantasy; it’s an epic tragedy.

Freedom in its deeper sense comes with its own burdens. Sartre addressed this on an individual level through the idea of personal anguish, claiming that once you truly understand what it is to make a choice, you recognize the weight you bear. Victor Frankl put it in broader social terms, and suggested that a “Statue of Responsibility” should be built on America’s West Coast to balance the Statue of Liberty on the East Coast. In Man’s Search for Meaning he wrote “Freedom is but the negative aspect of the whole phenomenon whose positive aspect is responsibility. In fact, freedom is in danger of degenerating into mere arbitrariness unless it is lived in terms of responsibility.”

Technology and Freedom [Freedom, part II]

In my earlier post, I suggested that we could look at freedom from three perspectives, and I will get back to that at the end of this post. But I want to also look at the way that the ideal of freedom has been affected by technological shifts.

The environment of nature has always put limitations on freedom in that it has always required certain behaviors and disallowed others: there have always been “laws” in nature that we do not have the freedom to surpass. The environment demands a certain amount of food, air, water, work and rest, regardless of how those things are achieved. Nonetheless, so long as no person interferes, the natural difficulties which arise are shrugged off as amoral, merely luck and not much to account for. By this understanding, freedom as an ideal is only limited when human laws get in the way, not when disaster, illness, accident or other natural causes do. This classic American vision of freedom at first seems to contain a Rousseauian assumption that a social contract is unnecessary, that life without a social contract consists of individuals who leave one another alone and seek out what they need in relative peace.

However, such a viewpoint is radically at odds with a world of business. In order for industry and technology to grow, for capitalism to achieve its goals, it is vital that networks and groups – companies and corporations – are formed, compete and grow as well. In fact it seems that the 19th century assumption is more Hobbesian in its premise but just draws a different conclusion: life without a social contract is nasty, brutish, short – and totally awesome. The fewer rules prescribed, the more battles must be fought, but this is a benefit rather than a cost, and the “collateral damage” of those lost in the fight is worth the rise of empire.

But all of this becomes more complicated as technology expands. While nature provided limitations that could not be denied, the freedoms of individuals allow for the alteration of nature and new rules are put into play. In other words, the environment of a contemporary person is less limited by natural factors than by the structure of society. Unless born into specific circumstances, a person cannot simply start hiking, foraging, farming or hunting to survive. Instead, to afford food, shelter and transportation it’s necessary to take part in the economy, and this is thanks to the revolutionary changes put into place by businessmen. Thus the freedom to do anything leads, through technology, to particular limitations for the citizen. It is not the forces of government that put those rules into place, but the forces of invention; even Amish communities allow themselves limited use of certain technologies just to be able to survive (once local resources like lumber get used up and trading becomes necessary).

In other words, society takes over for nature as the primary environmental setting in which people live, and the needs and options are determined according to social rules. The very basics – a job and a place to live – come with various strings attached, and many other aspects will seem necessary to the majority as well, things like the right sort of clothing, cable TV, household appliances, a diamond ring, a nice car, or an iPhone. Conveniences and achievable luxuries in life change expectations until it is assumed that everyone ought to be taking advantage of their availability, and they become simply “the norm”. The more such social roles become defined, not just according to gender or family but also generation, musical preferences, political parties, brands or stores, and all manner of interests, the more identity is socially secured, and freedom is harder to reach. (While one may be free to break social norms, it is always easier for those with resources than those without, as social approval is usually needed to get a job, and in any case social acceptance is a constant component of life choices.)

To return to the three aspects of freedom I discussed in part one of this post, we can link back to a classic trichotomy: one could think of these forms of freedom as elements of the true, the good and the beautiful. The first form, freedom as what you are physically able to do, describes what is actually possible and factual—but truth as potential, through the lens of technology, is an active and relative descriptive. What is possible is always becoming, not a final determination. As technology grows, even nails in coffins are looked upon like puzzles that might unlock.

The second, the choice an individual can make, is clearly in keeping with the history of the good, the right, or the legal. This too is entangled with the changing options of a world with new identities and roles. Goodness has always been perspectival in practice given the necessity of conflicting interests, even if certain thinkers have maintained belief in an ultimate form, but here it takes on a Sartrean component—what is good is whatever you are willing to live with. The individual bears the burden of complete freedom to make moral decisions, as even those who claim absolute answers can at best be “one absolute answer among many.”

Finally, the notion of what is most beautiful or appealing to the soul includes freedom in another way. Here it is the feeling of freedom as an emotion being connected to the feeling of beauty. Kant’s theory of beauty speaks of aesthetic judgment, or the mental sensation of recognizing something as beautiful, as a “free play” between imagination and understanding. Since the understanding is the ability to conceptualize or see things as belonging to categories, beauty is the ability to go beyond that and experience the item in a way that breaks free from rules or standards. Although it is merely concerned with a direct experience of the environment, and not the meaning of one’s larger social role or way of life, there is something analogous about beauty and freedom in an anarchic sense.

Altogether, then, the larger idea of freedom seems to combine an awareness of an unknown future, the weight of responsibility, and the sense of excitement of breaking out of routines. Which aspects are people worried about? It is probable that when spoken of in theoretical terms, it is the second one, a moral freedom to determine one’s own values, that is cited most, but when referred to simply as a broad worry, there are aspects of the other two as well—a sense of fear that opportunities just won’t be available or social constrictions will hold us all hostage.

In fact, I think a strong case could be made that it is that third one, the aesthetic of freedom, that drives concerns about losing freedom. And of course, the more determinations are made to assure factual freedoms, the less the aesthetic of freedom has any place. In reality, the aesthetic of freedom includes tragedy, pain, and risk – it includes competition and even violence – but the volatility inherent to this sensory freedom is at odds with the stability and reliability expected from guarantees and laws, even those that protect freedoms. Freedom writ large cannot be simply defended, but has to be understood as a whole variety of different issues and desires that can be taken in turn.

If the post-Industrial age has brought with it new problems of freedom, they are not tied to certain policies but a much more complex series of historical and technological changes that has produced roles not of family members or craftsmen, but of consumers and servers – roles heavily tied into an economy rather than a community.

The Multiplicity of Freedom [Freedom, part I]

There is a claim made by a portion of Americans—especially among those who lost the most recent election—that they defend the ideal of “freedom” and that it is in danger of slipping away, either under the current administration or just in contemporary culture generally. But the idea of freedom is both vague and complex. Although this is an enormous topic, there are a couple points I’d like to make, one regarding the multiple angles of the concept to begin with, and one regarding how history and technology have had an effect. Today, I’ll look at three ways that the concept of freedom may be grasped: as ability, as choice, and as feeling. In my next post, I’ll follow up with what this means in context.

The first version of freedom is the simple capacity to do something. This is originally inhibited only by the laws of nature—I can walk but I can’t fly, and though I am free to be lazy I still must find food if I wish to stay alive. However, as history progresses this aspect of freedom is impacted by technology and society. For instance, my first example is now false in everyday parlance –modern human beings fly all the time. Donna Haraway’s theory of cyborgs exploits this use of freedom: ultimately, what we are able to do is what makes us free, so technology is a beneficient force. For Haraway, women in particular suffer when reduced to that which nature intends—or demands—and not allowed the creativity of the artificial. Once intertwined with technological possibilities, embracing a “cyborg” nature as she calls it, women can actuate a new level of freedom. This goes against tradition and any idea of natural law, of course, in which freedom is met by clear boundaries.

The second concept is the idea of free will or autonomy, which is not the physical possibility of performing a particular action, but the process of choosing intentionally to do so. (This is the kind of freedom that usually gets tied up in theories of determinism, which I am not going to address here). Nonetheless, autonomy is always complicated by secondary pressures and forces. That is, the individual may define this notion of freedom externally by some form of law or moral boundary that is not identical across the population. It is easy to say we should all be free, but harder to agree on whether that freedom includes certain choices—and as it turns out, much of what is considered taking away freedom by one group is seen as a way to save or protect freedom by another. It is an argument of definitions as much as policy: Is it the freedom of the mother or the fetus that should be under consideration when discussing abortion? Is it freedom of speech to be able to demean someone for their belief, or freedom of religion to be able to practice that religion without persecution? The autonomy of multiple parties has to be accounted for, and is commonly in conflict. The most libertarian approach, where existence and action always win over persecution and impediment, runs into trouble when trying to explain why people can’t be watched, used, and generally exploited since it’s the freedom of the big guys to keep expanding their enterprises. Limitations that recognize protecting freedoms to, for instance, pursue happiness and not just maintain one’s existence, complicate definitions and also leave the edges of each person’s liberty rubbing against each other.

The third is a less specific ideal and one that permeates the American psyche. It is the fantasy of a new beginning, of wild horses and open land on an uncharted continent allowing for anything to happen. This notion can change as time passes, and history begins to settle in. America is a young country, but no longer adolescent. When Emerson wondered what the “new American Scholar” would be like, the Civil War had not even taken place yet. He advised members of the childlike country to stick closer to Nature and Action than Books, to explore things anew instead of being weighed down by history, but now Americans are bound to the traditions of our own books, quoting Emerson instead of following his advice. Even so, the feeling of excitement towards free, open space, a sense of boundlessness and lawlessness, is clearly universal, and there are multiple ways that this desire manifests. The question may be how it is related to the more distilled forms of freedom mentioned earlier.

In our most everyday use, we might say freedom is the ability to do as you choose. This definition could be thought to include both capacity and self-rule. One might presume it to be boundless unless directly challenged, but on closer inspection neither component requires there to be an immediate enemy in order to be reduced. Both the potential avenues a person can travel, as well as their own awareness and determination in making active choices, can face severe erosion due to social and environmental factors alone. In other words, a person’s freedom can be limited by the chance experiences they undergo in life, so that they are stuck in a situation where there truly is no other choice, or in terms of our definition, where they have no freedom.

Does such a situation count as a society taking away freedom? I will look into how this multiplicity of freedom can clarify the nature of the concept, as well as discuss the historical arc of technological change, next.

Our Father vs Big Brother

The tape of Mitt Romney speaking to his cohorts in what could be described as a proverbial back-room seems to have had a lasting effect – we’ll see if it turns out to make all the difference, but it certainly brought into focus the image of Romney as oblivious aristocrat.

But even more interesting to me than the specifics of this candidate’s attitudes was the evidence of a change in certain social and technological expectations. Many people responded to Romney’s comments by shaking their heads at the fact that he would say those things out loud, that he would speak so candidly. Sure, he was at a fundraiser with other super-rich political puppeteers, but he must have known the information could get out…

Of course, a couple decades ago, it probably would not have. Even if a member of the staff could afford a hidden camera it would have taken a lot of planning and setting up to get the material, and once it was on tape it would have taken a lot of work to get it nationally aired. It may not seem like that’s that much commitment, but it’s definitely active and organized: hide tiny expensive specialty technology beforehand, and then transfer incriminating material to a standard medium, and try to get a national news outlet’s attention without being dismissed as some kind of conspirator (in fact, many journalists back then might have rejected the tape as unethical just because Romney clearly doesn’t realize he’s being taped).

Today, a person does not even have to really care about the consequences – sometimes people will record things just because they can. In a room with a famous person and some number of non-guests with iPhones, it is not at all surprising that someone recorded Romney speaking and then put a portion of it on YouTube—there did not even need to be intent behind it. The ease of catching a person in the act has increased so monumentally that the very idea of a backroom deal is in trouble.* Anyone can tape the conversation and show it to a potential audience of millions, and they don’t even need to dislike you or want to cause harm. It’s just information sharing—the connotations or potential impact of the information is not always considered (this happens on Facebook all the time: a photo posted in fun in one context is evidence of a promise broken in another, for instance).

The idea that we are losing privacy, and even losing the desire for privacy, has been argued about since technology and the internet especially first began allowing for these new methods of disclosure. An angle I want to focus on is the concurrence this has with a rise in atheism. There are plenty of other reasons that the idea of God is not as popular as it once was, and technology and the internet can contribute to the phenomenon in other ways. But there’s a social, pragmatic level at which God is becoming obsolete that could be a factor.

One of the classic reasons to have a concept of God from society’s point of view is the same as a reason to have Santa: “he knows when you’ve been bad or good, so be good for goodness’ sake.” From an intellectual standpoint this may not be convincing – Plato, for instance, attempts to show why we can’t use God as a referee when discussing the question of ethics in The Republic. The story of the Ring of Gyges, a ring which allows its wearer to become invisible and thus get away with any sort of immoral behavior she chooses with no repercussions, leads to the argument that even if the wearer is invisible, surely the Gods still know and can still judge. The original argument illustrated by the story of the ring is that people only act ethically when they are being watched, and this comeback says, well, you are always being watched by God so the point is moot. God serves as an external conscience.
But in The Republic, this idea is debunked—God is unreliable, and can be appeased by gifts or pleas for forgiveness. If you do something wrong, you can always get back on His good side. In other words, your conscience may know you were unethical this once, but do something extra-nice next week, and you’ll feel it’s been evened out.

In that way, Big Brother is more effective. If a person wants to steal something in a store, but thinks “No, God will know what I’ve done,” they might stop themselves. But they may also imagine that they can bargain with the big guy and promise to never do something like this ever again. On the other hand, if they believe there is a camera coming at them in every direction it will be harder to make that kind of deal. Our increasingly Panoptic forms of life make it possible to see this particular utility of God being overshadowed, since people with videos are a lot more direct and aggressive.

I am not suggesting that would consciously affect beliefs, but if the fear of moral oversight were to shift realistically toward peers, one of God’s greatest strengths would be made irrelevant. Sure, no video can see into your heart: but if it becomes widely expected that everything that happens in a public or semi-public space could be broadcast, that knowledge could play the part of an external conscience just as well as religion.

It’s true that God was famously described as dead over a century ago by Nietzsche, and he too was concerned with moral issues. However, his focus was on the lack of cohesion or agreement in beliefs, whereas I am addressing the much more mundane but perhaps more convincing issue of the cohesion of facts. That is, Nietzsche thought the concept of God was coextensive with the idea of absolute truth, and as that became untenable, religion would die. It’s arguable to what degree that happened, but the issue here is not what is right, but whether the right thing has to be done. God as an externalized conscience becomes less effective when society is doing the job in a more obvious and graspable way (which doesn’t require that God isn’t real, just that His methods are less convincing).

It could easily be coincidence that secularism is on the rise at the same time as surveillance and general recording become the norm, but I’m suggesting that it is part of larger cultural shift, and that the notion of God just fits less easily into a world where we can already picture a very ordinary kind of “all-seeing, all-knowing” presence. What was once supernatural is now merely artificial.

*I wouldn’t want to imply that therefore people will start being ethical, however. There are always adaptations and ways around – the idea is just that a fear of being seen is becoming much more real.