The Ethics of Cyberbugs

Parthenium beetles

(Photo credit: Wikipedia)

In science fiction, a cyborg (“cybernetic organism”) is a combination of organic and technological components. Daleks, Cybermen, Terminators, the Bionic Man and the Borg are well known examples of fictional cyborgs. While there are real-life human cyborgs, these tend to be extremely limited. For example, a person might have a pacemaker or other such implant. However, cyborg insects are a reality.

Michael M. Maharbiz and Hirotaka Sato developed an interesting, if disturbing system, for creating cyborg beetles. The gist of the method is to equip a beetle with a “backpack” containing electronics that are linked into the beetle’s muscles and nervous system, allowing the beetle to be driven about (more or less) like a remote control vehicle. The main reason for using cyborgs rather than purely mechanical drones is that beetles are far more effective and efficient flyers than our mechanical creations. As such, it makes practical sense to convert a beetle to a cyborg rather than trying to build a better mechanical beetle.

As far as the uses of such cyborgs, they tend to involve using the beetles much as purely mechanical (and vastly larger) drones are used: to gather information. For example, in some future battle swarms of cyberbeetles might be deployed to look for enemy soldiers within a city. As a more peaceful example, swarms of cyberbeetles might be released into the rubble after a natural disaster to locate survivors. While such cyberbeetles could prove useful, there are still moral questions regarding their creation and use.

One obvious moral concern is that creating the cyberbeetles requires modifying a living organism with technology and effectively enslaving it to serve as a drone. This, of course, actually involves two points of concern, namely the modification and the enslavement.

In terms of the modification, the main concern is that such tampering with living creatures is morally dubious, perhaps because it is unnatural. The challenge is, of course, to develop an account of the natural in which such alterations would be wrong. I will not endeavor to do so here.

In terms of enslavement, the obvious concern is that the beetles are being treated the way that the science fiction monsters the Cybermen and the Borg treat their victims: they simply take control of them with technology and rob them of their own lives. On the face of it, such technological enslavement is wrong whether it involves robbing a human or a beetle of whatever freedom they possess.

The obvious reply is, of course, that the “victims” in this case are just beetles. They do not have much of a life (or lifespan) even in the natural world and hence they are not being wronged. In fact, it could be argued that as valuable tools they would have a better life than in the wild. After all, they would be fed and protected. Presumably the Cybermen would advance similar arguments, should they ever consider the ethics of their actions. That is, the same arguments that are used to justify the enslavement of beetles could be used to justify converting a significant number of humans into human versions of the beetles. This, of course, leads to another moral concern.

While there is obviously a considerable distance between cyborg beetles and creating comparable cyborg humans (basically Cybermen), allowing beetles to be converted into cyborgs is a beetle sized step towards converting higher organisms. After all, if a beetle would make a good flying spy, a bird would make an even better one. Also, imagine the usefulness of converted rats, cats, and dogs. From there it is a much smaller step to creating human cyborgs that are controlled by implants to engage in spying or combat. Enslaving humans in this manner is clearly wrong and the path to this begins, obviously enough, with these beetles.

That said, it is obviously possible to stop before we get to humans—I do not, of course, want to throw out a fallacious slippery slope argument here. However, before going on a journey it is generally wise to consider where it might end.

My Amazon Author Page

Enhanced by Zemanta
Leave a comment ?


  1. Mike,

    “… is morally dubious, perhaps because it is unnatural.”

    The concept of what is ‘natural’ went of the (unnatural) window the moment a human or any other animal did something with an object that wasn’t part of its own ‘natural’ equipment: using tools. A bird that uses a rock to break open a shell is using unnatural means.

    “The challenge is, of course, to develop an account of the natural in which such alterations would be wrong. I will not endeavour to do so here.”

    Wise move.

    As for enslaving animals, there are many naturally parasitic relationships that seem quite freaky to us, especially when the parasite alters the behaviour of the host by altering its brain. Don’t miss this post: And for a quick freaky cyborg like control of one animal by another try this short video:

    But we don’t seem too concerned when we alter the behaviour of horses or dogs by infecting their brains with training, sticking on a saddle or a harness. In what way are these animals then not cyborgs?

    “… allowing beetles to be converted into cyborgs is a beetle sized step towards converting higher organisms.”

    I thought we’d overcome that problem to a large extent. We do enslave and train horses, dogs, dolphins and other animals. But we’ve stopped doing that to humans. We’ve even restricted what we can do with other apes, and we’ve stopped much of the harmful exploitation of animals such as dogs in the testing of cigarettes and beauty products. Sure there are still cases that can be debated, about the use of animals in drugs trials for example. But basically the slippery slope argument made in the above quote doesn’t really hold.

    But if there is really any concern about the moral, and maybe legal, rights of these beetles that the whole of New York should be put on trial for their discrimination against cockroaches. Live and let live.

  2. I would find it hard to object to this kind of usage of insects. They are still doing what they normally do, except they are being focused in another area. They are certainly not rational animals.

    I have more of a concern with genetic manipulation, because of the unknown paths that this might cause nature to follow.

    Human do not have a real good history of containment of science projects.

    We rush in where angels fear to tread.

    Our desire to win prizes and financial awards and academic accolades, sometimes causes us to cast prudent caution to the wind.

    I am not opposed to prudent and well planned genetic research, the potential for treatment of diseases is enormous.

    I am ok with animal laboratories that focus on developing treatments for those who suffer.

    Regardless what the movie Star Trek said about whales, I believe that more than 99% of all species have disappeared.

    (Yes, I have two cats and a dog).

  3. Re Timrford Dec 7th

    “I am ok with animal laboratories that focus on developing treatments for those who suffer.”
    I agree and would add that animals themselves, by way of veteriary science also benefit. I think this is so often overlooked.

  4. There is more here than a symbiotic relationship between living creatures. The article questions whether humans are acting as parasites when controlling the dynamic behaviour of living creatures. The question not asked: Are humans being controlled by a parasite, but do not know it?

  5. Interesting,

    I believe there is a parasitic, possibly a symbiotic relationship. Both species can benefit.

    However, I regard myself as a predatory member of the highest hierarchy of animal genus, because of my ability to think, reflect, use language, and use my hands for tools.

    Your question of whether we are being controlled by parasites, needs more explanation for me. I sure think I am in the controlling or driver’s seat as regards the usage of other animals.

  6. TimrFord,
    The human body hosts a variety of digestive bacteria, viruses and dermal organisms. There is a microbiological war going in inside every human. The war is for control of the host body, or the victorious prize of that war must be to control the host body. The espistemic question is how can we know if a zombie parisite is controlling our thinking?

  7. Thanks, makes some sense.

    If they do control us, then it must be as the result of a natural response or a reasoned response. Obvious the “gamy” shrimp I had the other night won out over control of my intestines. Sure controlled my life for awhile.

    I have to rule out a reasoned action or response.

    I think what you describe is a purely natural phenomena, part of nature. Eventually our bodies, and occasionally our minds wear out. In nature, we are at the mercy of many factors, bacteria, weather, despots, etc., etc. Nothing is guaranteed in life, natural selection, a phrase many of my friends do not like, has insured that the one stability in this world, is the continual state of change. Keeps it interesting, I suppose.

    Since we are continuing to have miniaturization of electronics, I suspect that one day we will have the ability to treat disease and suffering through micro robotics, although I think that right now, gene therapy has some very interesting potential.

  8. Ron Murphy

    The use of what is natural as a basis for ethics can be defended in sophisticated way. For example, Aristotle argues that the function of humans (as humans) is what defines virtue for us. Going against our function would be to fail in our excellence. The Taoists also present what could be considered an account based on nature (in this case the “way” that is the Tao).

    There seems to be a moral difference between training and cybernetic control. When I train my dog, I am teaching my dog and she can elect to refuse to learn. Some dogs do that and dogs often refuse to learn certain things. For example, my husky refuses to learn to fetch, although she has learned many skills (like opening doors, working gate latches with her paws and so on). In contrast, being wired up is direct control and the animal has no choice. Teaching my dog is rather like the raising of a child while wiring up an animal is enslaving it.

    Now, I do agree when animals are trained with cruel compulsion it is a wrongful thing.

    As far as the slippery slope argument, I do do close by noting that I am aware of the fallacy and that the slide need not occur. As you note, we have changed our treatment of humans and animals, at least in the laws of many nations. Of course, human slavery is still widely practiced.

  9. Dennis,

    It depends on how the parasite works. In sci fi, the usual deal is that the parasite takes control of the nervous system, but the person retains their own mind, rather like a helpless passenger in a hijacked vehicle. The puppet masters of Heinlein and the Goa’uld of Stargate work this way.

    Of course, a parasite might also work another way, by directing the host’s feelings and decisions so that the host believes it is making the choices. In that case, the host would not know if s/he was being controlled (barring detection of the parasite, of course).

  10. The ethical notion of cybernetic augmentation is already with us, e.g., note the possible questionable use of paralympic technology.

    I note that you examine the question of control of will and use of technology in this respect. It might be argued, we are already engaging in this with the psychological conditioning of desires through marketing media. But for me this is not the biggest risk in the ethical sense.

    For me the issue is a transhumanist ethical question, about the ethical use of emerging technology (cybernetic as well as genetic) to augment our living experience. At the level of reducing suffering this is easier to handle than say extending the functionality and longevity of human life, to something other than what can be called “human”, creating the human 2.0 or transhmanist future from the technological singularity expected within the next 30 years or so by many.

    Given this, for me, the more pressing ethical challenge that the advent of sophisticated cyborg technology brings is… “for whom?”

    Assuming it is not just as an augmentation for the reduction of physical suffering and more of an enhancement to the opportunity for a more fulfilling life, the question is shall we create a society of haves and have nots ?

    Will we create a race of cosmists and terrans? A kind of Wellesian Time Machine future of Morlocks and Eloi…


  11. Mike,

    “There seems to be a moral difference between training and cybernetic control. When I train my dog, I am teaching my dog and she can elect to refuse to learn. Some dogs do that and dogs often refuse to learn certain things.”

    There is a moral issue in that hopefully you are participating in a relationship with your dog where it enjoys the training. The moral issue changes when your dog doesn’t want to be trained, if you are a police dog trainer for example: what moral responsibility is taken towards rejected dogs. But that still leaves unanswered the moral issue of ‘owning’ animals, and so using them for our ends; even if in the case of dogs and horses that early use is lost in time. Just as dogs have (artificially) adapted to ‘like’ their relationship with humans we could foresee a time when bugs would be changed to ‘like’ their new role as cyborgs.

    The difference you are identifying is about the current status of the relationships between humans and dogs on the one hand and humans and bugs on the other.

    “In contrast, being wired up is direct control and the animal has no choice. ”

    What choice do dogs really have? What if a dog chooses to have no owner and acts in a way that we consider wild and dangerous to us? Does he stand trial in any realistic sense? We have merely made the choice easier for them to live with by forcing them, through breeding, to adapt. While I see the difference you are making, it seems like a moral smokescreen. Many women in the Church of England think the church should not ordain women bishops. They clearly make that choice because of what they have been taught about their own faith. Do they have a choice in coming to that belief? Have they, like your dog, been adapted to be trained to obey their faith’s teachings?

    I think the ‘parasitic’ perspective is interesting and it does open up a wider context in which to view all these different types of relationships, but it also brings into question other points about how we want to apply our morals now, as opposed to the traditional perspectives we have held for some time; so I’ll make a separate comment.

    “Aristotle argues that the function of humans (as humans) is what defines virtue for us. Going against our function would be to fail in our excellence. The Taoists also present…”

    But I see this as ancient outmoded philosophical woo. The ‘function of humans is what defines virtue for us’? Well, yes, in an entirely human context. But our ‘excellence’? Isn’t this just so much bunk? Again, I need to explain my objection further.

  12. The specific term ‘parasitic’ can be viewed in broader terms as the interaction of systems, with no teleology implied – except that we traditionally apply teleological descriptions when it comes to what humans do. But then if we take into account the illusion of free-will even teleological intention is ultimately a cause and effect system that is centred on the uncontrolled behaviour of the brain. In a sense, our intention to harness and use a horse is a trick that we have learned to perform that is simply far more complex in its origins and development than the way some bug harnesses a snail for the conveyance of its offspring. We don’t recognise any teleological intention on the part of the bug – we class it as an unthinking unplanned acquired adaptation. But the harnessing of horses is a behavioural outcome of the adaptation of thinking, when applied to how we use our environment, of which the horse is a part. We sort of couldn’t help ourselves resort to using animals intentionally when we acquired the cognitive capacity to do so; no more than a parasitic bug infects a snail.

    The reason it gets interesting and confusing now is because we use that same cognitive adaptation to start wondering if we should stop parasitically using horses for racing, or monkeys to test drugs. Or we wonder if perhaps we should not use bugs as cyborgs. We have invented morals for our own interaction, and now feel morally obliged to apply them everywhere – especially if the subject seems to be sentient to any degree whatsoever; i.e. it has some neurons.

    Our moral urges are driving our application of our morals. It’s tempting to think in terms of memes, but they are only a descriptive label for some complex replicating or recursive behaviour, but instructive nevertheless. I think our recent (last century or so) attitude towards the wider application of morality is probably one of the many behaviours that we enjoy which had some evolutionary benefit in its origins, but which has become a system of behaviour in its own right – like playing games. We’ve stumbled into developing our moral behaviour; we get some satisfaction from it, and it helps things go more smoothly, most of the time. We stumbled into using animals.

    I feel that we are now trying to find our place in the world in a way that is not traditional. We are becoming more conscious about how we are making stuff up about morals as we go along. Traditionally we were all God’s creatures and part of his plan; or we were a part of some grand natural non-designed cosmological system where we turned out to be the pinnacle (as far as we could tell). But always we are at the top, as if custodians of the whole shebang (the earthly if not the heavenly for those that believe in that stuff), and so having some sort of moral obligation to the rest of the subordinate life and the planet we find around us. It may have been complex in detail, but we all sort of knew our place, or had intuitively decided what it was. Appearing to be at the top of the earthly food chain we assumed that responsibility, and concocted philosophical political and religious systems that enforced that principle. Or, “Going against our function would be to fail in our excellence”, as Mike echoed Aristotle. But, as I said, we stumbled into this state of affairs and all the religion and philosophy is an attempt to explain that state of affairs, the way we are.

    But as we learn more about human biology and behaviour and our place in a wider physical context it seems to be turning out that we are just one more arbitrary product of an unfolding system, and one that we don’t fully understand. We unconsciously use other animals as they use us, and it’s not always clear who is the greater beneficiary in a particular relationship. But there is no morality out there to which we can refer to decide whether any of this is good or bad.

    All our morals are contextually related to our human inheritance (“the function of humans (as humans) is what defines virtue for us”), but totally arbitrary in relation to anything to which we are not related.

    So ideas about the moral responsibility we have towards the planet, for example, are entirely based on what the planet means to us. There is no rational sense in which we owe any moral responsibility towards the rocky planet earth.

    Any morality we choose to consider in relation to other animals is somewhat both contextual and arbitrary. It’s contextual in that we are related to other animals to different degrees. But it’s arbitrary in that our morals developed about us, not other animals, and though many other animals have similar in-species biological relationships, there are many that are quite different to us and so to the moral standards we set for ourselves, such as eating one’s spouse after mating. How would we have adapted that particular biological behaviour into a moral code had it been a bit closer to us in our ancestry? Outside our context our morals are arbitrary. Eating your spouse, or parasitically using another animal (as a bug does a snail, or as we might cyborg a bug) – where’s the morality in this? We seem to apply it in some relation to the extent to which our biological empathy drives us, and we don’t have much control over that – we feel pain when cute kittens are harmed.

    There have been times when there would have been no reason, beyond any personal empathy for a particular animal, to consider the wellbeing of animals generally. There have been times when we haven’t been particularly equitable with our morals when it comes to members of our own species: races, groups, children, women; and most religions still aren’t, at least for the last. Is it more morally wrong for the C of E to rule out women bishops, or for humans to use bugs as cyborgs? Is it more wrong to saddle a horse, or to saddle a woman with childbirth instead of abortion if she would choose the latter? What’s the parasitic relationship between a mother and a fetus?

    Until we’re a lot clearer about what we think morals are, and until we have a greater agreement about that, some of the specific questions, such as the use of bugs as cyborgs, don’t have a foundation upon which to build a consistent rational argument. We are merely left to vote on it: John thinks all life is sacred and wouldn’t knowingly step on a bug; Jack thinks only ‘higher’ animals count (and he’ll draw his own line), so bug cyborgs are fine; Jim has no problem drowning kittens, flogging his horse, or hunting foxes, so thinks considering a bug’s status is pointless; Joe thinks only human life is sacred, but though bug cyborgs go against God’s intention (how does he know that?) still women can’t be bishops; Julian thinks the cosmic consciousness feels pain every time we dig up a field of daisies; …

    Personally I’m a materialist physicalist (using convenient labels). Some systems have a physical response that in human and some animal physical systems is classified as pain and suffering. Even as a physical system I acknowledge the subjective experience of pain and suffering and empathetically acknowledge it in many other species, and suspect it in yet others. And out of empathy I would choose to avoid it where I could. But specifically there is too much advantage to humans and too little knowledge of the pain and suffering a bug can experience, so I don’t have too much of a problem cyborging bugs. Or poisoning cockroaches and rats in our cities. The benefits to us outway my awareness of suffering in them; or, for cockroaches and rats, the harm they cause to us is greater (they have the beneficial hand parasitically) .

    Of course I also acknowledge that ignorance is no permanent excuse. What do we do if we find cockroaches suffer great pain and anguish as they die from our poisons, or that a cyborg bug struggles more psychologically with his lot than Sisyphus? Change our behaviour again I suppose?

Leave a Comment

NOTE - You can use these HTML tags and attributes:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>