[PHILOSOPHY] Against Transhumanism

Transhumanism is basically the idea that biotechnology and information technology will merge and eventually improve the human body and mind. More intelligence. Cancer-eating nanobots in the blood vessels. Increased longevity and perhaps even eternal life. Simply everything we could wish for - after we've lost the ability to ask for anything other than what the techno-commercial masters claims to be able to give us, that is. But will we still be human if we do all this to ourselves? Transhumanists reply: "No! And the sooner the better! We must evolve beyond the human."

Transhumanism and post-humanism is about, in advance, learning to accept a hypothetical technology which main function is to be a retroactive compulsion. A technology that will force us to say: "This technology is now a reality whether we like it or not. We have become dependent on it, so let's just adapt." This is the bitter little triumph that transhumanism has to offer, and its representatives are already celebrating. We are supposed to invest so much time and unfortunate fantasies in this biotechnological concept of the future that we don't have any options left - so that we have to actualize it. We're not there yet, but we are made to believe that it's too late to change course for any other possible future.

You say you want to make man better. What do you mean by "better"? Better for what purpose? After a few such questions, peeling off layer after layer of ill-conceived notions, this "better" will prove to be completely hollow.

If you want to make a transhumanist miserable, you should, for any future improvement that he's listing, in turn respond with Socratic questions:

Transhumanist: "We will be able to live 150 years - at least!" 

You: "Yes, but why is it important?"

Soon, you'll both discover that transhumanism doesn't have anything other than acceleration to offer. An acceleration that quickly becomes a desperate end in itself - and to drown the looming realization that the acceleration never reaches anything the pace is turned up faster and faster.

Behind this visions of a hyper-advanced technology lies a handful of simplistic values. This is especially apparent in Ray Kurzweil, one of the future visionaries who have gone furthest in this direction. He's hoping for something called "the singularity" - a technological big bang which is expected to occur when the accelerating artificial intelligence takes a leap beyond anything any human can imagine. An exciting thought, but what does he mean by intelligence? His conception of intelligence most of all resembles a five year old's idea of horsepower: a million horsepowers must be better than 250! All Kurzweil wishes for is more.

People who sell these kind of visions of the future are remarkably fond of biological metaphors. Transhumanism, artificial intelligence and "the singularity" is described as a continuation of the 'explosion' of life from unicellular organisms to humans. The biological metaphors make their vision of the future seem natural and inevitable. By implication: saying no to their future is just as stupid as if a bacteria three billion years ago would have refused evolution. Another implication is that evolution somehow would have stagnated after reaching humanity - a hard technological kick in the ass is what it needs to get to the next level.

Transhumanism is a fantasy of power. Just as much, it's a fantasy of our total submission to technology. Therein lies the double allure that makes it so appealing. Man is as strongly attracted by submission as he is to power. We so much want to believe that "development" is a power we have to submit ourselves to, just as a stone age farmer was a slave to the weather. But development is a tapestry of human decisions - economic, political, everyday and scientific. Therefore, the development is in our responsibility. The qualities we should be practicing for the future is not submission and acceptance, but rather judgment and sense of responsibility.

The greatest challenge now is to get people to lose interest in transhumanism. How will that come about? If we want to find something that attracts us more than transhumanism we must drop the idea that development and evolution is a narrow line along which we can only move in two directions, forward or backward.

From the point where we are now, an incalculable number of possible futures radiates in all directions of the compass. "Forward" is not a direction; forward is 360 degrees. The proponents of transhumanism want to make it appear as the only way forward. All the other possible futures that await in the remaining 359 directions we have not even begun to imagine.

H2
H3
H4
3 columns
2 columns
1 column
16 Comments