Transhumanity
where strange brilliant ideas for the future intermingle and breed…

Home > Articles > The Death of Humanity

The Death of Humanity

Posted: Sun, December 30, 2012 | By: Dirk Bruere



Something I wrote on Facebook seems to have sparked a controversy, not least with the editor of H+ Magazine the voice of HumanityPlus, Peter Rothman. The exact post is this:

“Well, something that might well have contributed to me being kicked from the WTA list is that I view H+ is essentially a project to murder the Human race. If we get our way then in a few centuries tops there won’t be any Humans about, unless its a few scraps on a reservation or zoo somewhere.”

What do I mean by this? Well, for a start “murder” is a word I use rhetorically, although a project whose core meme includes creating a vastly superior “PostHuman” species that will inevitably out compete us in all spheres might reasonably lead to that word. The problem is that although we may merge, be absorbed by or voluntarily become that new species it will inevitably lead to the extinction of the current Human genome and phenotype, not to mention the more vague “Human mentality”. To illustrate, let’s take a step back and consider a hypothetical “TransNeanderthalism” or N+ for short.

Here we have a group of Neanderthals dedicated to creating a better species to be called “PostNeanderthals”, or more succinctly, “Humans”. Now, the N+ people have decided that the new creation will probably incorporate some Neanderthal DNA but otherwise it will be smarter, faster, more dextrous, better at tool creation and use, breed faster and require less energy to live. The predictable downside may well be that the new Humanity out-competes Neanderthals and drives them to extinction, not deliberately but simply as a consequence of always getting the resources that the Neanderthals need to flourish.

So, what if someone said to the N+ group that they are plotting to murder all Neanderthals and essentially commit genocide? Would that be a valid accusation? With hindsight, one might argue “Yes”.

Which bring us to modern Transhumanism (H+). We are engaged in a variety of projects that may well result in a PostHuman species. It may be something similar to what we already are if it’s merely genetic engineering, or it may be something utterly alien if it is one of a variety of potential Artificial General Intelligences. Intermediates candidates would be brain simulations or even uploaded Human or animal minds that have been enhanced to beyond Human capabilities.

There is much discussion about how such creations may impact Humanity in the short term but almost everyone agrees that old style Homo Sapiens will not have a long term future as masters of their own

destiny. The best we might hope for is pampered pet status or simply to be left alone on reservations (which might include the entire Earth). The worst scenario would be to be treated as we treat the natural world and other animal species now – driven to extinction as “collateral damage” because PostHuman wants and needs outweigh the value of our lives.

Can we stop all this? No – not without wrecking the modern world and crashing the population to hunter-gatherer levels. That would mean a ninety five percent death rate over the next few decades. A cure worse than the disease. What we can do is try and look ahead and choose the best path available. Nevertheless, my claim that we are plotting the extinction of Humankind stands.

Of course, this is not popular from a public relations and fund raising point of view which is why some organizations like to dress up H+ in rather bland terms and keep quiet on these issues. Indeed, I have even been told that a poll question I would like to address to Transhumanists is meaningless.

 It is this: If you knew that the price for creating a finer and greater intelligence on Earth was ultimately the extinction of Humanity would you still do it?

I suspect the answer would shock non Transhumanists – but I do not know since the question has never been posed to a representative sample. Oh… and my personal answer is “Yes”. Since in the natural way of things I am going to die anyway I would rather roll the dice and have a shot at Godhood.

My last Facebook post on that topic:

I don’t think it unreasonable that if a video was sent back from the future and it showed a world filled with alien creatures, unimaginable technology and no Human Beings that most people would assume we were extinct. Nor that if they were told all that was planned by H+ people now alive that many would consider it genocide.

 

___________

Dirk

Full Spectrum Praxis : ZERO STATE : http://zerostate.net

 



Comments:

No what I objected to was the support for the writings of Vladimir Frolov who calls for the murder of the human race as a solution to the word’s problems.

Then you wrote:

“I view H+ is [sic] essentially a project to murder the Human race.”

“I think it is quite a legitimate position to hold that H+ will result in the death of our species”

“‘Murder’ is a matter of perception and rhetoric”

Lot’s of murders would agree with I suppose but I don’t.

By Peter Rothman on Dec 30, 2012 at 8:24am

transhuman short stories that address some of this/

i have posted them free


the story of the hard start is at kindle “the quantum sausage machine”

By bob hanson on Dec 30, 2012 at 8:38am

This is a problem when an abstract form of we is used.  You define humanity as a certain genotype and phoneme.  This is the same as defining my as my body.  I am more than the physical form that shows up on a recording.  I exist within an environment of energies as a living being with memories and personality. 

Transhuman thought seeks to make me selfdirected to being better than what i am and this does not destroy me or humanity but rather allows me to create a better me of my choosing.

By Sean Byrne on Dec 30, 2012 at 9:54am

Ever watch the new Doctor WHO episodes?

The “death of humanity” is a meme constantly addressed and overturned. The Doctor is constantly pointing out that humans still exist even though the “human” of our age has changed and evolved and interbred and moved beyond the petty definitions of humanity that we cling to so violently today.

I’ve addressed this so many times. Our “form” and our “limitations” are not what define us as human. But to those with limited worldviews, I can see where you come from.

By Valkyrie Ice on Dec 31, 2012 at 7:54am

Of course, there is also a far from zero probability that the word “murder” might be perfectly accurate if we end up with a hostile or even indifferent AGI as the PostHuman species. It is far from clear that any such species will contain anything at all of Humanity in any form, apart from maybe a bit of Linux code.

By Dirk Bruere on Dec 31, 2012 at 9:08am

If I plan to educate a 5 year old child, to teach him what the world is like, to teach him to think, to develop his brain, I am started a course of events that will result in dramatically changing who that child is over the next decade or so.  Is that “murder”?  And if I plan to educate myself to make myself into something smarter, to teach myself to think better, is that the equivalent of suicide?

To live is to change.  I don’t think we’re going to “murder” anyone; I think that most of the human race will join the transhuman project in one way or another.  Advancing ourselves isn’t “murdering the human race”; we can’t stay who we are and stay were we are, freeze ourselves in time, any more then that 5 year old child can.  Where we are is simply not a sustainable long-term place.  Either we adapt, change, one way or another, or we die.

By Yosarian on Dec 31, 2012 at 10:48pm

It is also worth noting that the view I am espousing is not one commonly held by H+ people. However, “ordinary” people react strongly even to the threatened loss of cultural identity, let alone elimination of genome and phenotype. The latter is likely to go far beyond issues like nationalism, gender bias, homophobia etc, all of which have resulted in mass murder.

By Dirk Bruere on Jan 01, 2013 at 5:35am

Hi Dirk. I hope you get to put your question to a sample properly. The response would be very interesting. Of course, the exact wording of the question will strongly influence the answer, so it might make sense to have a few alternatives and to put them to different, representative groups. An expensive business, but a worthwhile one, I think. It will be useful in writing the public-facing communications of the future.

On the other hand, I do think it appropriate that a well-managed organisation should properly control its communications. This is nothing to do with freedom of speech. A nutter can always post his opinions in a blog or through letterboxes. Organisations with members and a controlling board, like H+, are under no obligation to share whatever articles get posted to them. That isn’t censorship; it’s sound management. (Or, at least, it can be sound management if it’s done right.)

Fear seems to be the problem inherent in the substantive issue, here. Once one reaches the point of recognising that each human being is already a part of something much larger* and that none of us really matter, the fear tends to fade, because there is less to lose. Subjectively I find such recognition to be a very warm, welcoming feeling, rather like going home. One can easily imagine a sci-fi writer turning this into a story about sheeple rescued from mindless extinction by the manipulative H+ AI at the last moment by an action hero; yet I think you are right, despite that. It’s a very tough sell.

*Incidentally, I learned yesterday that current leading astrophysical theory is that the ratio between the sizes of an atom and the observable universe (warp drives aside) is the same as the ratio between the sizes of the observable universe and the whole universe. Woah. That’s without going anywhere near the many-worlds interpretation. That brings me to the ‘shot at godhood’ comment in your article. The scale is past fathomable. It’s not easy to see how even God could fathom it. Mental.

By Darren Reynolds on Jan 02, 2013 at 6:30am

Hi Derren. What I have been trying to do with this article, and the one on the Borg is dig into what seems to be the unwritten rule of Transhumanism, namely that some things should not be spoken of. Or if they are, they must be worded “correctly” for fear of frightening the sheep. It’s a very elitist view IMHO.

Which brings me to the possible ramifications of such a poll. Imagine this being picked up by the popular mass media and this headline appearing:
“X% of Transhumanists would kill you and your family to make their sick dreams come true”

By Dirk Bruere on Jan 02, 2013 at 10:45am

This should be completely obvious to anyone who has studied all pertinent details.

By Khannea on Jun 27, 2013 at 9:25am


Leave a Comment:

Note We practice Buddhist Right Speech in our communication. All comments must be polite, friendly, and on topic.







What color is a yellow house?