Transhumanist Declaration

Материал из hpluswiki
Перейти к навигации Перейти к поиску

A video montage going through the transhumanist declaration (July 16, 2018)

The Transhumanist Declaration was originally crafted in 1998 at the time of formation of the World Transhumanist Association by members of WTA, Extropy Institute and other transhumanist groups.

Overview[править]

This Transhumanist Declaration has been modified over the years by several authors and organizations.[1] It was adopted in its present form by the Humanity+ Board in March, 2009.[2][3] The first version was created at the time of the formation of the World Transhumanist Association with the following international group of contributors including:

Doug Baily, Anders Sandberg, Gustavo Alves, Max More, Holger Wagner, Natasha Vita-More, Eugene Leitl, Bernie Staring, David Pearce, Bill Fantegrossi, den Otter, Ralf Fletcher, Kathryn Aegis, Tom Morrow, Alexander Chislenko, Lee Daniel Crocker, Darren Reynolds, Keith Elis, Thom Quinn, Mikhail Sverdlov, Arjen Kamphuis, Shane Spaulding, and Nick Bostrom. From this group of authors, more than 70% of contributors were members of Extropy Institute and supported extropianism

It is influenced by the earlier transhumanist manifesto by Natasha Vita-More.

Versions[править]

Latest 2009 version (wikified)[править]

The latest version on the new Humanity+ site:[4]

1. Humanity stands to be profoundly affected by science and technology in the future. We envision the possibility of broadening human potential by overcoming aging, cognitive shortcomings, involuntary suffering, and our confinement to planet Earth.

2. We believe that humanity’s potential is still mostly unrealized. There are possible scenarios that lead to wonderful and exceedingly worthwhile enhanced human conditions.

3. We recognize that humanity faces serious risks, especially from the misuse of new technologies. There are possible realistic scenarios that lead to the loss of most, or even all, of what we hold valuable. Some of these scenarios are drastic, others are subtle. Although all progress is change, not all change is progress.

4. Research effort needs to be invested into understanding these prospects. We need to carefully deliberate how best to reduce risks and expedite beneficial applications. We also need forums where people can constructively discuss what should be done, and a social order where responsible decisions can be implemented.

5. Reduction of existential risks, and development of means for the preservation of life and health, the alleviation of grave suffering, and the improvement of human foresight and wisdom should be pursued as urgent priorities, and heavily funded.

6. Policy making ought to be guided by responsible and inclusive moral vision, taking seriously both opportunities and risks, respecting autonomy and individual rights, and showing solidarity with and concern for the interests and dignity of all people around the globe. We must also consider our moral responsibilities towards generations that will exist in the future.

7. We advocate the well-being of all sentience, including humans, non-human animals, and any future artificial intellects, modified life forms, or other intelligences to which technological and scientific advance may give rise.

8. We favour allowing individuals wide personal choice over how they enable their lives. This includes use of techniques that may be developed to assist memory, concentration, and mental energy; life extension therapies; reproductive choice technologies; cryonics procedures; and many other possible human modification and enhancement technologies.

2002 version[править]

The last version hosted on the old WTA site:[5]

(1) Humanity will be radically changed by technology in the future. We foresee the feasibility of redesigning the human condition, including such parameters as the inevitability of aging, limitations on human and artificial intellects, unchosen psychology, suffering, and our confinement to the planet earth.

(2) Systematic research should be put into understanding these coming developments and their long-term consequences.

(3) Transhumanists think that by being generally open and embracing of new technology we have a better chance of turning it to our advantage than if we try to ban or prohibit it.

(4) Transhumanists advocate the moral right for those who so wish to use technology to extend their mental and physical (including reproductive) capacities and to improve their control over their own lives. We seek personal growth beyond our current biological limitations.

(5) In planning for the future, it is mandatory to take into account the prospect of dramatic progress in technological capabilities. It would be tragic if the potential benefits failed to materialize because of technophobia and unnecessary prohibitions. On the other hand, it would also be tragic if intelligent life went extinct because of some disaster or war involving advanced technologies.

(6) We need to create forums where people can rationally debate what needs to be done, and a social order where responsible decisions can be implemented.

(7) Transhumanism advocates the well- being of all sentience (whether in artificial intellects, humans, posthumans, or non- human animals) and encompasses many principles of modern humanism. Transhumanism does not support any particular party, politician or political platform.

This version carries the explanatory note:

The Declaration was modified and re-adopted by vote of the WTA membership on March 4, 2002, and December 1, 2002.

Note that the changes from the 1998 version (below) are relatively few.

July 1998 version (2.4)[править]

This is the earliest known published version of the document.[6]

(1) Humanity will be radically changed by technology in the future. We foresee the feasibility of redesigning the human condition, including such parameters as the inevitability of ageing, limitations on human and artificial intellects, unchosen psychology, suffering, and our confinement to the planet earth.

(2) Systematic research should be put into understanding these coming developments and their long-term consequences.

(3) Transhumanists think that by being generally open and embracing of new technology we have a better chance of turning it to our advantage than if we try to ban or prohibit it.

(4) Transhumanists advocate the moral right for those who so wish to use technology to extend their mental and physical capacities and to improve their control over their own lives. We seek personal growth beyond our current biological limitations.

(5) In planning for the future, it is mandatory to take into account the prospect of dramatic technological progress. It would be tragic if the potential benefits failed to materialize because of ill-motivated technophobia and unnecessary prohibitions. On the other hand, it would also be tragic if intelligent life went extinct because of some disaster or war involving advanced technologies.

(6) We need to create forums where people can rationally debate what needs to be done, and a social order where responsible decisions can be implemented.

(7) Transhumanism advocates the well-being of all sentience (whether in artificial intellects, humans, non-human animals, or possible extraterrestrial species) and encompasses many principles of modern secular humanism. Transhumanism does not support any particular party, politician or political platform.

March 1998 version (2.1)[править]

This version, named "Transhumanist Principles 2.1", was published for comment on the EXI mailing list on 27 Mar 1998 by Nick Bostrom.[7]

Hello everybody! On the WTA action list we have been discussing an updated formulation of a set of transhumanist principles. The latest version is as follows. Comments and suggestions are welcome...

(1) Humanity will be radically changed by technology in the future. We foresee the feasibility of redesigning the human condition, including such parameters as the inevitability of ageing, limitations on human and artificial intellects, unchosen psychology, suffering, and our confinement to the planet earth.

(2) Serious effort should put into understanding these coming developments and their long-term consequences.

(3) Transhumanists believe that by being generally open and embracing of new technology we have a better chance of turning it to our advantage than if we try to ban or prohibit it.

(4) Transhumanists advocate the moral right for those who so wish to use technology to extend their mental and physical capacities and to improve their control over nature. We seek personal growth beyond our current biological limitations.

(5) In planning for the future, it is mandatory to take into account the prospect of dramatic technological progress. It would be tragic if the enormous potential benefits failed to materialize because of ill-motivated technophobia and unnecessary prohibitions. On the other hand, it would also be tragic if intelligent life went extinct because of some disaster or war involving advanced technologies.

(6) We need to create forums where people can rationally debate what needs to be done, and a social order where responsible decisions can be implemented.

(7) Transhumanists support scientific humanism. Though as a matter of fact most transhumanists believe in individual freedom, transhumanism in itself is politically neutral.

The following people have contributed comments: Doug Bailey, Anders Sandberg, Gustavo Alves, Max More, Holger Wagner, Natasha More, Eugene Leitl, Doug Baily Jr., Berrie Staring, David Pearce, Bill Fantegrossi, den Otter, Ralf Fletcher

Transhumanist Principles 1.0a[править]

Another important input to the Transhumanist Declaration was the "Transhumanist Principles 1.0a". The following version on the website of Anders Sandberg dates from 1996:

1. Transcend!

Strive to remove the evolved limits of our biological and intellectual inheritance, the physical limits of our environment, and the cultural and historical limits of society that constrain individual and collective progress.

2. Pragmatism.

Use whatever tools prove effective toward this goal. Technology, and the intellectual disciplines used to develop it, are currently among the most effective such tools.

3. Memetic propagation.

Support the proliferation of transhumanist principles and goals, consciously setting an example that others may follow or promoting the principles of transhumanism directly. Spread awareness of the dangers of technophobia, coercion, anti-humanism and other destructive ideologies.

4. Achievement.

Whether seeking health, fitness, intellectual goals, or financial or social success or political accomplishment, strive to achieve your individual ambitions. Cooperate with other innovators and optimists to reach goals both personal and global.

5. Diversity.

Promote human efforts to grow and adapt to an ever-changing universe. Tolerate people of all schools of thought that do not seek to limit the extent or variety of your achievement. Discourage any attempts to impose will or ideas through coercion.

6. Evolution.

These principles should evolve, in order to address the needs of future Transhumanity; but resist any change in the principles that limits transhuman activity.

These principles were preceded by the following note:

About this document - by Alexander Chislenko

The following is a draft Transhumanist principles that has been put forward and discussed by a group of Transhuman list members. The idea of writing the Principles belongs to Alex Bokov, who has also been a major driving force in the discussions. Other participants include Anders Sandberg, Rich Artym, Nancie Clark, Romana Machado, Sasha Chislenko, Mark A. Plus, and Christopher T. Brown.

The purpose of these Principles is to define a "consensus platform" of Transhumanism that would allow us to see what ideas and goals we have in common as a group, and to present them to people trying to understand what this transhumanism is all about.

This purpose apparently cannot be achieved by a small group of people, so the following document is just a draft that as we hope will be discussed and modified on the list, after which we can agree on the first "official" version of Principles.

The point of preliminary discussion was to spare the list members from a flood of technical details. The complete archive of the discussion is currently available at http://www.lucifer.com/~sasha/refs/Principles_Archive.html

Proposed alternative versions[править]

In 2013, Transhumanity.net ran a competition asking for new versions of the Transhumanist Declaration. The winning submission was by Dirk Bruere. Entries by Samantha Atkins and Jason Xu tied for second place. The entries are available on an archive of Transhumanity.net.[8]

A "redux" version proposed by Nikola Danaylov is available on Singularity Weblog, in both text and video format.[9]

Related declarations and manifestos[править]

Zoltan Istvan's The Transhumanist Wager featured the concept of "Teleological Egocentric Functionalism" and the "3 laws of transhumanism", possibly as principles of what has been characterised as survivalist transhumanism:[10]

1) A transhumanist must safeguard one's own existence above all else.

2) A transhumanist must strive to achieve omnipotence as expediently as possible--so long as one's actions do not conflict with the First Law.

3) A transhumanist must safeguard value in the universe--so long as one's actions do not conflict with the First and Second Laws.

The Technoprogressive Declaration was issued on the IEET website on November 2014[11]. It was drafted in a side-meeting of TransVision 2014 in Paris.

Ben Goertzel and Giulio Prisco published Ten Cosmist Convictions in 2009[12].

The Brussels Declaration for Radical Healthspan Extension was adopted on 1st October 2016 at the Eurosymposium on Healthy Ageing held in Brussels[13].

See also[править]

References[править]