Редактирование:
Criticism of transhumanism
(раздел)
Перейти к навигации
Перейти к поиску
Внимание:
Вы не вошли в систему. Ваш IP-адрес будет общедоступен, если вы запишете какие-либо изменения. Если вы
войдёте
или
создадите учётную запись
, её имя будет использоваться вместо IP-адреса, наряду с другими преимуществами.
Анти-спам проверка.
Не
заполняйте это!
== The view that transhumanism will lead to 'gigadeath' == ''Main: [[Existential risks]]'' As a variant of the previous idea, it may prove to be the case that humans - even if biologically enhanced - are unable to compete with robotic lifeforms. Again, the threat of a fundamental division opens up. Some writers foresee this division developing into an existential clash, similar to what happened in human prehistory, when the Neanderthals became extinct in the face of competition from Homo Sapiens. For example, [http://www.forbes.com/2009/06/18/cosmist-terran-cyborgist-opinions-contributors-artificial-intelligence-09-hugo-de-garis.html Hugo de Garis] refers to a forthcoming "Artilect War": <blockquote> The issue of species dominance will dictate our global politics this century. Given the rate at which technologies are developing that enable “artilects”–artificial intellects–it is likely that humanity will be able to build artilects with mental capacities that are literally trillions upon trillions of times above the human level. Humanity will then have to choose whether to become the No. 2 species on the planet or not... In about a decade there will be a thriving artificial brain industry, and nearly everyone will have a home robot, which will be upgraded every two or three years. Each new home robot generation will be smarter and more useful than the previous generation, so that as the gap between the human intelligence level and the artificial intelligence level gets smaller every year, the species dominance debate will heat up. Millions of people will be asking such questions as: "Can the machines become smarter than humans? Is that a good thing? Should there be a legislated upper limit to machine intelligence? Can the rise of machine intelligence be stopped? What if China’s soldier robots are smarter than America’s solder robots?" And so on and so forth. Considering all this, I predict that humanity will split into three major philosophical, ideological, political groups, which I label as follows. –The Cosmists (based on the word “cosmos”) will be in favor of building these godlike machines (the artilects), who would be immortal, think a million times faster than humans, have unlimited memory, go anywhere, do anything and take any shape... –The Terrans (based on the word “terra,” meaning the earth) will be opposed to the construction of artilects, fearing that in a highly advanced form, the artilects may decide to wipe us out. To ensure that the probability that this might happen is zero, the Terrans will insist that the artilects are never built in the first place. But this strategy runs utterly contrary to what the Cosmists want. The Terrans will be prepared to go to war against the Cosmists to ensure the survival of the human species. –The Cyborgists (based on the word “cyborg,” meaning cybernetic organism that is part machine, part human) will want to become artilect gods themselves by adding artilectual components to their own brains, thus avoiding the bitter conflict between the Cosmists and the Terrans. </blockquote> De Garis contrasts his own forecast of the future with a view held by Ray Kurzweil, among others, that he describes as an "over-optimistic prediction that the rise of the artilect this century will be a positive development for humanity". De Garis continues as follows: <blockquote> I think it will be a catastrophe. I see a war coming, the “Artilect War,” not between the artilects and human beings, as in the movie ''Terminator'', but between the Terrans, Cosmists and Cyborgists. This will be the worst, most passionate war that humanity has ever known, because the stakes – the survival of our species – have never been so high. Given the period in which this war will occur, the late 21st century, with late 21st century weapons, the scale of the killing will not be in the millions, as in the 20th century (the bloodiest in history, with 200-300 million people killed in wars, purges, holocausts and genocides) but in the billions. There will be gigadeath. </blockquote> This threat of impending war is made worse by the following consideration: <blockquote> Kurzweil claims that if ever a war occurred between the Terrans and the other groups it would be a quick no-contest battle. The vastly superior intelligence of the artilect group would quickly overcome the Terrans. Therefore I claim that the Terrans will have to strike first while they can, during the “window of opportunity,” when they have comparable intelligence levels. </blockquote> Transhumanists can reply as follows: * The potential of growing divergence needs to be acknowledged in advance * As stated in the [http://humanityplus.org/philosophy/transhumanist-declaration/ Transhumanist Declaration], "Research effort needs to be invested into understanding these prospects. We need to carefully deliberate how best to reduce risks and expedite beneficial applications". * It is a core part of the transhumanist project to support initiatives by organisations that address [[Existential risk]]s.
Описание изменений:
Пожалуйста, учтите, что любой ваш вклад в проект «hpluswiki» может быть отредактирован или удалён другими участниками. Если вы не хотите, чтобы кто-либо изменял ваши тексты, не помещайте их сюда.
Вы также подтверждаете, что являетесь автором вносимых дополнений, или скопировали их из источника, допускающего свободное распространение и изменение своего содержимого (см.
Hpluswiki:Авторские права
).
НЕ РАЗМЕЩАЙТЕ БЕЗ РАЗРЕШЕНИЯ ОХРАНЯЕМЫЕ АВТОРСКИМ ПРАВОМ МАТЕРИАЛЫ!
Отменить
Справка по редактированию
(в новом окне)
Навигация
Персональные инструменты
Вы не представились системе
Обсуждение
Вклад
Создать учётную запись
Войти
Пространства имён
Статья
Обсуждение
русский
Просмотры
Читать
Править
История
Ещё
Навигация
Начало
Свежие правки
Случайная страница
Инструменты
Ссылки сюда
Связанные правки
Служебные страницы
Сведения о странице
Дополнительно
Как редактировать
Вики-разметка
Telegram
Вконтакте
backup