LessWrong/en: различия между версиями

31 байт убрано ,  17 апреля 2021
нет описания правки
(Новая страница: «{{languages|LessWrong}} '''LessWrong''' (aka '''Less Wrong''') is a discussion forum founded by Eliezer Yudkowsky focused on rationality and futurist thinki...»)
 
Нет описания правки
Строка 4: Строка 4:


== History ==
== History ==
According to the LessWrong FAQ,{{cn}} the site developed out Overcoming Bias, an earlier group blog focused on human rationality. Overcoming Bias originated in November 2006, with artificial intelligence (AI) theorist Eliezer Yudkowsky and economist [[Robin Hanson]] as the principal contributors. In February 2009, Yudkowsky's posts were used as the seed material to create the community blog LessWrong, and Overcoming Bias became Hanson's personal blog.{{cn}}
According to the LessWrong FAQ,the site developed out Overcoming Bias, an earlier group blog focused on human rationality. Overcoming Bias originated in November 2006, with artificial intelligence (AI) theorist Eliezer Yudkowsky and economist [[Robin Hanson]] as the principal contributors. In February 2009, Yudkowsky's posts were used as the seed material to create the community blog LessWrong, and Overcoming Bias became Hanson's personal blog.


LessWrong has been closely associated with the [[effective altruism]] movement. Effective-altruism-focused charity evaluator GiveWell has benefited from outreach to LessWrong.{{cn}}
LessWrong has been closely associated with the [[effective altruism]] movement. Effective-altruism-focused charity evaluator GiveWell has benefited from outreach to LessWrong.


== Roko's basilisk ==
== Roko's basilisk ==
In July 2010, LessWrong contributor Roko posted a thought experiment to the site in which an otherwise benevolent future AI system tortures anyone who does not work to bring the system into existence. This idea came to be known as "Roko's basilisk," based on Roko's idea that merely hearing about the idea would give the hypothetical AI system stronger incentives to employ blackmail. Yudkowsky deleted Roko's posts on the topic, later writing that he did so because although Roko's reasoning was mistaken, the topic shouldn't be publicly discussed in case some version of the argument could be made to work. Discussion of Roko's basilisk was banned on LessWrong for several years thereafter.{{cn}}
In July 2010, LessWrong contributor Roko posted a thought experiment to the site in which an otherwise benevolent future AI system tortures anyone who does not work to bring the system into existence. This idea came to be known as "Roko's basilisk," based on Roko's idea that merely hearing about the idea would give the hypothetical AI system stronger incentives to employ blackmail. Yudkowsky deleted Roko's posts on the topic, later writing that he did so because although Roko's reasoning was mistaken, the topic shouldn't be publicly discussed in case some version of the argument could be made to work. Discussion of Roko's basilisk was banned on LessWrong for several years thereafter.


== Media coverage ==
== Media coverage ==
LessWrong has been covered in Business Insider and Slate. Core concepts from LessWrong have been referenced in columns in The Guardian.{{cn}}
LessWrong has been covered in Business Insider and Slate. Core concepts from LessWrong have been referenced in columns in The Guardian.


LessWrong has been mentioned briefly in articles related to the technological singularity and the work of the [[Machine Intelligence Research Institute]] (formerly called the Singularity Institute). It has also been mentioned in articles about [[Neo-reactionary movement|online monarchists and neo-reactionaries]] in positive light.<ref>http://techcrunch.com/2013/11/22/geeks-for-monarchy/</ref>
LessWrong has been mentioned briefly in articles related to the technological singularity and the work of the [[Machine Intelligence Research Institute]] (formerly called the Singularity Institute). It has also been mentioned in articles about [[Neo-reactionary movement|online monarchists and neo-reactionaries]] in positive light.<ref>http://techcrunch.com/2013/11/22/geeks-for-monarchy/</ref>
3311

правок