ai_timelines:predictions_of_human-level_ai_timelines:ai_timeline_surveys:muller_and_bostrom_ai_progress_poll

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

ai_timelines:predictions_of_human-level_ai_timelines:ai_timeline_surveys:muller_and_bostrom_ai_progress_poll [2022/09/21 07:37] (current)
Line 1: Line 1:
 +====== Müller and Bostrom AI Progress Poll ======
 +
 +// Published 29 December, 2014; last updated 10 December, 2020 //
 +
 +<HTML>
 +<p>Vincent Müller and Nick Bostrom of FHI conducted a <a href="http://www.nickbostrom.com/papers/survey.pdf">poll of four groups of AI experts</a> in 2012-13. Combined, the median date by which they gave a 10% chance of human-level AI was 2022, and the median date by which they gave a 50% chance of human-level AI was 2040.</p>
 +</HTML>
 +
 +
 +===== Details =====
 +
 +
 +<HTML>
 +<p>According to <a href="http://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0199678111">Bostrom</a>, the participants were asked when they expect “human-level machine intelligence” to be developed, defined as “one that can carry out most human professions at least as well as a typical human”. The results were as follows. The groups surveyed are described below.</p>
 +</HTML>
 +
 +
 +<HTML>
 +<table border="1" cellspacing="0">
 +<tbody>
 +<tr>
 +<td></td>
 +<td> Response rate</td>
 +<td>10%</td>
 +<td>50%</td>
 +<td>90%</td>
 +</tr>
 +<tr>
 +<td> PT-AI</td>
 +<td> 43%</td>
 +<td>2023</td>
 +<td>2048</td>
 +<td>2080</td>
 +</tr>
 +<tr>
 +<td> AGI</td>
 +<td> 65%</td>
 +<td>2022</td>
 +<td>2040</td>
 +<td>2065</td>
 +</tr>
 +<tr>
 +<td> EETN</td>
 +<td> 10%</td>
 +<td>2020</td>
 +<td>2050</td>
 +<td>2093</td>
 +</tr>
 +<tr>
 +<td> TOP100</td>
 +<td> 29%</td>
 +<td>2022</td>
 +<td>2040</td>
 +<td>2075</td>
 +</tr>
 +<tr>
 +<td> Combined</td>
 +<td> 31%</td>
 +<td>2022</td>
 +<td>2040</td>
 +<td>2075</td>
 +</tr>
 +</tbody>
 +</table>
 +</HTML>
 +
 +
 +<HTML>
 +<p><b><em>Figure 1: Median dates for different confidence levels for human-level AI, given by different groups of surveyed experts (from <a href="http://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/0199678111">Bostrom, 2014</a>).</em></b></p>
 +</HTML>
 +
 +
 +<HTML>
 +<p>Surveyed groups:</p>
 +</HTML>
 +
 +
 +<HTML>
 +<p>PT-AI: Participants at the <a href="http://www.pt-ai.org/2011">2011 Philosophy and Theory of AI</a> conference (88 total). By the list of speakers, this appears to have contained a fairly even mixture of philosophers, computer scientists and others (e.g. cognitive scientists). According to the paper, they tend to be interested in theory, to not do technical AI work, and to be skeptical of AI progress being easy.</p>
 +</HTML>
 +
 +
 +<HTML>
 +<p>AGI: Participants at the 2012 AGI-12 and AGI Impacts conferences (111 total). These people mostly do technical work.</p>
 +</HTML>
 +
 +
 +<HTML>
 +<p>EETN: Members of the <a href="http://www.eetn.gr/">Greek Association for Artificial Intelligence</a>, which only accepts published AI researchers (250 total).</p>
 +</HTML>
 +
 +
 +<HTML>
 +<p>TOP100: The 100 top authors in artificial intelligence, by citation, in all years, according to <a href="http://academic.research.microsoft.com/RankList?entitytype=2&amp;topdomainid=2&amp;subdomainid=5&amp;last=0&amp;orderby=1">Microsoft Academic Search</a> in May 2013. These people mostly do technical AI work, and tend to be relatively old and based in the US.</p>
 +</HTML>
 +
 +
  
ai_timelines/predictions_of_human-level_ai_timelines/ai_timeline_surveys/muller_and_bostrom_ai_progress_poll.txt · Last modified: 2022/09/21 07:37 (external edit)