arguments_for_ai_risk:is_ai_an_existential_threat_to_humanity:argument_for_ai_x-risk_from_large_impacts

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

arguments_for_ai_risk:is_ai_an_existential_threat_to_humanity:argument_for_ai_x-risk_from_large_impacts [2023/02/12 04:31]
katjagrace ↷ Page moved from arguments_for_ai_risk:argument_for_ai_x-risk_from_large_impacts to arguments_for_ai_risk:is_ai_an_existential_threat_to_humanity:argument_for_ai_x-risk_from_large_impacts
arguments_for_ai_risk:is_ai_an_existential_threat_to_humanity:argument_for_ai_x-risk_from_large_impacts [2023/06/08 21:44] (current)
jeffreyheninger A few sentences were in a different font.
Line 32: Line 32:
 <li><div class="li">The creation of advanced AI is more likely to impact humanity’s long term trajectory than other developments (from 1)</div></li> <li><div class="li">The creation of advanced AI is more likely to impact humanity’s long term trajectory than other developments (from 1)</div></li>
 <li><div class="li">Developments with large impacts on the future are more likely to be worth influencing, all things equal, than developments with smaller impacts</div></li> <li><div class="li">Developments with large impacts on the future are more likely to be worth influencing, all things equal, than developments with smaller impacts</div></li>
-<li><div class="li"><span style='color: initial; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen-Sans, Ubuntu, Cantarell, "Helvetica Neue", sans-serif;'>One main way things might not be equal is that some developments may be easier to influence than others</span>, but AI currently looks tractable to influence.</div></li>+<li><div class="li">One main way things might not be equal is that some developments may be easier to influence than others, but AI currently looks tractable to influence.</div></li>
 <li><div class="li">The creation of advanced AI is especially likely to be worth trying to influence, among other developments</div></li> <li><div class="li">The creation of advanced AI is especially likely to be worth trying to influence, among other developments</div></li>
 </ol> </ol>
Line 94: Line 94:
  
 <HTML> <HTML>
-<p>(We treat 2 as a separate argument, ‘<em style='font-size: revert; color: initial; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen-Sans, Ubuntu, Cantarell, "Helvetica Neue", sans-serif;'><a href="https://aiimpacts.org/argument-from-most-intelligent-species/">Argument from most intelligent species</a></em>‘.)</p>+<p>(We treat 2 as a separate argument, ‘<em><a href="https://wiki.aiimpacts.org/doku.php?id=arguments_for_ai_risk:is_ai_an_existential_threat_to_humanity:will_malign_ai_agents_control_the_future:argument_for_ai_x-risk_from_most_intelligent_species">Argument from most intelligent species</a></em>‘.)</p>
 </HTML> </HTML>
  
arguments_for_ai_risk/is_ai_an_existential_threat_to_humanity/argument_for_ai_x-risk_from_large_impacts.1676176302.txt.gz · Last modified: 2023/02/12 04:31 by katjagrace