Think Tank

Prominent Scientists, Faith Leaders, Policymakers and Artists Call for a Prohibition on Superintelligence, as Poll Shows Americans Don’t Want It

Initial signatories include AI pioneers Yoshua Bengio and Geoffrey Hinton, leading media voices Steve Bannon and Glenn Beck, Obama's National Security Advisor Susan Rice, business trailblazers Steve Wozniak and Richard Branson, five Nobel Laureates, former Irish President Mary Robinson, actors Stephen Fry and Joseph Gordon-Levitt, and hundreds of others.

C
Chase Hardin
· · 1 min read · 21 views

Initial signatories include AI pioneers Yoshua Bengio and Geoffrey Hinton, leading media voices Steve Bannon and Glenn Beck, Obama's National Security Advisor Susan Rice, business trailblazers Steve Wozniak and Richard Branson, five Nobel Laureates, former Irish President Mary Robinson, actors Stephen Fry and Joseph Gordon-Levitt, and hundreds of others.

Executive Summary

A coalition of prominent scientists, faith leaders, policymakers, and artists has called for a prohibition on superintelligence, citing concerns over its potential risks and unintended consequences. The initiative garners significant attention, with initial signatories including influential figures from AI research, media, politics, and entertainment. A concurrent poll suggests that Americans are skeptical about the development of superintelligence, with many expressing reservations about its potential implications. While the proposal may be well-intentioned, it raises questions about the feasibility and practicality of restricting the development of superintelligence. Furthermore, the initiative's broad coalition of supporters may dilute its credibility and effectiveness as a call to action.

Key Points

  • Prominent coalition calls for prohibition on superintelligence
  • Signatories include influential figures from AI research, media, politics, and entertainment
  • Poll suggests Americans are skeptical about superintelligence development

Merits

Raises awareness about superintelligence risks

The initiative highlights the potential risks and unintended consequences associated with superintelligence, sparking a necessary conversation about its development and implications.

Broadens the discussion on superintelligence

The coalition's diverse membership brings together individuals from various fields, expanding the scope of the discussion and encouraging a more comprehensive understanding of superintelligence's complexities.

Demerits

May be overly broad or unrealistic

The call for a prohibition on superintelligence may be overly ambitious, given the complexity of AI research and development, and the potential for unintended consequences of restricting its progress.

Lacks clear definition or boundaries

The initiative's scope and goals may be unclear, leading to confusion and diluting its impact as a call to action.

Expert Commentary

While the initiative is well-intentioned, it is essential to approach the call for a prohibition on superintelligence with caution and nuance. Superintelligence is a complex and multifaceted concept, and its development and implications cannot be reduced to a simple prohibition. A more effective approach may involve developing and promoting safety-oriented practices and guidelines within the AI research community, rather than attempting to restrict its progress. Furthermore, the initiative's broad coalition of supporters may benefit from a more focused and coordinated effort to address the challenges and opportunities associated with superintelligence.

Recommendations

  • Develop and promote safety-oriented practices and guidelines within the AI research community
  • Establish a more focused and coordinated effort to address the challenges and opportunities associated with superintelligence

Sources

Original: Future of Life Institute