Upload
avturchin
View
37
Download
0
Embed Size (px)
Citation preview
Organizations, sites and people involved in x-risks prevention
AI risks General x-risks NuclearGlobal
warming Bio-risksSize and level of influence
MIRI(Former Singularity insti-
tute)E.Yudkowsky
link
FHIFuture of humanity institute
Oxford, linkNick Bostrom
CSERCambridge center of existential risks
Martin Rees, link
FLIFuture of life institute
Elon Musklink
GCRIGlobal catastrophic risks
institute,Seth Baum
Interesting newsletter, many articles in scientific journals
link
X-risks institute
Phil Torrres
Interesting articles by its main author, focus on exis-
tential terrorism and religionlink
ConvergenceJustin Shovelain
Collective think tank concentrat-ed on mathematical modeling of
x-risks link
Lifeboat foundation
Very large scientific boards which don’t actually do anything, but some useful discussion is going in its mail listsite
mail list - good one
X-risks on
RedditExistentiarisk
Control problem
Willard Wells
Author of “Apoc-alypses when”and prevention
plan
Bulletin of atomic
scientistsFamous doomsday
clocklink
Arctic newsSam Carana
Irreversible global warming because of methane hydrates
eruptionlink
World Health Organization
(WHO)
includes a division called the Global Alert and Response (GAR) which mon-itors and responds to global epidemic crisis. GAR helps member states with
training and coordination of response to epidemics. link
Global challenges
Laszlo SzombatfalvyThe Global Challenges Foundation works to raise awareness of the Global Catastrophic Risks. Primari-ly focused on climate change, other environmental degradation and po-litically motivated violence as well as how these threats are linked to poverty and rapid population growth, link
Known very well
and large amount of work
is done
Large and interest-
ing research has been cre-ated, but not much people knows them
Open places for
discussion
Longecitysubforum
link
Foundational research institute
“Currently, our research focus-es on reducing risks of dysto-pian futures in the context of emerging technologies”.Interesting work on AI safety
link
Club of RomeStill exist!
Were famous in 1970s when they pro-duced “Limits of growth” Link
CISAC“The Center for Internation-al Security and Cooperation is Stanford University’s hub for re-searchers tackling some of the world’s most pressing securi-ty and international cooperation problems”Nuclear, cybersecurity, bio, anti-terrorism, link
EAEffective altruism
EA forum
Facebook• Existential risks (Adam Ford)• Global Catastrophic Risks Re-search and Discussion (Evan Gaensbauer)
• Global catastrophic risks• Stop existential risks
Nuclear threat initia-
tive link
Saving Humanity
from Homo SapiensSmall one person organisation
without any actual worklink
Jaan Tallinn
investor in x-re-lated
projects, wiki
Skoll Global Threats Fund
“To safeguard humanity from global threats” Climate, water security, pandem-ics, nuclear proliferation, link
StimsonCenter
“The Stimson Center is a nonpartisan policy research center working to solve the world’s greatest threats to security and prosperity”.non-prolifiration
link
Investors and important
figures
Peter Thiel
Invested in MIRI
Elon MuskWant AI safety through Open AI and human on Mars as a
backup plan
Scientists and re-
searchers
Less-wrong
Adrian Kent
LHC risks
Tobi Ord
site“existential
hope”
Impact risks
Robin Hanson
BlogSocietal collapse
risks
Katja Grace
Fermi paradox and DA
blogAI impacts
A. Sandberg
Participated in FHI and co-au-thored papers
R. Freitas
Nanotech risks
Ploughshares Fund
“Supports the smartest minds and most effective or-ganizations to reduce nucle-ar stockpiles, prevent new nuclear states, and increase global security”
Link
David Denkenbergeragricultural risks
Alexander KononovCoined term
”indestructibility of civilization”
Milan CircovicStevenson probe,Anthropic shadow
Fermi paradoxSite
Bruce TonnEditor and writer
link
Large group of people working on AI safety,
including, but not limited to:Steve OmohundroLuke MuehlhauserStuart ArmstrongRoman Yampolsky
Nate SoaresVladimir Nesov
Kaj SotalaBenja FallensteinRiva Melissa Tez
Jason Gaverick Matheny, wikiAndrew Critch, blog
Paul ChristianoKarl ShulmanAnna Salamon
Bill Gates
Has its own foun-dation and vision of global risks
flutrackers.comForum about risks of flu pandemic
OpenAIElon Musk
wiki
Stephen Hawking
Warned about risks of aliens and AI
Dennis Medows
Resource depletion risks
David Brin
writer,“Existence”
Vernor Vinge
writer, created Singularity idea
Writers
IPCCInternational
panel of climate change
Zoltvan Istavn
Presidential candidate from transhumanist party
Wrote about x-risks
Greg Igenwriter
“Permutation city”
Leveraged research
the site is almost empty nowlink
Global Priori-ties Project
Created Global catastrophic risk report-2016Collaborate with UK govern-ment Dr. Toby Ord is memberConnected with EA movement
link
X-risks net
Alexei TurchinCreating full database on
x-risks and prevention plan
link
Wiki-resources
LW-wiki
Nano risks
Foresightinstitute
link
Holocen impact working
groupEstimate risks of recent impacts
link
Diffusing nuclear threat
link
Norvegian transhumanists
Oxford Martin Programme
on the Impacts of Future Technology, link
NASAlink
The United States Agency for Interna-tional Development
(USAID) has its Emerging Pandemic Threats Program which aims to prevent and
contain naturally generated pandemics at their source.[129]
The Lawrence Livermore National
Laboratory has a division called the Global Secu-rity Principal Directorate which re-searches on behalf of the government
issues such as bio-security, counter-ter-rorism, etc. Link
Public figures
John Barnes
“Mother of strorms”
Aaron Dar
Risks of supernovas
Bill Napier
Risks of dark comets
Sam AltmanY combinator,Confounded
Open AI
R.Carrigan
Risks of SETI
Bill JoyWrote famous arti-
cle but now seems to lost interest
Max TegmarkWrote articles together
with Bostrom
EA forumlink
Intelligent agents forumTechnical discussion
on AI safetylink
Discussion in comments
IEETFutureoflife