1 min readCambridge-based Group Develops Civilizations V Mod to Educate Public About the Dangers of Superintelligent AI

The Cambridge University Center for the Study of Existential Risk (CSER) isn’t known for gaming, but the threat of superintelligent AI has led them to develop a mod for Civilizations V, a popular strategy game. Primarily focused on exploring global catastrophes that could lead to total annihilation of humanity, the CSER has been studying about climate change, technological and biological risks since 2012. Its goal is to  study these existential risks in order to develop multi-stakeholder strategies so that humanity could take advantage of better technologies while minimizing the risks of technological progress.

In relation to AI, the CSER believes that superintelligent AI poses a real threat. Through the Civilizations mod, CSER hopes to educate the public about superintelligence, as well as to give players an idea on what might happen to civilizations if the technology is not managed properly. CSER’s Civilization provides players a solution to counter malevolent superintelligence – research.

If you choose to go down the AI path, you need to make sure you have more safe AI research than rogue AI research.

Shahar Avin, Research Associate, Center for the Study of Existential Risk

[contentcards url=”https://kotaku.com/civilization-mod-adds-a-super-ai-that-can-destroy-human-1821860158″]

Leave a Reply

Your email address will not be published. Required fields are marked *