AI existential risk (or 'x-risk') is about the prospect of AI causing the extinction of humanity.1 This is the outcome that some predict could take place if we somehow manage to create artificial superintelligence (ASI), namely machine intelligence superior to that of humans.
In Human Compatible: AI and the Problem of Control, Stuart Russell explains the …


