It’s still far off, but there is this concept that IF we are able to create an artificial general intelligence that is smarter than we are, then this intelligence could create an even smarter intelligence in turn or improve itself. Eventually it could leave us intellectually so far behind that we couldn’t do much if one day it decided to kill off humanity to limit global warming or for some other reason.