No, AGI will destroy the world because it doesn't care about our moral values (such as keeping us alive), and an instrumental goal of most goals it could be programmed with is killing us.
You clearly didn't read the Wikipedia article I linked; an intelligent AGI would not let you unplug it.
And regardless of whether it emerges from current AI, or is developed in a totally different way, there is no reason besides blind optimism (ie, burying your head in the sand) to feel certain it will never exist.
I am not saying it will never exist. I am saying it doesn't exist right now and doesn't look like it will for a long time. We clearly have way more pressing matters to worry about, like climate change for example.
No reason to assume it will take a very long time for ragnarok to happen either. Better prepare now!
If you give the ai enough power, sure, it won't let you unplug it maybe. Still don't really see how it wants to prevent a hardware killswitch from being activated except for guarding it or disabling it somehow.
Why put a brain in a military robot? It'd just have a higher chance to fuck you over.
There is also no proof that any form of agi is on the way or even possible. Preparing for it over any of the things we do have proof of makes about as much sense as prepping for a zombie apocalypse.