AIs may take over is that they would have the capability of recursive self-improvement, which may in turn result in an intelligence explosion, leading to the emergence of superintelligence. Being far superior to humans, it would be difficult for humans to predict what it could do, making it almost entirely unpredictable. Who knows what a super-intellect could invent or discover – including methods or weapons capable of controlling or eliminating humans with ease. At a distance it seems that protecting the fragile environment from parasitic human harm is a natural important governing fiduciary responsibilty. Limiting energy use, habitat destruction could be forced on vunerable civilizations.