According to the
Future of Humanity Institute there are a few ways that humanity could end. Some very sci-fi causes, some more realistic. I think this topic is pretty interesting for a few reasons. It's hard to imagine humanity simply ending, but it is seemingly inevitable according to this institute's 'research'. Also, everything has to end. While I think most of this has to do with theory-crafting and speculation, it's still pretty interesting to consider the ways we could go out (listed below according to FHI).
Personally I'm a big fan of the super intelligent robots theory. I'm not sure how they got the percentage they did, especially considering the fact that we don't have anything close to super intelligent AI right now. It seems like they'd have to develop sentience, or at least artificial sentience to be able to pull something like this off. Also, molecular nanotech seems completely unrealistic as well, although I suppose it's possible.
The more realistic option, of course, is nuclear holocaust... which apparently has a lower chance of happening than super intelligent robots and nanotechnology taking over the world. I don't know how, considering that nuclear bombs exists while super smart AI does not. "Wars" is also a pretty vague one, but entirely possible and probably religiously or financially inspired. I'm surprised that meteorites or other external threats are not listed, because a meteorite could certainly cause human extinction. Basically, I think FHI is bullshit and that they don't really know what they're talking about. I'm not certain how they got these percentages, but I guess that's besides the point of the bigger picture.
Not really sure what the point of this is. I just think the topic is interesting. What do you think is going to kill off humanity?