In the defence of that chart, those probabilities are obviously intended to represent the likelihood of something both happening and wiping us all out if it happens. While a nuclear war and a natural pandemic may be more likely to happen than a superintelligent AI, they are also much more likely to leave survivors.
Now, I do think the 20% chance of extinction by 2100 implied by those figures is a bit off, but I can understand why those items are in that order.
Also, regarding the non-existence of a superintelligent AI, consider that we are currently closer to the invention of the computer than we are to 2100. Give it time.