This the the difficulty in assigning probability to something that is distinctly possible but not measured.
Your guess on what AI will do in the future is based on how AI has performed in the past. At the end of the day we have no ability to know if the function it will follow or not. But on 1900 and one second I can pretty much promise you that you would not have said that nuclear war or climate change would be your best guess on what would cause the collapse of humanity. Forward looking statements that far in the future don't work well these days.
“Humanity” is not the same thing as “human civilization”.
But, yes, its unlikely that it will be extinct in a century, even more unlikely that it will be from climate change.
...and yet climate change is still more likely to do it in that time than AI.