> Einstein's letter to Roosevelt was written before the atomic bomb existed.
Einstein's letter [1] predicts the development of a very specific device and mechanism. AI risks are presented without reference to a specific device or system type.
Einstein's letter predicts the development of this device in the "immediate future". AI risk predictions are rarely presented alongside a timeframe, much less one in the "immediate future".
Einstein's letter explains specifically how the device might be used to cause destruction. AI risk predictions describe how an AI device or system might be used to cause destruction only in the vaguest of terms. (And, not to be flippant, but when specific scenarios which overlap with areas I've worked worked in are described to me, the scenarios sound more like someone describing their latest acid trip or the plot to a particularly cringe-worthy sci-fi flick than a serious scientific or policy analysis.)
Einstein's letter urges the development of a nuclear weapon, not a moratorium, and makes reasonable recommendations about how such an undertaking might be achieved. AI risk recommendations almost never correspond to how one might reasonably approach the type of safety engineering or arms control one would typically apply to armaments capable of causing extinction or mass destruction.
Einstein's letter [1] predicts the development of a very specific device and mechanism. AI risks are presented without reference to a specific device or system type.
Einstein's letter predicts the development of this device in the "immediate future". AI risk predictions are rarely presented alongside a timeframe, much less one in the "immediate future".
Einstein's letter explains specifically how the device might be used to cause destruction. AI risk predictions describe how an AI device or system might be used to cause destruction only in the vaguest of terms. (And, not to be flippant, but when specific scenarios which overlap with areas I've worked worked in are described to me, the scenarios sound more like someone describing their latest acid trip or the plot to a particularly cringe-worthy sci-fi flick than a serious scientific or policy analysis.)
Einstein's letter urges the development of a nuclear weapon, not a moratorium, and makes reasonable recommendations about how such an undertaking might be achieved. AI risk recommendations almost never correspond to how one might reasonably approach the type of safety engineering or arms control one would typically apply to armaments capable of causing extinction or mass destruction.
[1] https://www.osti.gov/opennet/manhattan-project-history/Resou...