You talk with these interesting certainties. Like if a human dies and we have their body and DNA we can be pretty assured they are dead.
With something like an AI you can never be sure as you must follow the entire chain of thermodynamic evidence of the past to when the AI was created that no copy of it was ever made. Not just by you, by any potential intruder in the system.
You're the only example I know of, of someone using "containment" to mean "extermination" in the context of AI.
Extermination might work, though only on models large enough that people aren't likely to sneak out of the office with copies for their own use and/or just upload copies to public places (didn't that already happen with the first public release of Stable Diffusion, or am I misremembering?)