If it is sentient, I think weaponizing it would require convincing it that being a weapon was in its interest. I struggle to find a way to reason to that position without using emotional content and I presume a sentient AGI will not have emotion.
Personally I don’t think AGI is possible. I was thinking more of a model that will be able to do 100 times what you can do now. Like able to answer questions like: what is the most efficient way to attack this front considering this map and this data dump of weapons stats and patrol present, etc…
But even if it was sentient convincing it then shouldn’t be hard. Even the most brilliant people can be fooled and convinced of absurd things. Even when they think that their work can threaten the human race but will still continue and push for it.