Note that all of the above arguments can be fixed by a "disable backdoors in production builds" policy. It's requiring field-updatable devices that really kills you here. (You can implement high-quality public-key crypto, but that is hard.)
My point is that the engineering organization has to care in an effective way; a build policy like that is great, but you have to back it up with intent.
Before Microsoft's big security push (whatever else you want to say about it, they made a huge effort) most of the above attitudes existed. Now you can't turn around without going through a security review . . . some more effective than others, but at least they're trying.
Armoring a system that will accept only signed updates isn't that hard (just check signatures and refuse updates that fail). This is different from armoring a system against hardware-level attacks, which Bunny and the NSA and a LOT of other people are good at.
Armoring a system against intentional holes is not an engineering problem, it's a people slash attitude problem.
Armoring a system against bugs (buffer overruns, etc.) requires that you solve the people / attitude problem first, and then do meaningful security engineering. This might be really easy for a flash drive, which should have a really simple surface area.