Then do it insecurely and warn the user. 'We can't figure out how to do this securely' is their BS excuse for every bad design decision or unimplemented feature.
You might say 'but they don't want to make people less secure, people will get the wrong idea!' But they do this already, in ways that are much worse than allowing the user to make a security decision for themselves.
You can change a setting to prevent screenshots inside the Signal phone app, so you can't take screenshots. Your conversations are now secure, right? Nobody can take pictures of your disappearing messages! WRONG. You can turn on that feature and I can still take screenshots all day, including disappearing messages that you send to me.
Likewise, Signal can't tell if you're downloading pictures or copying text I send to you. You could be backing up everything - my only 'assurance' is that you probably aren't doing it because it's inconvenient.
You can change disappearing messages timer to anything you want! Great! But the change of timer is itself a message. So if we are arrested and police get into one person's Signal, they can see when disappearing messages were turned on and when the timer was lengthened or shortened. Sure, the messages disappeared, but what were you doing on August 23rd at 7:39pm that made you change the timer to 10 minute4s for 3 hours? We know where you were because of your phone's IMEI, I guess we will tell the court that you were trying to cover something up during those 3 hours and charge you with obstruction of justice.
I have asked them to change the latter behavior repeatedly, explaining why it could be a problem for users, and all I ever hear is 'good point, we'll look into it' even though there's no reason that information should be stored.
Your former examples are things that quite simply can’t be mitigated in any case. If you want to send a message to someone there is no way to prevent them from storing it in a way you control.
Your latter example is also a security concern they can’t address. A jurisdiction that allows a message about a settings change being used as a basis for obstruction of justice can rule the use of signal as the same (though I do agree that former is problematic on its face).
I dont know the ins and outs of the problems with backups, but it doesn’t take a phd in cryptography to envision a case where your settings about backups open all your contacts to automated dragnet surveillance. In that case it doesn’t make sense for a single user to downgrade everyone else’s security settings.
I'm not saying they can be mitigated, I'm saying that casual users have the illusion of security through settings that seem to mitigate security concerns, but don't.
The disappearing message timer history could absolutely be mitigated by simply not retaining that information or timestamping it.
If you could export/back up single conversations, you would have much more granularity than exporting or backing up your entire message database. Other people could also get a message that the conversation had been exported. there are lots of cases where you might want to do this by mutual agreement, but it isn't possible.
What I don't understand here (though admittedly I haven't been following the iOS discussions closely) is why backups are possible on Android but not on iOS?
The backups on Android are near useless as well - they expect users to remember and save a massively long string of numbers (that are pre-generated, so they can't even choose a password they remember) and then they only do backup manually and onto device storage where it'll be gone together with everything else on the device if it breaks or dies.
Getting that backup off the device is yet another manual process for most users they need to think about.
Compare this to Telegram: user doesn't have to do anything.
Compare this to iMessage: user doesn't have to do anything.
Compare this to WhatsApp: user just needs to click agree.
The last two even save backups in an E2E encrypted fashion unreadable by servers.
As I understand it, iOS backups normally go to iCloud -- where they're stored encrypted but the keys are held by Apple (i.e. not end-to-end encrypted, and not a zero-knowledge system by any stretch). This makes iCloud-stored iOS backups susceptible to subpoena, malicious employees, and/or good-enough hackers.