Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: HiddenVM – Use any desktop OS without leaving a trace (github.com/aforensics)
429 points by aforensics on March 5, 2020 | hide | past | favorite | 141 comments



Maybe good for hiding activity when you're already below the radar. If you're a person of interest for a large enough state then they can and will use all manner of dirty tactics to nail you and simply encrypting is not enough. You will have to flee like Snowden did. And once they bring in legislation that says a govt agent can ask for your decryption keys under reasonable doubt then everyone is in soup since encrypted data is easy enough to detect as such. One may have to shift to steganography of increasing sophistication. Basically this fight has to be clinched politically. While technology can help it can't ensure absolute privacy/security against an all-powerful state. The key question is if a state should be all-powerful at all in the first place...


> everyone is in soup since encrypted data is easy enough to detect...

This is only half-true. Any secure encryption is going to result in ciphertext that is indistinguishable from random data.

In cases where the ciphertext is designated by a header or file format, then it's trivial to know that something is encrypted. Then there are cases where we can try to forensically determine that there's encrypted data via the existence of an encryption tool (e.g. VeraCrypt).

If you wipe a disk with random data, for example, then it would be relatively difficult to determine whether or not the disk is encrypted (implying that there are no headers on it). In fact, one method of wiping disks is to generate a random encryption key and encrypt a stream from /dev/zero to fill the disk (https://wiki.archlinux.org/index.php/Dm-crypt/Drive_preparat...).

This tool is making use of a VeraCrypt hidden volume which is a rather really interesting application of plausible deniability in cryptography. Essentially, this let's you have two volumes where both are encrypted, but each has a different key. In this setup, you'd put some files on one of the volumes to make it appear that it's your "used" volume. On the other "hidden" volume, you'd place the real files you want to keep safe.

In a case where the government is demanding that you release your encryption keys, you would give up the keys to the "fake" volume. Unless you divulge the keys to the "real" volume, the attackers wouldn't necessarily know that it exists.

Unless there's evidence of you using one (maybe chat logs or google searches asking for help on using it, for example), there's no reason for anyone to suspect you use it.

The VeraCrypt documentation explains the technical details (https://www.veracrypt.fr/en/Hidden%20Volume.html) well enough.


> This is only half-true. Any secure encryption is going to result in ciphertext that is indistinguishable from random data.

A new SSD with very little data in the filesystem isn't going to have many, many sectors filled with random bytes. They're going to be blank instead.

A used drive will have free sectors (not used by the filesystem) containing unencrypted contents of old files that have since been deleted or something. This is also not random data. Chunks of movies, pictures, applications and music will be identifiable, easily.


A previously-used disk, wiped to NIST standards, will be filled with random data - that's exactly the point of the wipe.


> A previously-used disk, wiped to NIST standards, will be filled with random data - that's exactly the point of the wipe.

Yes, and that is suspicious. Random data is suspicious.


Exactly. Something being suspicious is about small differences from what's expected, differences that correlate with something bigger. Whether the disk is full of random data because it's encrypted or because it was securely wiped, either way it correlates with somebody having something they're working to hide.


Or a second-hand computer: I do not wish to carry previous owner's use history into my usage, and neither should anyone else. Do not conflate "unusual" with "therefore hiding stuff", and don't even try "hiding stuff, therefore bad".

There are legitimate reasons for wiping data...can't believe we're having this discussion, here of all places.


I agree there are legitimate reasons for doing so. None of which will matter to some official busybody rummaging through your drive. To them, a drive filled with random noise (instead of, say, being zeroed out) is going to be unusual in a way that correlates with bigger things they worry about. Or, in short, suspicious.


In no way am I arguing that you shouldn't wipe your data, use encrypted filesystems, or anything like that. That is a totally legitimate thing to do.

We're just talking about drawing attention to yourself from governmental agencies that probably don't have your well-being as their highest concern.

Using state-of-the-art encryption to keep your files safe is good. But if it leaves any evidence that you are indeed using encryption, you are potentially drawing attention to yourself. And people should be aware of that.


Suspicious, true. But in such case, you could be in trouble for refusing to provide the password to a suspected hidden drive which doesn't exist. How does that even make sense? (Rhetorical question)


From your point of view, it doesn't of course. You know that you don't have an encrypted partition.

But from the authorities' point of view, they will beat you until they're convinced you don't have anything of interest you can give up. That could last quite a while...


Which means that you can rot away for life, based on a LEO's hunch. And that's not The Sovereign People's Republic of Elbonia: that's Australia and the UK.


Okay, so why not wipe to NIST standards (multiple random overwrites or whatever is the latest best practice) and then zero it out to an "unused" state afterward?


If all you're going to do is just securely delete data without drawing suspicion, that's a good plan.

If you wanted to hide data in an encrypted partition that looks like random data, that's not going to work.


> Any secure encryption is going to result in ciphertext that is indistinguishable from random data.

While that's technically true it feels a bit like a moot point because if you have random data that cannot be attributed to any other application (such as large volumes of randomness) then it's a reasonable conclusion that you've just detected an encrypted volume.

> In a case where the government is demanding that you release your encryption keys, you would give up the keys to the "fake" volume. Unless you divulge the keys to the "real" volume, the attackers wouldn't necessarily know that it exists.

Unless they inspect the storage properties (either physically or how it registers itself on the host) and see that it's a 1TB drive with only a 500GB mountable volume. Again, it wouldn't be a forgone conclusion that the individual has other hidden volumes but it would be suspicious enough to warrant further investigation / interrogation.

As always though, it really depends on the risk level you're trying to protect yourself against.


I would also assume the ‘dummy’ operating system wouldn’t have much activity. Since the user would be using the hidden OS. So that coupled with the unaccounted for space would raise more flags.


Indeed. Software is a supplement to the physical world. But we do what we can, and at least in the realm of software, we can have freedom.

It's possible Tor and Tails is dangerous software to use in certain states. But if they can safely use it, it's here for them.


"... since encrypted data is easy enough to detect as such ..."

I am not sure what you mean here - properly encrypted data is indistinguishable from random data ...


I forget where I hear it or if it was my own idea (the shame, I know, I know!) but... cant you have an unknown number of username/password pairs that decrypt/unpack the same chunk of data into different things? Say you have the same OS 51 times, as clean installs the data shouldn't have to be all that much larger than 1. You install some games on one, some office apps on the next, put some downloaded movies on the 3rd. You could give them "all" 50 passwords and they could never find OS nr 51.


It's not possible to encrypt 51GB of real-world data in 1GB space for the same reason that compression algorithms can't achieve 51x compression ratios. Given that, such a scheme presents some challenges if you want to maintain plausibility. Either,

1. Each filesystem lives within an allocated area and knows not to overwrite its neighbors' data.

2. Some filesystems (the real ones) are privileged and know their actual allocated area. Others (the decoys) think they own areas of the storage volume that contain hidden data and therefore have the potential to overwrite the hidden filesystems if they are written to.

In the case of (1), you need to be able to explain why your computer has unallocated areas filled with pseudorandom data. That is never going to pass the plausibility test, imo.

In the case of (2), a lot of effort needs to be put into making the decoys look normal while not letting them overwrite the hidden data. There are a number of strategies you could use here that would work, but it will never be as convenient or simple as dual-booting and the more convenient you try to make it, the less innocent a hard drive will appear under close inspection.


I think the original commenter was going for an encrypted copy-on-write setup, not some magical compressed fs. Just a base image (eg 50GB) with various encrypted delta images (1GB each) that are assigned to each user.



I wonder why this wasn't adopted and maintained?


Nice idea, now wondering how / if I could have an alternate password that would load a decoy environment while hiding all the private stuff of my main account...

Does such a thing already exists for an Ubuntu setup ?


Safest way is to hack the SSD/HDD firmware, make it report its size half. Depending on some condition, make it use the selected half. (ex: some byte in first sector, some ATA command)


Any entity big enough to seize your laptop for analysis is also going to be able to look up the specification for any particular part in your laptop, and eventually this portion of the cat and mouse game will end.


That's predicated on an all knowing adversary with unlimited time and budget. In other words it is a largely fictional problem.

Most of the people we're talking about just run off the shelf forensics software and have minimum actual expertise (the government doesn't pay well enough for legitimate experts doing it by hand).

But then again, very few people are crazy enough to modify computer hardware to protect their information. So both sides of this coin might be largely fictional.


I heard of a story of an SD card purchased from china that did not have the disk space specified. In fact, the card would continue to over write the contents without reporting full.


Are most people getting their laptops seized?


A international borders? It isn't at all uncommon. I've had my electronics searched before.


Out of curiosity, if you’re comfortable saying, what country was it and were you a citizen there?

I’m always curious what the breakdown is.


Entering the USA you can be required to hand over details of social media accounts etc on your ESTA. Pretty horrid stuff.


That's definitely not true, the article mentioned people traveling between countries - TSA/border patrol/airport police aren't going to send your laptop over to the NSA/KGB to have it cracked by an expert. That being said, actually encrypting the data is way more secure than fucking with HD self-reporting.


If you go so far as to modify the hard drive firmware, you probably already use full-disk encryption.


And when they pull the hdd and realize it says 512gb when you're only showing 256gb?


Replacing a SSD's sticker doesn't seem particularly challenging relative to modifying the firmware to misreport but make available storage.


I wonder if it isn't easier to buy a laptop with two drives. Install a regular OS on the first, hide the second in the BIOS and nobody will notice.

The people doing the cloning / data theft would have to know about your particular model. Obviously, you encrypt the second drive, that in itself contains a hidden partition in case they do discover it.


The project here is quite neat. But I wonder if your idea would also work and ask myself how competent the forensic teams of airport security really are. Or even if they are, they certainly don't have a lot of time per device.

IT specialists are expensive and it would be shame if we waste that on something benign as airport security which was mainly established by paranoia and the wish to save face.

An what exactly are they targeting? Are they looking for howToBlowUpAnAirplane.txt? Just some industrial espionage? Just some display of authority? I don't really get what would prompt these measures.

Was there ever anything they found on a device someone took on a plane?


>An what exactly are they targeting?

journalists, I heard


That doesn't seem it would increase airport security too much.


But it definitely increases the security of the people who control airport security.


Wouldn't they see that the laptop have 2 drives on their x-ray scanner ?


If you're only trying to slip by a cursory inspection, mount the second drive in your computer but don't attach any cables to it. Could be trickier depending on how drives are mounted in your laptop.


So does Tails run as the root OS but displays a separate OS in a VM? For example, does it look like your booting into Windows when really it’s just a VM inside Linux? If so, does that Windows VM or whatever it is you choose act like what is essentially a read only OS?


Yes and yes. No, the VM is not read-only. (But you can run a VM as read-only.)


Interesting, thank you for sharing.


Hello HN,

We're finally sharing our github with the world. This post is the first announcement of our project apart from our thus-far non-populated subreddit. No one's discovered us yet. We've only told one person in the world before right now. We're new to developing and we're very humble and willing to learn, so any suggestions and help is welcome.

What we aren't as humble about is the potential we think this application has. HiddenVM allows full-scale anti-forensic use of any desktop OS. (No longer just Tails.) If you place your installed files inside good deniable encryption like VeraCrypt, it means that no digital trace of your chosen OS is left on your hard drive or can be forensically proven to exist. That is significant.

There are many reasons why you may want to use HiddenVM. Some use cases include:

- You're a spy protecting national security and you need to leave no digital trace on the hard drive of the computer you just used.

- Law enforcement agents conducting sensitive investigations.

- Diplomats, politicians, and military personnel.

- Whistle-blowers needing to safely carry their information in any situation.

- Activists, dissidents, political asylum seekers, and journalists in need of stronger protection of their information from corrupt governments when their equipment is forcibly seized. (We know that the risk of the rubber hose remains a complex problem and limitation of encryption.) Now that you can use Windows once you set it up inside Tails, keeping your data private could become easier for you.

Border agents forcibly invade our privacy and potentially steal our secrets with no respect to who we are or what our rights are. We need tech solutions to protect our data. More use cases include:

- Lawyers carrying sensitive client information.

- People in business protecting their IP or trade secrets.

- Tactics in fighting against corporate espionage. It could be expensive or impossible to sue for someone's unlawful intrusion into your data. Easier to technologically prevent them in the first place.

- Protect your basic privacy and dignity for any of the one thousand other reasons why privacy matters.

- You travel a lot and you want to use Windows/macOS/Linux in a way that prevents malware code from being forcibly installed inside your operating system simply because you entered a country.

- Digital currency: store a more private Bitcoin wallet. Secure your assets against unwanted and unwarranted access. When data literally is money you have a lot to lose.

- Domestic violence victims, and people in other dangerous situations in life.

Data privacy is a human right. If you don't want someone searching your naked body and violating your dignity in that way, why should your data be any different? Airport border agents not only perform a full digital strip search, but they're also potentially stealing your data or implanting spyware and malware without you knowing. It is a devastating act.

Using Tails should never be reason to suspect you are a criminal or a spy. It also protects basic data privacy and democracy. Tails should become a standard USB that anyone who values their digital safety carries around in their briefcase, bag, purse or wallet. We hope our application increases the size of the Tails user base.

Thank you for your interest. We invite you to rip apart our assertions and code (but with courtesy), try out HiddenVM, and contribute to our project.

Sincerely, aforensics


Copy notes:

1. Pictures.

> What we aren't as humble about is the potential we think this application has.

So you're not humble? Ditch the marketing goofiness. You think it has major potential. Be humble or don't. It's inessential to conveying what HiddenVM is.

> Like Tor, Tails, or Whonix, HiddenVM can be used for bad purposes

Unnecessary. You're already on the back foot.

> - You're a spy

This isn't a normatively 'good' reason.

> Activists, dissidents, political asylum seekers, and journalists (like Laura Poitras)

Don't cite a specific person unless that person is endorsing the product.

> Using Tails should never be reason to suspect you are a criminal or a spy. It protects basic data privacy and democracy.

Don't lead with a user story that exactly matches the stereotype, then. You're walking right into it.


Thank you for the feedback. Some of it made sense and I've updated the parent comment.


Don't forget private bankers trying to evade the tax authorities:

"According to one former UBS banker, managers gave the private-wealth team specially encrypted laptops that could be easily deleted in case U.S. authorities barged in. “They told us about the computers, ‘if ever you run into problems in the U.S. with the IRS, just push button X twice, and everything will be deleted,’ ” said the banker. “It was like James Bond.”"

https://www.cnbc.com/2015/04/30/why-did-the-us-pay-this-form...


I may be misunderstanding how it works, but aren't the presence of Tails, the drive full of random-looking data, and the absence of a visible consumer OS all massive red flags? It seems like it would be completely undeniable that you're trying to hide something.


Red flags might not actually matter in many use cases. But where it does, setting up a decoy OS that boots on the computer by default when turned on may be one good strategy.

For a VeraCrypt volume, setting up an outer volume with convincing files and providing the decoy password may be effective.

Whether the mere possession of a Tails USB adversely affects your situation is a matter that remains to be discussed at length. There is clearly no one situation that applies to everyone.

HiddenVM's potential to provide deniability is about cryptographic deniability, not human deniability. Software can only do so much. If humans are suspicious, software alone cannot change their minds.


The thing is, if you are outside the norm, you are raising suspicion. Using this kinda setup will prevent you from flying under the radar, instead, you are painting a nice target on your back. This is gonna bite you in particular if you are already a person that your adversaries are keeping an eye on.


Right. So all we can do at our end is better document the risks and limitations, and then work on the political advocacy side of things to promote diversity and nonconformity.

Such is the spirit of Linux. Is every Linux user automatically suspicious to various enemies? If so, the work to be done is not in our code repositories.

If we could incorporate some steganography in the future that could also help. Open to ideas.


Hi, very cool project! I'm getting more into security so apologies if this is a stupid question. Is the veracrypt drive, and therefore the HVM, linked to my specific instance of tails or can it be accessed by anyone with a tails usb and my veracrypt authentication details? Also, is it possible to have tails on one usb and a veracrypt drive on a seperate USB drive? How would that effect deniability at, say, a border?


> Is the veracrypt drive, and therefore the HVM, linked to my specific instance of tails or can it be accessed by anyone with a tails usb and my veracrypt authentication details?

If someone has your VeraCrypt volume password, the volume can be unlocked by them using via any Tails stick but potentially any other operating system. What HiddenVM does is reduce digital forensic evidence of using that volume quite fundamentally.

> Also, is it possible to have tails on one usb and a veracrypt drive on a seperate USB drive?

Yes. It may be faster if you make your computer's internal SSD to be one entire partitionless hidden VeraCrypt volume.

> How would that effect deniability at, say, a border?

We want to be careful about making claims about deniability, and it's still a field we have a lot to learn about at HiddenVM. Someone more knowledgeable might dare to answer this. We already give two examples on the github page. Your situation is unique and only you can know what deniability strategy works best.


It is kind of ridiculous to still use md5sum to check software for integrity.


Normally I would give people the benefit of doubt and say “it doesn’t matter for most practical cases”, but given the purpose of HiddenVM you really would expect them to do better than md5. They’re expected to be on the forefront of these technologies and should set the example, even for something as innocuous as zip file integrity checks.


Thanks, we'll look into it.


And now fixed, moved to SHA512.


What's wrong with MD5?


It's too easy to generate hash collisions.


If you want to use Tor in another country then come through a border, what's the advantage of HiddenVM over putting Tor on a bootable thumb drive, using it, then throwing away the drive before crossing the border? Just the persistence?


Or, or..hear me out, swap your HDD with a gaming one so when a smart guy takes a look at your HDD will find only benign games. You think customs agencies does not have smart people who can look past a simple boot screen? Think again


Put your sensitive data on a microSD card and put the card in your mouth while in the checkpoint, the detector won’t be sensitive enough to pick it up (it doesn’t detect fillings and they are around the same size).

If you get in trouble, just swallow.


Or have Veracrypt volumes uploaded to some service and use it from anywhere on the world when need it? Much simpler


Veracrypt is source-available, but proprietary. A better solution would be LUKS-encrypted partitions. For single files, gpg-based encryption should work as well.


And having source available, and most important can be compiled from those sources, means can be scrutinized for vulnerabilities by security experts, yes? That sounds good enough for me.


Security wise, yes. But secure or not, proprietary software can be a turn-off for many.


Or just wipe the HDD and install a fresh OS? I have to admit that I'm uncertain exactly what the goal is.


Github says the goal is to hide your privacy from customs agents. That's the honorable use of this. But is a tool, and just like any tool can be used for both bad or good things. A scammer or spammer will have use of this faster then a person facing customs agents.


A fresh OS is pretty suspicious.


It takes almost no time to reinstall an OS, install a few games from steam.

If you want you don't even need to wipe your install and have a bit more time to spend, you can dual boot and remove the other boot option temporarily from GRUB / MBR or whatever the windows equivalent was.

Or fetch out a few github repos if you don't want to install steam games. :)


I'm not sure I understand your response. Am I missing something in my original comment?


My reply was to say that it's relatively easy to make a clean install not look 'fresh' :P


Fair.


A zeroed drive and a stock OS install, surely unlikely to have any hidden data.


> The VM will even connect to full-speed pre-Tor Internet by default, while leaving the Tor connection in Tails undisturbed.

This doesn't strike me as a selling point? Surely the default should be to have the VM traffic all go over Tor?

Cool project though.


Well, the idea is that you don't have to be limited by Tor's speed or handicaps like being IP blocked when web browsing, if you don't want that by default. We love Tails' amnesia for anti-forensics, but we prefer Whonix's more secure Tor anonymization, if you want to be anonymous. Now you can easily combine both benefits.


I think the point is to make a decoy OS that you can boot into if forced to unlock your laptop. Running on Tor would be highly suspicious.

The point of running this on Tails is to prevent the use of forensic tools inside the decoy OS to unearth what's underneath.


If using Tor is suspicious, then having Tails on your computer is also going to be suspicious. I'm certain that no border agent will be swayed by your "but I don't actually use this Tor I have installed" arguments.


I assumed that tails is normally invisible and must be logged into using some secret handshake at boot time. Otherwise it's pretty silly, even if the partition is wholly encrypted.


Please consider an acronym other than "HVM", which already has meaning in virtualization context (Hardware Virtualized Machine).


> The VM will even connect to full-speed pre-Tor Internet by default

Snatching defeat from the jaws of victory.


Will be interesting to also have macOS as guest with Vera Crypt & encrypted volume etc.


Isn't downloading additional software defeats the purpose of inspecting the code?


How about running from RAMdisk? Feels like that would be the safest


It would be, but how do you go through the whole install every time you need it?


If your laptop is getting inspected at the border of an actual authoritarian police state:

https://en.wikipedia.org/wiki/Rubber-hose_cryptanalysis


That's why TrueCrypt, and I think Vera Crypt have support for a volume that can be decripted with two keys - one will yield a decoy volume where its free space is actually the real volume that you want to protect which gets decoded only with a second key

I'm not sure if the entropy analysis of that free space can suggest that there's something funky about that free space or not.. Because usually free space is either actual info just marked as deleted, or info reset to zeros by some pro active wiping of the free space. So, having a bunch of whacky data that doesn't look like any kind of file, can probably be used as a tell tale sign? No?


In most instances I think torture would yield all of your knowledge including the secondary unlock.

Unless you're thinking your attacker would exclude you from torture for "yielding" the password.

I would think if they are capable of torturing you than they wouldn't stop at a polite confession.


Border agents inspect and copy many more digital devices than just those of people they actively suspect of something, or are willing to torture.


VeraCrypt hidden volumes are supposed to be a 100% deniable. There’s no way to prove they exist.

The idea is that they’ll stop torturing you because they don’t know of the existence of the hidden volume.

(In practice I suspect it leaves some subtle queues, but maybe perfect for a border crossing )


My understanding is that people who torture you don't know what you don't know; so they don't know when to stop. As such, they'll keep torturing you way past the point where you've admitted to everything you know. This is why information obtained under torture is considered unreliable: eventually you'll just say anything to stop the torture; further admissions will support the use of torture as an information extraction tactic, and then lead to more torture.


Yes but, under what pretext would you torture someone who has complied with all of your requests?

~"Do you have any encrypted data we can't see?"

"Yes. The entire drive is encrypted"

~"What is the key?"

"iloveapplesauce6969"

~"Well, that worked and I see your data here. What a lovely family.. is that Disney world?"

"Yes it was Timmy's 5th birthday"

~"I'm going to waterboard you"


Them: "You're still hiding something"

You: "This was everything!"

Them: "We don't believe you" -rubber hose-

You: "Stop it! I planned to blow up the world trade center!"

Them: "We knew you were a liar."

To themselves: "Wow, torture works."

Rinse, repeat.


In theory, you could just keep adding n+1 layers of fake passwords (maybe with realistic fake data), on the hope that after n attempts, they think they've broken you and hit the jackpot.

But as sibling commenters describe, if sufficiently motivated, there's no reason that an authoritarian state wouldn't just keep torturing you anyway. :(


> if sufficiently motivated, there's no reason that an authoritarian state wouldn't just keep torturing you anyway.

Sure, but it helps to make it look like you're someone not worth torturing in the first place. The same look would happen when you decrypt your TrueCrypt partition.

Large unused sections on the laptop with random data is a bad look for someone trying to say they're not a spy.

If it were me, I would do something more like bring a laptop with a bunch of biblical research and ask everyone in the checkpoint, if they've taken Jesus Christ into their heart.

This of course assumes that in this instance the authoritarian regime just finds these sorts of religious people annoying and not dangerous. I wouldn't do this coming into Iran, say.


It's also why, IMO, obscurity is a valid component to security.


When cryptographers talk about "security through obscurity" they're talking about cryptographic algorithms and protocols. So even systems that aim to prevent "rubber-hose" attacks could benefit by avoiding algorithms (like AES) who's security is based on obscurity, even if there are parts of the system that are obscured.


Only circumstantially in IMHO. Obscurity can be a semi-decent security tool in some situations, and in others completely and utterly useless. It depends on what you're trying to secure.


Yes, but defence in depth/layering is the over-arching, higher-order concept in the security game.


I agree, not least because I can't see how to define 'obscurity' without it also being a basic explanation of encryption.

(Good) Encryption is a (secure) mechanism for obscuring data, surely?


It seems like there's something like a way to measure the different mechanisms in terms of how inherently decoupled they are from their surroundings. So, a the fact you have to send messages to my server in a particular format is one type of obscurity- but it's highly non-incidental, linked to many different parts of the world (i.e. it's some common network protocol) and more easily investigated (you could get interesting different responses by vary what string of bits you send).

In comparison, which particular password I use can be very highly decoupled from the rest of the world and my architecture, which makes it vastly more (reliably) obscure.

Somewhere inbetween "you have to know my server exists to send 'login:admin password:pass' to it" and "the volume's encrypted with a 2048-bit cypher generated from atmospheric entropy" is, maybe, a useful middle ground.

Hidden volumes seem like more of a defensive meta-obscurity, in that they obscure your metadata (your ownership of a particular piece of encrypted data).



I'm aware. But Kerckhoff's principle is just saying that your mechanism of encryption shouldn't be obscured. It doesn't change that I can't define 'obscure' in a way that doesn't make it a mechanism (in itself inobscure) of obscuring data.

Also, there are plenty of historical ciphers that fall foul of Kerckhoff, I don't think we can say retrospectively that they weren't done for security, and in many cases were probably totally adequate for some time, if not their lifespan.


As I've started putting it: Make your infrastructure and systems hard. Then don't tell anyone the details.


I think the authoritarian regime will just torture anyone with any VeraCrypt or TrueCrypt volume and the plausible deniability will come back to bite you as you can't prove that there are no other hidden volumes.


The point of a decoy is to satisfy scrutiny. Truecrypt WITHOUT decoy will escalate the scrutiny, truecrypt WITH decoy will avoid further scrutiny.


The problem is the level of scrutiny. Against some attackers, decoys have very very nasty game-theoretic failure cases.

Specifically- there's no limit to the number of decoys that could be on a disk. So you can get into the situation where you've decrypted every volume that exists, under coercion, but your adversary believes there are more volumes remaining.

By design, you have no way to prove that there isn't more hidden data on that disk. This is unlikely to end very well for you.


> So you can get into the situation where you've decrypted every volume that exists, under coercion, but your adversary believes there are more volumes remaining.

This is intentional.

If you could prove that there wasn't more hidden data, then the incentives would be to torture you until you did that.

Since you can't, there is no incentive to reveal a further hidden volume, since the attackers will either keep torturing you or not, and revealing more will most likely not help you.

This exact topic was discussed in the Rubber Hose documentation that I read 20 years ago. I think this is an archive of the document I'm thinking of: https://web.archive.org/web/20100820175505/http://iq.org/~pr...


When the sum of nonfree data on all volumes reaches the capacity of the drive there is no more space for hidden data


If I remember correctly, the decoy volume treats all the hidden space as available disk space. TrueCrypt used to have a warning that booting into the decoy volume could scramble the hidden volume when the OS wrote files to disk if it happened to choose some space that overlapped with your data.

If your decoy only lists 5GB of space on a 5TB drive, then it isn't a very good decoy.


If that‘s the operation mode of your foe, it‘s not going to end very well for you anyways.


It's more complex than that right?

If you have a VeraCrypt partition that they can detect, it makes you look like a spy. Lots of random data in unused sectors on your hard drive is a bad look if you're trying to convince the border agent you're not a spy.

If you have a plain old laptop with some mildly embarrassing information on it that's not encrypted, you might still be a spy, but they wouldn't be able to tell from the laptop itself.


An encrypted volume (fixed space) should even remove the white space. After all, knowing the size of a file contained within could leak information about its contents.

I imagine the only way to detect a volume would be to have it decrypted (enforced by law enforcement), to take the supposed volume type and files within and then re-encrypt with the same data. If your volume and the supposed clone are different, it would suggest that you have hidden another volume within.


I think the defense is that (assuming IVs, nonces, etc... are held constant) that the files would encrypt identically. And the excuse for the rest of the disk is “it gets filled with random bytes to obscure how much disk space is actually being used.


The same tradeoff as with any security measure applies to border officials. They may or may not go beyond scanning for low-hanging fruit, and in a typical scenario probably won't.


What countries do this? I saw a mention of Australia.


Any, if they would really want the data.


All borders don't have privacy protection laws? Hard to believe.


That just raises the bar for how much they need to want it. It doesn’t eliminate it.


As far as I know, every country asserts the right to thoroughly inspect anything that crosses its border. There may be a few exceptions, and it may not matter in a practical sense for situations like (e.g.) within the EU, where you don't actually have to go through customs when you cross the border, but in the general case, it's true.


Depends on the border / country.


All things are fungible


Are there any protection mechanism against that? Something like, even if you are being tortured you cannot provide the access to anyone else?


Don't take the laptop with you.

At this point setting up a secure connection to a device in a secure location is way easier than trying to protect your data against someone with physical access.

You can also get your collaborators to revoke access if you fear you might be 'compromised', although ultimately it's hard to protect a system against yourself.


I believe Truecrypt supported a feature where different passwords would unlock different partitions in a volume. So someone could ask you to input the password for BadBoy.tc, and if you enter password1, then you get say the data they actually want. But if you enter password2, it mounts a different part of the file which gives the appearance that you unlocked the whole thing. So, you could stage a dummy partition that has false but convincing data and hopefully fool any captors.


Yes, but:

1) they might not believe you, and

2) that’s still true even if the reason you don’t have a key is because you don’t actually have a secret encrypted partition — or whatever — to supply a decryption key for

So the best thing to do is avoid being in a situation where someone is allowed to do that in the first place.


This is interesting. This also means that using encryption or anything that can plausibly make someone even slightly suspect you're using encryption (even if you are not) can make your situation worse, with certain classes of enemies.

I'm sure advanced configurations with well-crafted decoys and steganography can help combat that, but as we can see, encryption can only take you so far and it's only one element of the picture.


Plausible deniability, like hidden containers in TrueCrypt?

That's a double edged sword though - imagine you give up, surrender the password and are then being asked to unlock a hidden volume, which you don't have.


Encrypt with two keys, one that you know and one that a trusted third party , knows. When you reach your destination, establish secure contact with the third party and have them share their key.


I think one method would be to ensure you don't have the full key, i.e. you have some select friends, each that have part of the key (with some redundancy) - all unaware of one another and potentially all unaware that they even have part of the key.

Then you position your friends over multiple jurisdictions so that they cannot legally compel all of them to play along.


Sharded secrets... you only have one part of a key or keys needed to decrypt some data so even extracting that from you by torture will not suffice. Of course this isn't always practical.


“They” (whoever they are) can torture more than one person at a time.


The trick is to ensure you are never near all the people who know the secret when there is the possibility of trouble. That makes kidnapping everyone harder.

Of course if you really worry about such things you shouldn't be trusting the other people you are working with either...


Presumably the other key holders are in a different jurisdiction.


If you're talking about protection for the people that may be captured with the help of the data you may have on your computer, yes.

Otherwise, no. When they have you they can just torture you to death for whatever reason or no reason at all.


I was at the Sydney launch event for Assange's implementation[0] in 1997. It is amazing that, after all of this time, it is still apparently such a major theoretical touchstone within the open source cryptography landscape[1].

[0] https://en.wikipedia.org/wiki/Rubberhose_(file_system) [1] https://en.wikipedia.org/wiki/Deniable_encryption


Reminds me of this xkcd: https://www.xkcd.com/538/


if i had reason to be this paranoid about doing something on the computer, i probably wouldn't do it on the computer... see what Ron Minnich amd Bunnie Huang have to say about the state of modern hardware and bios.

that, and i believe AMT is still a thing




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: