Hacker News new | past | comments | ask | show | jobs | submit login
CipherShed, the Truecrypt fork (ciphershed.org)
83 points by JoachimS on Aug 17, 2014 | hide | past | favorite | 58 comments



How close is this fork to the original source? The folks behind the Open Crypto Audit Project[1] have done a fantastic job auditing the security of Truecrypt and they're now doing the cryptanalysis. While an actively maintained fork is desirable, changes to the core dependencies will eliminate the benefits of the audit.

[1]: https://opencryptoaudit.org/


That's very true, but if the Open Crypto Audit Project finds any TrueCrypt vulnerabilities, there will be no one around to fix them.


IIRC, it's forked from the audited version, both as that's the last one before TC got Lavabit'd, but also because they can then get straight to work on fixing any vulns the audit finds.


Interestingly the FUSE element is going to get much more difficult come Yosemite. Apple has implemented a strict unsigned kext ban on Yosemite, and obviously FUSE relies on those kernel extensions, so effectively, building things like FUSE from source is going to get a lot more difficult for end-users.

The FUSE element will still work fine if you take a signed OSXFUSE binary for example, because those binaries & the kexts within them are signed, but distributing source packages to OS X at least is going to become more painful.


I was under the impression it couldn't be forked. Incorrect?


The TrueCrypt author is anonymous and would have to reveal their identity in order to enforce the license, which they would be very unlikely to do.

According to CipherShed's forum the main thing they have to do is remove all references to TrueCrypt from the code: https://forum.ciphershed.org/viewtopic.php?f=12&t=48

I'm not sure if anyone on this site is actually a TrueCrypt developer but there is a new forum: https://forum.truecrypt.ch/

They have some discussion about CipherShed and some other TrueCrypt forks: https://forum.truecrypt.ch/t/working-with-ciphershed/22


Shouldn't, but could:

> On 16 June 2014, the only alleged TrueCrypt developer still answering emails, replied to an email by Matthew Green about the licensing situation. He is not willing to change the license to an open source one, believes that Truecrypt should not be forked, and that if someone wants to create a new version they should start from scratch.[1]

(Copy of the email on pastebin[2])

[1]: http://en.wikipedia.org/wiki/TrueCrypt#End_of_life_and_licen... [2]: http://pastebin.com/RS0f8gwn


The letter of copyright law might allow a prohibition of forking and relicensing (to reiterate, the TC author would have to reveal him or herself in order to enforce that prohibition), but philosophically...

"I don't feel that forking truecrypt would be a good idea..."

This is a straight-up abuse of copyright law. Author has no intent of profiting from the project, no intent of continuing to work on the project, and still waves around the hammer of copyright as if it's a fundamental right absent any connection to the constitutional qualification of promoting progress in the sciences and useful arts.

To say there's a moral imperative against forking and relicensing in this scenario is stretching the rationale for the copyright monopoly quite far.


Interesting. To who else's work can you write a nerd message board post explaining your intrinsic entitlement?

It's not at all shocking that the creator and maintainer of a free encrypted filesystem would eventually abandon the effort, given that comments like these are how the effort is ultimately repaid.

Is there a name for the hammer you swing, the hammer not of abusing copyright but instead of not writing simulated hardware disk encryption and donating it for free? It seems to me that's a far mightier hammer than the one you described.


From each according to their ability and all that. Oh, you can write crypto software? Therefore you must write it for me.


Uh oh, it looks like @tptacek's calling people message board nerds again.


I'm @tqbf, not @tptacek. @tptacek is some other person on Twitter.


I'm still wondering whether this is an elaborate way to tell people that Trucrypt is completely compromised and shouldn't be used.


Of course you are. That's a fun thought experiment. Details of licensure, burned out developers, and Truecrypt's modern place among all the other encrypted storage schemes are much more boring than the cloak and dagger fiction of a Truecrypt warrant canary.


Since the developer(s) is/are anonymous - why would they need any elaborate way of telling people this? They could just state that they consider the code compromised and advise not to use it anymore.


Actually incorrect. (Not legal advice, but read the licences yourself.)

The previous version of TrueCrypt (7.1a - the one you'd actually want to fork because it could still encrypt things), was still under a licence that'd allow you to fork it as long as you didn't call it TrueCrypt (or anything resembling it).

That was basically inherited from E4M. It's an ugly licence, however.

https://diskcryptor.net/ might be worth looking into (this is not a recommendation, I have not audited it). It's certainly cleaner - TC's kinda ugly inside, with a decade of maint cruft.


I used Truecrypt heavily for years, but since all the kerfuffle happened I've switched to other stuff. I like Truecypt and would like to move back, but I'm not sure when I'll be able to trust a fork. I can't audit the code myself so I have to rely, I guess, just on mind share and the opinions of security people who are smarter than me.

I'm curious about what other users are doing in this situation. What are your criteria for trusting a fork? How will you know when something (not necessarily CipherShed) is mature and safe enough to use?


I'm going to trust it because it's using Truecrypt code. I still use the last good version of Truecrypt, but will probably give it a year or so to settle down and get reviewed before I move over to Ciphershed (any truecrypt vulnerability notwithstanding, which might force earlier migration).


What alternatives have you used?


For disk encryption on Windows, I switched to Bitlocker for my personal use. At my workplace we use Symantec Encryption and Sophos Safeguard, my understanding is they are both pretty okay.

For file encryption, now I just do everything on my Ubuntu workstation and use GPG + tarballs. This is sort of a pain in the ass though, and it's not as secure as a TC container of course. It's kind of just a stop gap until I can think of something better.


You think BitLocker closed-source from Micrsofot is more secure and trust worthy then open-source fork CipherShed ? I would take open-source project over closed-source any day. Linux Action Show recently talked about "Tomb", it looks like a nice alternative. https://www.dyne.org/software/tomb/


It's possible for a closed-source product to be written by experts, while the open-source competitor is only written by hobbyists. When it comes to crypto, you probably want the experts.

Also, someone needs to actually audit the source code for the 'open-source' part to really come into play (other than from the discontinuation of support angle). Even with open source code, it's only really been recently that TrueCrypt itself ever got any sort of in-depth audit.


In the case of Bitlocker we know that Niels Ferguson is behind it. The name might ring a bell.


http://en.m.wikipedia.org/wiki/Niels_Ferguson

Niels T. Ferguson (born 10 December 1965, Eindhoven) is a Dutch cryptographer and consultant who currently works for Microsoft. He has worked with others, including Bruce Schneier, designing cryptographic algorithms, testing algorithms and protocols, and writing papers and books. Among the designs Ferguson has contributed to is the AES finalist block cipher algorithm Twofish as well as the stream cipher Helix and the Skein hash function.

...

In 2001, he claimed to have broken the HDCP system that is incorporated into HD DVD and Blu-ray Discs players, similar to the DVDs Content Scramble System, but has not published his research, citing the Digital Millennium Copyright Act of 1998, which would make such publication illegal.

At the CRYPTO 2007 conference ... Niels Ferguson and Dan Shumow presented an informal paper describing a potential backdoor in the NIST specified Dual_EC_DRBG cryptographically secure pseudorandom number generator. The backdoor was confirmed to be real in 2013 as part of the Edward Snowden leaks.


Do you think the closed-source crypto did get any sort of audit?

From what I gather, crypto experts are strongly opposed to closed-source crypto.


You gather that from exactly which crypto experts?

Yes: Bitlocker was audited. No company in the world spends more on audits, and is more sophisticated in sourcing them, than Microsoft.


What information have Microsoft released about the Bitlocker audit(s)? I couldn't find any, although my search wasn't particularly thorough. It seems to me that the value of an audit, for third parties, is that the auditor puts their reputation behind it.


That happens in a statistically negligible number of audits, and most especially in the best cryptographic audits. Which systems has Cryptography Research audited? Answer: you have no idea.


I don't doubt that, but no doubt a great deal of cryptographic audits are produced entirely for internal consumption at the procuring party, and another large chunk of them would be for bespoke software development gigs where the audit is for the benefit of the single customer.

Audits (and here I use the word in its expansive sense) that are intended to build confidence in a large or public audience do tend to be made public.


If that's true, it should be easy to cite audits of important software conducted by well-known cryptography engineering firms. So, tell me: where's the audit of OpenSSL, or SChannel, or NSS, done by Cryptography Research or Riscure? Where's the PGP audit? The LUKS audit?

Can I ask where you came by these opinions of how security audits work? I know where I came by mine.


When I say "the expansive sense" I am not referring to the specific case of security audits. For an example of what I mean, in terms of an audit intended to build confidence in a large audience, this was published in last year's annual report for News Corporation:

  The Board of Directors and Shareholders of News Corporation:

  We have audited the accompanying consolidated and combined balance
  sheets of News Corporation as of June 30, 2013 and 2012, and the
  related consolidated and combined statements of operations, 
  comprehensive (loss) income, equity, and cash flows for each of
  the three years in the period ended June 30, 2013. These financial
  statements are the responsibility of the Company’s management. Our
  responsibility is to express an opinion on these financial 
  statements based on our audits.

  We conducted our audits in accordance with the standards of the 
  Public Company Accounting Oversight Board (United States). Those 
  standards require that we plan and perform the audit to obtain 
  reasonable assurance about whether the financial statements are 
  free of material misstatement. We were not engaged to perform an
  audit of the Company’s internal control over financial reporting.
  Our audits included consideration of internal control over 
  financial reporting as a basis for designing audit procedures that
  are appropriate in the circumstances, but not for the purpose of
  expressing an opinion on the effectiveness of the Company’s 
  internal control over financial reporting. Accordingly, we 
  express no such opinion. An audit also includes examining, on a 
  test basis, evidence supporting the amounts and disclosures in the
  financial statements, assessing the accounting principles used and
  significant estimates made by management, and evaluating the 
  overall financial statement presentation. We believe that our 
  audits provide a reasonable basis for our opinion.

  In our opinion, the financial statements referred to above present
  fairly, in all material respects, the consolidated and combined 
  financial position of News Corporation at June 30, 2013 and 2012,
  and the consolidated and combined results of its operations and 
  its cash flows for each of the three years in the period ended 
  June 30, 2013, in conformity with U.S. generally accepted 
  accounting principles.

  /s/    Ernst & Young LLP

  New York, New York

  September 20, 2013 
I do not believe the lack of a public security audit of OpenSSL, SChannel, NSS, PGP or LUKS indicates anything other than that either no-one cares enough about building public confidence in those projects to fund such an audit, or that anyone who has is sitting on the results because they weren't good.


I thought this was the essence of Kerckhoffs' principle. Bruce Schneier talks about open source a fair amount. Here's an example I found. https://www.schneier.com/crypto-gram-0205.html#1

The reason I have trouble trusting closed-source crypto is that users don't know when it fails, so they can't judge good from bad. Companies that write closed source crypto exist to make money, and there's good money in backdooring your system for the government. Users are happy (since they don't know), the government is happy (since they get the information), and the company is happy (since it's making money on both sides of the deal). In short: the incentives simply don't align in favor of the user.

Weren't there allegations precisely to this effect that RSA took millions from the government to make Dual_EC_DRBG the default in BSafe? I don't know if it's true, but the fact that it's plausible is a problem for me.


Your suggestion here is that users do know when open-source crypto fails? They manifestly, obviously do not.

It turns out, the kinds of people who are qualified to detect when open-source crypto fails tend also to have the means of detecting failures in closed-source crypto.

The problem with discussions about closed-source crypto is that a whole giant cohort of participants mythologize closed-source code. They imbue it with all sorts of magic powers and handwave away arguments by suggesting that the code is itself unknowable. But nobody who really works in my industry engages with code that way. Nothing Microsoft ships is unknowable. No company on Earth is more scrupulously and aggressively reverse engineered than Microsoft's.

Unfortunately, there's lag in learning about crypto failures in Microsoft's code, and it's the exact same lag as we experience for open-source software. It comes of people not actually understanding a fucking thing about how crypto actually works, and it's a problem not just for generalist engineers but for software security experts as well.

Hence: cryptopals.com.


Aside: I've been working through cryptopals over the weekend and I'm enjoying it immensely. Thanks!

I completely agree that both open- and closed-source crypto can fail in ways users do not detect.

The gist of the point I was attempting to make was that the incentives for open-source projects are often ideological, rather than monetary, which reduces the incentive for authors to incorporate weaknesses in exchange for money.


Thank you! I'm happy it's been pleasant for you.

I think the incentive problems with commercial providers are misconstrued; the market --- at least to sophisticated buyers --- is unkind to people who sell their trustworthiness. So commercial providers in fact do have a lot to lose.

To the extent that open source has an edge over commercial software in motivation and ideology, it's cancelled out by the immaturity of the code itself. Cryptography is extraordinarily unforgiving.


It's just as possible for a closed-source product to be written by hobbyists of the field. You're placing your trust in Microsoft employees requiring proper credentials from the engineers, and not just taking the guy who was free and involved in the Word 2007 document encryption


Bitlocker wasn't written by hobbyists. Ironically, Truecrypt probably was. But if you'd like a more certain comparison, look at the Linux encrypted filesystems to see how amateurish mainstream open-source crypto can be.


I take it you mean LUKS/dm_crypt, which is what has any real world use. Could you elaborate? Amateurish good, amateurish bad?


eCryptfs has a lot of use right now. It's what (e.g.) Ubuntu uses to implement secure home directories. It's not a block filesystem, but it is a filesystem. (You mount it on a directory; it's just backed by a directory of encrypted files)

IIRC, there has been criticism of the implementation, and possibly the design. This may be what is referred to.

There are also things like EncFS, too. I'm not sure what the view of some of these other filesystems are from a cryptologist point-of-view.


Pedantic, but important: there's no such thing as a "block filesystem". There is encryption performed at the level of a filesystem, and encryption performed at the level of a hardware device (schemes like Truecrypt are, in fact, simulating hardware disk encryption).

It's a little ironic that encrypted filesystem implementations on Linux are so bad, because the filesystem is a much better layer at which to perform encryption than the device itself.


This is true. I was pointing out that eCryptfs is reading files off of another filesystem, and presenting the decrypted versions of the files as another filesystem vs. something else that can read directly from a block device.


http://blogs.msdn.com/b/si_team/archive/2006/03/02/542590.as...

Is that absolute proof? No.

Is it better than some unknown guy claiming whatever? I think yes. YMMV.


All crypto software is guilty until proven innocent, but BitLocker is fine, that guy who wrote the blog post is sure about it.

Made by Microsoft and closed source, how could I not trust it.


Giving that you're running a closed source OS, you are pretty much out of luck anyway.



This is a blog post that points out that you can back up your keys in a Microsoft online account, among several other options. You've managed to condense it to "Bitlocker is a bad idea". Which leaves me to wonder whether you actually read the piece.


Seems the source has been 1984'd, but a while ago there were leaked Microsoft internal slides about how they store everyone's bitlocker key.


They store the keys of people who give them their keys.

Microsoft's code is the most heavily reverse engineered in the industry. It's 2014, not 1995; you can't summon a "closed source" boogieman to win arguments about what they do and don't do.


Using Bitlocker over open source software makes zero sense. Thanks to Snowden we know for a fact that Microsoft backdoored many of their products and gave access to NSA.

I would be shocked if Bitlocker doesn't have a backdoor.


I also shudder at the thought of Bitlocker being the de facto standard for the average American's crypto needs. Microsoft is consistently the first to kneel before any and every demand of the NSA at the behest of their board members. Skype is a prime example. They were willfully deceptive about their ability to collect and record calls and metadata on the Skype network. Microsoft started working on integrating the NSA's PRISM into skype 8 months before they even purchased it. Well, that or they bought it with full knowledge of the PRISM integration and active deception and never planned to mention it to the public until the Snowden leaks forced their hand. Of course they say they only ever took action in accordance with their legal obligations and only after careful review, but their actions would appear to speak for themselves IMO. Bitlocker is no different from Skype. It's probably fine if you're trying to protect yourself from the prying eyes of an average citizen but it is also probably entirely useless against someone with enough influence and/or connection to the government's secret ops.

http://www.theguardian.com/world/2013/jul/11/microsoft-nsa-c...



tl;dr:

The crypto component of Windows was discovered to have a 1024-bit public key embedded within it whose symbol name is _NSAKEY. The "obvious" assumption was that this permits the NSA to read, sign, or authenticate anything for Windows. Microsoft denies this and says that "we have not shared this key with the NSA or any other party".

I remember when this story first broke. I thought it would lead to major embarrassment and repercussions for Microsoft. Boy was I wrong. There was no news coverage; other than a small subset of techies, nobody was concerned; and as far as I know, Microsoft never gave a technical explanation of how it was intended to be used.


There was no news coverage because virtually no expert in the field believes it to have been an NSA backdoor. The arguments suggesting it is were debunked by Bruce Schneier more than a decade ago. Among them, the obvious: Microsoft held the other key, the "non-NSAKEY" key, and could do anything that the NSAKEY can do. If NSA can coerce Microsoft into adding a whole new signing key, it could just as easily have coerced Microsoft into signing things with Microsoft's own key. Given that, an "NSAKEY" backdoor is irrational.

In promoting the notion that "NSAKEY" is a backdoor, you're lining up against the likes of Bruce Schneier and lining up alongside the likes of WorldNetDaily, which is indeed among the top Google search results for [NSAKEY backdoor].


Anybody use this yet? What's the legitimacy of this compared to Truecrypt's fallen domain?

I see that actually list the developers which is nice. But unfortunately I'm not too familiar with the big name's in cryptography software.


Reading the "news" section, they actually still haven't made any changes, they just managed to compile it!

Some other place that forked the TrueCrypt sources to some repo (VeraCrypt) managed to change a few constants and make the containers incompatible. At least they addressed a known TC weakness: the low number of PBKDF2 passes.

Finally, the contribution of the TruCrypt.ch (TCNext) is that they put the original sources and binaries behind a new domain name and wrote "TrueCrypt must not die."

In short, it's easy to put other's work behind a new domain or in another repo, it's hard and costly to do the real development.


The fact that developers are known and are mostly US based means you shouldn't trust their product. Any of them can be silenced by a secret gag order and forced to introduce a backdoor/weakness, and there isn't much they would be able to do other than move to Russia and reveal it.


From what I understand the author(s) explicitly do not want this forked. The license has not been changed, and whatever fork this is will be from an older Linux repo.


Is this the same or different project from the TrueCryptNext one?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: