> Writing software that may run in a potentially hostile environment should be done with great care, and a thorough analysis of what you are actually protecting against.
The first section of his documentation states what the purpose of the module is and what's it is trying to protect against. The use of RSA there is pretty reasonable, so I don't know why you're being so hard on him. It also talks about how this project is a variation on another project, so it's not like he is going off half-cocked and implementing something crazy.
> This is a job for people who take programming seriously, not amateurs
Everyone starts out as an amateur. You can either be an internet know-it-all that wants to show off how little you know or you can provide constructive feedback.
> who can't be bothered to read the documentation of the modules they are using, e.g. http://stuvel.eu/files/python-rsa-doc/usage.html
> Note especially the bits about signing RSA, and how to use it to encrypt files. Doing raw RSA on strings longer than 245 bytes is unwise, and inventing new protocols for RSA is unwise. That the author does these things is strong evidence that the author is not demonstrating "great care".
Did you bother to read the documentation yourself? It directly talks about encrypting large files. Yes, it's true that trying to encrypt more data than is supported for a key size will leak information, which is why implementations throw an error if you try to do that. The usage documentation you pointed to discusses that and provides options to cope with it. But, in this case, he is not even encrypting the whole file, just the values of certain properties, which are probably going to be small enough to not require any extra measures.
And, again, what does signing have to do with how RSA is being used in greybox?
> If the author believes forgery/tampering is orthogonal and/or protection from in-process attacks, then the author should state that. That's part of the analysis step.
They did state the scope of the project and there is really no need for them to go into deep detail about deployment. Are you going to criticize him for not pointing out that the whole system needs to be secured from tampering? Of course not, so stop acting like a douchebag.
>> Of course, the key can be recovered if you have the ability to ptrace. What's your point? That's going to be true of any solution out there.
> Nonsense. If you delete the key from memory, it is obvious that someone cannot use ptrace to recover it afterwards. Detailing what is at risk, and what is not at risk is part of the analysis step.
You have to ask, where did the key come from in the first place? The private key file has to exist somewhere that is accessible on the box so that the process can read it in the first place. If someone has ptrace, they can probably read a file as well.
Ultimately, it's about the level of risk people are willing to live with. For what greybox seems targeted at, it's encrypting properties in files that are committed to a repo and then using a private key that is only available on certain machines to read that property. For some use cases, that is probably a perfectly fine level of security and better than what is being done in some cases anyways.
> You have to ask, where did the key come from in the first place?
It is sensible to type it in while the system is still in single-user mode.
If you left the private key in a file that is right next to your encrypted configuration file, then it remains as I stated previously: No security except false security.
> Everyone starts out as an amateur.
This is not the forum for an education on programming. A library that states it was written as a learning exercise with a request for criticism will be treated that way. A library proposed to solve problems recognised in the linked article will not.
> Ultimately, it's about the level of risk people are willing to live with.
People are notoriously bad at recognising and evaluating risk.
> If you left the private key in a file that is right next to your encrypted configuration file, then it remains as I stated previously: No security except false security.
Shut up and read the goal of the project already. It is trying to protect secrets that are stored in a repo and it achieves that goal.
Secrets on a deployed host will always be in the clear on that host, it's just a fact of life. Many barriers can be put in place, but at the end of the day, a program will always need access to the plaintext version of the secret at some point.
> It is sensible to type it in while the system is still in single-user mode.
No it isn't. Services have to restart all the time, saying that a human has to be ready to type in a password at any moment is not practical.
> This is not the forum for an education on programming.
But it is a forum for you to post invalid criticisms of a project and act like a dick? That's bullshit. This is "hacker news", it's a perfectly fine place for a technical discussion. If you thought there were problems with the project and this wasn't the right place to discuss it, then file issues on the github project.
> A library that states it was written as a learning exercise with a request for criticism will be treated that way. A library proposed to solve problems recognised in the linked article will not.
There is no difference, you're just trying to justify your bad behavior.
The first section of his documentation states what the purpose of the module is and what's it is trying to protect against. The use of RSA there is pretty reasonable, so I don't know why you're being so hard on him. It also talks about how this project is a variation on another project, so it's not like he is going off half-cocked and implementing something crazy.
> This is a job for people who take programming seriously, not amateurs
Everyone starts out as an amateur. You can either be an internet know-it-all that wants to show off how little you know or you can provide constructive feedback.
> who can't be bothered to read the documentation of the modules they are using, e.g. http://stuvel.eu/files/python-rsa-doc/usage.html > Note especially the bits about signing RSA, and how to use it to encrypt files. Doing raw RSA on strings longer than 245 bytes is unwise, and inventing new protocols for RSA is unwise. That the author does these things is strong evidence that the author is not demonstrating "great care".
Did you bother to read the documentation yourself? It directly talks about encrypting large files. Yes, it's true that trying to encrypt more data than is supported for a key size will leak information, which is why implementations throw an error if you try to do that. The usage documentation you pointed to discusses that and provides options to cope with it. But, in this case, he is not even encrypting the whole file, just the values of certain properties, which are probably going to be small enough to not require any extra measures.
And, again, what does signing have to do with how RSA is being used in greybox?
> If the author believes forgery/tampering is orthogonal and/or protection from in-process attacks, then the author should state that. That's part of the analysis step.
They did state the scope of the project and there is really no need for them to go into deep detail about deployment. Are you going to criticize him for not pointing out that the whole system needs to be secured from tampering? Of course not, so stop acting like a douchebag.
>> Of course, the key can be recovered if you have the ability to ptrace. What's your point? That's going to be true of any solution out there. > Nonsense. If you delete the key from memory, it is obvious that someone cannot use ptrace to recover it afterwards. Detailing what is at risk, and what is not at risk is part of the analysis step.
You have to ask, where did the key come from in the first place? The private key file has to exist somewhere that is accessible on the box so that the process can read it in the first place. If someone has ptrace, they can probably read a file as well.
Ultimately, it's about the level of risk people are willing to live with. For what greybox seems targeted at, it's encrypting properties in files that are committed to a repo and then using a private key that is only available on certain machines to read that property. For some use cases, that is probably a perfectly fine level of security and better than what is being done in some cases anyways.