Hacker News new | past | comments | ask | show | jobs | submit | chflags's comments login

The traffic between the attacker and the private network would have been properly encrypted.

Assuming they chose a sensible cipher suite, where is the "crypto flaw"?

Maybe the problem is not the crypto but the strange system devised for authentication of an endpoint.

I call this x509. In my opinion this bizarre^1 "trust model" dating back to a long past era in computing is one of the reasons why MITM on SSL/TLS^2 in today's world of computing is so easy. The other reason is the trust placed in domainnames over IP numbers. Both are systems that delegate "authority" to third parties.

1. https://www.cs.auckland.ac.nz/~pgut001/pubs/x509guide.txt

2. Or perhaps an attack like the one here.

OpenSSH can use x509 for authentication but by default it does not. It uses host keys. Maybe it's not even a "default" but it is the traditional usage.

CurveCP also uses keys instead of certifcates.

The big difference in my view is that x509 relies on the idea of a "certificate authority" or "CA" that "issues" certificates for users.

Host keys require no such authority -- users generate their own.

With x509, there is a reliance on third parties that somehow are deemed "trusted". Host keys require no third parties and therefore no added layers of trust.

Certificates are a business^3. Host keys, to my knowledge, are not.

3. This is starting to change so that certificates can be "free" like host keys. I'd argue it is still a business however, due to the third party involvement in the process.

The widespread use of "self-signed certificates" seems to be an illustration of the desire by users to use certificates like host keys -- i.e., without needing a third party "authority" to "issue" them.


"... breaks the trust model of SSL/TLS,"

Certainly some of the encryption one can get via SSL/TLS is worth something. (But then one could use that encryption outside of TLS, too.)

And maybe some elements of the protocol are worth something.

But on the open internet is the "trust model" really worth anything?

It is so ridiculously easy to subvert. Cloudflare does it on a mass scale.

But one does not need to be Cloudflare to do it. The "incovenience" of subverting SSL/TLS is minimal.

Any website who is delegating their DNS to some third party is potentially vulnerable not to mention any user who is delegating their DNS lookups to a third party. Those are very large numbers.

Note I said open internet. I am not referring to internal networks.

Also - Question for the author: Was the archiving of dnshistory.org successful? Did they recently shut down and use Cloudflare to block ArchiveTeam?


Not sure I understand what you are saying. If you are saying that "Any website who is delegating their DNS to some third party is potentially vulnerable" to subverting SSL/TLS, then you are absolutely wrong. Malicious DNS can help the attacker to insert her servers between the user and the web service the user is trying to access, but it doesn't subvert TLS/SSL man-in-the-middle protection in any way.


Malicious DNS can request cert for the domain via e.g. let's encrypt, then it can do whatever it wants.


My understanding is that it doesn't apply at least to EV certificates. Also, the parent says that "any user who is delegating their DNS lookups to a third party", but that can't apply to such users either.


> But on the open internet is the "trust model" really worth anything?

It is. I'd be the first to admit that the CA model is absolutely not a solution that works well overall[1], but regardless of that, it's very hard to get away with a non-targeted attack on TLS (eg. by compromising a CA). Only targeted attacks are really viable, dragnet surveillance is not.

The problem with the way CloudFlare breaks the trust model, is that it's broken for everybody - not just high-risk individuals in a targeted attack, but every single person that talks to a site going through CF. It's completely viable to do dragnet surveillance or modification without anybody realizing it, and this makes it a much bigger breach than the CA model.

> Any website who is delegating their DNS to some third party is potentially vulnerable not to mention any user who is delegating their DNS lookups to a third party. Those are very large numbers.

Not without making a lot of noise. In the context of not having a good way to establish trust for previously unknown entities (Web-of-Trust doesn't really work there), the best we can do - at least, until we find a better solution - is making tampering as public and noisy as possible, so that it becomes risky for a malicious actor to carry out large-scale attacks.

Keep in mind that DNS requests are not directly done by clients, but rather through hierarchical caching resolvers - assuming that CAs used something like Google's DNS servers, an attacker on the DNS provider's network would have to spoof the DNS responses to Google, and as such have a very large portion of the internet end up on the wrong DNS record.

With the amount of DNS history services and security companies monitoring DNS discrepancies, it'd be pretty much impossible to get away with this quietly. Any attempt at subverting the verification process by changing DNS records would immediately show up everywhere.

> Also - Question for the author: Was the archiving of dnshistory.org successful? Did they recently shut down and use Cloudflare to block ArchiveTeam?

Unfortunately, our archival effort was interrupted by the operators of dnshistory.org enabling "I'm Under Attack" mode. We did not have enough time to implement the bypass before the service shut down (although it is what caused me to write the bypass code linked from the article).

I have to say it was a rather strange case anyway. We'd contacted them well in advance - multiple times, I believe - to ask about obtaining a copy of their data (which would mean we didn't have to scrape their servers), and they'd completely ignored the messages.

Only after we'd contacted them to ask about the block, did they reply with a biting message about "causing issues for other users on the site". Why they thought the impending shutdown and removal wouldn't cause issues for their users, I don't know.

[1]: http://cryto.net/~joepie91/blog/2015/05/01/on-mozillas-force...


I agree with the distinction you make between targeted and non-targeted. But I think being able to easily accomplish targeted attacks on SSL/TLS is a cause for concern -- and indeed that's what I'm thinking of. My thought is that it should not be possible for users to place such trust in something that is so easily subverted. As for DNS, I see no reason why one cannot encrypt DNS packets to prevent tampering. If users ignorantly want to use third party caches (which opens up more problems than just the one you mentioned), even when it's so easy to run a local cache, then we see arguments for another "trust model", e.g., DNSSEC, etc. Same problems.


Does RT still use mrsync? Great program.


Boots from SD card. Uses U-Boot for bootloader. Hardware support in two BSD projects as well as Linux.

Godspeed.


Why do they say "duckduckgo-owned"?

It appears they are using AWS. Who really owns the server?

   echo 50.18.192.251 duckduckgo.com >> /etc/hosts
Avoids needless DNS lookups; saves DNS logs from your footprints.


... and will break as soon as that instance is shut down.


That's why I'd prefer a search engine that was not using AWS, or Yahoo. Too many needless dependencies.

Out of curiousity I'm starting a counter today. Will post something when this IP addr fails. Feel free to take a guess how long it will be used by DDG.

And when it does change, if ever, I have a one line shell script that uses ed to remove entries from files, so all it takes to add a new IP addr to HOSTS is typing

    ed-script-name domain-name /etc/hosts
    echo aa.bb.cc.dd domain-name >> /etc/hosts
In my opinion, the usefulness of a website is inversely proportional to the frequency with which it changes IP addresses.

For example, the IP address for HN rarely changes.

But feel free to keep looking it up in DNS every day. Just in case.

If there are problems with DNS, most users will probably not remember the IP address for HN and will not be able to read the stories.

Meanwhile, the user that stores IP addresses as a backup will have no such problems.


So it's not possible to log on without enabling Javascript?

I guess that's one way to coerce the user into enabling Javascript, at least temporarily.


This is the change-password form, not the login form.


"We started out collecting this information by accident, as part of our project to automate everything, but soon realized that it had economic value."

Is he saying the end results of the project to [insert stated project purpose here], e.g., "to automate everything", "to organize the world's information", etc., did not have enough economic value to sustain the project... and hence founders were "forced" to collect data as a means to generate value?

Here's another version: pre-Google search engines realized they could sell ad space, i.e., paid placements, e.g., to auto manufacturers.

Once the advertising industry became involved, then collecting data about the network's users, if one could do it, was a no-brainer.

"... really it's just the regular old world from before, with a bunch of microphones and keyboards and flat screens sticking out of it."

Not sure that young people who worship Silicon Valley want to believe this, and why should they?

The author always makes a good case for the potential long term consequences of the so-called "changes to the world" that many programmers are adamantly pursuing.

Maybe these programmers are not changing the world. Maybe they're just doing what others already did in the past, on a smaller scale, without the benefit of cheap electronics.


Is he saying the end results of the project [...] did not have enough economic value to sustain the project... and hence founders were "forced" to collect data as a means to generate value?

I don't see any reason to read that into his speech. We're used to the "you pay or are the product" dichotomy being propagandized at us as the reason for accepting surveillance business models. I don't see that here. I see it more like describing the realization that there is an(other) revenue stream staring the creator in the face.


Stallman's writing is reminiscent of The Jargon File or FOLDOC. There are many "silly" terms like the ones you mention in those collections. (More on textfiles.com.) Stallman was a part of that era. Maybe he's just an original 1970's "hacker" trying to stay true to his ideals. They liked to create their own counter culture vocabulary.

Then you have Zuckerberg who puts some blurb in Facebook's first SEC filings that his website^W company will follow the "hacker" ethos. What was he doing there, using that term? He sure has a lot of corporate sponsorship for a "hacker".

Imagine Stallman and Zuckerberg in a certain American game show that ran for several decades. Hint: The game's title relates to honesty.

"Will the real 'hacker' please stand up?"


I'll bet Zuckerberg and countless Facebook employees have used software written by Stallman.

At the risk of being wrong, I'd even go so far as to say they need this software.

But does Stallman ever need to use software written by Zuckerberg or Facebook employees?

Personalities and errors in judgment aside, give respect and credit where it is due.

I have never needed Facebook's software. And I doubt I ever will.

From a purely utilitarian viewpoint, my gratitude goes to Stallman and the people who brought us the internet. I'm not sure what I could thank Facebook for.

If Zuckerberg had failed to be in the right place at the right time, there would always be a substitute.

But if there was no Stallman back in the 70's and 80's, would we still have gcc, gdb and so much free, open source software?


Nobody needs GNU software; they can use FreeBSD which is just as good. They can also use clang and lldb instead of gcc and gdb.

You can debate about whether those would have existed in their present form without the leadership of the FSF, but then you'd be making a different argument.

By the way, I'm not sure if you are trying to imply that Stallman brought us the Internet but I can't see how that could possibly be true.


"...I'm not sure if you are trying to imply that Stallman brought us the Internet..."

No. The opposite. I'm trying to draw attention to the fact that besides free open source software, improvements in the network allow companies like Facebook to grow as big as they are.

Before clang, FreeBSD used gcc. I'm actually not a GNU/Linux user; I wish BSD did not have to use gcc, but for as long as I've been a user, release engineering has always used gcc. I would personally prefer to use something like pcc as opposed to clang.

Today, lots of companies use GNU software, when they could just as easily use BSD. And many BSD users still need gcc to compile their OS. Not how I expected things to turn out but attacking Stallman makes little sense at this point.

Instead, maybe we should be thanking him.

Companies like Facebook offer very little while taking all our personal information for their commercial use, while open source projects like BSD and GNU continue to offer a great deal, without requiring so much as an account or password, much less tracking our every move.


Needed. LLVM hasn't been around as long as Facebook, and for a long time initially you couldn't (or wanted to) build the Linux kernel with it.


The BSD licence only exists because Stallman lobbied for it IIRC.


But the free (as in libre) software movement has done something socially that I don't think could have been accomplished with BSD.


"... linux laptops were being reformatted to run windows."

Microsoft/Windows is a cancer. Ballmer once said Linux was a cancer. He later retracted, after it was well-known Microsoft itself uses Linux.

There's nothing more pathetic than when you see Microsoft's businesspeople or lawyers at conferences all faithfully using Windows, as if there was no other choice, and rambling on about how their products can solve any problem. These are not stupid people, but they are blinded to independent reasoning about computer software.

And then there are the people at Microsoft Research. What a waste (not for them -- they probably get paid handsomely). It is like MS is keeping these minds locked away, so the zombie-like adherence to Windows can persist. Keep the monopoly going.

Microsoft is a cancer on the brain. It creates a zombie-like, tunnel vision of computing. Everything must pass through Redmond.

Microsoft continues to remain dangerous to the future of computing, because they continue to work dilgently to effectively quell all independent thought from being implemented and made accessible to users.

Intent, malice, etc. is irrelevant. Regardless of why they do it, the end result is suppression of non-Microsoft software.

And now hardware.


This comment breaks the HN guidelines by calling names. Please don't do that on this site. The old-fashioned technology flamewar is one of the things we're trying to avoid here, regardless of how one feels about Microsoft etc.

We detached this subthread from https://news.ycombinator.com/item?id=11770051 and marked it off-topic.


OP: "... cemented them as assholes in my mind."

That's "name-calling".

But I understand the need to detach the silly hyperbolic responses my comment triggered. I was not expecting those.

Please accept my apologies for any incomvenience I may have caused.


What do you make of all the Windows-unrelated stuff that comes out of Microsoft Research? Seems like a lot of money and effort they're spending to cover their real mission of infecting the world with their zombie cancer.

I recently saw a Microsoft Research paper about a memory mapping prototype the authors implemented in Linux. I now realize they must have been personally executed by Satya "Kim Jong Un" Nadella when he learned of their non-Windows treachery.

We can only pray that one day these minds will be set free, and we can finally enter the golden age of AI that would surely have arrived 30 years ago if not for the scourge of Microsoft.


I'm surprised at the negative attitude you have of MSR. People at MSR seem to have it pretty good and are free to publish their results to the open, so I don't see anything wrong with what they do. It actually seems nice that Microsoft is willing to fund researchers on things that are not directly related to products like Theoretical CS.


There is nothing more PATHETIC than a designer using Windows. If only he could taste the wonderful FREEDOM of GIMP and throw down the shackles of Microsoft oppression!!! The CANCER would be gone from his brain and he would soar free like a majestic eagle on the winds of unZOMBIEfied morning forever. Yet he tolls forever blinded to independent reasoning about computer software! The work of the blind designer cannot be true! Everything else is irrelevant.

Yet there is NOTHING more pathetic than a gamer using Windows. This is a cancer on his game-brain, nothing more that a zombie-like tunnel-vision of gaming. The gamers are not stupid people yet Microsoft keeps their capable mouse-hands LOCKED away so their skills will never develop beyond what Redmond envisions. Everything MUST BE CONTROLLED through One Microsoft Way, and so Microsoft continues to remain dangerous to the FUTURE of gaming! Lo and behold, the end result is SUPPRESSION OF ALL non-Windows games.

And now - hardware.


Is your argument that Microsoft is evil because not all people who use computers are actually interested in how they work?

It's not apparent to me why business people or lawyers should be also computer science aficionados. Should I be a telecoms engineer to use a phone or a combustion specialist to drive a car?


I think the idea is MS employees are hamstrung by having to use an inferior OS because it's the one their employer makes. It's like being a Honda employee who needs to tow a heavy load for some company project, and while a big Ford diesel F-350 would be perfect for the job, you're not allowed to use that and have to use a Honda Ridgeline instead, but the load is beyond the Ridgeline's towing capacity so you either have to use two Ridgelines (which is probably illegal and technically difficult), or just overload one Ridgeline (which is both illegal and very dangerous). Or, you're an employee of Freightliner, and you need to go buy something from a local business, but you have to use a company vehicle. But the company vehicles are all Freightliner semi-tractors, and they don't have any regular cars in their fleet because they don't make cars. So you're expected to drive a semi-tractor to some local business to pick up something, and the local business doesn't even have a parking lot large enough for you to get the truck in.

Best tool for the job, and all that. Now arguably, Windows is indeed the best tool for business type people working at MS, due to the application software, the network environment they're in, etc.. However, for researchers doing heavy computing, I would say it's definitely not, it's quite inferior. There's a reason most of the supercomputers, render farms, etc. now run Linux, and it's not just license cost.


They are simply following incentives set by the economic environment they operate in.


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: