Hacker News new | past | comments | ask | show | jobs | submit login

I've always wondered just how much a network run by someone like Google or Facebook or one of the other absolute top tier providers like AWS or Azure might be able to 'handle' in terms of dealing with DDoS attacks.

Presumably these giants can easily handle such traffic as long as someone is willing to pay for the privilege? 665gbps seems tiny in comparison to the capacity someone like Google might have at its disposal but I'm speculating as I haven't seen anything detailing their network stats.

To give something of a concluding statement to this waffle I guess I have respect for Google in running this public service type protection for sites that have a strong enough 'public good' element.




Right, we don't talk (in detail) about the "capacity" of our network. However, you can see that we just added a crazy new submarine cable to Japan (FASTER, https://plus.google.com/+UrsHölzle/posts/c6xP4PGmTAz) with 60 Tbps (I think we've reported somewhere that Google will use 10). And in many ways, it's a lot easier and cheaper to do networking over land like say in the US or Europe than between the west coast and Japan ;).

So yes, 665 Gbps is well within our network capability.

Disclaimer: while I work on Google Cloud, I'm no networking expert.


Very interesting, thanks. I had somehow not seen Google's involvement in this project.

Reading between the lines of what you said it's clear just how big your network could be!


For what it is worth: My GCP-hosted site (https://cloud.sagemath.com) was hit by a DDoS attack in April (a WordPress amplification attack), which was about 5GB/s at peak time. The GCP network had no problem handling the traffic, but the Linux network stack in my cluster of GCE VMs -- which were running nginx -- simply couldn't handle the load. I now use CloudFlare.


Sorry for your troubles! Why'd you let it through, Bill? (Also, did Support help you out with this or not? You can email me your case number if so).

Disclosure: I work on Google Cloud.


It sounds like too much abussive traffic reached the VM. I guess Google doesn't have a similar product as CloudFlare for normal sites that can throw up captchas, enable caching, etc. in response to an attack.


I recall that Google was taken down by a botnet in 2003, but I'm not able to find any articles about it.

Today it's different, not only do they have a massive network, but they probably also have firewalls which drop the attacks from the start.


That wasn't Google, that was Yahoo!, Amazon, CNN and a few others by a script kiddie that went by MafiaBoy using someone else's botnet.

https://en.wikipedia.org/wiki/MafiaBoy




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: