How about expecting that DB doesn't bind to 0.0.0.0 by default and to force passwords to be set during the installation? Is that such an unreasonable thing to expect?
The more things change, the more they stay the same. I'm thinking of people who back in the 90s/early noughties didn't want to bother with a middle tier application, and instead talk directly to the database from their fat clients(1) because it was easier.
(1) This, btw, is what many SPA are: fat client apps running within a browser. When I came back to web development four years ago one of the biggest surprises was how much like desktop development it had become in many ways.
This is Hacker News not Fox News. At least be intellectually honest.
It's not insecure by default. It just binds to all interfaces. Apache and Nginx both do this and we don't consider them to be insecure. Should a database be doing this ? That's debatable since it's a tradeoff between security and ease of use.
But that said if you are running an internet facing server without a firewall then you will have bigger problems than just your database.
I've always thought that the biggest service openbsd did, was teach people to remove unneeded stuff and turn off unused services. Remember when people used to sneer that X years without remote root in the default install was no wonder, because the Base install didn't do anything useful?
I also don't get this "firewall" idea. Why make something listen for everything, and then place a system outside to restrict it? Why not just whitelist what you want to listen to in the first place?
Note, I get that binding an application to localhost and then letting a dedicated proxy do the heavy lifting to link up with other systems (eg stunnel or haproxy) - but what does packet level filtering really gain you?
In general I see firewalls as just adding complexity - one more source of bugs and potential mis-configuration. (Say the fun when ipv6 exposes the soft inner network that everyone thought was "firewalled" when in fact it just had broken connectivity due to chappy NAT borne from scarcity of routable addresses).
If you don't necessarily know what's going to be running on a machine, a firewall gives you control over what's allowed in or out. If a lazy dev installs some tool that listens to everything on a machine that's on the internet, a firewall will protect you from their laziness.
In an ideal world, everyone would care about this stuff (and have time to properly set these things), but we're not in an ideal world.
Right. I would rather fix the broken developer once, than paper over systems with a firewall. Perhaps I'm too idealistic. (IMHO proper devops does this - helps give devs a proper view of system administration by sharing knowledge and responsibilities).
I'm also pessimistic enough that I think allowing development to install back doors (eh, "useful helper daemons") willy-nilly in production systems is a bad idea ;-)
Who said the devs have access to the production systems? :) You can still lose valuable information with the loss of a testing server.
But you're acting as if you know which dev is the one who is going to do 'it'. I'd rather have a firewall that is largely set-and-forget than keep tabs on teams of devs that go through hiring cycles. There's already enough for ops folks to do without having to psychologically evaluate developers... besides, I've been through a few devs who agree sincerely not to do $bad_thing, and then caught them a few days/weeks/months later doing it again.
It's sad, really. I'm not even a security zealot, but I have overheard the folks in my small company tell each other not to let me know that they've signed up to a SaaS with a weak password (company name + digit).
Oh, I suspect all devs, I just think a sound process around small, cross-disciplinary devops teams is the preferred approach.
I get that a firewall can sometimes help fight broken practices (eg: bind on all interfaces, no password by default). But if your devs end up deploying password auth in general (rather than key/cert based) - with weak passwords in particular - your firewall is unlikely to help in the case where a service is supposed to be exposed.