I think it depends a bit on circumstance, but I think I'd start with "way too much software binds to 0.0.0.0 by default", "way too much software lacks decent authn/z out of the box, possibly has no authn/z out of the box", and "developers are too lazy to change the defaults".
Do you mean "why is running a firewall on an individual host useful"? Single-application hosts are quite common, and sadly some applications do not have adequate authentication built-in.
Do you mean "why does Linux allow firewalling based on the source host"? Linux has a flexible routing policy system that can be used to implement useful controls, host is just one of the available fields, it's not meant to be used for trusting on a per-host basis.
It's a catch-all in case any single service is badly configured. This often happens while people are fiddling around trying to configure a new service, which means they are at the most vulnerable.
There's always an edge case, gotta know the various sec controls to slice the target risk outcome, vs target outcome == specific implementation. Security hires who are challenging employees are the latter types.
Edge case and your answer, in spirit - public-facing server, can't have a HW firewall in-line, can't do ACLs for some reason, can't have EDR on it.... at least put on a Linux host-level FW and hope for the best.