It might be because on the server side they usually don't care about energy or RAM much. The StackOverflow dev team has an interesting blog post somewhere, where they explain that they figured at one point C#'s GC was the bottleneck and they had to do a lot of optimizations at the expense of extra code complexity to minimize the GC overhead.
It is actually quite rare that companies think of their infrastructure costs, it's usually just taken for granted, plus that there aren't many ARC languages around.
Anyway I'm now rewriting one of my server projects from PHP to Swift (on Linux) and there's already a world of difference in terms of performance. For multiple reasons of course, not just ARC vs. GC, but still.
With all due respect, (big) servers care about energy costs a lot, at least as much as mobile phones. By the way, out of the manages languages Java has the lowest energy consumption. RAM takes the same energy whether filled or not.
Just because GC can be a bottleneck doesn’t mean it is bad or that alternatives wouldn’t have an analog bottleneck. Of course one should try to decrease the number of allocations (the same way you have to do in case of RC as well), but there are certain allocation types that simply have to be managed. For those a modern GC is the best choice in most use case.
It is actually quite rare that companies think of their infrastructure costs, it's usually just taken for granted, plus that there aren't many ARC languages around.
Anyway I'm now rewriting one of my server projects from PHP to Swift (on Linux) and there's already a world of difference in terms of performance. For multiple reasons of course, not just ARC vs. GC, but still.