Hacker News new | past | comments | ask | show | jobs | submit login

This article starts by saying that there is something badly wrong with modern programmers.

It then details and critiques the way a typical one-man, one-site 'startup' is using discrete 'cogs' to build his system, presumably whilst learning how to market, build customer relationships and develop a beautiful and compelling product that makes enough money to keep him afloat.

I think the author may be missing the point. An elegant and sustainable back-end does not directly correlate with an elegant and sustainable business.




I don't think so.

Is installing and configuring Redis, installing and configuring a client library, and integrating the usage of that client library with your system really a better use of a one-man startup developer's time than writing

  new HashMap
and going back and fixing it later if it matters?

I had a three hour argument the other night with a developer trying to implement some sort of complex logic involving caching for a fairly small blacklist file. I eventually had to make him go away and benchmark it ... at which point he realised what my original point was - loading that file off disk and parsing it on every request it was needed was actually faster than talking to a cache to get a pre-parsed version.

Overengineering is still overengineering, even when most of the components you use to do it are provided by somebody else.


I see your point, and if all you need is a HashMap then use a HashMap.

But if that data in the HashMap must be available to another process, or you need more "features", Redis suddenly looks like a very easy solution.


The point is that you don't need multiple processes and Redis most of the time. Your site can easily be served from one box and one process.


That depends on what you're creating and how. For better or worse[1], not everything follows the one-enormous-process Java way of doing things.

Btw, Redis is extremely fast to set up, but as I sai, if all you need is a HashMap..

[1] - Almost always for the better if you ask me. That Java "enterprisy" way of development is the worst thing I have ever experienced in computing ;)


Even multiple threads - any sort of concurrent access/modification on the same dataset becomes problematic if you just use a HashMap.

Sites that have any kind of download/upload component, even though does not have too much traffic, will need to serve multiple requests using threads/processes.

That is where we normally use relational database. However Redis presents a way to be faster than conventional databases by storing everything in memory. This is as close to a concurrency-safe HashMap that you can get with the least amount of effort.


But Java comes with several currency-safe hash-maps and trees and such - where have you been?

If you think that blocking TCP calls to a redis server that serialises everything is going to be faster than a `new HashSplayTree<String>()` you're absolutely bonkers.


"This is as close to a concurrency-safe HashMap that you can get with the least amount of effort."

No, it's more effort than using a concurrency-safe HashMap in a JVM process.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: