Hacker News new | past | comments | ask | show | jobs | submit login

Redis is the best data store I've had the fortune to work with. No nonsense data structures coupled with extremely reliable performance, beautiful API and extremely easy to manage.

It also helped me earn a lot of praise for using redis-cli --pipe to bulk insert data which bringed down a data import job to a few minutes from 4 hours. I eventually built a wrapper around redis-cli --pipe for using it with a cluster.




You clearly haven't used it at scale with sorted sets.


Can you elaborate on that please?


Try inserting more that 30K elements in a key which is a sorted set and watch the insertion time, memory & cpu usage. Now try doing this to millions of keys simultaneously.


We run several million sorted sets, but they are all short sets (100s), but do thousands of writes/sorts per second without issue.

From memory - there was a setting to turn on/off gzip compression for a list once it went beyond a certain size - do you have this enabled?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: