Hacker News new | past | comments | ask | show | jobs | submit login

I guess it might be very useful in a cloud like environment where you use this set up to pipe data across scaled up servers.



I use it as a poor man's job queue.

cat jobs.txt | redis-pipe jobs

An then on several workers: redis-pipe --count 10 jobs | python do-work.py | redis-pipe results

An then in the end: redis-pipe results > results.txt

Or you can use it as simple logging mechanism: tail -f /var/log/syslog | ./redis-pipe logs


Poor man's job queue is celery with redis backend i guess.


Mine is even more poor :)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: