Hacker News new | past | comments | ask | show | jobs | submit login

I heartily recommend using concurrent.futures. It is a standard part of Python 3.2+ - http://docs.python.org/dev/library/concurrent.futures.html - and you can get it for other Python versions - https://pypi.python.org/pypi/futures

Behind the scenes it uses multiprocessing and/or threading plus queues etc. I have a function that adds command line arguments (number of workers, use threads or processes, debug mode) and then another that returns an executor given the command line arguments.

The debug mode forces a single thread and is intended for when you want to run pdb as multiple threads/processes make things considerably more complicated.




I've been programming with import Queue, import threading, for years now (probably 8 or 9). I don't want to drop all my completely fine multithreading code for Yet Another Threading API.

Personally I think multiprocessing is just dopey. Why force me to program around two different address spaces when I can just have one?


I'd also done the Queue/threading thing for the longest time. I'd also use multiprocessing some of the time. But it was arbitrary, and I'd have to make sure that shutdown, exceptions etc are correctly handled.

> Yet Another Threading API

concurrent.futures is not yet another threading api. futures is a standard async work/result wrapper, while the process and thread executors are standard executor apis.

> Personally I think multiprocessing is just dopey

Then don't use it. But be aware that it works well for many people. Some of my code scales linearly with multiprocessing and all I had to do was use a process executor instead of a thread executor.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: