Hacker News new | past | comments | ask | show | jobs | submit login

That just pushes the task of optimising the workload up to you, complete with opportunities to forget about it & do it badly.

I don't relish the idea of splitting sections of a file up into N chunks and running N grep's in parallel, and would much rather that kind of "smarts" to be in the grep tool




It has no choice but to read file data in chunks or exhaust memory.

If you need to do n parallel searches what better arrangement do you propose?


I propose the search tool decide how to split up the region I want searched, rather than me trying to compose simpler tools to try to achieve the same result.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: