Hacker News new | past | comments | ask | show | jobs | submit login

Typically the issue is not the commands themselves but the shell. Trying to do "something *" on the shell command line will expand out the glob, and if the resulting string is too long (e.g. you fail to malloc() it!) the shell will do something ranging from crashing to not running the command and giving a useful error message.



Not claiming to be an expert here but I typically do this when confronted with a large number of files.

Instead of: command *

for i in [someregex]*

do

command $i

done

I know I could also do command [someregex]* but like the comfort of having each item echo back to the terminal so I know the progress.


That still relies on the shell expanding a glob of millions of files. Another method is to use 'find' and 'xargs' to avoid specifying the files as arguments explicitly.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: