Not only inefficient; some linux commands fail when they're used on more than a few million files at a time - there is a maximum number of arguments that they can handle.
EDIT: bzbarsky's explanation below is more accurate.
Typically the issue is not the commands themselves but the shell. Trying to do "something *" on the shell command line will expand out the glob, and if the resulting string is too long (e.g. you fail to malloc() it!) the shell will do something ranging from crashing to not running the command and giving a useful error message.
That still relies on the shell expanding a glob of millions of files. Another method is to use 'find' and 'xargs' to avoid specifying the files as arguments explicitly.
EDIT: bzbarsky's explanation below is more accurate.