Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That runs perl multiple times, possibly/likely often in calls that effectively are no-ops. To optimize the number of invocations of perl, you can/should use xargs (with -0)




No need for `xargs` in this case, `find` has been able to take care of this for quite some time now, using `+` instead of `;`:

    find . -type f -exec perl -p -i -e 's#foo#bar#g' {} +

xargs constructs a command line from the find results, so if **/* exceeds the max command line length, so will xargs.

xargs was written to avoid that problem, so no, it won’t. https://man7.org/linux/man-pages/man1/xargs.1.html:

“The command line for command is built up until it reaches a system-defined limit (unless the -n and -L options are used). The specified command will be invoked as many times as necessary to use up the list of input items. In general, there will be many fewer invocations of command than there were items in the input. This will normally have significant performance benefits.”

Your only risk is that it won’t handle inputs that, on its own, are too long.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: