Considering PHP's scope and its limitations (lacks things like threads for example) I would say that is an extremely rare scenario. If you're storing 100.000 integers on a PHP data structure you're most likely doing it wrong.
Isn't PHP-FPM a fastcgi process manager? I don't see any references to threading in it's documentation. I also recall the php documentation saying php is unsafe with multiple threads due to a large number of libraries that are not coded to be thread-safe.
Wow. I always thought it was threaded. My mistake. It seems it launches multiple child processes. There is some form of memory sharing going on, though. Looks like it's some sort of a hybrid. Now I'm just thoroughly confused ...
How would you evoke such code? Squeezing it into an http reply? Calling it from PHP CLI?
If doing it on your webapp, don't! You're doing it wrong. Otherwise why would you need php specifically? I mean I guess you could, but what advantage does it poses comparing to other languages available in pretty much every *nix system nowadays (perl, python)?
For the record, I have a nice collection of downvotes for defending PHP on point-and-laugh-at-PHP threads here at HN. I just don't see how the use case you're presnting is realistic.
Firstly, I think the point of the demo of loading that many integers into the array, was to get a good measure of the memory usage per array item; by making the array big small one-time overhead memory usage is washed out in the average.
As to why you would want to do that in real life -- if you have written a large system with accompanying libraries, objects, or other infrastructure, then even if you pass of some sort of calculation to background or crontab tasks, you still might want to use your PHP code. I do this for some of the Drupal sites I work on - I setup drush commands that will do intensive calculations like most-related-article and etc to run behind the scenes have the results cached.
The best solution would be to improve PHP's memory usage so that the benefits may be had on page loads as well. I suspect (but have little hard evidence) that the size of PHP code and data structures impedes performance as threads and processes get pulled in and out of the CPU.
A secondary solution would be to take some of the libraries out there for doing specific operations, like scientific commputing and numerical libraries, and make them available to your code as a PHP extension. Obviously that doesn't solve as many problems as fixing PHP does.
That's the only place it's useful. Once the request comes in, I would split it off to a separate PHP CLI process. When the analysis is complete, I can transfer it to the user via web sockets.
So they would request a report. I would tell them great, it'll take a bit, keep working on something else and we'll let you know when it's ready. When it's ready, a notice shows up telling them to check it out.
> I mean I guess you could, but what advantage does it poses comparing to other languages available in pretty much every *nix system nowadays (perl, python)?
For something like this, the advantage would only be that you're sticking to the same language as the rest of your code base. It would make things simpler. So if you're using PHP, stick with that. If Perl, Python, etc. stick with that. It doesn't really matter. They can all do the work.