

PHP Dark Arts: Shared Memory Segments (IPC) - rotarycoptor
http://www.re-cycledair.com/php-dark-arts-shared-memory-segments-ipc

======
duskwuff
PHP: Bringing you yesterday's IPC today!

    
    
      *: SysV shared memory is incredibly clumsy to implement anything more complex than a cache in, and memcached still does a better job of that. There's really very few reasons you would ever want to use it in production code.

~~~
vital101
In fairness, PHP provides other methods of IPC as well.
<http://www.php.net/manual/en/book.sem.php>

~~~
duskwuff
Those are all still cruddy SysV IPC methods you're pointing to. They're
marginally better than just throwing data into shared memory and hoping it
sticks, but you're still much better off sending messages across a UNIX domain
socket.

------
lukefabish
Kinda cool, but can anyone imagine why'd you'd do this with PHP?

~~~
dhotson
I had a go at using this a little while ago to implement (erlang style)
message passing concurrency in PHP:

<http://github.com/dhotson/phork>

The main use case is something like: Let's say you've got a mysql query that's
going to take a few seconds. In the meantime, you can do some other useful
work in PHP while the query runs.

~~~
atomical
The amount of complexity it could introduce to an application doesn't seem
worth the minimal amount of gain.

~~~
dhotson
Probably true. But it depends. Let's say you've got 5 queries that take 1
second each.. There's a potential 5x speedup if you can run them in parallel.

I think some basic concurrency support in PHP (such as Futures) would have a
pretty huge benefit. The other alternative is to have async APIs for libraries
like mysql.

You could argue that anything that is taking too long that it needs
parellelizing is probably a candidate for a background queue + frontend
polling (or similar).

~~~
atomical
It still doesn't make sense to me. I can either use my time to fix the queries
or use my time to play around with this disaster waiting to happen.

~~~
dhotson
It's not just db queries though.. it's also things like 3rd party API calls
(S3 for example).

Basically anything where you're blocked waiting for a response you could be
busy doing something else.

Synchronous APIs are kind of wasteful in a lot of cases. Node.js does a good
job of this by using async APIs.

It's a tradeoff. But the extra complexity can be worth it if it means your
pages load 2x faster.

