

Faster PHP fo shizzle - teoruiz
http://terrychay.com/article/hiphop-for-faster-php.shtml

======
cmallen
Am I the only one wondering why facebook hasn't implemented a compression
backend into memcache much like Reiser4 and ZFS has done?

They've made it very clear that they're RAM limited (in particular with
respect to capacity), so why not just have the processor compress/decompress
memcache operations back and forth with a highly efficient and relatively low
compression algorithm?

It's not even like you couldn't tune the algorithm to detect duplicate/similar
data and create atomic globs of data that represent multiple informational
objects.

It seems like their big cost is putting together machines with tons of RAM for
their memcache clusters, so why not bring that cost down?

~~~
stephenjudkins
[http://highscalability.com/blog/2009/10/26/facebooks-
memcach...](http://highscalability.com/blog/2009/10/26/facebooks-memcached-
multiget-hole-more-machines-more-capacit.html) would indicate that Facebook's
memcached instances are sometimes CPU-bound. They've submitted several patches
to Memcached to improve its performance so that would back that up as well.

I've wondered the same thing--compression would be enormously helpful to us
since we're RAM-bound (even with tons of RAM) and store a lot of easily
compressible HTML. Further, our memcached instances show almost no CPU load.

~~~
cmallen
Well, I guess for their scale, CPU-limiting factors would make compression not
worthwhile.

That said, for more common scaling issues, I think compression would be a huge
win, especially with Redis-style backends.

------
Zak
_If you are making an argument to recode your entire site from PHP to some
other language, the answer is you just lost that argument._

This only works if execution time was a major part of the argument, and the
site meets the conditions for benefiting from HipHop discussed in the article.

------
agazso
I was afraid that the usefulness of HipHop would be as limited as is regarding
that it's not an easy feat to create a PHP-to-C++ compiler that handles C
library dependencies (which PHP has a lot!) well.

BTW It was the second time in a week that there was a product that created
incredible buzz in HN community without anyone able to trying out the product
(the other was iPad of course) and I was amazed at the amount of well-informed
opinion based on such little information.

PS. This blog is a good reading if you are interested in Facebook
architecture, scaling and design issues in general.

~~~
pbiggar
They aren't trying to reuse the Zend C libraries. However, phc does, and it is
tricky and requires sacrifices.

------
ojbyrne
Some day they'll post the code.

~~~
teoruiz
It seems the code will be out "soon":

 _We're in the process of opening the list and approving members to the group.
Code will follow soon after any recent cherges have been merged to the branch
and we're sure that anything Facebook specific is removed._

More in the HipHop mailing list: [http://groups.google.com/group/hiphop-php-
dev/browse_thread/...](http://groups.google.com/group/hiphop-php-
dev/browse_thread/thread/8e2c858d79a3ada6)

------
seldo
Terry's earlier article about the future of PHP (linked in the first paragraph
of this one) is also very good reading:

<http://phpadvent.org/2009/1500-lines-of-code-by-terry-chay>

------
leej
thinking that php is written in C why didnt they do php->c instead of
php->c++?

~~~
pbiggar
They aren't reusing any of the Zend PHP implementation. But that's exactly
what phc (<http://www.phpcompiler.org>) does.

~~~
leej
not asking that. is there that much performance diff btw c and c++ or what are
the reasons for the choice? first c++ is sexier that's for sure.

~~~
jared314
From the presentation, it sounded like it generates a C++ object for each user
created php object (1 to 1). We will all know when they release the source.

