Hacker News new | past | comments | ask | show | jobs | submit login

PHP is not particularly known for fast and no, you wouldn't expect PHP to move petabytes of data. The normal approach for moving data with PHP is to readfile() the whole file into a byte string in memory before echo()ing it to the user; doing something chunked, incremental, and seekable is probably just as much difficulty in PHP as in Perl. It also isn't legacy -- as they say, they switched from Perl to PHP. (Although, it might be. They may have switched to PHP just for the library MySQL functions, or perhaps they wanted to switch to Nginx and couldn't get it at the time to run exactly how they wanted with Perl -- either way, it could be the case that now they're staying with it for legacy reasons.)

I can think of some special cases where PHP would be better, especially in a porn site's case -- the most common clicks are front-page links and there are probably a bunch of common keywords and clicks to links off the first page of those searches, which means that caching whole pages is probably economical. As far as I know, both Perl and PHP are identically suited to talking with upstream caching proxies, but PHP might have felt more natural for day-to-day feature development.




These days you just issue an x-sendfile header and PHP washes its hands of the request.


That sounds pretty nice; I hope you wouldn't have to fuss too much with it to implement video seeking.


I think nginx can be fiddled with to handle that stuff.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: