

PHP 5.3.9 released - yuri41
http://www.php.net/ChangeLog-5.php#5.3.9

======
pilif
So. There is a know denial of service issue caused by the specific hashing
algorithm that everybody seems to use.

Everybody fixes it by randomizing tables used by said algorithm at
initialization time.

PHP "fixes" it by not touching the hash algorithm and adding a max_input_vars
configuration setting, thereby reducing the functionality while not really
fixing the underlying security issue. This also means that if max_input_vars
is set reasonably high (or has to be set as such), an attacker can still do
the exact same DOS attack - albeit using more concurrent connections.

I can't even believe that the desire to keep backwards compatibility at all
costs is a valid reason for a decision like this: Older versions supported an
arbitrary amount of input fields, the current version does not, so this is a
clear BC break.

Especially when considering that PHP is getting better with their release
process (i.e. not having 100s of failing tests so that the 101st that fails
would be missed), I really think this half-assed solution is way too cautious
- especially as other projects had the exact same issue and solved it more
correctly - without causing (apparent) regressions so far.

PHP could just - again - copy their updated algorithm.

(related discussion on their mailing list:
<http://news.php.net/php.internals/57291>)

~~~
sootzoo
Not endemic to PHP, of course, as it seems MSFT took the same approach with a
recent ASP.NET security update. The workaround described in their KB
(<http://support.microsoft.com/kb/2661403>) is to add a similar configuration
override and they've set the limit arbitrarily at 1000 form keys. Which
vendors did not dodge the hash algorithm problem and what's the updated
algorithm you're referencing? Who's doing this right?

~~~
pilif
Perl has fixed this (<http://www.perlmonks.org/?node_id=945526> also contains
a link to a PDF detailing the issue) in 2003 when it was discussed the first
time.

Ruby 1.9 was already randomizing the hashes whereas this current rediscovery
of the problem caused them to fix it by randomizing in 1.8 too. Same goes for
Python.

The best link to the algorithm I found on a quick search is here
[http://www.hardened-
php.net/hphp/zend_hash_del_key_or_index_...](http://www.hardened-
php.net/hphp/zend_hash_del_key_or_index_vulnerability.html#djbx33a_-
_daniel_j._bernstein_times_33_with_addition) \- where the problem was
discussed again in 2006.

By randomizing I mean randomizing that value for h for the lifetime you need
your hashes to be consistent (until the script ends in PHP)

While you are still vulnerable to attackers who know what that random initial
value is, finding that by just randomly trying to create the collisions is
impractical and easily detectable.

------
Loic
I heavily recommend PHP farm to test your code with a given version of PHP:
<http://cweiske.de/tagebuch/Introducing%20phpfarm.htm>

This allows you to have several versions of PHP, each one with a given
php.ini, given list of extensions and a given PEAR.

~~~
asm89
If you have open-source projects on github, you should really checkout travis
(<http://travis-ci.org/>). Travis is also able to run tests in different php
environments and can be hooked into github to tests whenever you commit! :)

------
ck2
Argh I just finally compiled 5.3.8 last week - oh well.

 _90 bug fixes, some of which are security related_

Hopefully APC has no issues with it, gave up on eaccelerator.

Interesting they took the time to do the 5.4 backports in there.

------
darda
People still use this shit?

~~~
kingofspain
Indeed! I have been quite pleasantly surprised lately with the more
constructive discussions around PHP related items. The LOLPHP posts have been
at a minimum (I don't mind genuine criticism).

