To those who think it isn't a big deal: when GET requests are made public you can snoop "password reset links" and similar to to get access to somebody else's account. Even when developers use best practices GET request paths can leak sensitive information.
Attacking a system is not just guessing passwords. You need to gather as much information as you can about your target and devise an attack using what you know.
Maybe it's a DNS entry or network addresses that would have been secret if not for it appearing in server-status (there's a reason AXFR from random internet clients is a bad idea). Maybe it's a client IP that accesses an admin panel, and attacking that client machine will give you the keys to the kingdom. Maybe it's a PID that's guessable and gives an insight to what a new temporary file or directory will be called, exposing a race condition. Maybe it's the very precise time information to be used for making timing attacks easier. Maybe it's the number of workers configured that lets them tune an attack to fill up resources. Maybe it's the system utilization to help them figure out if they should attack CPU, memory or i/o. Perhaps it's the server version and OS telling them what software is running and thus what exploits they should pick first.
Or maybe it's the fact that the requests can be linked to client IPs to build a profile on specific users, violating their privacy.
Whatever the reason, it's stupid to keep this information public.
I believe the consensus was that it was not a big deal for www.php.net to have the server status page up. The original link for that news item was http://php.net/server-status. However, not all sites are like php.net which is used mainly for documentation (it would appear). That said, a status page leaking IPs or sessions (PHPSESSIONID) for a private forum, government agency, etc, could be a different story.
You've got to watch this... Apache typically limits this to localhost, but if you use squid as a reverse proxy (quite common) then you can see this easily being exposed since all requests will appear to come from the local machine.
Even a cursory scan of the http://urlfind.org/?server-status list reveals scads of porn sites exposing their visitor's IP addresses:
(Note these links go to Apache server-status pages at the time of linking. This may change if the server admins wise up - to be on the safe side consider them NSFW):
http://black-tgirls.com/server-status
http://badexgfs.com/server-status
http://tubepornx.com/server-status
http://lesbianvalley.net/server-status
..... and many more .....
Personally, I don't care what consenting adults do with their genitals. But I think it's safe to assume that the visitors to these sites expect a certain level of privacy that's not being met.
This can also lead to DOS issues, as I understand it, the Apache server-status pages are very computationally intensive to produce, and it requires stopping and polling every child.
Something like
<Location /server-status>
SetHandler server-status
Order Deny,Allow
Deny from all
Allow from 10.0.0.0/24
</Location>
(where 10.0.0.0 is your local network range) will prevent external requests. This is mentioned in the linked through Apache documentation.
According to other commenters, this is only enabled for localhost by default, but if one is using a reverse proxy on localhost, all requests will appear to come from there. So be careful with this approach.
Sure, for a skeezy site. The parent was talking specifically about php.org, I can't imagine any real risk for a site like that exposing their visitor ip log.
It's not just that. You can get an idea of the traffic to the site if you watch for a while. When that information might be commercially sensitive then it could be a genuine issue.
For example, nba.com has been averaging about 3 connections and 1021 idle workers while I've been watching it. That's perhaps less traffic than you might expect? I don't know, but if I were paying for ad space I might be interested.
AOL deidentifed search data still allowed some people to be identified. When name is tied to some medical condition related search terms, it gets embarrassing...
Back around 13 years ago I believe the default to have it enabled was changed. That said alot of sites carrier on leaking that way, ft.com was one - even after it was pointed out to them. Eventualy they changed things when I mentioned it to IBM rep who also dealt with FT's account, nice rep.
I can see how it can end up being enabled and left open, but it is also that level of administration that opens you up to other more concerning issues, this is a concerning issue for many reasons. If you had a firewall that blocked off by default not exprecitly allowed(with good wildcarding when needed on sub directory's) remote access to everything not the main public site then that would of caught it. If you had a access control , that again would of controled it.
Only way some companies will learn is to be hacked or being done under the laws for leaking private data. So if you go onto a sight like that, tell there admin they are in breach of the applicable data protection/privacy laws you have that can cover such things. Then if they don't fix it, cash in on there stupidity and sue them, you get paid for your time and they pay for there crime and learn the only way some do learn. Don't hack them, no need, just use the law. Or get a patent on bad administration and use that to claim back royalties. Crazy approach, but if you have the money to cater for such whims, let us all know how it pans out, profitable and educational for the patent system. Who would contest and claim prior art on stupidity of administrating computers, you would get your money worth in laughs if nothing else.
Short version, this is a old issue and you are also breaking data protection/privacy laws - be warned. If you see it, warn them and feel free to educate them via the legal cashmachine.
Another issue we identified is that you can find those "hidden" admin panel or URLs that shouldn't be known to the outside, by just refreshing the page a few times and checking all requests.
It is not a best practice, but some companies do and it makes easier for those to be found.
Someone from tweetdeek frequents this site, because it is fixed there. I am just interested in how busy this sites are. From my quick views Ford and Staples were the busiest.
Why do you think Disney has access to high quality admins? Do you know high quality sysadmins who want to work for Disney, or companies like it?
Just because a company is a big name doesn't mean it attracts big talent. Disney is still fishing from the same ocean where all the best engineers went to sexier places.