As someone who goes to Stack Overflow regularly I see this situation all the time. It goes like this...
1. Person asks a question (e.g. "I need to build a service that will serve school districts in Los Angeles County. How do I go about doing that?")
2. Someone comes along and answers the question with something that seems right (e.g. "Here are a bunch of links on building REST services...")
3. That answer gets upvoted and the answerer gets his reputation boosted
4. Later someone comes along with the right answer (e.g. LA Unified requires all services to be WSDL discoverable so you need to look at SOAP"). But it's too late because the question's already scrolled into the abyss.
So the guy who gave the wrong answer gets a better reputation and the guy who got the right one gets nothing (and the person who asked the question likely took the upvoted answer and didn't check back so they waist untold hours on the wrong thing)
And who doesn't read the top few answers, if not all of the answers, anyways? Even if you expect the top answer to be the right answer, subsequent answers often help you understand the problem space a lot better.
But: As a SO-consumer this particular problem doesn't phase me.
The better answers usually have more votes and sometimes the green checkmark is even on the best one - but when it's not, who cares?
To me as a SO-consumer a bigger problem recently has been the duplication. For any given question there may be dozens of threads which makes it harder to track down the good one(s). If SO wants to improve they should imho work on their deduplication efforts.
This might even be automatable to a point; let users vote on questions to be merged, and perform the merge when either the author agrees or a vote-threshold is met.
They've taken some steps to alleviate that (badges for getting an older answer upvoted to +5), but it's still an issue.
Moreover, the person asking the question has the most interest, but the least capacity to be patient. And, likewise the initial status-seeking answerers can get edgy for a quick tick.
edit: however, I do not minify on-the-fly, I keep the source HTML, CSS and JS files space and line-break compressed (and variable-name compressed.) The speed increase may be due to less going down the wire, or it may just be faster to process by the browser due to some other mechanism.
Also, having css and JS code within the HTML itself has sped things up too - no external files.
Have a look at the source of http://twitya.com/index.php - I also Gzip that, and this speeds it up as well. I keep a few line-breaks in for readability.
Takeaway: what matters most is having as little data go down the pipe as possible in a compressed fashion, and not worrying about server load - that could be offset by the improvements on the browser.
Also, Gzipping http://hackerbra.in has led to a marked improvement, with its large slabs of text.