
The Economist's US college rankings - mrsaint
http://www.economist.com/blogs/graphicdetail/2015/10/value-university
======
6stringmerc
The concept of studying in Higher Education for more opportunities, job
stability, and income during one's lifetime is not to be dismissed.

That said, if the predominant value of education is in the pursuit of money,
then I think using the Brookings Institute model (which The Economist
references) that includes two-year and vocational education entities is a much
more pragmatic and practical approach. That is, if money is the goal, only
focus on money as the outcome. Conflating money with prestige is, to me,
foolish. A University is a prestige degree, insofar as there's a pursuit of
knowledge, in theory, to produce a well-rounded, educated person...in theory.

I say this as a graduate of two generally highly ranked Universities, who
occasionally gets the feeling that I'd be making more money if I'd taken my
education budget, got trained in HVAC service/repair/sales, and started my own
company.

~~~
btilly
Even if the predominant value of education is the pursuit of money, the
Economist still has the more useful model. It allows you to differentiate
between schools whose graduates make money because they are full of people who
are going to try to make money, and schools that make money because the school
brings something to the table that helps its graduates above other options.

Look at CalTech for example. It does well in the Brookings model because it
attracts a lot of people who want to be engineers. It does poorly in The
Economist's model because its engineers are not making as much in 10 years as
would be expected given a variety of factors that they control for. (Now that
said, CalTech may be hurt because they graduate a lot of people who take side
tracks through grad school and presumably would make more 15-20 years out. But
you have to work with the data you have, and all they had was 10 years.)

Now if I'm a student planning to go into engineering, the Brookings model
merely confirms that I'm making an excellent choice. The Economist's model
suggests that if I'm capable of going to CalTech, maybe that is not the best
school for me.

~~~
6stringmerc
After reading through your response, I believe you missed my point. What I was
referring to is that for a large population of potential students - those who
wish to simply make more than if they do not go to a higher education program
- going to a two year or vocational path school is exceptionally smart.
Leaving that out of consideration is...not smart...highly biased...as in, how
many readers of The Economist would consider sending their kid to become a
diesel mechanic?

From the research I've absorbed over the years, students that can afford to go
to prestige universities often come from well-to-do, college educated families
- these are high indicators of future success. For a much larger swath of the
population that is trying to advance their stake in life, they are first-
generation students, and often not well-served by taking on debt and attending
a large University environment. Thus, many Universities are, practically
speaking, very bad at ROI for a large contingent of students who simply want
to make more money - vocational training? An excellent prospect.

What remains to be seen is how much more earning potential there is in
professions that require vocational training to be licensed or successful
(i.e. welding, machining, air traffic control, plumbing, electrical
contracting) as the generation currently working retires / dies from the
workforce, thereby spiking demand, and in fields where a "traditional"
University degree are, well, useless.

~~~
btilly
Read the article again. The Economist controlled for a variety of factors.
Family socioeconomic status was among them.

That said, one of the shortcomings of their metric is that they _only_ had
data on students who took out federal loans. How predictive this result is for
others is open to question.

~~~
6stringmerc
Well, then, it's even more glaringly bad-form to leave out two-year and
vocational schools from their approach then, if socioeconomic status is a
factor, now isn't it? My perspective is yes, it's bad. It seems we'll just
keep talking around each other, but I'll stand by my thumbs up toward the
Brookings version and give the hairy eye to The Economist's method.

------
adenverd
This tool would be a lot more useful if it allowed for filtering by chosen
course of study. For example, University of Washington has a median earnings
that is slightly below expected, but I can pretty much guarantee that their
computer science graduates are earning well above median.

~~~
digikata
That was my thought too, to make a simplistic example if a university had both
say an engineering school and an art school, it might presumably do worse than
a university with only an engineering school. So this metric might favor
smaller, focused schools which happen to concentrate on education areas with
high median salaries...

~~~
_delirium
> So this metric might favor smaller, focused schools which happen to
> concentrate on education areas with high median salaries...

I don't think that part's necessarily true. If a school focuses on an area
with high median salaries, the model will take that into account in the
predicted salaries, so the school will have to have even _higher_ actual
salaries than typical for the field (and its input demographics, SAT scores,
etc.) to get a positive value-add. See Caltech for an example of a STEM-
focused school that does badly by this measure: from its SAT scores,
demographics, and heavy concentration of STEM majors, the regression analysis
predicts that it should produce graduates with a median salary of $82k. But
the actual median is $74k, so its value-add is taken to be -$8k.

Some of the schools that do well are in areas with poor salaries, but score
highly because they do better than you'd expect (or than the model would
expect, anyway) for that area and student demographics. Otis College of Art
and Design has a predicted salary of $29k from the regression analysis, but
actual median is $42k, so implied value-add +$13k.

~~~
carpdiem
Interestingly, having gone to Caltech, I suspect that its extreme focus on
STEM research actually hurt it here. Mostly because that focus results in a
very large portion of undergrads going on to grad school (much larger % than
any other university), and grad students don't earn very much.

~~~
selimthegrim
Hi there, Mike! Care to expound on any solutions to make Tech more industry
oriented? Are there things you'd rather have been exposed to more in your
undergrad education given where you're at now?

------
alistairSH
Something feels "off" about a methodology that basically ranks all of the
"top" state schools as poor values.

Take the top-3 state schools in Virginia (by most other rankings, that's UVA,
W&M, and VT). All three are nationally recognized. And all three are
competitive that you need very good grades and a solid set of extra-curricular
activities to be accepted.

UVA ranks the lowest of the three, yet has the highest actual earnings. VT is
ranked significantly higher with the middle earnings value.

What does a student do with this? Cost of attendance at the three is similar.
Should they attend VT with it's higher ranking, despite lower average
earnings?

Similar comments can be made about UNC-CH, UW-Seattle, UT-Austin.

Also lacking from this analysis seems to be the loan burden borne by students.
Georgetown and Villanova both rank very high in this list. But, both are
insanely expensive to attend. Even with high actual earnings, it could take a
decade or more for many students to pay off a potential six-figure loan.

~~~
tpudlik
I don't think the Economist's ranking can be used for choosing which school to
attend. It tells you how good a job schools do at boosting the earnings of the
people who attend there---conditioned on who attends. In other words, they
tell you how much the school does for a typical member of its student body.
But if you're not representative of that student body (and you certainly won't
be representative at many of the schools, especially the outliers!), this will
not tell you how much the school would do for you.

~~~
alistairSH
I guess I just find it extremely hard to believe almost all of the nations
"top" state schools have a negative impact (actual earnings less than
expected) on the students that attend.

Maybe I need to dig into the source of the expected earning figures. I'm
definitely biased to some degree, being a graduate of one of the state schools
with a poor ranking - I'm sitting here wondering where else I could have gone
to get a better value.

------
hackuser
Income is a poor measure of college education. The problem is that the
valuable benefits of college are difficult to quantify; income has importance
and is quantifiable, so I understand the temptation to use it as a metric, but
it doesn't represent of the values of a college. It's like using the number of
lines of code as a metric to represent the value of a piece of code - it's
quantifiable, significant in some ways, but not representative of the code's
value.

I much prefer the Times Higher Education model, especially their reputation
survey. They survey 10,000 tenured and published academics worldwide, using
what looks like well-designed methodology [1]. These are people in a position
to have expert knowledge about the qualities of various universities. Yes it's
imperfect and those people have bias too, but I can't think of a better model:

[https://www.timeshighereducation.com/world-university-
rankin...](https://www.timeshighereducation.com/world-university-
rankings/2015/reputation-ranking)

One excellent alternative is Washington Monthly's rankings. Their approach:

 _We rate schools based on their contribution to the public good in three
broad categories: Social Mobility (recruiting and graduating low-income
students), Research (producing cutting-edge scholarship and PhDs), and Service
(encouraging students to give something back to their country)._ More here:

[http://www.washingtonmonthly.com/college_guide/rankings-2015...](http://www.washingtonmonthly.com/college_guide/rankings-2015/national-
universities-rank.php)

\----

[1] [https://www.timeshighereducation.com/world-reputation-
rankin...](https://www.timeshighereducation.com/world-reputation-
rankings-2015-methodology)

------
noelsusman
The methodology feels odd. The results are essentially just errors in their
prediction model. They're assuming that the model errors represent the
school's contribution to median income. They're also assuming that those
contributions are normally distributed around zero with constant variance.

I understand why they made those assumptions but they're almost certainly not
true, which isn't exactly ideal. Then again I can't quickly think of a better
way to do it without more granular data. Maybe use a mixed effects model with
multiple years of data from each school....

------
fiatmoney
There seems to be a major issue with cost-of-living adjustments as well.
Several rather obscure California colleges are highly ranked, and I assume
this is because both wages and costs are higher in California, where many of
their graduates stay after college.

~~~
rconti
I'm not so sure. My generally-highly-regarded college in California (bay area,
no less!) is almost bang-on average ($-33) despite being in a very high cost
of living area.

~~~
rconti
To reply to myself, the comments include this mention from "DR" of The
Economist:

\----- Geography (both city and state) are variables included in our model.
Colleges in the Bay Area are not rewarded for the high salaries available
there--they have to surpass a higher "bar" of expected earnings. That is why
our eighth-ranked college is in West Virginia, and the ninth in Laredo, Texas.
\------

Now, I don't know how they compensate for this. Cost of living at the place
where the student ends up? Or where they study?

If you study in a high cost of living area but move to Podunk, Iowa, your
return will be far lower than expected, if they're only controlling for CoL at
the college's geographical location.

If they control for CoL at the student's new home, then okay.

------
hackuser
The Brookings study [1] mentioned by the the Economist offers this interesting
analysis:

 _1\. Graduates of some colleges enjoy much more economic success than their
characteristics at time of admission would suggest. Colleges with high value-
added in terms of alumni earnings are often focused on training for high-
paying careers in technical subjects. ...

2\. Four college quality factors are strongly associated with higher earnings
for alumni:

Curriculum value: The amount earned by people in the workforce who hold
degrees in a field of study offered by the college, averaged across all the
degrees the college awards;

STEM orientation: The share of graduates prepared to work in STEM occupations;

Completion rates: The percentage of students finishing their award within at
least 1.5 times the normal time (three years for a two-year college, six years
for a four-year college);

Faculty salaries: The average monthly compensation of all teaching staff

3\. Value-added measures are fairly reliable over time. ..._

(Personally, I think the most valuable things gained in college or in any
education don't happen to have much monetary value.)

[1]
[http://www.brookings.edu/research/reports2/2015/10/29-earnin...](http://www.brookings.edu/research/reports2/2015/10/29-earnings-
data-college-scorecard-rothwell)

~~~
jldugger
> (Personally, I think the most valuable things gained in college or in any
> education don't happen to have much monetary value.)

Well, someone does, because they're charging for it.

~~~
hackuser
> Well, someone does, because they're charging for it.

Good point; I should have said, I don't think they are things that earn you
money.

------
mynameishere
Well, I'm glad my shitty school beat Yale (almost last) so handily. Really, I
don't think they could have come up with a worse metric. If you simply took
each school's equivalent of this:

[https://en.wikipedia.org/wiki/List_of_Yale_University_people](https://en.wikipedia.org/wiki/List_of_Yale_University_people)

...and just sorted it by # of entries, you'd have a far superior list.

------
leroy_masochist
So basically, the methodology is that they do a very complex analysis of the
student body to predict what their average earnings would be if they went to
"college in general" and then compare that with what the graduates actually
make.

There's one thing that jumps out at me when I see the actual rankings:

At least three of the eight schools that received a perfect 100 score -- W&L,
Babson, and Bentley -- have a very, very high number of students who go to
work for lucrative family businesses immediately upon graduation. In the case
of W&L (which provides a great liberal arts education mind you) it's mostly
southern good ol' boy/girl types. At Babson / Bentley (which provide great
undergrad business educations), a huge chunk of the student body are the
scions of the economic elite in developing countries, who are getting schooled
up so they can be ready to be put in charge of something at a young age.

My hunch is that the Economist's "expected earnings" methodology wasn't
granular enough to take these idiosyncratic attributes into account, and that
its r-squared of .85 might not be a rigorous number.

~~~
btilly
Note that the data set they had only includes students who got federal loans.
I strongly doubt that students who go to work for lucrative family businesses
immediately upon graduation will be in the data set.

But having classmates who look like that seems to be a really good economic
choice! Which is a factor that is not apparent in normal college rankings.

------
et2o
This is hilariously bad. Schools that produce academics as opposed to those
who go into high-paying professions are penalized, when in fact (at many of
the top institutions) producing academics is one of the main goals.

~~~
stagger87
This is not bad. It is simply one of many possible ways to rank colleges.

~~~
BookmarkSaver
But it is a serious flaw in the methodology. It says nothing about the quality
of education or the boost in earnings that a college will provide, _if you are
specifically going to a school to do so_. At the very least they need to
somehow account for individuals that go to the top schools but aren't looking
to go into lucrative careers.

CalTech is a common example being cited here. It is basically impossible to
deny realistically that going to CalTech is a great way to make a lot of
money. But it is in the bottom percentages of these rankings, likely because
so many CalTech alumns go into academia.

The metric is supposedly showing which colleges can potentially increase your
earnings the most, but it just isn't doing that. It's a great idea, it just
hasn't been implemented fully.

~~~
selimthegrim
I think the money thing is paralleling something else entirely - given the
supposed caliber of student that comes into Caltech, is Caltech really making
them better scientists than if they had gone elsewhere (or at least better
scientists than they were turning out in the 1970s). That to me is a open
question.

~~~
BookmarkSaver
Yes, but Caltech students have some of the highest rates of going into
academia. Which pays well on a general scale, but compared to where engineers
and developers go onto it drags the stats down.

They do admit that the metric only gauges financial success and is imperfect
when it comes to accounting for alternative priorities. But do you seriously
believe that CalTech provides one of the worst "financial bumps" even
accounting for the quality of their admissions? I find it incredibly hard to
believe.

~~~
selimthegrim
It's not at all clear that every Caltech student is going to be an engineer or
developer. Recall Jim Simons getting booted out from Princeton because he
sucked at programming?

------
obastani
This analysis doesn't make any sense to me. Even by their own intent (i.e.,
best added value based on salary), their methodology is nonsensical.

For example: Why should CalTech get hurt for being near L.A.? They're
basically arguing that you're better off going to a school in the middle of
nowhere because "hey, for being in such a crappy location, you did pretty
well!". In an _absolute_ sense you are better off going to CalTech, it's just
that they might not leverage their advantage as well as some other schools.

Not that they even show the last point -- it seems unlikely that the true
model is linear (I'm guessing they used linear regression). For example, if
the true model is closer to a sigmoid, then schools at the high end suddenly
get unfairly penalized and schools near the low end get unfairly boosted.

Finally, the statistical indicators are equally misleading. I can obtain an
R^2 of 1.0 just by including indicators I[is COLLEGE_NAME] for each college.
While that might not give you significance, the point is that getting good
prediction is meaningless.

I think what they really want is to restrict to predictors about the students.
So, given that you're a straight A student with a 2400 SAT score, what would
you expect to make coming out of each school? This at least tells me something
about the added value to _me_ of going to a certain school. (This approach is
still prone to bias, but in the opposite direction -- there's a chance that
the straight A student with a 2400 SAT score going to community college may
have been smart but unmotivated, which might correlate with lower salary.)

Edit: Here's another concern. They're claiming to have a model for "expected"
earnings:

earnings = A * (college covariates) + b + error

but they can't distinguish between _model error_ (i.e., error because their
model is misspecified) vs. the school variation that they are trying to
capture.

~~~
krstck
> They're basically arguing that you're better off going to a school in the
> middle of nowhere because "hey, for being in such a crappy location, you did
> pretty well!".

Well, no, not exactly. It's a subtle distinction, but what it's actually
ranking is how well that school _exceeds expectations_ , not _best outcomes_.
This is not necessarily a list that will give a student the best school to go
to, but rather (what it says on the tin) a scorecard for how well those
schools are doing, given their resources.

~~~
obastani
That's my point -- who decides what _expectations_ are? Their results are
incredibly dependent on the model specification. I imagine if they changed
which indicators they used, the results would vary widely.

Here's another way to see my concern. Suppose you had a classifier that
achieves 1.0 R^2; then since it _perfectly_ predicts each school's expected
value, it'll assign each school a score of 0. I'm pretty suspicious of an
approach where the results get worse with better predictive power.

Even if you want to do "exceeds expectations", I think you shouldn't include
variables that are school specific, only variables that are student specific.
In other words, for _my_ expected outcomes, which school is best?

~~~
haberman
> Suppose you had a classifier that achieves 1.0 R^2; then since it perfectly
> predicts each school's expected value, it'll assign each school a score of
> 0. I'm pretty suspicious of an approach where the results get worse with
> better predictive power.

If I'm understanding correctly, that result would indicate a world where the
college you attend has no effect on your earning power. ie. choose any college
you want, because you'll earn the same amount regardless of which one you
choose.

This would only apply to colleges that people in your demographic group
actually attended though. If the dataset doesn't contain any information about
people like you who went to Harvard, then maybe Harvard would indeed increase
your earning potential if there was a way for you to actually go there.

~~~
obastani
I'm not saying that each college you attend has no effect on earning power.
It's just that I can perfectly predict the effect of each college on your
earning power. Does that make sense? If I have an oracle that tells you

"if you got to Harvard, you will make $80,000, if you go to MIT, you will make
$86,000",

and the oracle is exactly correct, then under this model, The Economist
assigns every college a score of 0.

~~~
amscanne
I think you are missing the key ingredient in the analysis.

The Economist is attempting to build such an oracle via statistical
regression. HOWEVER, the Oracle is intentionally limited in _input_ to a
specific list of things: SAT scores, sex ratio, race breakdown, size, public
or private, earning power in the city where it is located, etc.

The things that are _omitted_ constitute the actual value the University
brings to the table: quality of teachers, instruction, organizations on
campus, etc. (1)

So however far off the model is for two given Universities must be explained
by all the missing inputs, i.e. largely how "good" the University is.

If the Oracle was able to perfectly predict your earning power given that
limited set of inputs, then it would basically mean that a University is
completely defined by SAT scores of students, sex ratio, race, etc. and
there's absolutely no value they add or subtract beyond that. That was be a
very, very interesting result. But you can see why it's unlikely.

Hopefully this makes sense?

(1) Of course it's possible that there are factors like "how many trees on
campus" or "how many vowels are in the name" which might also affect earnings.
But we can probably agree that it's less likely to be important than the
aforementioned ones ("quality of instruction", etc.).

------
cassieramen
Does anyone know of a good diy college ranking system? One where you can pick
that factors that actually matter to you. I'm always happy to see options
other than U.S. News but I'd love to be able to tweak the algorithm myself.

I think these one size fits all rankings are all flawed by their inherent
nature.

~~~
brensudol
I made a simple DIY ranking system using the same College Scorecard data last
week. Here you go: [http://bsudol.com/1PUPohk](http://bsudol.com/1PUPohk)

You can select which variables are important and how much, and it generates a
top/bottom 50 list.

This was inspired by this article from NPR
([http://www.npr.org/sections/ed/2015/09/21/441417608/the-
new-...](http://www.npr.org/sections/ed/2015/09/21/441417608/the-new-college-
scorecard-npr-does-some-math))

~~~
Amorymeltzer
Submitted:
[https://news.ycombinator.com/item?id=10480851](https://news.ycombinator.com/item?id=10480851)

------
nilkn
If I were a prospective student interested in making a lot of money out of
college, surely I'd still just follow this very simple model:

Take all the schools I could get into, look up their actual median earnings,
and go to the one which is highest (let's ignore issues of cost of
attendance).

If it turns out that the one which is highest isn't ranked high on this
particular list, I don't see why that would suggest I still shouldn't pick
that school. Imagine these are my options, for instance:

(A) A school with expected earnings of $90k and actual earnings of $75k.

(B) A school with expected earnings of $50k and actual earnings of $55k.

(B) will rank vastly higher than (A) in this study, but I'd pick (A) over (B)
every time if I just wanted money.

In short, I don't actually see the value add here from this list. How am I
supposed to act on these rankings? How are these rankings supposed to change
any idea I might have about which school I should attend?

It seems if you want to know which school to attend based on earnings we
already have much more reliable data for that: actual earnings data.

~~~
TheCoelacanth
The problem with your method (which this analysis is attempting to although
may not actually correct) is what if the difference in the actual earnings
between the schools is due to quality of the students admitted into the school
rather than the increase in the students earnings that the school causes. For
example, if your options are

(A) A school that admits 1000 people with IQ of 200 who go on to earn an
average of $100,000 and 100 people with IQ of 50 who go on to earn an average
of $1,000

(B) A school that admits 100 people with IQ of 200 who go on to earn an
average of $1,000,000 and 1000 people with IQ of 50 who go on to earn an
average of $10,000

(A) has median earnings of $100,000 while (B) has median earnings of $10,000,
so by your criteria (A) is much better. However, (B) has much better outcomes
for both groups of students.

------
cgearhart
> _" The government generated the numbers by matching individuals’ student-
> loan applications to their subsequent tax returns, making it possible to
> compare pupils’ qualifications and demographic characteristics when they
> entered college with their salaries ten years later."_

So the dataset excludes students who do not apply for loans? (i.e., this
analysis penalizes schools who admit the folks most likely to make lots of
money, and the schools that have the lowest expected student contribution.)

> _"...based on a simple, if debatable, premise: the economic value of a
> university is equal to the gap between how much money its graduates earn,
> and how much they might have made had they studied elsewhere."_

If we are only to look at financial incentives, a more reasonable analysis
would be to compare the expected future earnings _distribution_ as opposed to
just a central tendency statistic like the median.

~~~
dragonwriter
> So the dataset excludes students who do not apply for loans? (i.e., this
> analysis penalizes schools who admit the folks most likely to make lots of
> money, and the schools that have the lowest expected student contribution.)

It penalizes (or at a minimum, undersamples in a nonrepresentative way)
schools with their own, non-loan, aid programs.

This probably very badly hurts schools like Caltech, since it means that there
(unless Caltech no longer has the essentially full-coverage need-based aid
they had when I went -- unsuccessfully, I graduated elsewhere) they are _only_
counting people who are making the unwise choice of taking loans when they
have no need given their current resources.

That this will skew the data badly should be obvious.

~~~
selimthegrim
Axlines were suspended some time ago, I think they are still proactive with
need based aid but I think with their tuition skyrocketing they can't possibly
be as gracious as they once were.

------
impendia
I am a proud alumnus of Rice University and I see that it is on page 64, fifth
from the bottom. Also, even more surprisingly, very near the bottom is
Caltech.

This is obviously wrong. If their methodology says that Caltech is in the
bottom 2% of US colleges, then one concludes that their methodology is
worthless, or at least that what it's predicting is not very closely related
to the quality of the education provided. I suspect that many Caltech
undergraduates decide to pursue grad school and academia, which is not an
especially lucrative career, and which is probably not highly corrolated with
political leftism or "reefer madness".

I can also tell you that my friends at Rice who were looking to make a lot of
money after they graduated, by and large succeeded.

In summary: Bullshit.

~~~
forgetsusername
> _This is obviously wrong._

"The analysis goes against my gut feeling, therefore it's wrong."

Did you bother to read the article?

 _The Economist’s first-ever college rankings are based on a simple, if
debatable, premise: the economic value of a university is equal to the gap
between how much money its graduates earn, and how much they might have made
had they studied elsewhere._

That's what they did, and those were the results. They even say it's
debatable. Using a different analysis or metric, the results would be
different.

> _In summary: Bullshit._

As opposed to the anecdotes regarding your Alma Mater and rich friends.

~~~
nilkn
I think he has a point, though. This study doesn't seem to control for an
incredibly important variable: student interests.

If one school tends to attract students interested in making a lot of money,
and another school tends to attract students interested more in academics and
graduate school, can one really conclude that the former adds more value?
Perhaps, but certainly not without taking into account the different student
interests and inclinations towards money-making.

They do control for field of study, but I don't think this is sufficient. A
school that attracts a lot of STEM majors who go to graduate school instead of
industry is pretty much going to get destroyed in this study.

------
hacknat
It would be very cool if you could sort by percentage differential between the
over/under and expected earnings. I think sorting by that parameter would
produce some really interesting results.

~~~
ryandrake
Agreed. Ranking by raw dollar values means that schools in very expensive
areas (or that feed into very expensive areas) will have exaggerated scores
and end up either really highly ranked or really low.

Another interesting study would be to see whether a child's college attendance
correlates with a better or worse standard of living from their parents'.
Maybe look at Father's salary at age 30 and child's salary at age 30, and see
if the attended college is significant. The question being: If you're born
rich, you tend to stay rich, and if you're born poor, you tent to stay poor.
Does college choice matter?

------
kelukelugames
Going to plug Professor Lauren Rivera's book again. [1]

It stresses how school ranking are part of the system reducing economic
mobility.

1\.
[http://www.amazon.com/gp/product/0691155623/ref=as_li_tl?ie=...](http://www.amazon.com/gp/product/0691155623/ref=as_li_tl?ie=UTF8&camp=1789&creative=9325&creativeASIN=0691155623&linkCode=as2&tag=wwwocfberkele-20&linkId=3TSKTACWO3KICFOR)

------
TACIXAT
My university asked for my earnings only one time. The categories maxed out at
70k+. They might be higher on that list if they took a more precise survey.

~~~
tpudlik
The data used here does not come from the universities, but from the
government. Specifically, from matching federal student loan applications
(from the Department of Education) to tax returns (from the IRS).

------
ronyeh
The data makes more sense if you sort by expected earnings or median earnings
and then compare within localized clusters of universities. For example, MIT
and Caltech have similar expected earnings, but MIT wins on median earnings.

------
jpatel3
Would love to get hold of csv or api to see state wide university data.

------
seibelj
If a college has a huge communications program and a smaller engineering
program, it would be ranked less because of pay disparities. This methodology
is simply inaccurate

------
thefastlane
i tried to look up several four-year U.S. institutions just now, including one
major research university, and none were in the list.

from the article, apparently they excluded universities when the scorecard
dataset was missing one or more factors; it would be useful if the tool at
least showed the university anyway as a placeholder and indicated what data
was missing.

