
Overconfident Students, Dubious Employers - sylvainkalache
https://www.insidehighered.com/news/2018/02/23/study-students-believe-they-are-prepared-workplace-employers-disagree
======
lr4444lr
_But Brandon Busteed, executive director of Gallup 's higher education
division, which also conducts research related to graduates and careers, said
these sorts of definitions can vary

For instance, Gallup has found that generally an employer believes that
"critical thinking" is coming up with new, original thought. But in an
academic sense, it can mean more picking apart ideas in depth, he said._

Given that their definition of "critical thinking" (which is incorrect,)
differs from the academic one, employers are probably responding on a
completely different wavelength. That throws this whole survey into question.

~~~
flukus
"new, original thought" to the employers probably just means that they can
deal with a problem that doesn't fit into the general workflow or one that
they haven't been specifically trained to do.

As an example, the payment system at my pub went down a few weeks ago and the
staff where clueless on what to do, the accepted short term solution (at my
insistence) to write down who bought what (we all have accounts) so they can
be put into the system later. This is a simple, trivial example, but in jobs
like that it makes you manager material.

~~~
Kluny
That's creative thinking, but not critical thinking.

~~~
flukus
The solution might be creative (in this case it might be more memory of how
things used to work than creativity), but first you have to isolate the issue,
theorize on how long it will be out and what the consequences of that are and
determine if you should do something or just wait. Then you have to weigh
several solutions which could include nothing or multiple solutions for
different cases. There are many critical thinking steps involved and some very
incomplete data to base them on.

------
ryandrake
> The AAC&U looked at some of the same measures as the association.
> Specifically around oral communication, students ranked themselves highly --
> about 62 percent of students believed they did well in this area compared to
> 28 percent of employers. That and written communication showed the biggest
> gaps in the AAC&U report (27 percent of employers versus 65 percent of
> students).

Kind of difficult to disagree with this one. I know it's passé these days to
worry about trivial things like correct grammar, spelling, and usage. I've had
fruitless discussions on this very site where people will argue that it
doesn't matter how you use the language if, ultimately, the message somehow
manages to get conveyed. Terrible language skills are not just found among new
grads though, I've seen them across the workforce, including appalling
spelling in communications from the SVP level. There's a non-zero cost, too:
the cost of having to get people to repeat themselves, the cost of ambiguity
when your manager poorly communicates their wishes, the cost of misunderstood
information bubbling up the chain to decision makers. It adds up.

~~~
sotojuan
> I've had fruitless discussions on this very site where people will argue
> that it doesn't matter how you use the language if, ultimately, the message
> somehow manages to get conveyed.

Most of the time terrible language does manage to hinder the message! But the
writer doesn't realize this, because for them it's obvious. I had a coworker
(who is almost 30 and a college graduate) with this very attitude. A lot of
his code comments and documentation had grammar mistakes and quirks (Random
periods. In the middle of phrases.) that made it very hard to read and thus
understand.

~~~
robbick
Agreed - I think the 'its how the message gets conveyed' only applies to
really trivial things, like 'expresso', or whom vs who where the incorrect
version is so clear that it doesn't affect the readability.

~~~
ryandrake
Here's the thing about the "expressos," "irregardless," "for all intensive
purposes," and "mute points" out there: Even if the meaning is clear and the
communication is successful, these convey a carelessness and sloppiness that,
as a great engineer, _you don 't deserve!_ As someone who knows what you're
doing and takes pride in your craft, things like bad spelling and usage
needlessly undermine your credibility.

~~~
solidsnack9000
They do undermine one’s credibility.

At the same time, a cursory study of German, French, Hebrew, even Japanese
would suggest that English spelling is needlessly complicated and ambiguous.
Part of addressing this problem for future generations is a push for spelling
reform.

~~~
wavegeek
I've learned a lot of languages and they all have their problems. E.g. the
horribly complicated broken Kanji characters in Japanese and Chinese. E.g. the
almost completely pointless genders in German, and resulting requirements for
other words changing in regular and irregular ways in agreement. Similarly
verb tenses are insanely over-complicated. English is reasonably easy to learn
compared to many other languages.

It is harder than it looks to fix spelling because in the written language
different words that sound the same are spelled differently. Making the
written language phonetic would create ambiguity.

~~~
solidsnack9000
Some things about English are easier than other languages, but it’s spelling
is uniquely complicated. There are roughly 55 phonemes in English and 1500
distinct spellings for them. In most languages the ratio is closer to 1:2, not
1:30.

Phonetic writing could increase ambiguity but in many cases will not because:

(a) Many of the weird redundant spellings only show up in multi-syllabic words
that are quite distinctive, as for example _ti_ for _ʃ_ in _caution_.

(b) People often can tell apart many shorter homophones, which are frequently
mis-spelled by being substituted for each other: _road_ / _rode_ , or _their_
/ _they’re_ / _there_. The fact they are frequently substituted for one
another is a clue to the degree which context disambiguates them.

------
nostrademons
I'm always a little skeptical of these sorts of surveys because it's hard to
tease out what people believe about themselves because it's _true_ vs. what
people believe about themselves because it's _useful_.

I remember that when I was a new grad, there was a very large part of myself
that held a realistic appraisal of my abilities and was therefore scared
shitless about my ability to make it in the working world. I was very careful
to never let that part of me out in interviews - or, for that matter, to
_anyone_. Confidence only works if you keep up the illusion so thoroughly that
it ceases to be an illusion.

And it worked. I got a job at a financial software startup, and then was put
in charge of projects that no new grad should ever have been put in charge of.
I grew into the role. I left to go found a startup, which is also something
that someone with 2 years of work experience had no business doing. That
worked too - I may not have been qualified to found a startup, but when I
folded it up, I was a lot _more_ qualified as an engineer than most of my
other peers with 4 years of work experience. So Google hired me to work on the
front page of the search engine, and I grew into that role too.

The majority of my classmates let their accurate perceptions of what they were
actually qualified to do govern what they applied to do, and as a result, many
were still struggling to get into a career 10 years later. By that point, your
self-perception has become reality, and it's much harder to convince potential
employers to take the risk that you'll grow into the position. Then they wake
up and realize that _everybody 's_ faking it and their new manager isn't
actually all that much more skilled than them, but (barring a career reset
like going to grad school) it's difficult to reset people's perceptions.

~~~
ericmcer
"What is your greatest weakness?". Everyone is expected to lie or warp reality
a bit on this question. Interviewing is just a game and people learn pretty
quickly to give the answers interviewers want to hear. The alternative to
overconfidence is not great.

~~~
bpchaps
I answered this in one of my first interviews with, "My greatest weakness is
my inability to focus on things for an extended period of time." The
interviewer told me never to do that, but appreciated my candor and extended
an offer, though I didn't take the job.

Sometimes it actually works to be honest instead of treating it as a game. :)

~~~
dvtv75
This leaves me conflicted.

I've noted here (quite) a number of times that I have a disability that
impacts my ability to absorb chunks of written material. I've had an
occupational therapist tell me that I shouldn't work in CS or IT (I have a CS
degree in spite of that), and several doctors have told me I should stay away
from the field, while others have told me that just go with whatever works and
don't tell employers I have this disability.

I prefer to be honest, but at the same time I also like to eat most days.

I have an interview for a job in a few days, and no clue what I should do. If
they don't ask, do I bring it up? What do I tell them? Most employers in the
city are highly discriminatory against people with a disability, even if it
means they're not hiring the most qualified one.

(I'm not actually seeking advice here, just musing online.)

~~~
bpchaps
If the companies that you're interviewing for aren't accepting of the way you
are, then it might be useful to question whether those companies are for you.
Yes, it's difficult to think this way when the specifics of your circumstances
are difficult, but it still might be a lifestyle worth pursuing.
Lentils/rice/chick peas are a super cheap way of living on a budget ;).

It's exceptionally difficult having disabilities, especially when trying to
relate it in a professional environment. Though, I've found my own
disabilities have been very useful in figuring out what it is that I really
want out of a career and life.

Cheers and good luck.

------
ergothus
Others have mentioned the vagueness and low value of the questions, but I'm
particularly struck by the professionalism/work ethic one.

Professionalism is tough, because we define it (roughly) as seeking the common
expectations for the role, but one of those expectations is "professionalism".

And "work ethic"...Im a gen Xer and I feel like millennials are getting dumped
on more than the normal generational strife, to a very unfair level. (Example:
I see a lot more snowflakes and need for "participation trophies" among those
complaining about such things than among the millenials themselves)

Which leads me to wonder, particularly with work ethic, is it the students or
the employer that have the unrealistic expectation?

~~~
mjevans
I have a feeling that both have unrealistic expectations.

Graduates: I won't say all, but I do think many modern graduates allow
themselves to be too easily distracted on a mobile computer they own.

Employeers: High expectations of engagement and 'loyalty', but low pay, low
benefits, and lacking support in the environment at work. Somehow able to
remain productive when in a highly distracting environment with no expectation
of actual tranquility for focus. Is masochist the kind of professional they're
describing?

Ultimately this is one of those issues where a clear set of expectations and
some metric of measuring them should be set; then an independent review party
should measure if just the metric was being met, or if things were normal but
the metric was still being met.

~~~
dasmoth
_Graduates: I won 't say all, but I do think many modern graduates allow
themselves to be too easily distracted on a mobile computer they own._

While I think there’s some truth in this (and, ahem, not-so-modern grads...),
it seems like measuring the wrong thing. If someone posts 50 times a day to
whatever the current friendfaceinstabook might be _and_ delivers useful code
(or whatever) on a regular basis, isn’t that fine? Focus on the latter, don’t
stress about the former.

------
araes
Implied in finishing university is the expectation that you have been taught
what you need to make it in the work place. Its part of the social contract.
Its what I paid you for right? Naturally, they answer with high confidence in
their readiness.

People who have worked a long time are in part answering for themselves. They
look back at how much they've learned, changed, and developed. How much they
had to learn on the job that was never taught in school. Were they actually
unprepared, or are they just judging new grads based on a viewpoint that's
colored by several years?

When I was a hiring mgr, this could be one of the most frustrating things in a
room. Trying to push to take a chance on a new grad, and having several others
tell me they'd prefer the industry veteran who's lateraling. Even for "entry
level" positions. Amazing how many people with really good CV's were applying
for "entry level" jobs.

On a last thought, there also seems to be a strong lack of faith that people
can learn on the job from hiring folks. Maybe its just the above mentioned
flood of high CV talent. Software for example: You know: c, c++, java,
javascript, haskell, and perl. Ah, but you don't know Ruby, and we're really
looking for Ruby. But, I know 6 other languages, I can learn one more...

~~~
rootlocus
> But, I know 6 other languages, I can learn one more...

I doubt you professionally know 6 languages. There's a lot more to learn about
the runtime, frameworks and tools of a language than there is syntax.

~~~
ekidd
OK, I might have a couple more languages than the average mid-career
developer, but I don't think I'm that much of an outlier. So here goes.

I have, in the past 10 years, professionally and successfully worked with C++,
Scheme, Ruby, a couple generations of JavaScript and Rust. And I've written a
whole pile of shell scripts. These were all "first tier" languages in which I
was up-to-date at the time. I've also, as a consultant, delivered projects in
C# and Java. In these cases, I didn't know all the tricks, but my code met
specifications, it had tests, and it was reasonably clear.

Oh, and I also wrote a bunch of Haskell on my own time, and it was stronger
than a couple of the languages that I got paid to use.

But no, I'm not professionally up-to-date in all these languages. My C++ is
out of date, JavaScript frameworks change every year, etc.

But if you dropped me in a new job with a new language, I could be delivering
business value and clean code inside of two weeks, and I'd be reasonably
current on whatever framework I was using within a few months. Yes, it would
take slightly longer for Elixir or Erlang, and longer still for something like
Coq, because there would be new paradigms involved.

But there really does come a time when genuinely unfamiliar ideas in
mainstream tools become rare and special. I'm always happy to learn something
new like React's virtual DOM or Rust's borrow checker.

And I know plenty of people who are far, far better than I am.

And this is why I'm generally happy to assume that competent developers can
learn on the job if they show any interest in doing so. You really don't need
more than one person on a team who knows the deeper trivia of a framework.

~~~
bbarn
> But if you dropped me in a new job with a new language, I could be
> delivering business value and clean code inside of two weeks

This is one of the things I consider to be a trait of a senior developer, when
making hiring decisions. I understand why places post jobs that want X years
experience with Y stack, because they don't want to feel like they have to
teach "the basics" to someone at a high level, but I've found "the basics" is
just what you've described in that sentence. Good professional software
developers are language agnostic - even if they've spent the last 10 years on
a single language in practice.

------
walshemj
Or the short version employers don't want to train new graduates even less
than they did a generation ago.

The increase of degree required jobs means that organisations that used to
take highschool leavers at 16 or 18 and train them now think that university
should do all the vocational training - they are also not getting the top
quartile of grads either.

------
thedudemabry
While I agree that a problem is exposed by these findings, let's not judge
college grads by their ability to judge how well they fit employers'
privately-held expectations of an entry-level employee. Companies suck at
training, balk at its costs, but lament its effects on the sourcing of young
employees.

------
austincheney
At the current job is the first time I have worked with young college
graduates. It is interesting to see that some of the developers are humble and
are completely ready for the corporate world while others are the opposite.

It really comes down to personality and whether a person's personality has
allowed them to practice more and work harder to learn a skill. These people
tend to be much better prepared without the delusions of their own
awesomeness.

A college new hire isn't ever the rockstar their social reference group has
indicated and the more humble personalities didn't need magic to figure this
out. Yes, I have actually encountered college new hires who thought themselves
as rockstars when in reality they seriously sucked. They cognitively knew they
sucked, but somehow the arrogantly boasting rockstar self-persona was still
there in very much a stereotypical millennial fashion.

~~~
bbarn
I don't think the "rockstar" act is a millennial thing. There have always been
people like that in any field, or social group. Some people will always
compensate for their flaws in ability with their strength of personality, and
with our industry that just means painting the "rockstar coder" persona,
because there are some people that seek that out.

That's why good hiring practices have to be used, even when you feel like it's
a waste. Give everyone the same critical evaluation and don't allow yourself
to skip questions in interviews because you like the person, etc. (I'm not
accusing you, just saying as a rule)

~~~
ryandrake
I like using the word "rockstar" to describe coders, because the word has
similar negative connotations as in the music context. An actual rock star
often doesn't (but sometimes does) have: classical music training, knowledge
of the music literature, vision beyond their area of expertise, humility in
the face of their equals or betters, a desire to learn and ability to change
with the times, the temperament to fit well with the business side, etc.

If that's what you want to hire, go hire your rockstar coders.

------
smallnamespace
The pressure to be overconfident sometimes comes directly from employers.

One time, walking into an interview for an entry-level position, the
interviewer completely ignored me and continued to stare at my resume for two
minutes.

The first words out of his mouth were, "At Lehman Brothers, we only hire
winners. Are you a winner!?"

In retrospect, I'm glad that interaction happened and revealed what the
company's ethos was; otherwise, I might have been tempted to work there.

------
jadedhacker
"The easy solution: set students up in a more professional environment,
Busteed said -- this could be internships or co-op programs. If students can't
go to an actual office, then the environment should be brought to them so they
have a better sense of how a workplace runs."

Because education is only about job training so that workers are pliant and
ready to to be directed by the system of hierarchy. Thus, turn higher
education into the same factory training that high school is. This, in an era
when profits are already soaring.

------
borntyping
This means almost nothing without a baseline to compare it to. I'd suggest
it's very likely that you'd get similar results if you surveyed any other age
group - that individuals give higher estimates of their ability than someone
else would is hardly surprising.

~~~
austincheney
I disagree. There are unique personality traits that qualify the millennial
stereotypes which are not present in older age groups. Young people will
always believe they are ready for the world if all their elders tell them such
no matter that this messaging is baseless and inaccurate.

The problem is a level of personal fragility and lack of contrary frame of
reference. Fortunately there is a minority population in that demographic that
either studied appropriately or with the correct personality traits to see
through, or utterly disregard, the dotting bullshit.

~~~
__s
Kids these days, amirite?
[https://mentalfloss.com/article/52209/15-historical-
complain...](https://mentalfloss.com/article/52209/15-historical-complaints-
about-young-people-ruining-everything)

~~~
austincheney
Millennials -
[https://www.youtube.com/watch?v=hER0Qp6QJNU](https://www.youtube.com/watch?v=hER0Qp6QJNU)

------
hashkb
It's silly to ask a college graduate of 20-25 years old anything about
professional conduct. Can't we just appreciate that they're young and excited?
Anyone with a career a few years long knows exactly how professional to expect
a recent graduate to act at their first job; it's everyone else's job to
mentor them and set a good example of professional behavior.

~~~
tylergetsay
As somebody younger (and more naive) that is only 2-3 years into my career, I
do my best to remain professional. What's some things to look out for?

~~~
sedachv
Your local public library will have a lot of quality career advice books.
Nancy Barry's _When Reality Hits_ is a good one. Go to the shelf it is on and
browse around.

------
crazygringo
Is this anything more than the general human trait where most people think
more highly of themselves than others do of them? E.g. where 90% of people
tend to consider themselves above-average, when mathematically it can only be
50%? Colloquially called the "Lake Wobegon effect" after the fictional lake
area where "all the children are above-average". See [1].

There doesn't appear to be any evidence that this has anything to do with
students and employers specifically.

[1]
[https://en.wikipedia.org/wiki/Illusory_superiority](https://en.wikipedia.org/wiki/Illusory_superiority)

~~~
autokad
that ("90% of people tend to consider themselves above-average") can
mathematically be true if you have a skewed distribution.

suppose that 90% of drivers have no accidents at all, while the other 10% have
10 accidents per year. Then the average (mean) is 1 accident per year, and
fully 90% of drivers are better than average!

------
greggarious
Maybe it's because employers want >2 years experience from people straight out
of college, and no one wants to train. Companies claim they don't want to
train because you could just leave (at will) but it's relatively simple to
draw up a contract that employees must pay back training costs if they don't
stay at least a year or two.

~~~
sotojuan
Companies don't "want" to train because employees "don't stay" but companies
also don't provide incentive for employees to stay for more than a few years.

------
bbarn
I find it interesting the one outlier in this is in "digital technology". Is
that the age difference in action?

------
daphneokeefe
I'm not sure the word "dubious" means what you think it means. Are the
employers dubious?

------
Rexxar
"Proficient" doesn't mean the same thing for the two categories. For example,
it probably means:

-> for student : "I will manage to have a job a do it"

-> for employer : "do the job as good as other employees"

So comparison is meaningless.

------
StanislavPetrov
When you give every kid a trophy, they all think that they are winners. We
have raised a generation of children who have been infantilized and sheltered
throughout their childhood and college years. The inevitable result,
unfortunately, is a generation of adults that lack the ability to tackle
adversity, think critically, or make valid self-assessments (something that is
difficult even for the most enlightened). We have a system of social and
academic promotion that has virtually no relation to objective metrics. Many
universities (and law schools), even the most prestigious, are eliminating
objective admission standards because so many applicants are unable to meet
them. Its no surprise that those who have been conditioned since birth to
believe that they are entitled to "achievements" have an inflated view of
their abilities. The steady decline in literacy, critical thinking, and
overall competence (from already low historical levels) mirrors the steady
decline and atrophication of our society as a whole.

[http://college.usatoday.com/2016/07/18/columbia-and-
barnard-...](http://college.usatoday.com/2016/07/18/columbia-and-barnard-are-
the-latest-schools-to-drop-sat-and-act-requirements/)

[https://www.washingtonpost.com/news/grade-
point/wp/2017/03/0...](https://www.washingtonpost.com/news/grade-
point/wp/2017/03/08/harvard-law-school-will-no-longer-require-the-lsat-for-
admission/?utm_term=.cdf318624bbf)

[https://www.fairtest.org/schools-do-not-use-sat-or-act-
score...](https://www.fairtest.org/schools-do-not-use-sat-or-act-scores-
admitting-substantial-numbers-students-bachelor-degree-programs)

~~~
tomtoise
Of course, if your stereotype is correct, this does beg the question about who
decided it was a good idea to _GIVE_ 'the kids' all these trophies.

~~~
watwut
The reality is, all of then knows difference between participation tropey and
winning. Maybe trophies smooth over feeling bad for not being winner, maybe
not, I dunno. Obsessing over them is ridiculous on its own sake.

Top schools are more competitive then ever. High schoolers need to adjust
their whole life to have a chance to get in. Competitions are harder then
ever.

Kids spend way more time in organized activities to get every bit of
performance out of them - the only exception is likely football which is
suspect of destroying their brains.

On average, kids have little time to just play around without working on
something.

