Hacker News new | past | comments | ask | show | jobs | submit login
Overconfident Students, Dubious Employers (insidehighered.com)
111 points by sylvainkalache on Feb 25, 2018 | hide | past | favorite | 121 comments



But Brandon Busteed, executive director of Gallup's higher education division, which also conducts research related to graduates and careers, said these sorts of definitions can vary

For instance, Gallup has found that generally an employer believes that "critical thinking" is coming up with new, original thought. But in an academic sense, it can mean more picking apart ideas in depth, he said.

Given that their definition of "critical thinking" (which is incorrect,) differs from the academic one, employers are probably responding on a completely different wavelength. That throws this whole survey into question.


"new, original thought" to the employers probably just means that they can deal with a problem that doesn't fit into the general workflow or one that they haven't been specifically trained to do.

As an example, the payment system at my pub went down a few weeks ago and the staff where clueless on what to do, the accepted short term solution (at my insistence) to write down who bought what (we all have accounts) so they can be put into the system later. This is a simple, trivial example, but in jobs like that it makes you manager material.


That's creative thinking, but not critical thinking.


The solution might be creative (in this case it might be more memory of how things used to work than creativity), but first you have to isolate the issue, theorize on how long it will be out and what the consequences of that are and determine if you should do something or just wait. Then you have to weigh several solutions which could include nothing or multiple solutions for different cases. There are many critical thinking steps involved and some very incomplete data to base them on.


It's fairly normal for terms to have different academic and non-academic usage. There is a reason so many surveys define terms.


Critical thinking isn't some kind of academic jargon. It literally means applying criticism to the subject of thought.


And? It's normal for words to drift in meaning over time. You have to deal with the reality of language, not "how it's supposed to be".


I agree with you in the general sense, but the employers (whoever they are) are simply wrong by any reasonable standard of language to equate critical thinking with original thinking.

For example, the skill of top lawyers is inseparably wed to critical thinking. They need to parse arguments and pore over monotonous tomes of legislation, regulatory paperwork, contracts, and case law to find areas of weakness or inconsistency to exploit/tighten up. They do this through slow, methodical analysis, consultation with other experts, and then have to distill and apply the essence of these points into language defendable against industry peers who are working for the other side. This work requires deeply critical thinking, but has nothing to do with "new, original thought".


I can agree with you all I want, but it doesn't change anything. They will still believe it means whatever they think it means.

Awful used to mean awe inspiring, which is clearly what it says, but now means terrible. I'm sure people pointed out that meaning was wrong at the time.


Yet communicating in a clear, consistent, and accurate fashion is crucial.

Many years ago, I worked in a photographic store. We developed film and printed photos, primarily. One of the women took in a film for development, wrote the customer's surname in the name box, then wrote "Matt" over the name.

It's fortunate that we would hand out the top slip so we would match the packet numbers, because when the customer came back into the shop we couldn't find her package. We did find one for a Matt Sullivan (or whatever the surname was). It turned out she was quite annoyed with us for not printing it in matte as they asked, so we reprinted it at no charge and she got to keep both sets.

The staff member argued that she had written matte ("See? It says Matt right there." "That's the name, not the paper type. The paper type has an -e at the end.") on it, it's just that language changes and we had to accept and adapt to it. Didn't matter what her argument was, though, because she screwed up and it cost the business money.


Yeah, but that's still not what critical thinking means.


I agree and can't imagine how they asked the employers the definition of critical thinking and got that response. The answer is so ridiculously wrong it that they must have asked a loaded question to get such a response.


> The AAC&U looked at some of the same measures as the association. Specifically around oral communication, students ranked themselves highly -- about 62 percent of students believed they did well in this area compared to 28 percent of employers. That and written communication showed the biggest gaps in the AAC&U report (27 percent of employers versus 65 percent of students).

Kind of difficult to disagree with this one. I know it's passé these days to worry about trivial things like correct grammar, spelling, and usage. I've had fruitless discussions on this very site where people will argue that it doesn't matter how you use the language if, ultimately, the message somehow manages to get conveyed. Terrible language skills are not just found among new grads though, I've seen them across the workforce, including appalling spelling in communications from the SVP level. There's a non-zero cost, too: the cost of having to get people to repeat themselves, the cost of ambiguity when your manager poorly communicates their wishes, the cost of misunderstood information bubbling up the chain to decision makers. It adds up.


> I've had fruitless discussions on this very site where people will argue that it doesn't matter how you use the language if, ultimately, the message somehow manages to get conveyed.

Most of the time terrible language does manage to hinder the message! But the writer doesn't realize this, because for them it's obvious. I had a coworker (who is almost 30 and a college graduate) with this very attitude. A lot of his code comments and documentation had grammar mistakes and quirks (Random periods. In the middle of phrases.) that made it very hard to read and thus understand.


Agreed - I think the 'its how the message gets conveyed' only applies to really trivial things, like 'expresso', or whom vs who where the incorrect version is so clear that it doesn't affect the readability.


Here's the thing about the "expressos," "irregardless," "for all intensive purposes," and "mute points" out there: Even if the meaning is clear and the communication is successful, these convey a carelessness and sloppiness that, as a great engineer, you don't deserve! As someone who knows what you're doing and takes pride in your craft, things like bad spelling and usage needlessly undermine your credibility.


They do undermine one’s credibility.

At the same time, a cursory study of German, French, Hebrew, even Japanese would suggest that English spelling is needlessly complicated and ambiguous. Part of addressing this problem for future generations is a push for spelling reform.


I've learned a lot of languages and they all have their problems. E.g. the horribly complicated broken Kanji characters in Japanese and Chinese. E.g. the almost completely pointless genders in German, and resulting requirements for other words changing in regular and irregular ways in agreement. Similarly verb tenses are insanely over-complicated. English is reasonably easy to learn compared to many other languages.

It is harder than it looks to fix spelling because in the written language different words that sound the same are spelled differently. Making the written language phonetic would create ambiguity.


Some things about English are easier than other languages, but it’s spelling is uniquely complicated. There are roughly 55 phonemes in English and 1500 distinct spellings for them. In most languages the ratio is closer to 1:2, not 1:30.

Phonetic writing could increase ambiguity but in many cases will not because:

(a) Many of the weird redundant spellings only show up in multi-syllabic words that are quite distinctive, as for example ti for ʃ in caution.

(b) People often can tell apart many shorter homophones, which are frequently mis-spelled by being substituted for each other: road/rode, or their/they’re/there. The fact they are frequently substituted for one another is a clue to the degree which context disambiguates them.


> It is harder than it looks to fix spelling because ...

Of what you say and lots of things too. Much of the existing mess is caused by spelling "reform" of the ages past, such as when the Norman scribes half-replaced the conventions evolved by Anglo-Saxon scribes with French ones. It's not very different from https://xkcd.com/927/.


The "who" versus "whom" thing is a dialectical difference, only worse, in that the dialect which uses "whom" is entirely artificial at this point, and only serves as a class marker. Saying it's "more correct" is a rather nasty bit of classism.


@ryandrak "irregardless" is a word


that's one of those gotcha words, though, like "inflammable" (contrast w/ "indestructible"), and "denuded".


Do you mean that "irregardless" would have to be one of those gotcha words if it did indeed exist?

"Inflammable" is confusing because it sounds like the opposite of "flammable"; but the sad fact of history is that "inflammable" has meant "burnable" for centuries while the synonym "flammable" was made up recently -- so we are stuck with both.

But as far as I know, the word "regardless" has been around for ages, and that's fine because it sounds like just what it means: the opposite of "with regard". Whereas "irregardless" should be chucked in the bin as a confusing neologism that sounds like the opposite of what the speaker intends.


The word "irregardless" exists, whether you/I/we want it to or not.[0,1] All I can say is that it is indeed surprising how quickly confusing neologisms that should be chucked in the bin soon become sad facts of history.

It's not in the same category as "for all intensive purposes". But that's just the way language evolves. "Intensive" in that case should be "intents and", which, I believe has the same heritage as "clear and present danger", "crimes and misdemeanors", and "cease and desist". All these phrases are redundant lawyer-speak that came about when the law was being translated from Latin to English in the, what, 14th century in England?

So, yeah, there are people who want to _sound_ educated, thinking they'll use a high-falutin' phrase like "for all intents and purposes", they get it wrong and say "for all intensive purposes", and instantly reveal themselves as the uneducated person they actually are. (But only if their present company includes an actually educated person, of course.)

But rather than passing judgment on uneducated folks for using eggcorns[2], it would be more compassionate to reserve that judgment for the actual content and meaning of their speech, and not the superficial form it takes.

[0] https://en.wikipedia.org/wiki/Irregardless [1] http://www.businessinsider.com/irregardless-real-word-regard... [2] http://eggcorns.lascribe.net/


I did a code review for my manager where I reminded him to punctuate and capitalize every code comment. I wonder what review score I'm going to get this quarter.


But you could just as well say its six or 1/2 a dozen of the other.

I am some what prone to that as well as a dyslexic but at 10 I had a reading age of 18+

However you could turn that around and say well maybe your reading comprehension skills are not as good as mine.

https://www.davidsongifted.org/Search-Database/entry/A10435


You can indeed say all of that, but both statements are merely misdirection.

Saying its "six of one, half dozen of the other" is a colorful way of saying mistakes are irrelevant; sotojuan is wrong, they don't actually detract from the communication. But you provide no reason.

As for your reading skills, the subject is writing communications, but from your reading skill you can see the value of writing quality.


err that's not what "six of one, half dozen of the other" means


In this context, yes, it means that written right or wrong is the same difference.


when a manager complains about oral/written communication they are NOT talking about how you spell things. its things like properly alerting them when issues arise, not being to terse yet not overly wordy emails, knowing what topics are appropriate on conference call and when others can be talked about offline, etc.

i myself am not the best at communication, but those problems do not lie in the fact i didnt bother to use caps in a hn post.


I think that if a developer has problems communicating in their written/spoken language of choice, it might indicate a similar issue with their programming language of choice.

Some managers do care about spelling, grammar, and punctuation. I have a 2nd VP who cares about the colors on the charts and graphs delivered to him.


well charts/graphs delivered to management is an entirely different skill set / topic. slack/emails/comments is a different story. i myself would be happy to see any documentation, and something more than: # this function calculates x def calculatesX():


I'm not sure if you are just going off on a tangent or just completely missed greedo's point.

If you can't write in your mother tongue properly you probably can't code properly either.


they are two completely different skill sets, and i think on some level you must know that, or have no idea what coding is. Not just different, but orthogonal. When one is good at one, they tend to be worse at the other.


disagree.

writing creatively or with nuance is definitely an orthogonal skill to coding, but understanding and following grammar rules requires a very similar syntax-parsing ability to that which is required to write code that will compile. unless you struggle to get simple programs to compile, you very likely have the ability to write with proper grammar, assuming you are willing to make the effort.

I would even argue that the skill of technical writing, where precision and compactness are valued over creative flourish, maps very closely onto the same skills needed for good program design and clean coding style.


I think you hit an extremely important point: communication skills (both written, oral and even nonverbal) are EXTREMELY important for most engineers, even though most software folks tend to not concern themselves with that too much. I understood this real early and it has helped me progress a lot not just in my career but in personal life as well (turns out good communication skills lead to healthy relationships... who knew? :) )

One experience that really change my life in many ways was writing a Masters Thesis under an advisor who was anal about explaining the ideas well and about eliminating typos and such. I found it very frustrating at first but I think overall it trained me to pay attention to what I say and write to try and be as clear as possible.


While there is a subset of people who are quite bothered by minor typos and grammar errors, most people read right over them. Which is fortunate given that I type into this site in my phone.

Poor reading comprehension is much more of an issue. Even assuming reading happens is a generous assumption given how often people clearly only read the headlines of stories.


I'm always a little skeptical of these sorts of surveys because it's hard to tease out what people believe about themselves because it's true vs. what people believe about themselves because it's useful.

I remember that when I was a new grad, there was a very large part of myself that held a realistic appraisal of my abilities and was therefore scared shitless about my ability to make it in the working world. I was very careful to never let that part of me out in interviews - or, for that matter, to anyone. Confidence only works if you keep up the illusion so thoroughly that it ceases to be an illusion.

And it worked. I got a job at a financial software startup, and then was put in charge of projects that no new grad should ever have been put in charge of. I grew into the role. I left to go found a startup, which is also something that someone with 2 years of work experience had no business doing. That worked too - I may not have been qualified to found a startup, but when I folded it up, I was a lot more qualified as an engineer than most of my other peers with 4 years of work experience. So Google hired me to work on the front page of the search engine, and I grew into that role too.

The majority of my classmates let their accurate perceptions of what they were actually qualified to do govern what they applied to do, and as a result, many were still struggling to get into a career 10 years later. By that point, your self-perception has become reality, and it's much harder to convince potential employers to take the risk that you'll grow into the position. Then they wake up and realize that everybody's faking it and their new manager isn't actually all that much more skilled than them, but (barring a career reset like going to grad school) it's difficult to reset people's perceptions.


"What is your greatest weakness?". Everyone is expected to lie or warp reality a bit on this question. Interviewing is just a game and people learn pretty quickly to give the answers interviewers want to hear. The alternative to overconfidence is not great.


I answered this in one of my first interviews with, "My greatest weakness is my inability to focus on things for an extended period of time." The interviewer told me never to do that, but appreciated my candor and extended an offer, though I didn't take the job.

Sometimes it actually works to be honest instead of treating it as a game. :)


This leaves me conflicted.

I've noted here (quite) a number of times that I have a disability that impacts my ability to absorb chunks of written material. I've had an occupational therapist tell me that I shouldn't work in CS or IT (I have a CS degree in spite of that), and several doctors have told me I should stay away from the field, while others have told me that just go with whatever works and don't tell employers I have this disability.

I prefer to be honest, but at the same time I also like to eat most days.

I have an interview for a job in a few days, and no clue what I should do. If they don't ask, do I bring it up? What do I tell them? Most employers in the city are highly discriminatory against people with a disability, even if it means they're not hiring the most qualified one.

(I'm not actually seeking advice here, just musing online.)


If the companies that you're interviewing for aren't accepting of the way you are, then it might be useful to question whether those companies are for you. Yes, it's difficult to think this way when the specifics of your circumstances are difficult, but it still might be a lifestyle worth pursuing. Lentils/rice/chick peas are a super cheap way of living on a budget ;).

It's exceptionally difficult having disabilities, especially when trying to relate it in a professional environment. Though, I've found my own disabilities have been very useful in figuring out what it is that I really want out of a career and life.

Cheers and good luck.


If they don't ask, do I bring it up?

Most people without a disability would focus on their positive attributes, I can't see why you should be any different.


"Open-plan offices."


I’m stealing that.


Related: these aren't even the competencies that employers have selected as most important, only the ones that the firm doing the survey selected. So how important are they really? Where do, for instance, "Career Management" and "Global/Intercultural Fluency" stack up against "Can actually do the job we're hiring them to do, e.g. program a computer"?


Intercultural fluency is code for people that are less likely to say the stupid things, either to customers or coworkers, that result in lawsuits.

Career managment is code for people who can maintain professional qualifications without supervision, who book the course for themselves before something expires.


Code words for sussing out leaders vs those who need direction.

Know how to grow, know when to bite your tongue.


Have you considered that perhaps your self-appraisal was unrealistically negative?

Your post suggests that either most people are capable of far more than work situations demand of them, or that you were more capable in practice than you appreciated at the time.

It's either that, or you were very, very lucky - which is possible, but not necessarily probable.


> Confidence only works if you keep up the illusion so thoroughly that it ceases to be an illusion.

This was the same for me as new manager. I was less confident that I let interviewers see. But in the end I got that confidence for real. "Fake it till you make it"


Had the exact same thought coming here, couldn't have said it better.


Others have mentioned the vagueness and low value of the questions, but I'm particularly struck by the professionalism/work ethic one.

Professionalism is tough, because we define it (roughly) as seeking the common expectations for the role, but one of those expectations is "professionalism".

And "work ethic"...Im a gen Xer and I feel like millennials are getting dumped on more than the normal generational strife, to a very unfair level. (Example: I see a lot more snowflakes and need for "participation trophies" among those complaining about such things than among the millenials themselves)

Which leads me to wonder, particularly with work ethic, is it the students or the employer that have the unrealistic expectation?


I have a feeling that both have unrealistic expectations.

Graduates: I won't say all, but I do think many modern graduates allow themselves to be too easily distracted on a mobile computer they own.

Employeers: High expectations of engagement and 'loyalty', but low pay, low benefits, and lacking support in the environment at work. Somehow able to remain productive when in a highly distracting environment with no expectation of actual tranquility for focus. Is masochist the kind of professional they're describing?

Ultimately this is one of those issues where a clear set of expectations and some metric of measuring them should be set; then an independent review party should measure if just the metric was being met, or if things were normal but the metric was still being met.


Graduates: I won't say all, but I do think many modern graduates allow themselves to be too easily distracted on a mobile computer they own.

While I think there’s some truth in this (and, ahem, not-so-modern grads...), it seems like measuring the wrong thing. If someone posts 50 times a day to whatever the current friendfaceinstabook might be and delivers useful code (or whatever) on a regular basis, isn’t that fine? Focus on the latter, don’t stress about the former.


Students may be interpreting professionalism / work ethic as consistency in studying and turning in assignments on time in the context of coursework. But the scale of problems that they need to solve in the workplace have a much greater scope in terms of time, complexity, and number of people involved. Employees need to integrate with the organization, not just do the equivalent of turning in assignments. Often the initial plan of work isn't complete or the right thing to do. Some fresh graduates have a tendency to latch on to some passing statement from a team meeting, work hard on that without doing any follow-up, and end up surprised when no one has a use for what they did. A professional keeps the big picture in mind and follows up with people so that course corrections are made and genuinely useful work gets done.

Colleges try to teach teamwork, but few team projects in coursework are sufficiently complex that they genuinely require teamwork. Since everyone in a course project is taking the same course, no one has irreplaceable expertise. The consequence is that students never have to learn how to work as an organization instead of individuals. The exceptions are club activities like (inter)national engineering design team competitions in which, as in business, genuine failure is possible and success depends on everyone's contributions.


I have a feeling that many of these numbers would have looked similar 10, 20, 30 years ago.

Is the news here that these numbers are much lower than they used to be? Or just that we know them.


Implied in finishing university is the expectation that you have been taught what you need to make it in the work place. Its part of the social contract. Its what I paid you for right? Naturally, they answer with high confidence in their readiness.

People who have worked a long time are in part answering for themselves. They look back at how much they've learned, changed, and developed. How much they had to learn on the job that was never taught in school. Were they actually unprepared, or are they just judging new grads based on a viewpoint that's colored by several years?

When I was a hiring mgr, this could be one of the most frustrating things in a room. Trying to push to take a chance on a new grad, and having several others tell me they'd prefer the industry veteran who's lateraling. Even for "entry level" positions. Amazing how many people with really good CV's were applying for "entry level" jobs.

On a last thought, there also seems to be a strong lack of faith that people can learn on the job from hiring folks. Maybe its just the above mentioned flood of high CV talent. Software for example: You know: c, c++, java, javascript, haskell, and perl. Ah, but you don't know Ruby, and we're really looking for Ruby. But, I know 6 other languages, I can learn one more...


There is lack of faith that people can learn another Angular version even. You have experience wih 1.6?! We do TS+Angular 4 over here, sorry mate! Such companies are simply not hiring - bragging, teasing, probing the market perhaps - but certainly not hiring. Not only the framework landscape is broken but as well people are extremely anal about their preferences and choices.


And what is the half-life of the latest and greatest frame work any way :-)


> Implied in finishing university is the expectation that you have been taught what you need to make it in the work place. Its part of the social contract. Its what I paid you for right?

Then the person misunderstood what education is and reality is about to slap them in the face. I started a career before graduating college and now have two simultaneous careers. The benefit of having multiple simultaneous careers is that you can see the comparative strengths and weaknesses of each (things other people don't see) and adjust yourself accordingly.


> Then the person misunderstood what education is and reality is about to slap them in the face.

Sure, but that doesn't change that it's heavily implied by the education industry and society as you grow up what OP said. If your entire life someone's been telling you an orange is an apple then when asked "what's that?" you'll answer an apple.


Perhaps I am fortunate to have grown up around technology. I knew network engineers who went to work straight out of high school without any college making $80k per year. There is a college major to configure routers and switches. It is just something you learn from gaining access to the hardware, reading books, and practicing.

I knew this because I was interested in technology and I was find people who were already making it in technology. I didn't wait around for somebody to simply gift me with a job. I know not everybody has this. If you think jobs are just something you happily pull off a store shelf like shopping in a grocery store merely because you got some college you are in for a world of hurt.

I could feel sorry for people who had absolutely no way of knowing better. Whether or not I feel sorry for these people the reality of the situation remains the same. Bottom line: Don't waste your time feeling sorry for people with first world problems.


Recognising the issue doesn’t require you to feel sorry for them.


> Implied in finishing university is the expectation that you have been taught what you need to make it in the work place. Its part of the social contract.

No. That's not contract, it's a dumb expectation, though shared by many. Vocational school is for training for a job. Universities have no sensible way to teach how industry operates, while industry does have the capabilities for teaching that.


Agreed, and I think it's harmful to universities to expect that. We need universities to excel at what they are good at - preparing students to push the frontier of their field. They're not job factories for industry and shouldn't have to try to be.


IME it's self inflicted harm, most of the university advertisements I see feed that expectation. Slogans like "education for the real world" and "industry focused" get thrown around all the time. Hard to blame students/society for expecting universities to work as advertised.


> But, I know 6 other languages, I can learn one more...

I doubt you professionally know 6 languages. There's a lot more to learn about the runtime, frameworks and tools of a language than there is syntax.


OK, I might have a couple more languages than the average mid-career developer, but I don't think I'm that much of an outlier. So here goes.

I have, in the past 10 years, professionally and successfully worked with C++, Scheme, Ruby, a couple generations of JavaScript and Rust. And I've written a whole pile of shell scripts. These were all "first tier" languages in which I was up-to-date at the time. I've also, as a consultant, delivered projects in C# and Java. In these cases, I didn't know all the tricks, but my code met specifications, it had tests, and it was reasonably clear.

Oh, and I also wrote a bunch of Haskell on my own time, and it was stronger than a couple of the languages that I got paid to use.

But no, I'm not professionally up-to-date in all these languages. My C++ is out of date, JavaScript frameworks change every year, etc.

But if you dropped me in a new job with a new language, I could be delivering business value and clean code inside of two weeks, and I'd be reasonably current on whatever framework I was using within a few months. Yes, it would take slightly longer for Elixir or Erlang, and longer still for something like Coq, because there would be new paradigms involved.

But there really does come a time when genuinely unfamiliar ideas in mainstream tools become rare and special. I'm always happy to learn something new like React's virtual DOM or Rust's borrow checker.

And I know plenty of people who are far, far better than I am.

And this is why I'm generally happy to assume that competent developers can learn on the job if they show any interest in doing so. You really don't need more than one person on a team who knows the deeper trivia of a framework.


> But if you dropped me in a new job with a new language, I could be delivering business value and clean code inside of two weeks

This is one of the things I consider to be a trait of a senior developer, when making hiring decisions. I understand why places post jobs that want X years experience with Y stack, because they don't want to feel like they have to teach "the basics" to someone at a high level, but I've found "the basics" is just what you've described in that sentence. Good professional software developers are language agnostic - even if they've spent the last 10 years on a single language in practice.


Given your breadth of experience, I actually bet you could get very comfortable with Elixir in a week's time. Especially so if you've worked with Scheme, Ruby, and Haskell. Elixir's syntax is very simple, and 90% of your code is data transformation steps strung together with pipes, which really isn't any different from other (functional) languages.


There's a lot more to learn about the runtime, frameworks and tools of a language than there is syntax

That is true but on the other hand, vast swathes of commercial work boils down to, use a language ultimately derived from ALGOL to process data structures that were in textbooks in the '70s.

All the libraries are ultimately written by people, and people who themselves have used other languages, so variations are pretty minor. The function you need to connect to a DB might be different but I bet the parameters you pass to it are in the order of username then password, in any language. Even Ruby, whose community delights in obscure, meaningless names, has patterns in common with everyone else...


I though ALGOL was a 60's language


I would feel comfortable delivering production code in:

Php, Java, ruby, python, javascript, SQL, perl, pl/sql, and bash.

And have been paid to write them all over my career.

Not saying I wouldn't need a day or two to brush up on some of them (perl was pretty early in my career), but 6 languages isn't that large a number once you've been around for a decade or two, especially if you consult.


Used to use 6 languages in the course of a day, in the before times, before JavaScript took over.


You're up to a surprise. It's not really that difficult to learn six languages to a level of using them professionally. I've seen it happen.

And you overestimate the differences between the runtimes and frameworks. Most of them work in a very similar way to one another.


[flagged]


The differences between GC or JIT are completely irrelevant to majority of professional jobs. There are some where it matters, but again, it is not that difficult to learn about new one. There are some conceptual difficulties when jumping from typed language to non-typed, but that is about hardest thing.

Also, compare to quite usual career advice on fake it till you make it which is present in this thread too (by someone succesfull).


mm ok ill bite

Fortran,BASIC,Perl,PHP,ASP,PL1/G Plus MySQL and the procedural languages T/SQl and PL/SQL.


I said I doubt it, not that it's impossible. And by professionally I meant you've read most of the canon books about these languages, you can explain what happens under the hood for any piece of code and you can debug, profile and write high quality idiomatic code in each and every language you listed.

And now, as per GP post "But, I know 6 other languages, I can learn one more..." you should have no problem learning C++ on the job after being hired for a senior C++ position.


Can you please stop? It's a pointless flamewar and going around in circles.


That's a somewhat unrealistic definition of 'professionally' but it depends on exactly what level of language-lawyering prowess you have in mind. Regardless, your original statement strikes me as a strange assertion to make to a diverse group of strangers. For many people with long careers, reaching this level in 6 languages is practically inevitable. I'm only 20 years in, and already up to 4 or 5, due solely to experience at professional day-jobs.


If the framework solves a common problem, then the structure will be familiar. Stackoverflow to fill in the gaps.


> Stackoverflow to fill in the gaps.

That's what juniors do. Please don't use Stackoverflow professionally like that. You should consider reading a book on the subject which provides more context and better practices.


Or the short version employers don't want to train new graduates even less than they did a generation ago.

The increase of degree required jobs means that organisations that used to take highschool leavers at 16 or 18 and train them now think that university should do all the vocational training - they are also not getting the top quartile of grads either.


While I agree that a problem is exposed by these findings, let's not judge college grads by their ability to judge how well they fit employers' privately-held expectations of an entry-level employee. Companies suck at training, balk at its costs, but lament its effects on the sourcing of young employees.


At the current job is the first time I have worked with young college graduates. It is interesting to see that some of the developers are humble and are completely ready for the corporate world while others are the opposite.

It really comes down to personality and whether a person's personality has allowed them to practice more and work harder to learn a skill. These people tend to be much better prepared without the delusions of their own awesomeness.

A college new hire isn't ever the rockstar their social reference group has indicated and the more humble personalities didn't need magic to figure this out. Yes, I have actually encountered college new hires who thought themselves as rockstars when in reality they seriously sucked. They cognitively knew they sucked, but somehow the arrogantly boasting rockstar self-persona was still there in very much a stereotypical millennial fashion.


I don't think the "rockstar" act is a millennial thing. There have always been people like that in any field, or social group. Some people will always compensate for their flaws in ability with their strength of personality, and with our industry that just means painting the "rockstar coder" persona, because there are some people that seek that out.

That's why good hiring practices have to be used, even when you feel like it's a waste. Give everyone the same critical evaluation and don't allow yourself to skip questions in interviews because you like the person, etc. (I'm not accusing you, just saying as a rule)


I like using the word "rockstar" to describe coders, because the word has similar negative connotations as in the music context. An actual rock star often doesn't (but sometimes does) have: classical music training, knowledge of the music literature, vision beyond their area of expertise, humility in the face of their equals or betters, a desire to learn and ability to change with the times, the temperament to fit well with the business side, etc.

If that's what you want to hire, go hire your rockstar coders.


The hiring process spends all its time to evaluate if a candidate meets minimum viable functionality. Instead my recommendation is to evaluate all candidates as senior developers and set them up for anxiety failure and watch how the handle the result.

I can train somebody to be a good coder, but I will not be their parent.


Sad prediction: overly confident will have better careers and will be seen as geniuses by people who don't work directly with them. They will have more responsibility sooner and even as they will make mistakes inittially (and some entirely), they will he excused and seen as leaders more then their humble peers.

Incentives.


This only works in big or medium companies, fortunately being over-confident without actually delivering stuff gets caught pretty soon in small companies (because it directly and adversely influences the bottom-line). In other words, small-company bosses/owners don’t really care of how confident you are, they only care about actual work that makes the clients happy, so that the bills and the salaries can be paid at the end of the month. That’s one of the main reasons why I’ve worked only for small software shops in my career, I can’t stand the wannabe people who always brag about themselves without actually doing anything of value.


This works also great in startups where nobody really knows what he is doing and no one has firm grasp on what people produce/should produce except impressions. Knowing who really produces requires good management that has time to look at the details. That requires stability, non panic mode and sane politics.

Overworked manager will take overconfident person over humble cautious one, because the illusion feels good.

Those overconfident people rarely do nothing and are completely incapable. That is strawman situation, basically. Oftentimes they eventually succeed, altrough it takes more time, is overall somewhat less effective, their code is harder to maintain or something of that sort.


Marissa Mayer - https://en.wikipedia.org/wiki/Marissa_Mayer

By all accounts she seemed to be a great product manager, but product management isn't organizational leadership.

Likewise there is Hugh Jones - https://www.sabre.com/insights/releases/hugh-jones-to-head-t...

Hugh Jones was known for tenacious leadership in large volume business operations, but experience in retail and product was limited or absent.


There are many, many companies outside of SV where a decent fresh grad (maybe with some co-op experience in SV) will be better than the "senior" developers at those companies.

Not that the fresh grads are awesome in any absolute sense, but there are so many awful developers out there who made it to "senior" level at their clueless company just because they stuck around for more than a year or two (really, because they were unable to find better work).


I wouldn't use the incompetence of others as a safe validation for one's own incompetence starting out.

That is taking a horrid situation, lowering the baseline to compensate, and then filling to the lowered baseline. The result will be worse than the incompetence you already have. The positive resolution is to fire (or move) the incompetent people and change the culture of the team.


Let me give an example if you don't believe me: I once argued with a senior dev for an hour that his code had a race condition and when he finally saw it he merged it anyways because it was "unlikely to happen". He was also "too busy" to write tests.

He probably thought I was an arrogant millennial too.



First thing I thought of too - the only category where students ranked themselves as less proficient than they were perceived to be is technology - the one area they are probably actually proficient in.


Did they goto private (public in the uk sense) schools? those type of schools inculcate there students with a master of the universe vibe.


The pressure to be overconfident sometimes comes directly from employers.

One time, walking into an interview for an entry-level position, the interviewer completely ignored me and continued to stare at my resume for two minutes.

The first words out of his mouth were, "At Lehman Brothers, we only hire winners. Are you a winner!?"

In retrospect, I'm glad that interaction happened and revealed what the company's ethos was; otherwise, I might have been tempted to work there.


"The easy solution: set students up in a more professional environment, Busteed said -- this could be internships or co-op programs. If students can't go to an actual office, then the environment should be brought to them so they have a better sense of how a workplace runs."

Because education is only about job training so that workers are pliant and ready to to be directed by the system of hierarchy. Thus, turn higher education into the same factory training that high school is. This, in an era when profits are already soaring.


This means almost nothing without a baseline to compare it to. I'd suggest it's very likely that you'd get similar results if you surveyed any other age group - that individuals give higher estimates of their ability than someone else would is hardly surprising.


I disagree. There are unique personality traits that qualify the millennial stereotypes which are not present in older age groups. Young people will always believe they are ready for the world if all their elders tell them such no matter that this messaging is baseless and inaccurate.

The problem is a level of personal fragility and lack of contrary frame of reference. Fortunately there is a minority population in that demographic that either studied appropriately or with the correct personality traits to see through, or utterly disregard, the dotting bullshit.




It's silly to ask a college graduate of 20-25 years old anything about professional conduct. Can't we just appreciate that they're young and excited? Anyone with a career a few years long knows exactly how professional to expect a recent graduate to act at their first job; it's everyone else's job to mentor them and set a good example of professional behavior.


As somebody younger (and more naive) that is only 2-3 years into my career, I do my best to remain professional. What's some things to look out for?


Your local public library will have a lot of quality career advice books. Nancy Barry's When Reality Hits is a good one. Go to the shelf it is on and browse around.


Is this anything more than the general human trait where most people think more highly of themselves than others do of them? E.g. where 90% of people tend to consider themselves above-average, when mathematically it can only be 50%? Colloquially called the "Lake Wobegon effect" after the fictional lake area where "all the children are above-average". See [1].

There doesn't appear to be any evidence that this has anything to do with students and employers specifically.

[1] https://en.wikipedia.org/wiki/Illusory_superiority


that ("90% of people tend to consider themselves above-average") can mathematically be true if you have a skewed distribution.

suppose that 90% of drivers have no accidents at all, while the other 10% have 10 accidents per year. Then the average (mean) is 1 accident per year, and fully 90% of drivers are better than average!


Maybe it's because employers want >2 years experience from people straight out of college, and no one wants to train. Companies claim they don't want to train because you could just leave (at will) but it's relatively simple to draw up a contract that employees must pay back training costs if they don't stay at least a year or two.


Companies don't "want" to train because employees "don't stay" but companies also don't provide incentive for employees to stay for more than a few years.


Actually harder to do even in the USA - employers don't want to pay the market rate aka how capitalism works.


I find it interesting the one outlier in this is in "digital technology". Is that the age difference in action?


I'm not sure the word "dubious" means what you think it means. Are the employers dubious?


"Proficient" doesn't mean the same thing for the two categories. For example, it probably means:

-> for student : "I will manage to have a job a do it"

-> for employer : "do the job as good as other employees"

So comparison is meaningless.


When you give every kid a trophy, they all think that they are winners. We have raised a generation of children who have been infantilized and sheltered throughout their childhood and college years. The inevitable result, unfortunately, is a generation of adults that lack the ability to tackle adversity, think critically, or make valid self-assessments (something that is difficult even for the most enlightened). We have a system of social and academic promotion that has virtually no relation to objective metrics. Many universities (and law schools), even the most prestigious, are eliminating objective admission standards because so many applicants are unable to meet them. Its no surprise that those who have been conditioned since birth to believe that they are entitled to "achievements" have an inflated view of their abilities. The steady decline in literacy, critical thinking, and overall competence (from already low historical levels) mirrors the steady decline and atrophication of our society as a whole.

http://college.usatoday.com/2016/07/18/columbia-and-barnard-...

https://www.washingtonpost.com/news/grade-point/wp/2017/03/0...

https://www.fairtest.org/schools-do-not-use-sat-or-act-score...


Oh fuck off with the trophies. I never wanted them. I got a 6th place ribbon for high jump, a "most improved shooter" trophy in the range club, and a "participant" ribbon in an elementary school basketball league where they explicitly didn't keep score during games. I knew I was a terrible athlete and the ridiculous trophies and ribbons were humiliating. I certainly never got the idea I was a winner.


Of course, if your stereotype is correct, this does beg the question about who decided it was a good idea to GIVE 'the kids' all these trophies.


The reality is, all of then knows difference between participation tropey and winning. Maybe trophies smooth over feeling bad for not being winner, maybe not, I dunno. Obsessing over them is ridiculous on its own sake.

Top schools are more competitive then ever. High schoolers need to adjust their whole life to have a chance to get in. Competitions are harder then ever.

Kids spend way more time in organized activities to get every bit of performance out of them - the only exception is likely football which is suspect of destroying their brains.

On average, kids have little time to just play around without working on something.


This is what happens when the educational system concentrates on emotional safety vs actually exposing students to what is happening in the real world. In real life, you also have to come to the realization that there are winners and losers in life, that it is not fair etc.. When a system focuses on preventing someone form experiencing any discomfort, this is a natural result of it.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: