This one. Every job I see when I go looking is 40+ hrs a week, likely with plenty of idle time and projects I don't believe in. Part of this is regional... my area is packed full of marketing jobs that I'm actively uninterested in. Anyway, I'm starting to get the hang of selling myself to contract shops, and I think I've found some twinkles of hope.
But today I was told straight up that my resume wasn't conventional enough, that it was too well-rounded. After quite a few messages back and forth, I'm not even sure if they even looked at my portfolio. They told me there was no money for me, because their funders were looking for someone more professional. So I took this as a sign that maybe this wasn't the job for me. No need to go barking up the wrong tree.
In the game dev world, I'd probably be a "technical artist", but even that doesn't quite fit for me. I'm like a "fool of all trades, jack of some, maybe a master of one or two". But it's hard to find a job that motivates me and leaves me with adequate space/time/energy to keep myself sharp and alive.
If anyone knows of funding for a guy that makes creative, reusable web components, let me know... edit: I just saw that vue.js is primarily funded by patreon. that's awesome!
If you have 3 primary skill sets you should have 3 focused resumes.
If you want a position that requires a generalist, the only/best fit would be founder. In almost all other circumstances, companies hire for a problem and not general optimizations. Even if they need a generalist.
I would posit that you have a marketing problem and not a skills mismatch problem. The noise to signal ratio on your resume may be making it seem to not match any position.
I think you're very right about my marketing problem :)
On the surface, my website is kinda out there. I built it from the ground up using Rake and some gems as a static site generator. I think it's very cool, and I don't expect that many people will appreciate it, but I'm hoping that one day someone comes along and says "wow, that's pretty cool, wanna help us with this Ruby thing?")
In a deeper way, this whole discussion is very helpful... it's giving me a fresh perspective on how to frame conversations with future potential employers.
And actually the other day I had an epiphany about my marketing. Sure, I chose an odd path, but I've already done so much of the work! I just need to get my random little open source tools in decent, usable shape so that people who'd find them useful might actually get a chance to use them. I need to get my half done work finished and make tutorials about it and get myself out there.
This came after I got a call from a contracting company who told me they were bidding on an interesting job that might be a good fit for me... I was so excited that I stayed up all night and finished like three old random projects at the same time, just so I could put more stuff on my portfolio. It was a wakeup call, affirming that there actually just might be some cool work out there after all... so get your shit together!
I have the 3 primary skillset problem you describe and have a decade of "founder" experience. I thought the breadth of experience and technologies would be impressive, but I think it just confuses most employers. It seems counter-intuitive to intentionally not mention your skills on your resume, and admittedly a little ego-deflating.. but this actually makes sense.
I was heading the direction of very focused resumes. I've also landed a low pressure founder role too, so that's nice.
I'm not a web or mobile developer. I'm not a "devops engineer", and I'm not a "backend engineer." I'm a problem-solver. But to many (most?) companies, this means that I need to be classfied into a track. I need to be an X or a Y or a Z. I can't adopt those roles when necessary--I need to be slotted in and just go.
That doesn't really work for me. Which is a major reason why I'm a software consultant now. I'd go do a full-time job that respected my complete disinterest in being pegged to a particular role and was able to give me the latitude to solve cross-cutting problems at scale. But that's a hard role to define, a hard role to hire for, and a hard role to evaluate--which is why they, and other generalist-focused roles, don't often exist.
It's easy to get a job. I can walk into almost anywhere in Boston if they're hiring and get an offer; I interview extremely well. I can't get a job that actually leverages my skills.
When you sell your time, you need to fit into a role that someone can comprehend and already knows they need.
When you sell your ability to deliver results, nobody asks about your skillset.
I don't enjoy entrepreneurship.
I keep wondering if I should take the rate cut for a year or just study mathematics at the local university (which is free here) ...
Having a math PhD could also help cut through the bs of people thinking you're not useful because you're a generalist. Suddenly, they'd be asking you to help them with weird little problems. Which is what I'm pretty good at like to do!
Either you fit yourself into the structure of someone else’s business, or you’re in business. If you’re in business, there’s no lack of diversity in what you spend your time doing. That’s what I mean by entrepreneurship, not “SV startup entrepreneurship”.
Something to think about. Thank you.
Yes, companies have complex problems to solve, that require more than a few days or weeks of work at a time. If you want to just wake up every day and decide that you need to work on a totally new project, you're not going to be very useful.
Lots of people have a wide range of interests, but you pick one of the many areas that seems interesting and that you can contribute to, and you work on that for a while and try to make a difference. "Adopting a role when necessary" is not that different from slotting in to a team, since you can often change teams every few months if you want to, or even work across teams.
What you describe is a hard role to define and hire for, but it's a role that many engineers create for themselves by being both great problem solvers and willing to do what is most valuable for the company. Consider that maybe you haven't gone in with the right mindset to make it happen for yourself.
I also happen to think that I'd be good at it, because it's kind of what I already do, and I wish it was more common because that'd be great. Heaven forfend. I mean, you can try to pull the well-actually-it's-labor-that's-bad thing by trying to turn it into my mindset is wrong rather than let's talk about executive and managerial allocation of resources and whether we do a good job of it across the board, but the thing speaks for itself.
And you're right, that post is braggy. It's also marketing: from this thread I've gotten two emails inquiring as to the state of my pipeline and if I'd have time for a chat. It can be both self-marketing and a reasonable observation of reality. (Email's in my profile.)
It's incredibly hard to get a job. But your versatility will ensure job security for as long as you care once you're in the door, thanks to your ability to get into the critical path of different department's operations.
I'm a competent, high performing generalist who can work autonomously and self-sufficiently with enough confidence in my capabilities to have a high risk tolerance. In my current company I got tossed around from starting under a marketing manager, to getting shifted under the CTO after stepping on IT's toes too often, to getting pushed into a newly formed Operations group, to finally just reporting directly to the owner. While officially I manage a data management and BI team, in practice I'm the guy you escalate random problems and needs to. Creating a BI team was a byproduct of being a generalist and being the first person that needed a consolidated view of everything (and having the ability to execute on that need).
Looking for a new job is a pain. Companies don't explicitly hire for it, politics can come into play if the generalist doesn't know how to dance around the company in a way that doesn't threaten others' fiefdoms, there's no easy way to suss out competence from arrogance during an interview, no manager wants to frontload your salary on their budget when they won't be getting the full benefit of it, and a host of other things.
Generalists are valuable to a company, holistically. They're oil to the machine and fit in between all the specialized cogs to smooth things out and let you run the engine hotter without breaking down. But managers only manage their own cogs, hiring only looks for the cogs they're told to find, and at the end of the day whether there's oil in the machine is Someone Else's Problem. Very few teams exist which have the prerogative and budget to hire for that holistic need, so most companies just go without oil and don't think about it ¯\_(ツ)_/¯.
Your job and experience sound very similar to my current role. I don't have an accurate job title anymore, and I very much see myself as the oil that makes the engine of the company run more smoothly. I wear about four or five different hats, all depending on what fires I need to put out. Good to meet someone doing something similar.
A firm without This Guy means that more small problems/needs get ignored since it doesn't clearly fall into anyone's scope of work or warrant the cost associated with the specialist. And if it gets large enough to not ignore, a specialist will consistently have to context shift to something outside of their specialty when something tangentially related pops up. This decreases their productivity considerably, decreases their satisfaction as they can't focus on what they truly enjoy doing, and increases business risk as things get overcomplicated or slip through due to a more biased view on the strategy to perform the work (based on whatever the individuals specialty is).
Practice-wise, it astounds me how many people/institutions expect different results from everyone else while hewing to the same practices as everyone else. Why do you think you'll be the exception? Why can't you look at the reality of what is working and what isn't?
This applies to so many areas of life, but it is extremely pronounced in business, where everyone cargo cults every little practice of AmaGooFaceSoft and believes that is what made them who they are. It's like there's an extreme deficit of original thought.
Parent showed them a generalist skill set and they saw a lack of professionalism.
The hard part is giving in to a job. I think my hesitancy in this area straight up reeks to the point H.R. people/recruiters can pretty much smell it. It's just not about the money for me. If I was doing something I enjoyed and had time to work on creative projects (a.k.a. "my life's work"), I'd be fine on ~ $1000/mo (say at ~20-40 hrs/mo). But this presents difficult hiring/management challenges, the main one being the challenge of using my time effectively. When someone is always on call, management doesn't have to worry about this as much.
Honestly, I think companies could get a lot more bang for their buck if they put more money into hiring people to maintain and improve on open source projects that they use. And I know that there are jobs like this out there. I just need to keep looking for one that fits me :)
Back when I quit my 9-5 job in 2006 my favorite quote was by the creator of Winamp... "For me, coding is a form of self-expression. The company controls the most effective means of self-expression I have. This is unacceptable to me as an individual, therefore I must leave."
No company deserved me because I was a creative innovator, and even though working retail for $10/hr wasn't using my CS degree, I'd rather do that than make $70k getting someone else rich. I thought I deserved $250k since I was obviously going to get them rich - so taking the $10/hr job doing mindless work was less of a sell-out.
The next 8 years I tried a lot of things, and went $20k in debt. When I reached my 30s, I realized a few things:
1. You can be a mercenary and get the skills you need for your own business while working for someone else. You learn and get faster while doing tasks for your employer, so even though the code you wrote is owned by them... the code you write for your own business will be better and done faster.
2. If you're worried about getting them rich... just do what they say instead of innovating. Most companies don't innovate, they just maintain the status-quo and have you build systems that meet existing needs and save a little bit of money. You aren't making the next Snapchat at a 9-5... and if you are, you should have equity.
3. Freedom is not just about having the time to do things.... I had all the time in the world in my twenties but no money. Being broke basically put me in a prison. I couldn't go on trips with friends, couldn't take girls out, couldn't do what I wanted because I'd have no money for it. Now that I have a full-time job and work on side projects, I have the opposite problem where I have money but no time.
Overall I think having a 9-5 job works better for my lifestyle. Maybe I just needed more discipline and I could have succeeded as an Indie developer making games, marketing software, apps... and whatever other shiny thing caught my eye... but now I do that stuff on the side. I still hope I succeed as an entrepreneur and reach a point where I have complete freedom (time & money)... but now that I'm older I realize that was always a long-shot.
Take Tumblr for example - they sold to Yahoo. Verizon bought Yahoo. If I was an engineer at Tumblr I'd be pissed that an evil monopoly like Verizon that's fighting against net neutrality is suddenly using my code.
I'm more humble now than I was in my 20s, and I don't think my code is anything special, but I still fear that the private company I work for will sell everything I built to someone I consider evil. If I have no control and no equity I'm going to act like a mercenary... I'll do the job efficiently and according to specifications, but they don't get anything extra. If they have an idea and have me build something that gets them rich that's great - but I'm not going to be like the guy who built GMail on the side at Google... I just do the job they tell me.
So what I am saying is at this point, do not worry about all these things. Keep up the side-projects, and if you do that over the long-term, you may one day find yourself in a situation where you have the freedom to dedicate your efforts to what you find meaningful.
But wouldn't you be happy that all your fellow coworkers also probably got to benefit from the sale? If you or your coworkers got rich enough then you could all just focus on doing things that you think matter or better the world!
Yeah, thats part of the pattern. Generalists often do not appear as professional as specialists (which does not mean that everybody who looks unprofessional is a generalist).
Case in point - I'm a generalist who can very credibly claim very significant expertise at two disconnected areas (compilers and distributed systems), plus some startup/ product management expertise. I saw a job listing here for a startup that needed both of these and claimed they're looking for remote work - so, even though I'm not actively looking for a job, I wrote them; the match between my skills and their needs seemed too good to be true.
They wrote back the standard rejection mail - didn't even try to talk to me. I thought that maybe they found someone else, but no, they still advertise the position. I know it sounds arrogant, but there's no chance in hell they can find many better candidates (I doubt that they can find even one, but well, good luck).
Are their recruiters uniquely ignorant? Unlikely. I think many companies do a lousy effort on recruiting, and then wonder why they can't fill the positions. One of the reasons "internal referrals" work so well is that the candidates are at least seriously considered.
I have seen three approaches to limit this. First being to stay super field specific but implementation general, but not keyword void. Second, is to go super rabbit hole on every keyword/tool/language/platform you have ever worked with inside the field.
The third approach is to have a type one resume with a type two cover letter. That approach has usually been the most consistent for callbacks in my circles, and allows for your rabbit hole to be of the same type as the job description.
The more skills you are trying to get filled for one position, the less likely a candidate will match all of those skills.
Take any job posting with 20-30 lines of skills or requirements and it's really no wonder they can't find someone that meets all or even half of that.
I think that hiring practices haven’t really caught up to pace of reality yet. Nobody does the same thing for more than 5 years anymore, even if you’ve been at a company that long what you’re doing today does not look like what you were doing then. The chance that a new hire already knows all the parts of your stack is basically zero. What you want is someone who soaks up frameworks like a sponge.
> "If companies find it hard to reliably staff generalist positions"
The process of hiring for this is hard. Finding people like this is also hard.
~Someone with little resume-driven experience but good grogging skills
The way they throw you into cold water is using whiteboard interviews. So if you don't like those... sorry.
If you've got someone who is largely autonomous and can wear multiple hats, steering them in a single direction may be more tricky than a specialist that doesn't seek work outside their remit.
Assuming your skillset is good, run, don't walk, away from those idiots.
* "Have Fun at Work" by William L. Livingston -- Livingston provide examples of how organizations are often incapable of hiring the people who might help them most.
* "The Murdering of My Years: Artists and Activists Making Ends Meet" by Mickey Z. -- This book is more in the category of seeing how bad so many creative people have it (until perhaps we get a basic income some day, Star Trek replicators for subsistence, a gift economy, and/or better government planning).
* "Drive: The Surprising Truth About What Motivates Us" by Dan Pink: https://en.wikipedia.org/wiki/Drive:_The_Surprising_Truth_Ab...
* "Punished by Rewards" by Alfie Kohn: http://www.alfiekohn.org/punished-rewards/
* "The Abolition of Work" by Bob Black (which reflects your point in another comment on "giving in to a job"): https://web.archive.org/web/20161031034600/http://whywork.or...
"Only a small and diminishing fraction of work serves any useful purpose independent of the defense and reproduction of the work-system and its political and legal appendages. Twenty years ago, Paul and Percival Goodman estimated that just five percent of the work then being done -- presumably the figure, if accurate, is lower now -- would satisfy our minimal needs for food, clothing and shelter. Theirs was only an educated guess but the main point is quite clear: directly or indirectly, most work serves the unproductive purposes of commerce or social control. Right off the bat we can liberate tens of millions of salesmen, soldiers, managers, cops, stockbrokers, clergymen, bankers, lawyers, teachers, landlords, security guards, ad-men and everyone who works for them. There is a snowball effect since every time you idle some bigshot you liberate his flunkies and underlings also. Thus the economy implodes."
Contrast with notions of "full employment". Or, sadly, also most of what Silicon Valley is up to these days...
About a decade ago, I sent that last link to an Apple recruiter who contacted me -- never heard back. :-) Not that I am a "Steve Jobs" or would want to be one, but, as with Livingston's point, would Apple hire a Steve Jobs as an employee? Almost certainly not. See: http://careerfuel.net/2013/06/if-apple-wouldnt-hire-steve-jo...
Though for balance, on the value of working together with others even in difficult circumstances, see "Buddhist Economics" by E.F. Schumacher: http://www.centerforneweconomics.org/buddhist-economics
"The Buddhist point of view takes the function of work to be at least threefold: to give man a chance to utilise and develop his faculties; to enable him to overcome his ego-centredness by joining with other people in a common task; and to bring forth the goods and services needed for a becoming existence. Again, the consequences that flow from this view are endless. To organise work in such a manner that it becomes meaningless, boring, stultifying, or nerve-racking for the worker would be little short of criminal; it would indicate a greater concern with goods than with people, an evil lack of compassion and a soul-destroying degree of attachment to the most primitive side of this worldly existence. Equally, to strive for leisure as an alternative to work would be considered a complete misunderstanding of one of the basic truths of human existence, namely that work and leisure are complementary parts of the same living process and cannot be separated without destroying the joy of work and the bliss of leisure."
I've collected some more ideas on making workplaces better here: https://github.com/pdfernhout/High-Performance-Organizations...
From what you describe, if you can't find a direct match to what you would like to do, you may want to consider being frugal and then earning what money you need to live in a low-stress part-time occupation with the rest of your time then available for what you really want to do. Have you ever considered, say, cleaning carpets to make ends meet? https://web.archive.org/web/20030807105050/http://www.unconv...
"More than a few people agree the best career would be one which provides challenge, intellectual stimulation, and rewards for quality work. Many however, would be surprised to discover they can have all of those benefits and more in some of the unlikeliest of careers. Case in point: I'm a professional carpet cleaner. Some people think this is a second-rate career. I don't agree with them. Carpet cleaning gives me challenges, intellectual stimulation, and many other rewards. To prove this, permit me to walk you through one of my work days. ..."
If instead you focus on "selling [yourself] to contract shops", you may find this book Aaron Erickson useful: "The Nomadic Developer: Surviving and Thriving in the World of Technology Consulting".
I think one of my best bets right now is to lean hard into teaching, especially piano teaching. I have one committed student and a few more excited but broke students who take lessons from time to time. Is very rewarding and there is lots of mutual respect.
You are not just a good frontend engineer you are a good frontend engineer that is competing with other good frontend engineers. You lack professionalism (it sounds like) so you lose against those guys.
Good luck, at some point you either find the place that works for you or you learn expand your skills. Seems like you need to improve your tolerence for down time. It could be worse
Have you ever done freelance web stuff for small businesses? There are some people who want you to be their webmaster... You give them a CMS and they want you to go fix their typos. So you say "Okay, sure, but it'll be $50/hr" and then they get frustrated because they don't have someone doing that kind of work on their end. They think it's your job to "make their website", but don't want to take ownership of the thing.
This is an analogy for the kind of thing I'm taking about. I want to help solve problems, and I want to provide value for my employers, but I'm unwilling mindlessly give up my time to people who don't value it outside of their myopic needs. A lot of times, that's what's expected to be "a professional". Well I don't want that. I'd rather be an amateur who's also a respectful and savvy business partner, who's definitely interested in helping you with your project but also keeps healthy boundaries for myself.
I'm definitely more of "an amateur" (lover of) than a professional, but I don't see why that should make me automatically "unprofessional" in my business relationships. On the contrary, I think it's fully possible to "be professional" without providing unrelenting 40+ hr/week service for a year. If that's not what you're looking for then with all due respect that's fine! But please don't blanket assume I'm "less professional".
Well, that's because CS programs are designed to teach CS, not to be a vocational school for programmers.
I suspect the answer is that our culture is confused, we want people who are smart and well rounded to also be useful. And the ugly side of it is, we want clear class divisions. One would never refer to law school or medical school as "vocational".
Sure you would. They are 100% vocational schools. Their cost, acceptance rates, time commitment, and difficulty level may be higher than other vocational schools, but that doesn't change what they are -- 100% focused on giving you the knowledge and skills to do a specific job.
Hell, they aren't even focused on teaching you to pass the bar, which is why most people pay for separate cram schools.
Arguably, "a school one must go to in order to practice a vocation" can be referred to as a "vocational school." I understand that vocational schools are often thought to be purely practical and not theoretical, but I think this definition does a disservice to the reality of entering certain vocations.
You don't have to have an engineering degree to get a co-op job. The student gets paid well over minimum wage (and learns on the job) and the employers get financial incentives from the gov't to hire students. Its win-win for both the student and the employer.
Co-op programs are optional, and take place in the summer or in a gap year. The affiliation of companies hosting co-ops is loose, and the schools charge large fees for participating in these programs (with the exception of Waterloo).
Speaking from personal experience, results may vary.
Note that my degree is from a well-reputed Canadian school that has a "software engineering specialization", and as far as I can tell (though I did not take it) that specialization is indistinguishable from the main CS curriculum, and is not valid vocational training.
It's almost like they rose from the death of many for-profit technical schools (DeVry, WestWood et al.) and took their ecological niche. But in contrast to bootcamps, these for-profit schools were too big, slow, and ineffective.
a) Whether you were able to get the work done.
b) Whether you were smart enough to do the work well.
It's the combination that's important, not either single one. Employers want people who show up on time, actually care enough to try, and are smart enough to figure it out.
The college degree has a better than zero correlation to that. After you get the job it's mainly about performance and output, once you've been promoted a few times your college degree mostly doesn't matter.
There’s another issue with using educational attainment as a proxy of intelligence: it doesn’t really help you distinguish among candidates who all have a bachelors degree. All you know is that they’re above some threshold, but not how far above. You can use the prestige of the school as a proxy for this, but it all seems like a hell of a lot of hassle when you can figure this all out in a half hour with an IQ test.
I believe the words "bored to tears" summarize the experience quite well.
While the majority of pupils get a standardized curriculum designed to keep their interest at a steady pace, pupils with a high learning capacity get no such thing. If they try to learn faster, it doesn't fit with the governance and management model within which the teacher must operate. Therefore, most teachers are at a loss when faced with these statistical outliers.
Also, other pupils may experience emotions of inadequacy and unfairness when a fellow pupil just blasts through the material in minutes that would take them all week. This may lead to the high capacity pupil being a target of some unfortunate group dynamics.
Since schools have no governance model for this, the high learning capacity pupil's school experience is essentially unmanaged. At a loss, society almost invariably resorts to platitudes like "No need to feel sorry for them, because they are so fortunate to be smart. We must focus on the pupils that struggle."
A few years down the line, the pupil's inner motivation may be completely replaced by depression, self-blaming or worse. Then the platitudes take a turn for the worse with blaming the pupil's willingness to work: "In fact, high IQ can reduce grit, since clever pattern matchers use their cleverness to avoid working hard on the toy problems of childhood."
Fortunately, my own school days are long since gone. Without blaming any person in the system, I can say: "Good riddance!" to this whole pitiful affair of how society treated me as a child.
Instead, I can draw attention to this problem by saying clearly that these are children that never asked for these gifts in the first place. Let's as a society realize that what we are doing to them is absolutely wrong and woefully irresponsible.
Fortunately, at least one western country has political attention on this right now. I will work hard and with "grit" to make sure that my experience and observations can help in creating new policies. In particular I wish to address how to practically leverage the pupils' own drive to learn without incurring social/peer stigma or ridiculous costs.
We are perfectly happy to identify athletic talent and enact systems so that the athletically talented are matched up with peers who share their gifts, but we mostly refuse to do this for intellectual talent.
I was fortunate enough to attend a program that handled this pretty well. Basically it was 120 kids from 11-18 who skipped high school and attended university together instead. It’s called the Early Entrance Program and it’s at Cal State Los Angeles:
The issue I face is that I am raising my children in Portland, Oregon, so this program will not be an option for them. Our personal approach is to home school instead. And I’ll probably investigate starting something like the Early Entrance Program at the state university in Portland here.
On the other hand, I have known a number of people who used their self-identification as a nonconformist as an excuse for why they couldn't cut it someplace, whether it's school or a job or some interpersonal situation. It's a bad excuse because it gives the person an easy out for not reflecting on how they could have handled the situation better, or avoided getting into the situation in the first place. It also pushes all of the blame on other people, which is comforting but very rarely 100% true. We should all be our own worst critics, lest someone else beats us to it.
There's actual employment law that prohibits using IQ tests? I'd never heard that before
The real reason companies don't do it is because Fizz Buzz is a better test than IQ tests.
The value and definition of "IQ" is widely disputed and generally seen as of limited relevance to the actual work and performance of that work (uncorrelated). Look it up and do some research if you're interested, this is just a quick showing of _why_ tests may be considered bad.
I'm not saying that IQ tests are useful measures of ability to do a job. I just wasn't aware there was actual law or precedent forbidding their use.
> If there is any sort of "test" given as a condition of employment, it must be relevant, standardized, non-biased, reviewed by specialists and approved, etc.
Wouldn't this disqualify a lot of whiteboard interviews too? Most startups don't have a standard question bank reviewed by experts or grade on a standard rubric.
Not to toot my own horn but I got a perfect score on the math part of the SAT with little prep, and I've tutored others on the same and only managed to eke out a little improvement. (In contrast, I've tutored math course work and had more success there.)
The math isn't very advanced, it's just a matter of doing it quickly and accurately.
The truth is more nuanced .
From a research-into-today's-hot-button-issues perspective, they are routinely slammed for being about cramming, but the effect of test preparation on test scores tends to be very close to zero when you look into it.
Whether this is correlated to anything important in real life is an entirely separate question.
I think you're wrong about that,
Another way to approach this is to ask yourself: if not as a proxy for intelligence, then why do some jobs require a college degree?
>I think that supports my point
I didn't say it invalidated your point. I'm trying to draw your attention to a specific area in your statement that is wrong. You can simply say,
>Higher education is used as a proxy for intelligence by companies
That's enough. Nobody will argue with you about that.
You may disagree with some of its conclusions (I personally am skeptical of its conclusions about racial differences in IQ), but to call it pseudo science is, I think, unfair. FWIW, Steven Pinker, who is about as thoughtful and level headed as they come, agrees:
And, anyway, the point I was making had nothing to do with racial IQ differences. It was referring to what the rest of The Bell Curve is about (discussion of racial IQ differences makes up only about 25% of the book): how the US society’s transition away from most people working in agriculture toward most people working at desks has created a class structure stratified by intelligence.
It’s actually a shame that the book discusses racial IQ differences at all, because everyone got hung up on that and totally missed what is, IMO, a much more important point. And I think it explains a lot about current politics in the US.
It's my understanding that the main data presented hasn't been overturned or disproved (In-fact quite the opposite)
At least the graduate can show that he/she can commit to a multi-year long, sometimes boring, project successfully. A high IQ person could just leave your company after 6 months because "they're bored".
Also, agree that educational attainment gives you more information about a candidate, but in terms of assessing intelligence, it is a coarser measure than IQ. I’ve hired a few people in my day, and there is huge divergence in intelligence among college grads. For lots of jobs it is useful to be able to distinguish between not-stupid, bright, and ridiculously smart. Educational attainment doesn’t really help with that, but IQ tests would.
They're also very expensive to attend, and the most immediate value they offer is to get people jobs
I often hear people talk as if education was something that happened to them, then it finished. (Like learning is something you just do in an institution.) I guess training is more like that, teaching knowledge and skills for some particular task/role/job.
What? I suspect that it's typically, if not exclusively, those in the upper classes that want this. The ugly side of it is human nature is selfish.
I have known several people who went to medical or law school and every single one them went only because it is a per-requisite to becoming a doctor or lawyer. What is your explanation, that people attend these schools to expand their mind, and then entirely coincidentally, manage to end up in the corresponding career as though by accident?
Look, the reason that all the students go to university is to make money. Like 90%+ of undergrads go there to make money so whether it "technically" should be a monastery of cloistered academics pondering the universe (based on some high minded notion of what learning "ought" to be for) is besides the point. University costs a shit ton of money and people (the vast majority) pay it so they can have a decent life.
And the fact that universities are, on the whole, teaching skills that do not help their students accomplish this means either A) they are actively misleading their students B) university is a class indicator more than a place to learn (I can spend huge amounts of money so I'm probably not poor and black - which sucks for all sorts of reasons) or C) they don't understand that teaching academic CS, as opposed to programming, is not what industry demands (so they suck at what they're selling).
These are the people who, quite literally, are billing themselves as the "smartest guys in the room". If that's the case they can well stop sucking at their bloody job!
First, if you walk out of a four year degree in computer science without a strong foundation in discrete mathematics, algorithms, data structures, and at least one of distributed systems theory, stochastic systems theory, control, statistics/machine learning, AI, data visualization, or design, you've stunted your horizons for growth as an engineer pretty badly. I use three of those every day at work. I wouldn't be able to read the research, do the math, or write the software to implement what I work on without a very strong education in fundamentals. The only CSE course I took in University that I haven't made active use of was an advanced matrix computation course. I mean, if you're just shitting out web services and REST APIs or converting COBOL to Java, fine, but that's something you can train a monkey (and in very short order, a computer) to do.
Secondly, we invest in Universities as a public good. You should come out of a four year program able to write well, critically analyze difficult material (not "Catcher in the Rye"), articulate complex ideas persuasively, and understand the world you live in and how to live in it substantially better than when you went in. You might be surprised to learn that with the benefit of hindsight, many people find the time they spent reading and learning to be incredibly useful later in life, as it provided a foundation for dealing with conflict and complexity. If that's wasted on you, then it's wasted on you, but understand that you have shot yourself in the foot in terms of your ability to mature as a human being. I personally think public University education should be substantially more rigorous and should fail people aggressively, and should also be free. I wish we separated the whole "whoo look at me I'm 18-22 let's party" thing from the thought of University. If that's where you are in life at 18, there's nothing wrong with that... go work as a code monkey or a barista and get it out of your system, and when you're ready to level up as a human being give University a try, go. That's more or less what worked for me, anyway.
What a great comment. I think that's also missed by schools where the focus is on memorization and/or trivia. It would be great if they focused on the areas you talked about, not necessarily directly, but at least that is the goal.
As far as "leveling up as a human being", books are free - go sit in a library. Hell, grab a kindle; most of the cool stuff that I learned about the human experience I learned from talking with others, staring at the sky, hiking in mountains, and reading free books that people on the internet recommend. None of this requires taking out the equivalent of a mortgage or listening to someone lecture on it. If you want a lecture, those are free online too!
I just get the sense that university is used as a filter to differentiate yourself from people who are "below you" - not as smart, not as hard working, not as "mature as a human being". But the main barrier to get into a university seems to be connections and money, neither of which I have a huge amount of respect for. Many of the wealthiest and well connected people in the world are huge assholes, and many of the poorest and least connected are incredibly wise and kind (and intelligent and hard working). It feels like a way to separate yourself from the unwashed hoi polloi and put a barrier to entry between yourself and other people who could do your job. In that sense university encourages laziness!
Overall, I just get this sense that university is this bullshit thing we all just agree to participate in because, for those who go they get a monetary benefit, even though everyone knows its just this bullshit thing. That monetary benefit comes from being a class filter, a networking tool for like minded rich people to find other like minded rich people, and probably a few other class effects I don't really understand. I thought when I was younger that as I got older that maybe I would get more jaded and just accept bullshit things. It would certainly make my life easier to do what other people do and not fight against the tide. But if anything I've become more altruistic and romantic, I want the world to be more honest and people to do the right thing.
I don't know. I've probably benefitted from it even though I wished people judged me based on other qualities that are more important.
Secondly, universities are not meant to prepare you for a better salary or job. They are meant to give you a particular set of skills and knowledge from a field of work. Finishing means that you usually get access to a certain community with the same specialty. This is useful if you want to keep up to date with your field or publish new findings.
Software companies that require computer scientists usually are just full of it and they forget that there are other universities that prepare people for software engineering or even just programming. You could argue that a course or bootcamp is enough. But we already know they ask for too much when you look at the requirements they have. They aim for the best and then take what they are given.
I think universities are too easy nowadays to let people in. During my father's time, there were around 10 slots for the arts university in my country each year. All of them ended being praised artists. Actually, national news are made when each one of them pass away.
Now, universities just take everyone, especially if they pay for a slot. This caused a shift into perspective. You are no longer hard-working or lucky to get into a university, you are now expected to get into one to prove you are not a complete moron.
This is the actual problem, and it is reflected in your arguments. I agree, you probably didn't need to go to university. Hell, I know that I didn't need to go through mine. However, we were both expected to, and for no good reason.
Regardless of what any individual may think a university is or isn't meant for, this is how they are used.
As a high school dropout, I was directly told by managers in more than one situation that all I needed was a Bachelor's and I would be eligible for better positions. There was zero discussion of skills, as it was already understood that my skill level in many areas was much higher than that of the degreed people I worked with (who also weren't using their educations for anything other than as a "door pass" to be considered for better jobs).
The system is broken.
Which is it, "all" or "90%?" I went to a private liberal arts college so maybe that automatically puts me and my classmates in the 10% but I did not think the overwhelming majority of us were there to make money. The fact that lots of students still major in subjects that can be very worthwhile and valuable but lack obvious paths to financial success (e.g. most of Humanities and a lot of Social Sciences) indicates "making money" is not why they're there.
You think universities should basically be trade schools. They're not and they shouldn't be. Trade schools should be trade schools but the U.S. has a dearth of them and many that fit the description aren't good enough.
Meanwhile a university not knowing whether it wants to be a trade school or a research school has mixed incentives that results in an inconsistent quality of education. You get professors who don't know how to teach and teachers who don't know how to go in depth. By requiring both you've made your staffing requirements that much harder to fulfill if you don't want to trade off education quality.
On that note how have bootcamps been doing lately?
"Financial success" doesn't necessarily mean optimizing the total number of dollars in your income; it can also mean being able to do work that you enjoy while having a decent standard of living. If a college degree had no impact on your employment opportunities, how many students would still go?
Trade schools should be trade schools but the U.S. has a dearth of them and many that fit the description aren't good enough.
Which is the point. It's not that universities shouldn't exist, it's that they shouldn't be trying to serve both roles.
Ok come on man. I said 90%+. Don't quibble just to be a prick.
Would you mind reading https://news.ycombinator.com/newsguidelines.html and https://news.ycombinator.com/newswelcome.html and abiding by the desired spirit when posting here?
I also went to a good liberal arts college and had a similar experience. People studying subjects that probably wouldn't lead to financially lucrative careers were in the majority.
First of all, no one is forced to go to university, so if people pay that much money, it's probably because they feel it's worth it, regardless of its supposed "cloistered academicism".
Secondly, maybe the main problem is that something that is needed (or at least very convenient) to have a decent life shouldn't cost a shit ton of money at all. As is the case in countries like Germany, etc. (if your parent post is from one of those, your reply doesn't even make sense).
Finally, I would rather hire a coder with a strong CS background that one from vocational school. The former may not know the language du jour and the framework fad of the day, but can pick them up easily because they know the fundamentals. The latter will likely adapt much slower.
It doesn't even have to be "real" research: I just expect more than classes which have you regurgitate what you learned to pass. And yet that's all you need to get a CS degree. It's a bit like learning to write by filling in blanks.
Is this different than any other field? I was under the impression that most BS programs do not require novel research. That doesn't come until post graduate courses.
An example might be writing a compiler for a language you design with features not covered in the book. It's probably not publishable, but it is doing CS in a way that just following instructions in a book or from lectures to write a compiler isn't.
Fill in the blank programs ("Here's the skeleton, make it display a list") and 3 projects with < 200 lines that aren't doing something CS specific are not novel, but all too common and wide spread.
Making people do continual simple demos and nonsense programs instead of teaching them actual CS is BS. They can be simple programs, they don't need to build huge things, but what they build should reveal and teach them things beyond how to shift text around the screen.
I think that's what school is, no?
I'm not trying to be glib, I think this is just a reality of how education works in pretty much all subjects (typical US academia, YMMV etc.) Can you name a major that doesn't work this way?
I have a BS in Mathematics from a mediocre school (University of Arizona) and exams were always "prove this fact (which you have never heard of before)", never "what is the proof of XYZ theorem (which we studied)"
"I consider computer science to be the art and science of exploiting automatic digital computers, and of creating the technology necessary to understand their use. It deals with such related problems as the design of better machines using known components, the design and implementation of adequate software systems for communication between man and machine, and the design and analysis of methods of representing information by abstract symbols and of processes for manipulating these symbols. Computer science must also concern itself with such theoretical subjects supporting this technology as information theory, the logic of the finitely constructable, numerical mathematical analysis, and the psychology of problem solving. Naturally, these theoretical subjects are shared by computer science with such disciplines as philosophy, mathematics, and psychology."
I think this was the author's point. You state it is a minimum requirement, implying a CS degree is a more challenging requirement. Yet it's not hard to find people with CS degrees who do not have the minimum requirement.
I dont see where. The implication is that they are assumed to be contrary to reality. It's supposed to be a list of obvious facts.
This reads to me that Patrick wishes that the joke were not true, which is corroborated by other comments of his, such as https://news.ycombinator.com/item?id=7761134#7762383
The fact that you think that the article is in any way objective is why it is so dangerous. Ideology is like water. Patrick's assumptions and biases shape how he presents his "facts" and which facts are left out. It's not objective! What he is really doing is presenting a specific market-based worldview, which has certain insights and major deficiencies.
Consider for example "There is no hidden reserve of smart people". It sounds like an actionable insight, but is it true? Is it even falsifiable? What does it mean? And what actions would it lead you to take?
My answer: it reinforces the perspective that smart people are the only people worth considering, that there are few smart people, and (in context) that these people are mostly "generalists" and entrepreneurs. It values "intelligence", probably measured in terms of short-term ability to make money, and leaves out values and self-reflection. It leaves out the possibility that you can learn from anyone (c.f. the story of the $20 fan). It leaves out the possibility that your hiring process could be discriminatory or unwelcoming. And so on.
This, I argue, is a good demonstration of why programmers need liberal arts; technical decisions are never just technical decisions. We must remember to ask who wins, who loses, is that desirable, is that just?
I still don't see this argument being made in ANYTHING you posted now.
> The fact that you think that the article is in any way objective is why it is so dangerous.
How does this relate? I asked for how you came to a conclusion and have gone off into other topics that you're using to characterize a perspective or ideology. I don't care about what you think or what the author thinks. I care about finding how you jumped to a conclusion based on the article's text.
The handwaving to make your own intellectual assertions (right or wrong) have nothing to do with my issue, in the content of what you originally stated. Good luck with whatever.
You can liken CS to something like Chemistry. Is Chemistry about test tubes and HPLC devices? No. Will you learn how test tubes work when you do a chemistry degree? Probably.
It just happens that the practical skill practiced by CS majors is coding. And coding has this unusually large applicability that wet lab work doesn't. Does that mean CS is about coding? No. But you'll probably learn more about coding doing CS than most other things.
Sure, you learn some mathematical tools that can help you solve some very practical problems, but it stops about there. I was not taught anything close to creative mathematical thinking. I was never taught the fundamentals of things: the axioms and the theorems on which the tools we "learn" are based on. I can guarantee you no one in my graduating class could read a proof of even modest complexity, let alone write one.
Granted, I did go to a school with a good reputation, but one of a very practical/applied program. Still, I'm pretty convinced that math and engineering are very, very different fields, even if all engineering programs include a handful of math classes.
What's that about? Does the lecturer just state theorems without proving them, or does it mean that the students aren't required to be able to prove them?
This describes 99% of the math that most Americans (including engineering students) who are not math majors will ever encounter. Proofs may be presented as curiosities, and students may even be quizzed on them, but students would not be expected to come up with proofs they haven't seen before.
And which is why they should be moved out of the College of Engineering and placed into the college that has mathematics. Curiously enough, the only school he mentions where he can reliably know their grads have programming experience actually does place CS under mathematics.
I think being able to program basic tasks should be a requirement for all engineering programs. They all require a course in CS - mostly not because they want ME's to know CS, but they want them to be able to program basic things.
Yet, many/most engineering grads are really poor at it. I feel for any given engineering degree program, we should make all seniors who have declared they will graduate that semester to take a basic exam on very basic concepts that all grads of that program should know (strong emphasis on the word "basic"). If it's acceptable for them not to know basic material in an introductory class, then remove the class from the curriculum!
Example: I was in a top 5 school helping an grad student in CS do basic probability homework. We were disagreeing on the correct answer, so I suggested he write a simple program to simulate it and compare with his calculated answer.
This is the simplest program possible: A for loop, and a random number generator.
He had no idea how to do it.
I'm not picking on him because he's a CS student. I think it's a valid question for people in all engineering programs that involve introductory programming courses. What's the point of making such a course a mandatory requirement if they can literally do nothing based on that knowledge?
Similarly, an electrical engineer who does not know, say, Kirchoff's laws should not be allowed to graduate.
Yes, I do think universities are for education, but if they cannot apply what they learned, they are not educated.
As for the CS students, lectures were often on the theory, and homework was usually focused on implementing demonstrations of the theory. A grad student who wasn't a decent programmer was either a cheater in undergrad, or didn't come from our school.
This particular student came from a different engineering background. Yet, he still got admitted into a top 5 CS program.
And whether he did CS or not in his undergrad is beside the point. I think this is a task any engineering grad should be able to do. If not, the university should not be listing programming as a core skill they teach their engineering students.
BTW, he is not unique. I don't think many (perhaps 20-30%) of my fellow grads (non-CS engineers) could code something that simple either when they were graduating.
I hope you helped out as good as you could to bridge the gap and maybe you were rightfully disappointed if the student hadn't prepared to catch up to undergrad material in spare time.
First of all, I'm not asking them to know "advanced" random number generation theory. As a kid, using random number generators in BASIC was one of the first things we did when we learned programming - it makes writing programs a lot more fun. Granted, it's not usually covered in introductory programming courses. But all that is needed is one API call. I pretty much told him what he needed to do. I showed him the API he needs to generate random numbers. We talked in big picture terms about the algorithm (set up the experiment, run 10000 times, take the ratio of success over number of runs). We talked about the theory on why such an experiment would answer his question. I helped in every way other than the actual coding. Had he written some code, I would have helped debug (some probability experiments are tricky to code correctly - that is acceptable).
He just couldn't translate the problem to code.
>You are being very critical, a part of university education is tolerance, although some opinions differ on if that relates to knowing literally everything. Trade-offs are a part of engineering because optimal solutions don't always exist.
I definitely am being critical. When you graduate, you get a seal from your institution that is supposed to be a guarantee of something. My question is: What is that something? If you start dealing with math graduates from a university and you find they often cannot do basic algebra, you stop valuing the seal the university provides - which damages all math graduates of the university. They are working very hard to get a certificate that is not valued by others.
Would you trust a physics graduate who cannot do basic calculus? You can argue that he may know all the theories (conservation of energy, electromagnetics, etc), and understands the principles behind calculus (area of curves, rates of change, etc). But if they struggle with basic evaluation of integrals, you probably will wonder about what kinds of courses they taught in his university that allowed him to get a degree without actually solving problems with calculus.
This is the problem the author is talking about. If you consistently interview people with computer science degrees, who cannot do a simple fizzbuzz, you stop valuing the degree - and that is the general landscape in the software world. Those who worked hard to gain a lot of knowledge relevant to both theory and practice are right to be upset that because of lax (or at least misguided) standards from their educational institution, they have to go through more hoops to convince people of their skills.
It goes beyond the simple mantra of "Universities teach computer science, not software." I would question the technical ability of anyone who cannot write such a simple program, with as much guidance as I provided - be they a CS major or a math major. I'm not asking for practical knowledge like whether unit testing is useful, or how to architect large software, or the virtues of OO over FP. Just a basic program. Can you write it?
Certainly there is not universal agreement on the curriculum. I'm saying there should be agreement on some basics. Historically, universities handle this by ensuring some introductory course is in the curriculum. My argument is that in the US the education system is a little too modularized, and there should be more coupling. If programming is seen to be a core skill that engineers should know, make writing code an aspect of every engineering course (it's really easy to construct assignments/projects in pretty much any engineering course that could involve writing code).
Don't let someone get an A in the introductory course in his freshman year, and then allow him to forget pretty much everything he learned from it by the time he graduates. If you do, you wasted the student's time, the instructor's time, and the time of all the interviewers who cannot trust your degree and have to check for themselves if this person knows the basics.
Again, it all boils down to: The university is giving you a degree. Is this degree a guarantee of anything? If so, what? I'm all for education for the sake of gaining knowledge. But how many of us would be OK with gaining the knowledge without any kind of certification from the university? Not many, I suspect. I actually did this personally (dropped out of PhD after many years as I felt I gained the knowledge but did not feel a degree would help me), and can assure you from the conversations I had with others who tried to convince me otherwise that pretty much almost everyone wants that paper to have some value.
I suspect there is just a cultural issue, where the CS faculty doesn't want to do that style of teaching. I had a professor hand out debugging assignments because that was part of the industry feedback they'd gotten, but only the one professor listened to the feedback.
- Graduate of the North Avenue Trade School
(* Most times)
So yes, after that you can't start as a Senior Java Developer but I think you have a very good foundation to quickly pick up other Object Oriented languages.
So you do not have to teach the latest hyped language to give students the ability to programm. Instead you can use clean languages to teach CS and let your students excercise them to show you that they understood what you taught them.
You were free to construct a degree out of proofs rather than source code if you wanted, but those of us who desired a more practical education could fill our days with systems programming.
This is really, really important. In my experience, this is not just unknown but actively ignored and disagreed with (either explicitly or effectively) by a large majority of people involved in software businesses. This hamstrings: companies, individuals' growth, and the advancement of the state of the art.
It is important that historic knowledge is respected for the context it provides, while outside knowledge is equally respected for the deeper context and data behind it.
Unfortunately, this change often means generalists lose jobs in favor of specialists as a company grows if there isn't a good place for them, or if they don't rise to management. In fact, even early managers can be displaced by outside managers who come in with more management experience. And this is not inherently a bad thing for the company (while it may be for the employee getting bumped). The people that take a company from point A to point B are rarely the same people that take it from point B to point C.
I think otherwise trying to be a "senior generalist" or "staff generalist" is going to be very hard, but for logical reasons.
You have to know what you're talking about and know when to defer to experts, and being a generalist might prepare you well for that. But once you've reached the threshold on knowledge and critical thinking, the biggest barrier to being a good manager is mastering the mechanics of people wrangling.
Firstly, besides understanding the work, a manager also has to be able to actually manage subordinates. This is orthogonal to technical ability.
Secondly, internal promotion of a generalist carries risk of that generalist being too opinionated based on his own work. They are much more likely to be a conservative force. This might be an unnecessary barrier to good changes.
These days I try to just be a tie-breaker for the team, instead of laying out the plan and looking for "feedback".
> Unfortunately, this change often means generalists lose jobs in favor of specialists as a company grows
> And this is not inherently a bad thing for the company
Yes, it is not _inherently_ a bad thing, but I would say that it is _generally_ a bad thing. Sure, if you are raising a Series C and everything is about optimizing the hell out of your product, there might not be a place for generalists.
However, I would claim that if you throw out generalists, you throw out some of the core/most valuable people who could help the company establish new products and help you grow into a multi-product company (diversifying your portfolio is generally a good thing). If the company is only left with specialists it will slowly transform into another one of those industry giants that struggle with innovation.
I think it is more like an organical transition where newly hired personnel are more likely to be specialists. And that is totally okay, because when you start a company there are about 0% specialists onboard ;-)
Specialists aren't as good as generalists at integrating the work of specialists across domains (almost by definition). A lack of generalists can then lead to a situation where specialists are all making locally good moves, but the overall direction is negative -- something akin to Simpson's paradox (though, in the other direction).
And I believe it's entirely possible to be both; to be a better specialist you need to be a better generalist, and vice versa.
Edit: to be clear, in my experience the specialists that aren't also generalists on some level are the least productive on the team because they have trouble participating outside their knowledge domain. Maybe I have worked at the wrong places though...
That's the messy part of the transition: There's kind of the implication that past decisions weren't correct. It's easy for people to get emotionally involved and take personal offense.
Example: Backup product. Design started in the late 90s. By late-aughts, multiple CPUs were obviously a serious thing. New guy was brought in, and set on the project of re-architecting the core program into a more asynch data processing model. 6 or 8 months later, the changes were working for base cases, but the senior devs threw an absolute hissy fit focused on how much of their code was changed. We should've banded together and fixed the edge cases. Instead, we spent years fighting over the structure of the program, and that dev transferred to another group.
Bringing in security-focused employees worked out better. Their changes were more around the edges of the various programs, rather than in their cores. Fewer toes stepped on, fewer disagreements.
Thanks for highlighting something I missed in my post. Some of the time yes, the decisions might just be wrong because generalists may not be experienced enough to know the right way to do things. But quite often, they were in fact the right decision at that time and circumstances change.
So any sort of process and culture that helps take the emotion and risk of personal offense out of those situations is really critical to gaining the benefit of historical context in a positive manner.
However specialists don't need to be generalists to be productive as long as they come in with the mindset that they are specialists, and therefore may not know everything. And that is especially important, as I highlighted, for respecting historical context of the company and its operations.
There might be VERY good reasons for not doing something the way the rest of the industry does it (which is what the specialist might be inclined to do). However a good specialist should also be adept at factoring that into how they approach problems, and properly weigh the new information against their past experience to determine if in fact this is a special snowflake situation, or if their approach is still the right one. Conflict can often arise when if they push for the latter, so strong communication skills are key (and that is something that never really changes for any role at any stage of company growth).
"Scale" often seems to equate with SF hotshots from one of the big names that will come in and airdrop a brand new architecture. These things take years and you need those generalists that have the tribal knowledge and the ability to jump in (almost) as deeply as a specialized developer, DBA, Network Admin, etc. I think the SRE movement is good evidence of the type of breadth and depth needed to operate a complicated system at scale.