Hacker News new | comments | show | ask | jobs | submit login
Things Many People Find Too Obvious to Have Told You Already (threadreaderapp.com)
854 points by JoshTriplett 7 months ago | hide | past | web | favorite | 487 comments



> Companies find it incredibly hard to reliably staff positions with hard-working generalists who operate autonomously and have high risk tolerances. This is not the modal employee, including at places which are justifiably proud of the skill/diligence/etc of their employees.

This one. Every job I see when I go looking is 40+ hrs a week, likely with plenty of idle time and projects I don't believe in. Part of this is regional... my area is packed full of marketing jobs that I'm actively uninterested in. Anyway, I'm starting to get the hang of selling myself to contract shops, and I think I've found some twinkles of hope.

But today I was told straight up that my resume wasn't conventional enough, that it was too well-rounded. After quite a few messages back and forth, I'm not even sure if they even looked at my portfolio. They told me there was no money for me, because their funders were looking for someone more professional. So I took this as a sign that maybe this wasn't the job for me. No need to go barking up the wrong tree.

In the game dev world, I'd probably be a "technical artist", but even that doesn't quite fit for me. I'm like a "fool of all trades, jack of some, maybe a master of one or two". But it's hard to find a job that motivates me and leaves me with adequate space/time/energy to keep myself sharp and alive.

If anyone knows of funding for a guy that makes creative, reusable web components, let me know... edit: I just saw that vue.js is primarily funded by patreon. that's awesome!


It would rarely be effective to market yourself as a generalist on paper.

If you have 3 primary skill sets you should have 3 focused resumes.

If you want a position that requires a generalist, the only/best fit would be founder. In almost all other circumstances, companies hire for a problem and not general optimizations. Even if they need a generalist.

I would posit that you have a marketing problem and not a skills mismatch problem. The noise to signal ratio on your resume may be making it seem to not match any position.


Thanks for your insight. It really resonates with me.

I think you're very right about my marketing problem :)

On the surface, my website is kinda out there. I built it from the ground up using Rake and some gems as a static site generator. I think it's very cool, and I don't expect that many people will appreciate it, but I'm hoping that one day someone comes along and says "wow, that's pretty cool, wanna help us with this Ruby thing?")

In a deeper way, this whole discussion is very helpful... it's giving me a fresh perspective on how to frame conversations with future potential employers.

And actually the other day I had an epiphany about my marketing. Sure, I chose an odd path, but I've already done so much of the work! I just need to get my random little open source tools in decent, usable shape so that people who'd find them useful might actually get a chance to use them. I need to get my half done work finished and make tutorials about it and get myself out there.

This came after I got a call from a contracting company who told me they were bidding on an interesting job that might be a good fit for me... I was so excited that I stayed up all night and finished like three old random projects at the same time, just so I could put more stuff on my portfolio. It was a wakeup call, affirming that there actually just might be some cool work out there after all... so get your shit together!


This is an interesting idea that I'll have to try.

I have the 3 primary skillset problem you describe and have a decade of "founder" experience. I thought the breadth of experience and technologies would be impressive, but I think it just confuses most employers. It seems counter-intuitive to intentionally not mention your skills on your resume, and admittedly a little ego-deflating.. but this actually makes sense.


I just thought of something else: it might be a good idea to go around and make a collection of all the different job titles that broad-knowledge people (e.g. "technical artist", "full stack") are given in different industries, and find the ones that fit you. Then study the roles. That way you'll have a language for looking for jobs and will have a better idea of where you might fit in and how to have conversations about it.


Your assertion about generalist ~= founder is pretty spot on. I've spent decades working lots of different job titles that interest me (or as the needs of the organization dictated). Putting all that down on my resume hasn't helped me much.

I was heading the direction of very focused resumes. I've also landed a low pressure founder role too, so that's nice.


A team lead position on a team that has autonomy is also a good place to be as a generalist. It's where I see myself providing the most value to my company. I've had some opportunities to move up to Director+ but blegh.


If you've worked in the different specialties at different jobs (rather than interleaving all 3), won't focused resumes have glaring temporal holes in them?


The idea is not to omit experience, but spend more real estate on the focus topic. Writing a longer description for the experience that really matches the description, and even wording less relevant experience to so that it sounds more relevant.


I think you framed your anecdotal evidence as supporting the OP's statement, but also that you misunderstood such statement; and you actually present evidence against it. If companies find it hard to reliably staff generalist positions, _and_ you're a generalist as you claim, then you would find it _easy_ to get a job.


I have shipped React and React Native apps. I've built nontrivial open source infrastructure/devops tools that are used at Companies You Have Heard Of. I've done backend stuff for NASDAQ-listed companies.

I'm not a web or mobile developer. I'm not a "devops engineer", and I'm not a "backend engineer." I'm a problem-solver. But to many (most?) companies, this means that I need to be classfied into a track. I need to be an X or a Y or a Z. I can't adopt those roles when necessary--I need to be slotted in and just go.

That doesn't really work for me. Which is a major reason why I'm a software consultant now. I'd go do a full-time job that respected my complete disinterest in being pegged to a particular role and was able to give me the latitude to solve cross-cutting problems at scale. But that's a hard role to define, a hard role to hire for, and a hard role to evaluate--which is why they, and other generalist-focused roles, don't often exist.

It's easy to get a job. I can walk into almost anywhere in Boston if they're hiring and get an offer; I interview extremely well. I can't get a job that actually leverages my skills.


Assembly lines beat generalists. As companies grow they can exploit this to gain increasing efficiency with minimal effort. So, either work for small companies, or be the front line manager who picks up whatever slack is missing to keep things on track.


Try consulting or entrepreneurship.

When you sell your time, you need to fit into a role that someone can comprehend and already knows they need.

When you sell your ability to deliver results, nobody asks about your skillset.


You might have noticed that I said that I am a software consultant. And the pipeline is pretty good. But it's a different thing than actually having something consistently interesting to do.

I don't enjoy entrepreneurship.


Same here, I work independently for a few years now and it just becomes less interesting, and for interesting problems where you'd learn something new (I.e. where you are not good but you know you'd get there quickly) you cannot maintain your rate.

I keep wondering if I should take the rate cut for a year or just study mathematics at the local university (which is free here) ...


Genuinely curious: how would studying maths solve your specialist/generalist dilemma?


Math is multidisciplinary like none other. You could use graph theory, for example, to study relationships in music, sociology, chemical reactions, transportation systems, markets, ecosystems, whatever. And the insight you gain in applying it from one angle will almost certainly provide a fresh light when you pivot back to a different application.

Having a math PhD could also help cut through the bs of people thinking you're not useful because you're a generalist. Suddenly, they'd be asking you to help them with weird little problems. Which is what I'm pretty good at like to do!


I've thought about going to study math too.


Are clients are hiring you as a team of one? If so, it’s easy to see you as replaceable. If you assemble a team then you get to work on more interesting and longer-term projects.

Either you fit yourself into the structure of someone else’s business, or you’re in business. If you’re in business, there’s no lack of diversity in what you spend your time doing. That’s what I mean by entrepreneurship, not “SV startup entrepreneurship”.


You know, that's a good point. When I consult through some of our locals it's very much a team-of-one sort of thing because we're probably too expensive to rope together too many people onto one project. (Not that we're actually that expensive, but false economies around expertise exist throughout tech.)

Something to think about. Thank you.


Sheesh, get over yourself. I'm not trying to be a jerk but this is such an obnoxiously braggy complaint.

Yes, companies have complex problems to solve, that require more than a few days or weeks of work at a time. If you want to just wake up every day and decide that you need to work on a totally new project, you're not going to be very useful.

Lots of people have a wide range of interests, but you pick one of the many areas that seems interesting and that you can contribute to, and you work on that for a while and try to make a difference. "Adopting a role when necessary" is not that different from slotting in to a team, since you can often change teams every few months if you want to, or even work across teams.

What you describe is a hard role to define and hire for, but it's a role that many engineers create for themselves by being both great problem solvers and willing to do what is most valuable for the company. Consider that maybe you haven't gone in with the right mindset to make it happen for yourself.


You've heard of "be a multiplier, not an adder", right? My contention is that many good developers can be multipliers across very large portions of an engineering organization when given the chance. Dysfunction does seem to be the natural state of any business, and engineering groups are no exception; technical and social resources capable of and empowered to tackle cross-cutting concerns are that chance I'm describing and provide self-evident benefits over the long term. You're right, that sometimes it happens informally. I am asserting that institutionalizing it makes teams faster, less risky, and happier.

I also happen to think that I'd be good at it, because it's kind of what I already do, and I wish it was more common because that'd be great. Heaven forfend. I mean, you can try to pull the well-actually-it's-labor-that's-bad thing by trying to turn it into my mindset is wrong rather than let's talk about executive and managerial allocation of resources and whether we do a good job of it across the board, but the thing speaks for itself.

And you're right, that post is braggy. It's also marketing: from this thread I've gotten two emails inquiring as to the state of my pipeline and if I'd have time for a chat. It can be both self-marketing and a reasonable observation of reality. (Email's in my profile.)


I have not heard of 'be a multiplier, not an adder' Can you elaborate?


Be one that makes changes that make a team perform 2x - 10x - Nx better, not someone who adds a constant contribution to the team.


> If companies find it hard to reliably staff generalist positions, _and_ you're a generalist as you claim, then you would find it _easy_ to get a job.

It's incredibly hard to get a job. But your versatility will ensure job security for as long as you care once you're in the door, thanks to your ability to get into the critical path of different department's operations.

I'm a competent, high performing generalist who can work autonomously and self-sufficiently with enough confidence in my capabilities to have a high risk tolerance. In my current company I got tossed around from starting under a marketing manager, to getting shifted under the CTO after stepping on IT's toes too often, to getting pushed into a newly formed Operations group, to finally just reporting directly to the owner. While officially I manage a data management and BI team, in practice I'm the guy you escalate random problems and needs to. Creating a BI team was a byproduct of being a generalist and being the first person that needed a consolidated view of everything (and having the ability to execute on that need).

Looking for a new job is a pain. Companies don't explicitly hire for it, politics can come into play if the generalist doesn't know how to dance around the company in a way that doesn't threaten others' fiefdoms, there's no easy way to suss out competence from arrogance during an interview, no manager wants to frontload your salary on their budget when they won't be getting the full benefit of it, and a host of other things.

Generalists are valuable to a company, holistically. They're oil to the machine and fit in between all the specialized cogs to smooth things out and let you run the engine hotter without breaking down. But managers only manage their own cogs, hiring only looks for the cogs they're told to find, and at the end of the day whether there's oil in the machine is Someone Else's Problem. Very few teams exist which have the prerogative and budget to hire for that holistic need, so most companies just go without oil and don't think about it ¯\_(ツ)_/¯.


> "I'm a competent, high performing generalist who can work autonomously and self-sufficiently with enough confidence in my capabilities to have a high risk tolerance. In my current company I got tossed around from starting under a marketing manager, to getting shifted under the CTO after stepping on IT's toes too often, to getting pushed into a newly formed Operations group, to finally just reporting directly to the owner. While officially I manage a data management and BI team, in practice I'm the guy you escalate random problems and needs to. Creating a BI team was a byproduct of being a generalist and being the first person that needed a consolidated view of everything (and having the ability to execute on that need)."

Your job and experience sound very similar to my current role. I don't have an accurate job title anymore, and I very much see myself as the oil that makes the engine of the company run more smoothly. I wear about four or five different hats, all depending on what fires I need to put out. Good to meet someone doing something similar.


I've said this in a sibling thread, but most shops I've seen that doesn't have This Guy are worse-run, less friendly, and higher business-risk shops. (Which isn't to say that counterexamples don't exist, but I can't think of a good one in my experience. The best companies institutionalize the role.)


Part of the reason my job needs to exist is to address lack of adequate staffing levels and dysfunctional inter-departmental practices. I would say the company I work for is transitioning between one that is haphazardly run to one that is more professional, but that transition doesn't happen overnight, and you need people to pick up the slack during that transition.


Very true. From what I've observed, it's because a firm with This Guy allows that role to absorb all the odds and ends that pop up and need handled, while allowing specialists to focus on what they do best, maximizing their work output and general work satisfaction.

A firm without This Guy means that more small problems/needs get ignored since it doesn't clearly fall into anyone's scope of work or warrant the cost associated with the specialist. And if it gets large enough to not ignore, a specialist will consistently have to context shift to something outside of their specialty when something tangentially related pops up. This decreases their productivity considerably, decreases their satisfaction as they can't focus on what they truly enjoy doing, and increases business risk as things get overcomplicated or slip through due to a more biased view on the strategy to perform the work (based on whatever the individuals specialty is).


How did you get into a role like that? Sounds like something that would be fun.


It is fun. As for how I got into it, I got hired for a different role (test engineer), proved myself in this role but also took on extra responsibilities outside my remit (I should note that I also had worked in various different technical roles in previous jobs). An opportunity came up to be a founding member of a new Business Analyst team and I took it, but because of the structure of the company I'm in I didn't leave the IT Development team fully, so now I sit somewhere between them. In the space of a week I can be doing data analysis, business analysis, software testing, software development and user training. The hours are long, but I'm seeing improvements in the business culture gradually coming in as a result of this work, which is satisfying.


> most companies just go without oil and don't think about it

Practice-wise, it astounds me how many people/institutions expect different results from everyone else while hewing to the same practices as everyone else. Why do you think you'll be the exception? Why can't you look at the reality of what is working and what isn't?

This applies to so many areas of life, but it is extremely pronounced in business, where everyone cargo cults every little practice of AmaGooFaceSoft and believes that is what made them who they are. It's like there's an extreme deficit of original thought.


A lot of comapanies have "A" teams they send in when necessary. The US Digital Service was one example of this.


I think the anecdote and OP’s statement support one another - companies have a hard time reliably staffing generalists and one of the reasons for that is that they are bad at identifying them and determining how to use them.

Parent showed them a generalist skill set and they saw a lack of professionalism.


It'd be straightforward to find/get a programming job that'd more than take care of my material needs. There are a buttload of entry/intermediate level jobs out there that'd provide more money than I even know what to do with. If I wanted a job really bad, I could study up for it, put the technologies they're looking for on my resume and apply.

The hard part is giving in to a job. I think my hesitancy in this area straight up reeks to the point H.R. people/recruiters can pretty much smell it. It's just not about the money for me. If I was doing something I enjoyed and had time to work on creative projects (a.k.a. "my life's work"), I'd be fine on ~ $1000/mo (say at ~20-40 hrs/mo). But this presents difficult hiring/management challenges, the main one being the challenge of using my time effectively. When someone is always on call, management doesn't have to worry about this as much.

Honestly, I think companies could get a lot more bang for their buck if they put more money into hiring people to maintain and improve on open source projects that they use. And I know that there are jobs like this out there. I just need to keep looking for one that fits me :)


The idea of failing at "giving in to a job" resonates so much with me. Thanks for taking the time to formulate this thought into words.


You think you're too good for a job, hence, you are unemployed.


I went through my entire 20s thinking this, and I have strong opinions about it so sorry for this long post...

Back when I quit my 9-5 job in 2006 my favorite quote was by the creator of Winamp... "For me, coding is a form of self-expression. The company controls the most effective means of self-expression I have. This is unacceptable to me as an individual, therefore I must leave."

No company deserved me because I was a creative innovator, and even though working retail for $10/hr wasn't using my CS degree, I'd rather do that than make $70k getting someone else rich. I thought I deserved $250k since I was obviously going to get them rich - so taking the $10/hr job doing mindless work was less of a sell-out.

The next 8 years I tried a lot of things, and went $20k in debt. When I reached my 30s, I realized a few things:

1. You can be a mercenary and get the skills you need for your own business while working for someone else. You learn and get faster while doing tasks for your employer, so even though the code you wrote is owned by them... the code you write for your own business will be better and done faster.

2. If you're worried about getting them rich... just do what they say instead of innovating. Most companies don't innovate, they just maintain the status-quo and have you build systems that meet existing needs and save a little bit of money. You aren't making the next Snapchat at a 9-5... and if you are, you should have equity.

3. Freedom is not just about having the time to do things.... I had all the time in the world in my twenties but no money. Being broke basically put me in a prison. I couldn't go on trips with friends, couldn't take girls out, couldn't do what I wanted because I'd have no money for it. Now that I have a full-time job and work on side projects, I have the opposite problem where I have money but no time.

Overall I think having a 9-5 job works better for my lifestyle. Maybe I just needed more discipline and I could have succeeded as an Indie developer making games, marketing software, apps... and whatever other shiny thing caught my eye... but now I do that stuff on the side. I still hope I succeed as an entrepreneur and reach a point where I have complete freedom (time & money)... but now that I'm older I realize that was always a long-shot.


There's nothing wrong with making people rich in exchange for money they give you. That's how jobs work. Making someone else rich is something to be proud of and aspire to. Not as cool as making yourself rich, but still good.


I don't mind if my boss or the CEO of the company gets rich, I think they're good guys. I'm just worried they would sell out instead of keeping the company private.

Take Tumblr for example - they sold to Yahoo. Verizon bought Yahoo. If I was an engineer at Tumblr I'd be pissed that an evil monopoly like Verizon that's fighting against net neutrality is suddenly using my code.

I'm more humble now than I was in my 20s, and I don't think my code is anything special, but I still fear that the private company I work for will sell everything I built to someone I consider evil. If I have no control and no equity I'm going to act like a mercenary... I'll do the job efficiently and according to specifications, but they don't get anything extra. If they have an idea and have me build something that gets them rich that's great - but I'm not going to be like the guy who built GMail on the side at Google... I just do the job they tell me.


You're afraid the code you write for an employer might be used by an evil company. We could apply the same argument to taxes. The taxes you pay may help fund actions that you believe are morally reprehensible. To a large extent, these things are beyond your control, and by overanalyzing you're simply limiting yourself.

So what I am saying is at this point, do not worry about all these things. Keep up the side-projects, and if you do that over the long-term, you may one day find yourself in a situation where you have the freedom to dedicate your efforts to what you find meaningful.


> Take Tumblr for example - they sold to Yahoo. Verizon bought Yahoo. If I was an engineer at Tumblr I'd be pissed that an evil monopoly like Verizon that's fighting against net neutrality is suddenly using my code.

But wouldn't you be happy that all your fellow coworkers also probably got to benefit from the sale? If you or your coworkers got rich enough then you could all just focus on doing things that you think matter or better the world!


I don't think that at all. I actually admire and respect people who are able to buckle down and focus on a full time job. I also acknowledge and respect that this is not a fit for me. There's nothing wrong with me or with people who that works for, and this is my challenge. Please don't interpret my frustration as a condemnation of traditional employees/employers. I'm learning how to navigate.


WHere did you get that from? Does this not seem like a really rude thing to say to you? Do you not value the well being of others?


you think there is stigma in being unemployed.


A salaried job that only gives you 5-10hrs/week is a niche unto itself.


Your absolutely right, and not even salary jobs, just part time jobs:

https://tech.mn/jobs/?q=&type=1&role=0&cat=any

vs.

https://tech.mn/jobs/?q=&type=2&role=0&cat=any


Another way to see the statements as non-contradictory is: since companies cannot hire people like this, they structure themselves such that such people are not required. Then, when one does happen along, the company has no place for him.


Or her.


> Parent showed them a generalist skill set and they saw a lack of professionalism.

Yeah, thats part of the pattern. Generalists often do not appear as professional as specialists (which does not mean that everybody who looks unprofessional is a generalist).


Right. And believe me, I'm not the pinnacle of professionalism, but I have core values of respect and a motivation to do work that people find valuable. Last time I checked they are pretty well intact. I believe this person mistook specialism and accolades for professionalism.


I think there's a difference of terms. Wanting to work 20-40 hours a month on stuff you like instead of "giving into a job" sounds pretty far from what most people are going to consider "a professional" (you work on what the job needs at the present, with standard hours, that they pay for).


This explains why everyone in tech fights over space on the latest bandwagon every time no matter how astoundingly mediocre the latest thing is.


It's strange, but I think at least some companies find it hard to staff positions, while also at the same time ignoring/rejecting suitable candidates.

Case in point - I'm a generalist who can very credibly claim very significant expertise at two disconnected areas (compilers and distributed systems), plus some startup/ product management expertise. I saw a job listing here for a startup that needed both of these and claimed they're looking for remote work - so, even though I'm not actively looking for a job, I wrote them; the match between my skills and their needs seemed too good to be true.

They wrote back the standard rejection mail - didn't even try to talk to me. I thought that maybe they found someone else, but no, they still advertise the position. I know it sounds arrogant, but there's no chance in hell they can find many better candidates (I doubt that they can find even one, but well, good luck).

Are their recruiters uniquely ignorant? Unlikely. I think many companies do a lousy effort on recruiting, and then wonder why they can't fill the positions. One of the reasons "internal referrals" work so well is that the candidates are at least seriously considered.


I would bet my lunch that your scenario was a keyword mismatch. Always keep in mind that the "hiring person" at startups is usually the non-technical person. So you have to get past the skills list provided by the CTO/CIO/Lead Dev.

I have seen three approaches to limit this. First being to stay super field specific but implementation general, but not keyword void. Second, is to go super rabbit hole on every keyword/tool/language/platform you have ever worked with inside the field.

The third approach is to have a type one resume with a type two cover letter. That approach has usually been the most consistent for callbacks in my circles, and allows for your rabbit hole to be of the same type as the job description.


You might be right (though there may be other explanations, e.g. ageism or just wanting local employees, and writing "remote" just for the heck of it, "to get more leads"). But even if you're right, that just reinforces my point: I'm not desperate to get a job; I wrote to have a discussion, because they managed to pique my interest... I didn't put any kind of effort to "market" myself, thought they'll figure things out in the interview(s). They were basically fortunate that their advertising hit he right target. And what did they do with that lead that they got? They throw it away through a non-technical filter....


I think that’s an interesting cross section. I’ll talk to you. Email in profile.


These kind of job listings may just be used as a cheap form of advertising to show investors and (potential) stakeholders that the company is growing and - therefore - doing well.


Companies sometimes aren't honest about their job postings asking too much, and instead blame the worker population on not having the skills they need.

The more skills you are trying to get filled for one position, the less likely a candidate will match all of those skills.

Take any job posting with 20-30 lines of skills or requirements and it's really no wonder they can't find someone that meets all or even half of that.


I’ve definitely seen postings around asking for 4+ years experience in React or something equally ridiculous (React has existed for 4 years).

I think that hiring practices haven’t really caught up to pace of reality yet. Nobody does the same thing for more than 5 years anymore, even if you’ve been at a company that long what you’re doing today does not look like what you were doing then. The chance that a new hire already knows all the parts of your stack is basically zero. What you want is someone who soaks up frameworks like a sponge.


Not to mention the companies that specifically write the ads so that they can claim their person doesn't exist and bring in an H1-B holder for half the rate. The fact that these shysters ruined it for the companies that actually need good people at good pay is a major problem.


There is a massive gulf between hiring and work. There's another massive gulf between what a manager asks for and what HR or recruiters ask for.

> "If companies find it hard to reliably staff generalist positions"

The process of hiring for this is hard. Finding people like this is also hard.


There's also gulf a between what companies think they need, and what they actually need. Almost everyone hires on technologies and not the abstract skillsets required to learn said technologies.


Any Idea how to find a company that hires on that abstract skillset, potentially by throwing one into cold water to test for it?

~Someone with little resume-driven experience but good grogging skills


Honestly, the way I got hired recently was by just reading job postings and cramming on the frameworks mentioned in them until I was competent enough to pass an interview/coding challenge. I’m a little peeved that it was essentially company training on my dime, and you have to count on companies not refusing to hire unless you have x years, but it did get me hired.


Big companies understand this and that's reflected in their hiring practices. They don't care what skills you have, just that you're smart.

The way they throw you into cold water is using whiteboard interviews. So if you don't like those... sorry.


No, I wish. The closest you'll get is companies that might hire on transferrable skillsets (e.g., Ruby <-> Python). The best thing you can do is have a portfolio of work that shows your understanding. It's what we had to do when I was doing graphic design and illustration work.


I got found by a ~10ppl YC company and the deal fell apart over $12K.


exactly! and even if you do find them, convincing them to work for you is also hard... lots of these people are don't respond to money in predictable ways.


Companies find it hard to staff generalist positions because they don't know how to interview/hire generalists. If a generalist interviews with a team that has several specialists then one of specialists can easily knock down the hire decision because the candidate isn't _as_good_as_ the specialist.


Something being hard does not imply it's expensive. There are many ways hiring some kind of people can be hard even with a massive amount of them out of your door looking for a job.


One of Marx' theories was that people are estranged from their labour by making it a purely economic entity. To me that's exactly the reason behind the reactions in this thread (and also why I feel the same).


I'm a fellow generalist. I got lucky that the company I'm with values my work, but I can see something that isn't said out loud about generalists but appears to be true (in my experience at least)... generalists are harder to manage.

If you've got someone who is largely autonomous and can wear multiple hats, steering them in a single direction may be more tricky than a specialist that doesn't seek work outside their remit.


>But today I was told straight up that my resume wasn't conventional enough, that it was too well-rounded. After quite a few messages back and forth, I'm not even sure if they even looked at my portfolio. They told me there was no money for me, because their funders were looking for someone more professional.

Assuming your skillset is good, run, don't walk, away from those idiots.


Some resources you may find of interest:

* "Have Fun at Work" by William L. Livingston -- Livingston provide examples of how organizations are often incapable of hiring the people who might help them most.

* "The Murdering of My Years: Artists and Activists Making Ends Meet" by Mickey Z. -- This book is more in the category of seeing how bad so many creative people have it (until perhaps we get a basic income some day, Star Trek replicators for subsistence, a gift economy, and/or better government planning).

* "Drive: The Surprising Truth About What Motivates Us" by Dan Pink: https://en.wikipedia.org/wiki/Drive:_The_Surprising_Truth_Ab...

* "Punished by Rewards" by Alfie Kohn: http://www.alfiekohn.org/punished-rewards/

* "The Abolition of Work" by Bob Black (which reflects your point in another comment on "giving in to a job"): https://web.archive.org/web/20161031034600/http://whywork.or... "Only a small and diminishing fraction of work serves any useful purpose independent of the defense and reproduction of the work-system and its political and legal appendages. Twenty years ago, Paul and Percival Goodman estimated that just five percent of the work then being done -- presumably the figure, if accurate, is lower now -- would satisfy our minimal needs for food, clothing and shelter. Theirs was only an educated guess but the main point is quite clear: directly or indirectly, most work serves the unproductive purposes of commerce or social control. Right off the bat we can liberate tens of millions of salesmen, soldiers, managers, cops, stockbrokers, clergymen, bankers, lawyers, teachers, landlords, security guards, ad-men and everyone who works for them. There is a snowball effect since every time you idle some bigshot you liberate his flunkies and underlings also. Thus the economy implodes."

Contrast with notions of "full employment". Or, sadly, also most of what Silicon Valley is up to these days...

About a decade ago, I sent that last link to an Apple recruiter who contacted me -- never heard back. :-) Not that I am a "Steve Jobs" or would want to be one, but, as with Livingston's point, would Apple hire a Steve Jobs as an employee? Almost certainly not. See: http://careerfuel.net/2013/06/if-apple-wouldnt-hire-steve-jo...

Though for balance, on the value of working together with others even in difficult circumstances, see "Buddhist Economics" by E.F. Schumacher: http://www.centerforneweconomics.org/buddhist-economics "The Buddhist point of view takes the function of work to be at least threefold: to give man a chance to utilise and develop his faculties; to enable him to overcome his ego-centredness by joining with other people in a common task; and to bring forth the goods and services needed for a becoming existence. Again, the consequences that flow from this view are endless. To organise work in such a manner that it becomes meaningless, boring, stultifying, or nerve-racking for the worker would be little short of criminal; it would indicate a greater concern with goods than with people, an evil lack of compassion and a soul-destroying degree of attachment to the most primitive side of this worldly existence. Equally, to strive for leisure as an alternative to work would be considered a complete misunderstanding of one of the basic truths of human existence, namely that work and leisure are complementary parts of the same living process and cannot be separated without destroying the joy of work and the bliss of leisure."

I've collected some more ideas on making workplaces better here: https://github.com/pdfernhout/High-Performance-Organizations...

From what you describe, if you can't find a direct match to what you would like to do, you may want to consider being frugal and then earning what money you need to live in a low-stress part-time occupation with the rest of your time then available for what you really want to do. Have you ever considered, say, cleaning carpets to make ends meet? https://web.archive.org/web/20030807105050/http://www.unconv... "More than a few people agree the best career would be one which provides challenge, intellectual stimulation, and rewards for quality work. Many however, would be surprised to discover they can have all of those benefits and more in some of the unlikeliest of careers. Case in point: I'm a professional carpet cleaner. Some people think this is a second-rate career. I don't agree with them. Carpet cleaning gives me challenges, intellectual stimulation, and many other rewards. To prove this, permit me to walk you through one of my work days. ..."

If instead you focus on "selling [yourself] to contract shops", you may find this book Aaron Erickson useful: "The Nomadic Developer: Surviving and Thriving in the World of Technology Consulting".


This is great, than you! And yes, I'm pretty much doing this now. I live with musicians who hustle very hard. I'm a musician too. I'm familiar with the life. I'm very happy and fulfilled in so many ways, just exhausted.

I think one of my best bets right now is to lean hard into teaching, especially piano teaching. I have one committed student and a few more excited but broke students who take lessons from time to time. Is very rewarding and there is lots of mutual respect.


I'm not sure what he meant by "modal employee". Is it in the statistical sense of occuring most frequently?


Looking at it, I think he means the statistical middle employee.


I thought it was a typo for "model employee", but your interpretation makes more sense.


I think that's exactly what he meant.


many people can have skills to do x but if I cant rely on you to do that for me for the next year at a high quality then I dont want to take the risk.

You are not just a good frontend engineer you are a good frontend engineer that is competing with other good frontend engineers. You lack professionalism (it sounds like) so you lose against those guys.

Good luck, at some point you either find the place that works for you or you learn expand your skills. Seems like you need to improve your tolerence for down time. It could be worse


I don't lack professionalism because I'm unable/unwilling to provide the same level of support that full time specialists/shops offer. I'm just not a fit for jobs where that's what needed. Maybe that makes me less of a specialist, but it doesn't make me inherently unprofessional. And a lot of times that's not what's needed! As an artist I'll tell you that in art worlds there are plenty of professional jobs where you do something well and walk away... There are others who are better suited to taking what you've built and integrating it over a longer term, particularly, it's people who have stake in the project.

Have you ever done freelance web stuff for small businesses? There are some people who want you to be their webmaster... You give them a CMS and they want you to go fix their typos. So you say "Okay, sure, but it'll be $50/hr" and then they get frustrated because they don't have someone doing that kind of work on their end. They think it's your job to "make their website", but don't want to take ownership of the thing.

This is an analogy for the kind of thing I'm taking about. I want to help solve problems, and I want to provide value for my employers, but I'm unwilling mindlessly give up my time to people who don't value it outside of their myopic needs. A lot of times, that's what's expected to be "a professional". Well I don't want that. I'd rather be an amateur who's also a respectful and savvy business partner, who's definitely interested in helping you with your project but also keeps healthy boundaries for myself.

I'm definitely more of "an amateur" (lover of) than a professional, but I don't see why that should make me automatically "unprofessional" in my business relationships. On the contrary, I think it's fully possible to "be professional" without providing unrelenting 40+ hr/week service for a year. If that's not what you're looking for then with all due respect that's fine! But please don't blanket assume I'm "less professional".


> CS programs have, in the main, not decided that the primary path to becoming a programmer should involve doing material actual programming.

Well, that's because CS programs are designed to teach CS, not to be a vocational school for programmers.


I've been spending a lot of time lately wondering if people who treat colleges and universities as though their job is worker training are confused, or if I'm naive in believing colleges and universities exist to expand human knowledge through research and education.

I suspect the answer is that our culture is confused, we want people who are smart and well rounded to also be useful. And the ugly side of it is, we want clear class divisions. One would never refer to law school or medical school as "vocational".


> One would never refer to law school or medical school as "vocational".

Sure you would. They are 100% vocational schools. Their cost, acceptance rates, time commitment, and difficulty level may be higher than other vocational schools, but that doesn't change what they are -- 100% focused on giving you the knowledge and skills to do a specific job.


Law schools (or at least the top 20 school that I went to) are definitely not 100% focused on teaching you the vocation of being an attorney.

Hell, they aren't even focused on teaching you to pass the bar, which is why most people pay for separate cram schools.


Yet, going to law school is a pre-requisite for becoming the vocation of a lawyer.

Arguably, "a school one must go to in order to practice a vocation" can be referred to as a "vocational school." I understand that vocational schools are often thought to be purely practical and not theoretical, but I think this definition does a disservice to the reality of entering certain vocations.


No, it is not a pre-requisite. At least in the U.S., several states still allow you to pass the bar and become a licensed attorney without going to law school.


That's the point - they are, but are not referred as such.


They are in fact called "professional schools" -- this captures what both of you are saying.


They are actually called "professional" schools, but yeah it's a euphemism that means the same thing.


Well then where are all the "vocational" CS programs at universities?


Canada has ‘software engineering’ degrees which are essentially vocational CS degrees.


To add to this, Canadian Universities also have co-op programs.

You don't have to have an engineering degree to get a co-op job. The student gets paid well over minimum wage (and learns on the job) and the employers get financial incentives from the gov't to hire students. Its win-win for both the student and the employer.


Software Engineering at Canadian universities does not contain a dramatically different level of vocational training (with the exception of Waterloo).

Co-op programs are optional, and take place in the summer or in a gap year. The affiliation of companies hosting co-ops is loose, and the schools charge large fees for participating in these programs (with the exception of Waterloo).

Speaking from personal experience, results may vary.


Which Canadian schools have software engineering degrees which are vocational?

Note that my degree is from a well-reputed Canadian school that has a "software engineering specialization", and as far as I can tell (though I did not take it) that specialization is indistinguishable from the main CS curriculum, and is not valid vocational training.


Medical and Law schools are postgraduate.


In some places. In Ireland, law is a normal undergraduate degree with some extra vocational training after you graduate. Medicine is an undergraduate (5 year) programme, during which quite a lot is vocational - students spend multiple months in hospitals during that period, and when they graduate they are still considered as training.


Coding Boot camps


... which are not at universities.


And will also not prepare you for a job in computer science.


They might prepare you for an intro-level job in software development though, if you actually continue learning things and honing your skills after completing them.


For what it's worth, these bootcamps are the current "trade schools" for software jobs where the barrier to entry is lower. They're accelerated programs that use job placement as a strong selling point.

It's almost like they rose from the death of many for-profit technical schools (DeVry, WestWood et al.) and took their ecological niche. But in contrast to bootcamps, these for-profit schools were too big, slow, and ineffective.


in the business college.


Higher education is used as a proxy for intelligence by companies who are hiring because they are not allowed to use IQ tests. So I think it kind of makes sense that our society is confused about the purpose of higher education.


It's not just an IQ test, it's a test of ability, which is different. Your college record shows that you

a) Whether you were able to get the work done. b) Whether you were smart enough to do the work well.

It's the combination that's important, not either single one. Employers want people who show up on time, actually care enough to try, and are smart enough to figure it out.

The college degree has a better than zero correlation to that. After you get the job it's mainly about performance and output, once you've been promoted a few times your college degree mostly doesn't matter.


I agree with all of that, though spending 4 years to make that determination seems crazy to me. A competent employer can figure out which hires are worthwhile in about a month or so.

There’s another issue with using educational attainment as a proxy of intelligence: it doesn’t really help you distinguish among candidates who all have a bachelors degree. All you know is that they’re above some threshold, but not how far above. You can use the prestige of the school as a proxy for this, but it all seems like a hell of a lot of hassle when you can figure this all out in a half hour with an IQ test.


IQ test tell you how good someone is at pattern recognition, which is a major factor in abilo to learn. It doesn't tell you what they have learned, or if they have the criticality important skill of perseverance/grit. In fact, high IQ can reduce grit, since clever pattern matchers use their cleverness to avoid working hard on the toy problems of childhood.


My personal take on this is that high IQ quickly turns into a liability in childhood. Particularly when entering the school system.

I believe the words "bored to tears" summarize the experience quite well.

While the majority of pupils get a standardized curriculum designed to keep their interest at a steady pace, pupils with a high learning capacity get no such thing. If they try to learn faster, it doesn't fit with the governance and management model within which the teacher must operate. Therefore, most teachers are at a loss when faced with these statistical outliers.

Also, other pupils may experience emotions of inadequacy and unfairness when a fellow pupil just blasts through the material in minutes that would take them all week. This may lead to the high capacity pupil being a target of some unfortunate group dynamics.

Since schools have no governance model for this, the high learning capacity pupil's school experience is essentially unmanaged. At a loss, society almost invariably resorts to platitudes like "No need to feel sorry for them, because they are so fortunate to be smart. We must focus on the pupils that struggle."

A few years down the line, the pupil's inner motivation may be completely replaced by depression, self-blaming or worse. Then the platitudes take a turn for the worse with blaming the pupil's willingness to work: "In fact, high IQ can reduce grit, since clever pattern matchers use their cleverness to avoid working hard on the toy problems of childhood."

Fortunately, my own school days are long since gone. Without blaming any person in the system, I can say: "Good riddance!" to this whole pitiful affair of how society treated me as a child.

Instead, I can draw attention to this problem by saying clearly that these are children that never asked for these gifts in the first place. Let's as a society realize that what we are doing to them is absolutely wrong and woefully irresponsible.

Fortunately, at least one western country has political attention on this right now. I will work hard and with "grit" to make sure that my experience and observations can help in creating new policies. In particular I wish to address how to practically leverage the pupils' own drive to learn without incurring social/peer stigma or ridiculous costs.


This is a huge problem.

We are perfectly happy to identify athletic talent and enact systems so that the athletically talented are matched up with peers who share their gifts, but we mostly refuse to do this for intellectual talent.

I was fortunate enough to attend a program that handled this pretty well. Basically it was 120 kids from 11-18 who skipped high school and attended university together instead. It’s called the Early Entrance Program and it’s at Cal State Los Angeles:

http://www.calstatela.edu/academic/eep

The issue I face is that I am raising my children in Portland, Oregon, so this program will not be an option for them. Our personal approach is to home school instead. And I’ll probably investigate starting something like the Early Entrance Program at the state university in Portland here.


It also shows that you could do that for several years, following the rules set by someone else, and passing.


Nice double meaning with 'passing' also meaning something like: "able to appear to be what those in power want me to be."


I would not interpret it that way at all. Being able to understand requirements and implement stuff based on those requirements, or work as part of a team, or maintain dedication to a task for weeks or months at a time, has nothing to do with teenage notions of the world being populated by conformists vs. nonconformists (which ironically is usually based on very conformist notions of what it means to be a nonconformist).

On the other hand, I have known a number of people who used their self-identification as a nonconformist as an excuse for why they couldn't cut it someplace, whether it's school or a job or some interpersonal situation. It's a bad excuse because it gives the person an easy out for not reflecting on how they could have handled the situation better, or avoided getting into the situation in the first place. It also pushes all of the blame on other people, which is comforting but very rarely 100% true. We should all be our own worst critics, lest someone else beats us to it.


For better or worse, it _is_ a proxy for many things. But one that i think is most relevant to a typical job is the ability to see something that is sometimes fun, sometimes boring, and sometimes challenging through to the end. Typically staying put in one place while doing so. Its not intended to be vocational training, but in a sense it matches very well with what's expect of the typical employee: Show up consistently, deal with stress and various people and tasks with an ability to see it all through, without quitting or pissing too many people off. If you can cope well and make it through undergrad, you can likely handle a vast array of jobs.


> because they are not allowed to use IQ tests

There's actual employment law that prohibits using IQ tests? I'd never heard that before


An IQ test can be used during the hiring process but the employer has to be able to defend its use as directly relevant to the work at hand. So most employers aren't willing to spend the legal fees related to defending the use of a direct IQ test... Possibly because the college degree is a good enough litmus test for IQ to satisfy their needs.


If it's difficult to prove that IQ is relevant to the job... Maybe it's not. Also, I doubt that's the reason.


It’s not difficult to demonstrate that IQ is relevant to job performance (its the best predictor of performance for most jobs). The problem is to demonstrate that it doesn’t have a discriminatory effect (there are large, persistent differences in IQ between races).


There's already an imbalance when it comes to different genders and races, I don't get what they would lose.

The real reason companies don't do it is because Fizz Buzz is a better test than IQ tests.


Source on that first claim?



It's related to anti-discrimination / "psychology-test" laws. If there is any sort of "test" given as a condition of employment, it must be relevant, standardized, non-biased, reviewed by specialists and approved, etc.

http://cdn8.openculture.com/wp-content/uploads/2014/07/Test1...

The value and definition of "IQ" is widely disputed and generally seen as of limited relevance to the actual work and performance of that work (uncorrelated). Look it up and do some research if you're interested, this is just a quick showing of _why_ tests may be considered bad.


One day I wonder if some lawyer will strike it big by making an argument that looking at universities is discrimination because e.g. family connections and wealth give you a big leg up in getting into certain elite schools...


Yeah, but most universities engage in affirmative action, which theoretically negates this.


The linked image you shared is one I've seen before that was a Louisiana state voter literacy test used for voter suppression. It was disproportionately administered to black voters[1]

I'm not saying that IQ tests are useful measures of ability to do a job. I just wasn't aware there was actual law or precedent forbidding their use.

> If there is any sort of "test" given as a condition of employment, it must be relevant, standardized, non-biased, reviewed by specialists and approved, etc.

Wouldn't this disqualify a lot of whiteboard interviews too? Most startups don't have a standard question bank reviewed by experts or grade on a standard rubric.

1. http://www.slate.com/blogs/the_vault/2013/06/28/voting_right...


Whiteboard tests are directly relevant to the position being hired for and, thus, okay to use.


The classic "how many gas stations are there in the USA?" type of interview question seem exactly like an IQ test and not at all relevant to job requirements.



If IQ tests were a reliable measure of any job capability, surely the brilliant minds at Mensa would already have made a mint by selling recruitment services.


See this commment elsewhere in the thread: https://news.ycombinator.com/item?id=15828513


Well, part of the problem is that mensa is a total crock of shit.


Whatever it is, it's a group filled with people who score very high on iq tests.


You may not know this but a primary purpose of Mensa is to help "smart" people find other "smart" people to socialize with. In that sense it succeeds (I'm a member).


Define "crock of shit?" It's just an organization with membership requirements, by that logic organizations such as the VFW are a "crock of shit."


For fuck's sake, cut out the middle man and just request SAT / ACT scores.


I'm canadian so have never taken these tests, but I was under the impression that a large component of scoring well on them is cramming?


Not really. I think of the SAT as a great equalizer since a brilliant kid in rural Nebraska will do well and be able to go to an elite college, while a silver spoon heir will spend thousands and maybe improve their score but certainly not max it out.

Not to toot my own horn but I got a perfect score on the math part of the SAT with little prep, and I've tutored others on the same and only managed to eke out a little improvement. (In contrast, I've tutored math course work and had more success there.)

The math isn't very advanced, it's just a matter of doing it quickly and accurately.


> I think of the SAT as a great equalizer

The truth is more nuanced [1].

[1] https://www.washingtonpost.com/news/wonk/wp/2014/03/05/these...


From a psychometric perspective, they are IQ tests, in that they show the same correlation with IQ tests that IQ tests show with each other.

From a research-into-today's-hot-button-issues perspective, they are routinely slammed for being about cramming, but the effect of test preparation on test scores tends to be very close to zero when you look into it.


I did very well on both tests in high school without much cramming (I skimmed one study book from the library), and scored something like 90th percentile when I took the ACT in middle school.

Whether this is correlated to anything important in real life is an entirely separate question.



You're correct.


I did well, but I also took it with a severe migraine. Can I take it again now?


>Higher education is used as a proxy for intelligence by companies who are hiring because they are not allowed to use IQ tests.

I think you're wrong about that,

http://abcnews.go.com/US/court-oks-barring-high-iqs-cops/sto...


Your link is to a story about an employer who uses a test which is a proxy for intelligence (they even give the rough equivalency between the test they use and IQ). I think that supports my point that intelligence is one of the more important things employers are looking for. Most employers don’t have the resources to develop a hiring screening test and then certify that it is not discriminatory, so they use educational attainment instead.

Another way to approach this is to ask yourself: if not as a proxy for intelligence, then why do some jobs require a college degree?


Still wrong.

https://www.hiresuccess.com/blog/is-employment-testing-legal

>I think that supports my point

I didn't say it invalidated your point. I'm trying to draw your attention to a specific area in your statement that is wrong. You can simply say,

>Higher education is used as a proxy for intelligence by companies

That's enough. Nobody will argue with you about that.


IQ tests used for hiring do exist in the US - https://en.wikipedia.org/wiki/Wonderlic_test


Yes, and that exact test was the one that is legally fraught: https://www.wonderlic.com/resources/publications/white-paper...


Higher education seems like more of a "class marker" to me.


Or maybe the causal factor for both class and higher education is intelligence. The Bell Curve makes a compelling case that the transition away from agriculture toward knowledge work over the last 100 years has created a society in which intelligence is funneled into higher education and from there into high paying professions.


The Bell Curve is a famous case of quasi-science. You are putting statistical speculation up against centuries of well-documented explicit discrimination policies.


Have you read The Bell Curve?

You may disagree with some of its conclusions (I personally am skeptical of its conclusions about racial differences in IQ), but to call it pseudo science is, I think, unfair. FWIW, Steven Pinker, who is about as thoughtful and level headed as they come, agrees:

https://mobile.twitter.com/sapinker/status/84691284703672320...

And, anyway, the point I was making had nothing to do with racial IQ differences. It was referring to what the rest of The Bell Curve is about (discussion of racial IQ differences makes up only about 25% of the book): how the US society’s transition away from most people working in agriculture toward most people working at desks has created a class structure stratified by intelligence.

It’s actually a shame that the book discusses racial IQ differences at all, because everyone got hung up on that and totally missed what is, IMO, a much more important point. And I think it explains a lot about current politics in the US.


You seem to be rather confused about the Bell Curve.

It's my understanding that the main data presented hasn't been overturned or disproved (In-fact quite the opposite)


Meh, IQ is potential, not capability, to do a job. I don't find this argument convincing. What about all the literal geniuses who have no interest and never studied CS?

At least the graduate can show that he/she can commit to a multi-year long, sometimes boring, project successfully. A high IQ person could just leave your company after 6 months because "they're bored".


I’m not saying employers would hire solely based on IQ. That would be crazy. It’s just a screening measure. But so is educational attainment. Nobody gets hired because they have a college degree, but plenty of people don’t get hired because they lack a college degree.

Also, agree that educational attainment gives you more information about a candidate, but in terms of assessing intelligence, it is a coarser measure than IQ. I’ve hired a few people in my day, and there is huge divergence in intelligence among college grads. For lots of jobs it is useful to be able to distinguish between not-stupid, bright, and ridiculously smart. Educational attainment doesn’t really help with that, but IQ tests would.


People aren't confused. Colleges do many things.

They're also very expensive to attend, and the most immediate value they offer is to get people jobs


I've heard doctors speak of themselves as having a trade skill... albeit one that pays well.


Fascinating point. Is "training vs education" a useful pair of terms? Universities used to be for education, now often its assumed they are for training only. (Bachelor of Science in Hotel Management I come across nowadays - in what?!)

I often hear people talk as if education was something that happened to them, then it finished. (Like learning is something you just do in an institution.) I guess training is more like that, teaching knowledge and skills for some particular task/role/job.


How is learning how to run a hotel not an education? Are all the Agriculture and Mechanical universities not education?


Well, that's the point - those things, training to do a particular job, are now thought of as 'education'.


> And the ugly side of it is, we want clear class divisions.

What? I suspect that it's typically, if not exclusively, those in the upper classes that want this. The ugly side of it is human nature is selfish.


Actually, I suspect that it's the middle class (or middle classes for cultures with more than three classes) that are most anxious about being mistaken as lower class. Actual upper class members don't usually care – they're not relying on university attendance to protect their positions.


I definitely would refer to law and medical schools as vocational, much more so than many other educational programs.

I have known several people who went to medical or law school and every single one them went only because it is a per-requisite to becoming a doctor or lawyer. What is your explanation, that people attend these schools to expand their mind, and then entirely coincidentally, manage to end up in the corresponding career as though by accident?

Get real.


Ugh...I hear this sort of thing all the time and I find it sooo obnoxious.

Look, the reason that all the students go to university is to make money. Like 90%+ of undergrads go there to make money so whether it "technically" should be a monastery of cloistered academics pondering the universe (based on some high minded notion of what learning "ought" to be for) is besides the point. University costs a shit ton of money and people (the vast majority) pay it so they can have a decent life.

And the fact that universities are, on the whole, teaching skills that do not help their students accomplish this means either A) they are actively misleading their students B) university is a class indicator more than a place to learn (I can spend huge amounts of money so I'm probably not poor and black - which sucks for all sorts of reasons) or C) they don't understand that teaching academic CS, as opposed to programming, is not what industry demands (so they suck at what they're selling).

These are the people who, quite literally, are billing themselves as the "smartest guys in the room". If that's the case they can well stop sucking at their bloody job!


There are two things wrong with this perspective.

First, if you walk out of a four year degree in computer science without a strong foundation in discrete mathematics, algorithms, data structures, and at least one of distributed systems theory, stochastic systems theory, control, statistics/machine learning, AI, data visualization, or design, you've stunted your horizons for growth as an engineer pretty badly. I use three of those every day at work. I wouldn't be able to read the research, do the math, or write the software to implement what I work on without a very strong education in fundamentals. The only CSE course I took in University that I haven't made active use of was an advanced matrix computation course. I mean, if you're just shitting out web services and REST APIs or converting COBOL to Java, fine, but that's something you can train a monkey (and in very short order, a computer) to do.

Secondly, we invest in Universities as a public good. You should come out of a four year program able to write well, critically analyze difficult material (not "Catcher in the Rye"), articulate complex ideas persuasively, and understand the world you live in and how to live in it substantially better than when you went in. You might be surprised to learn that with the benefit of hindsight, many people find the time they spent reading and learning to be incredibly useful later in life, as it provided a foundation for dealing with conflict and complexity. If that's wasted on you, then it's wasted on you, but understand that you have shot yourself in the foot in terms of your ability to mature as a human being. I personally think public University education should be substantially more rigorous and should fail people aggressively, and should also be free. I wish we separated the whole "whoo look at me I'm 18-22 let's party" thing from the thought of University. If that's where you are in life at 18, there's nothing wrong with that... go work as a code monkey or a barista and get it out of your system, and when you're ready to level up as a human being give University a try, go. That's more or less what worked for me, anyway.


>You should come out of a four year program able to write well, critically analyze difficult material (not "Catcher in the Rye"), articulate complex ideas persuasively, and understand the world you live in and how to live in it substantially better than when you went in.

What a great comment. I think that's also missed by schools where the focus is on memorization and/or trivia. It would be great if they focused on the areas you talked about, not necessarily directly, but at least that is the goal.


Well, not that it matters to my argument, but I did go to university, and did well (in the academic sense - much of my time would probably have been better spent drinking and networking rather than getting good grades). Personally I think it was an incredible waste of time. Even though I worked incredibly hard in many diverse subject areas, most of what I learned was taught at such an abstract level as to be completely unusable in any practical sense. Almost all of what I needed I later learned on the job as needed. Of all the things you've listed, why can't you just look them up on the internet or hire a tutor at worst?

As far as "leveling up as a human being", books are free - go sit in a library. Hell, grab a kindle; most of the cool stuff that I learned about the human experience I learned from talking with others, staring at the sky, hiking in mountains, and reading free books that people on the internet recommend. None of this requires taking out the equivalent of a mortgage or listening to someone lecture on it. If you want a lecture, those are free online too!

I just get the sense that university is used as a filter to differentiate yourself from people who are "below you" - not as smart, not as hard working, not as "mature as a human being". But the main barrier to get into a university seems to be connections and money, neither of which I have a huge amount of respect for. Many of the wealthiest and well connected people in the world are huge assholes, and many of the poorest and least connected are incredibly wise and kind (and intelligent and hard working). It feels like a way to separate yourself from the unwashed hoi polloi and put a barrier to entry between yourself and other people who could do your job. In that sense university encourages laziness!

Overall, I just get this sense that university is this bullshit thing we all just agree to participate in because, for those who go they get a monetary benefit, even though everyone knows its just this bullshit thing. That monetary benefit comes from being a class filter, a networking tool for like minded rich people to find other like minded rich people, and probably a few other class effects I don't really understand. I thought when I was younger that as I got older that maybe I would get more jaded and just accept bullshit things. It would certainly make my life easier to do what other people do and not fight against the tide. But if anything I've become more altruistic and romantic, I want the world to be more honest and people to do the right thing.

I don't know. I've probably benefitted from it even though I wished people judged me based on other qualities that are more important.


First, you should know that the "barrier of connections and money" you talk about does not apply everywhere. Actually, a lot of universities are free so long as you prove you are smart enough through entry exams. So while it still is used as a class filter everywhere, realize that there is equal chance in a lot of countries to get into any university.

Secondly, universities are not meant to prepare you for a better salary or job. They are meant to give you a particular set of skills and knowledge from a field of work. Finishing means that you usually get access to a certain community with the same specialty. This is useful if you want to keep up to date with your field or publish new findings.

Software companies that require computer scientists usually are just full of it and they forget that there are other universities that prepare people for software engineering or even just programming. You could argue that a course or bootcamp is enough. But we already know they ask for too much when you look at the requirements they have. They aim for the best and then take what they are given.

I think universities are too easy nowadays to let people in. During my father's time, there were around 10 slots for the arts university in my country each year. All of them ended being praised artists. Actually, national news are made when each one of them pass away.

Now, universities just take everyone, especially if they pay for a slot. This caused a shift into perspective. You are no longer hard-working or lucky to get into a university, you are now expected to get into one to prove you are not a complete moron.

This is the actual problem, and it is reflected in your arguments. I agree, you probably didn't need to go to university. Hell, I know that I didn't need to go through mine. However, we were both expected to, and for no good reason.


> universities are not meant to prepare you for a better salary or job

Regardless of what any individual may think a university is or isn't meant for, this is how they are used.

As a high school dropout, I was directly told by managers in more than one situation that all I needed was a Bachelor's and I would be eligible for better positions. There was zero discussion of skills, as it was already understood that my skill level in many areas was much higher than that of the degreed people I worked with (who also weren't using their educations for anything other than as a "door pass" to be considered for better jobs).

The system is broken.


Where the heck do you work? How long did it take for you to get to that position? What was your role as a junior?


> the reason that all the students go to university is to make money. Like 90%+ of undergrads go there to make money

Which is it, "all" or "90%?" I went to a private liberal arts college so maybe that automatically puts me and my classmates in the 10% but I did not think the overwhelming majority of us were there to make money. The fact that lots of students still major in subjects that can be very worthwhile and valuable but lack obvious paths to financial success (e.g. most of Humanities and a lot of Social Sciences) indicates "making money" is not why they're there.

You think universities should basically be trade schools. They're not and they shouldn't be. Trade schools should be trade schools but the U.S. has a dearth of them and many that fit the description aren't good enough.


I really hope we see greater specialization of mid-line universities that don't know whether they want to be an academic school or a trade school. Programming-as-inquiry is extremely difficult due to the amount of effort and math required to produce novelty, and programming-as-trade is much less difficult in comparison as a lot of the systems you're composing together already have their principles worked out. You can work from the manual in the latter case, while you're defining the manual in the former.

Meanwhile a university not knowing whether it wants to be a trade school or a research school has mixed incentives that results in an inconsistent quality of education. You get professors who don't know how to teach and teachers who don't know how to go in depth. By requiring both you've made your staffing requirements that much harder to fulfill if you don't want to trade off education quality.

On that note how have bootcamps been doing lately?


The fact that lots of students still major in subjects that can be very worthwhile and valuable but lack obvious paths to financial success (e.g. most of Humanities and a lot of Social Sciences) indicates "making money" is not why they're there.

"Financial success" doesn't necessarily mean optimizing the total number of dollars in your income; it can also mean being able to do work that you enjoy while having a decent standard of living. If a college degree had no impact on your employment opportunities, how many students would still go?

Trade schools should be trade schools but the U.S. has a dearth of them and many that fit the description aren't good enough.

Which is the point. It's not that universities shouldn't exist, it's that they shouldn't be trying to serve both roles.


> the reason that all the students go to university is to make money. Like 90%+ of undergrads go there to make money Which is it, "all" or "90%?"

Ok come on man. I said 90%+. Don't quibble just to be a prick.


If you can't stick to posting civil, substantive comments, we're going to have to ban you. Nitpicking is annoying but it doesn't give license to damage the site in your own right.

Would you mind reading https://news.ycombinator.com/newsguidelines.html and https://news.ycombinator.com/newswelcome.html and abiding by the desired spirit when posting here?


The name-calling seems a little uncalled for.

I also went to a good liberal arts college and had a similar experience. People studying subjects that probably wouldn't lead to financially lucrative careers were in the majority.


> University costs a shit ton of money and people (the vast majority) pay it so they can have a decent life.

First of all, no one is forced to go to university, so if people pay that much money, it's probably because they feel it's worth it, regardless of its supposed "cloistered academicism".

Secondly, maybe the main problem is that something that is needed (or at least very convenient) to have a decent life shouldn't cost a shit ton of money at all. As is the case in countries like Germany, etc. (if your parent post is from one of those, your reply doesn't even make sense).

Finally, I would rather hire a coder with a strong CS background that one from vocational school. The former may not know the language du jour and the framework fad of the day, but can pick them up easily because they know the fundamentals. The latter will likely adapt much slower.


To be fair, they do a pretty bad job teaching CS too. Your average undergrad program is almost entirely learning about CS, not learning CS. A non-trivial of students get through a CS degree never doing anything novel in CS—a real shame because, unlike math, research-level CS problems are eminently accessible to undergrads.

It doesn't even have to be "real" research: I just expect more than classes which have you regurgitate what you learned to pass. And yet that's all you need to get a CS degree. It's a bit like learning to write by filling in blanks.


Your average undergrad program... CS degree never doing anything novel in CS

Is this different than any other field? I was under the impression that most BS programs do not require novel research. That doesn't come until post graduate courses.


I don't necessarily mean publishable—that's doable but difficult in CS—but novel in the sense of requiring non-trivial creativity and insight within the field rather than replaying things you were taught directly.

An example might be writing a compiler for a language you design with features not covered in the book. It's probably not publishable, but it is doing CS in a way that just following instructions in a book or from lectures to write a compiler isn't.


So personal projects that extend the material you learn in courses and take 1-2 semesters to complete? I think plenty of schools already do that as part of a BA/BS graduation requirement.


You just must have had a bad CS program, my undergraduate degree involved plenty of those types of projects.


I think it doesn't come until a second degree, or sometimes a graduate degree. Not post graduate.


Something was lost in translation here. Graduate degree and postgraduate degree mean the same thing, at least in the US.


I don't think industry needs every CS graduate to produce something novel. The work of many is just building upon the work of others.


"novel = of value" in this case.

Fill in the blank programs ("Here's the skeleton, make it display a list") and 3 projects with < 200 lines that aren't doing something CS specific are not novel, but all too common and wide spread.


A lot of things of business value go like this, for example: go build this UI in React, etc. You don't need the ability to significantly return something back to academia to do that. You just need good coding mechanics.


Not the point: college is for CS. Not for "building on the work of others." You learn the basics that everyone needs to do the advanced work. If that isn't needed/useful then you need a vocational school instead of a CS program.

Making people do continual simple demos and nonsense programs instead of teaching them actual CS is BS. They can be simple programs, they don't need to build huge things, but what they build should reveal and teach them things beyond how to shift text around the screen.


Are there vocational programs beyond those 20-week'ish coding schools? Because those don't seem to equip people with quite enough skill.


> classes which have you regurgitate what you learned to pass

I think that's what school is, no?

I'm not trying to be glib, I think this is just a reality of how education works in pretty much all subjects (typical US academia, YMMV etc.) Can you name a major that doesn't work this way?


I had to write real proofs (tho I didn't fail completely if it wasn't free of errors) for my U.S. math undergrad degree.


Did those proofs have to be novel, or were the proofs of already known concepts?


I'm sure those proofs at the time were novel to the student but obviously not the professor who assigned the problem. That's how the student learns.


Novel to the student up to a point—math students are taught a variety of proof techniques in each of their subjects, theorems that students are required to prove generally fall to very similar techniques.


They have to be novel in my experience.

I have a BS in Mathematics from a mediocre school (University of Arizona) and exams were always "prove this fact (which you have never heard of before)", never "what is the proof of XYZ theorem (which we studied)"


there is so much room for questions, it's quicker to write the proof than to search an answer. And a readily found answer would, for undergrad, less likely be in proof form, or too distinct for specific topics later on. And the instructor might become suspicious and interogate you, in which case, if the material helped you to understand, that's the point.


Nostalgia: I really love the 60s definition of CS at Stanford, which seems more inclusive of practical aspects than the theoretical bent of today:

"I consider computer science to be the art and science of exploiting automatic digital computers, and of creating the technology necessary to understand their use. It deals with such related problems as the design of better machines using known components, the design and implementation of adequate software systems for communication between man and machine, and the design and analysis of methods of representing information by abstract symbols and of processes for manipulating these symbols. Computer science must also concern itself with such theoretical subjects supporting this technology as information theory, the logic of the finitely constructable, numerical mathematical analysis, and the psychology of problem solving. Naturally, these theoretical subjects are shared by computer science with such disciplines as philosophy, mathematics, and psychology."

http://i.stanford.edu/pub/cstr/reports/cs/tr/65/26/CS-TR-65-...


And yet many job listings for programmers demand you have a CS degree…


My first boss once told me that when he looks at hiring, having a CS degree means that you know how to find solution to problems. Being able to code can be learned on the job if needed. Most junior jobs in bigcos are usually code monkey type of work anyway.


Because for many employers "ability to program" is a minimum requirement, not the full job description.


For far more employers, "ability to program" is something they have no idea how to filter for in a pile of resumes, but "CS degree" seems like it might be a usable proxy.


>Because for many employers "ability to program" is a minimum requirement, not the full job description

I think this was the author's point. You state it is a minimum requirement, implying a CS degree is a more challenging requirement. Yet it's not hard to find people with CS degrees who do not have the minimum requirement.


That's because CS grads always want to be pair with their own and put heavy work on "planning" how to do things well rather than starting working on it.


...yet that's not obvious to everyone, hence the inclusion.


Forgot I just had this argument a couple of months ago: https://news.ycombinator.com/item?id=15502022


It's not about that fact. The author argues that programming schools should be vocational schools (while ostensibly presenting a list of objective facts).


The author doesn't seem to be making a normative argument here. Rather he is observing that something many people believe (someone with a CS degree will be a trained programmer) isn't true.


I replied to your sibling comment here: https://news.ycombinator.com/item?id=15827977


> The author argues that programming schools should be vocational schools

I dont see where. The implication is that they are assumed to be contrary to reality. It's supposed to be a list of obvious facts.


The full tweet, from the link: "CS programs have, in the main, not decided that the primary path to becoming a programmer should involve doing material actual programming. There are some exceptions: Waterloo, for example. This is the point where I joke "That's an exhaustive list" but not sure that a joke."

This reads to me that Patrick wishes that the joke were not true, which is corroborated by other comments of his, such as https://news.ycombinator.com/item?id=7761134#7762383

The fact that you think that the article is in any way objective is why it is so dangerous. Ideology is like water. Patrick's assumptions and biases shape how he presents his "facts" and which facts are left out. It's not objective! What he is really doing is presenting a specific market-based worldview, which has certain insights and major deficiencies.

Consider for example "There is no hidden reserve of smart people". It sounds like an actionable insight, but is it true? Is it even falsifiable? What does it mean? And what actions would it lead you to take?

My answer: it reinforces the perspective that smart people are the only people worth considering, that there are few smart people, and (in context) that these people are mostly "generalists" and entrepreneurs. It values "intelligence", probably measured in terms of short-term ability to make money, and leaves out values and self-reflection. It leaves out the possibility that you can learn from anyone (c.f. the story of the $20 fan[1]). It leaves out the possibility that your hiring process could be discriminatory or unwelcoming. And so on.

This, I argue, is a good demonstration of why programmers need liberal arts; technical decisions are never just technical decisions. We must remember to ask who wins, who loses, is that desirable, is that just?

[1] http://cs.txstate.edu/~br02/cs1428/ShortStoryForEngineers.ht...


> The author argues that programming schools should be vocational schools

I still don't see this argument being made in ANYTHING you posted now.

> The fact that you think that the article is in any way objective is why it is so dangerous.

How does this relate? I asked for how you came to a conclusion and have gone off into other topics that you're using to characterize a perspective or ideology. I don't care about what you think or what the author thinks. I care about finding how you jumped to a conclusion based on the article's text.

The handwaving to make your own intellectual assertions (right or wrong) have nothing to do with my issue, in the content of what you originally stated. Good luck with whatever.


Yes. I often explained that CS was more like a math degree than an engineering degree.


An engineering degree won't teach you how to do things, either. I have one. Grads aren't able to build bridges, silicon chips, airplanes, or whatever else engineers are supposed to do. (Ok, I built a radio in my first term. But it was a very simple one.) They just have some intellectual tools that are useful precursors to gaining such knowledge.

You can liken CS to something like Chemistry. Is Chemistry about test tubes and HPLC devices? No. Will you learn how test tubes work when you do a chemistry degree? Probably.

It just happens that the practical skill practiced by CS majors is coding. And coding has this unusually large applicability that wet lab work doesn't. Does that mean CS is about coding? No. But you'll probably learn more about coding doing CS than most other things.


Mechanical engineering, fluid mechanics, electronics, all have a common core of mathematical modeling techniques. An engineering degree is fairly flexible, and one can even pivot into CS. But CS has no commonality with engineering, and a CS degree cannot pivot into engineering.


It wasn't that many years ago that most colleges didn't offer a true CS degree, but rather a math degree in CS. When I was applying to colleges in the early 90s I remember it being roughly 50/50.


Ironically, an engineering degree is also like a math degree.


Meh, at least my (mechanical engineering) degree wasn't.

Sure, you learn some mathematical tools that can help you solve some very practical problems, but it stops about there. I was not taught anything close to creative mathematical thinking. I was never taught the fundamentals of things: the axioms and the theorems on which the tools we "learn" are based on. I can guarantee you no one in my graduating class could read a proof of even modest complexity, let alone write one.

Granted, I did go to a school with a good reputation, but one of a very practical/applied program. Still, I'm pretty convinced that math and engineering are very, very different fields, even if all engineering programs include a handful of math classes.


My engineering classes (Caltech) were nearly all applied math classes.


“Applied math” isn’t very similar to what you learn in a reputable math degree, so the point stands.


No it is not. I studied math and the engineers only went as far as differential equations, basic (non proof based) complex analysis, and so on. There were zero engineers in theoretical classes.


> non proof based

What's that about? Does the lecturer just state theorems without proving them, or does it mean that the students aren't required to be able to prove them?


It means the activity of the class (lectures, homework, exams) is to compute numeric and algebraic solutions according to the methods and procedures you're taught. The outcome of a class is generally that you're able to solve a new kind of equation.

This describes 99% of the math that most Americans (including engineering students) who are not math majors will ever encounter. Proofs may be presented as curiosities, and students may even be quizzed on them, but students would not be expected to come up with proofs they haven't seen before.


Electrical engineering surely is.


An applied math degree


Having attended Waterloo, this isn't very surprising at all. The CS department is part of the Math Faculty.


I would claim there is a significant portion of population for whom empirical teaching of CS would be far more kind and efficient than going through the Djisktraist pure theory route. Even Djisktra began with writing actual programs until he grogged how to think about it and then moved to pure theory.


>Well, that's because CS programs are designed to teach CS, not to be a vocational school for programmers.

And which is why they should be moved out of the College of Engineering and placed into the college that has mathematics. Curiously enough, the only school he mentions where he can reliably know their grads have programming experience actually does place CS under mathematics.

I think being able to program basic tasks should be a requirement for all engineering programs. They all require a course in CS - mostly not because they want ME's to know CS, but they want them to be able to program basic things.

Yet, many/most engineering grads are really poor at it. I feel for any given engineering degree program, we should make all seniors who have declared they will graduate that semester to take a basic exam on very basic concepts that all grads of that program should know (strong emphasis on the word "basic"). If it's acceptable for them not to know basic material in an introductory class, then remove the class from the curriculum!

Example: I was in a top 5 school helping an grad student in CS do basic probability homework. We were disagreeing on the correct answer, so I suggested he write a simple program to simulate it and compare with his calculated answer.

This is the simplest program possible: A for loop, and a random number generator.

He had no idea how to do it.

I'm not picking on him because he's a CS student. I think it's a valid question for people in all engineering programs that involve introductory programming courses. What's the point of making such a course a mandatory requirement if they can literally do nothing based on that knowledge?

Similarly, an electrical engineer who does not know, say, Kirchoff's laws should not be allowed to graduate.

Yes, I do think universities are for education, but if they cannot apply what they learned, they are not educated.


I went to a polytechnic university (i.e. one with a "learn by doing" focus). Our engineering students definitely had to have some programming classes, but they were taught separately from the CS ones. Most engineering students considered them an annoying roadblock in the way of getting back to their "real" work, and I (a CS student) spent a fair amount of time tutoring friends. I'll have to ask some of them if they use any form of programming now, in their day jobs. I suspect I'd get a mix of answers.

As for the CS students, lectures were often on the theory, and homework was usually focused on implementing demonstrations of the theory. A grad student who wasn't a decent programmer was either a cheater in undergrad, or didn't come from our school.


>A grad student who wasn't a decent programmer was either a cheater in undergrad, or didn't come from our school.

This particular student came from a different engineering background. Yet, he still got admitted into a top 5 CS program.

And whether he did CS or not in his undergrad is beside the point. I think this is a task any engineering grad should be able to do. If not, the university should not be listing programming as a core skill they teach their engineering students.

BTW, he is not unique. I don't think many (perhaps 20-30%) of my fellow grads (non-CS engineers) could code something that simple either when they were graduating.


you are being very critical, a part of university education is tolerance, although some opinions differ on if that relates to knowing literally everything. Trade-offs are a part of engineering because optimal solutions don't always exist. I'm saying, programming often is just knowing which functions to pick, which is not an inert skill except for reading comprehension, if there is a good documentation. There's nothing universal to learn there, the interesting bit is writing those functions. But a pseudo random number generator is rather advanced stuff.

I hope you helped out as good as you could to bridge the gap and maybe you were rightfully disappointed if the student hadn't prepared to catch up to undergrad material in spare time.


>I'm saying, programming often is just knowing which functions to pick, which is not an inert skill except for reading comprehension, if there is a good documentation. There's nothing universal to learn there, the interesting bit is writing those functions. But a pseudo random number generator is rather advanced stuff.

First of all, I'm not asking them to know "advanced" random number generation theory. As a kid, using random number generators in BASIC was one of the first things we did when we learned programming - it makes writing programs a lot more fun. Granted, it's not usually covered in introductory programming courses. But all that is needed is one API call. I pretty much told him what he needed to do. I showed him the API he needs to generate random numbers. We talked in big picture terms about the algorithm (set up the experiment, run 10000 times, take the ratio of success over number of runs). We talked about the theory on why such an experiment would answer his question. I helped in every way other than the actual coding. Had he written some code, I would have helped debug (some probability experiments are tricky to code correctly - that is acceptable).

He just couldn't translate the problem to code.

>You are being very critical, a part of university education is tolerance, although some opinions differ on if that relates to knowing literally everything. Trade-offs are a part of engineering because optimal solutions don't always exist.

I definitely am being critical. When you graduate, you get a seal from your institution that is supposed to be a guarantee of something. My question is: What is that something? If you start dealing with math graduates from a university and you find they often cannot do basic algebra, you stop valuing the seal the university provides - which damages all math graduates of the university. They are working very hard to get a certificate that is not valued by others.

Would you trust a physics graduate who cannot do basic calculus? You can argue that he may know all the theories (conservation of energy, electromagnetics, etc), and understands the principles behind calculus (area of curves, rates of change, etc). But if they struggle with basic evaluation of integrals, you probably will wonder about what kinds of courses they taught in his university that allowed him to get a degree without actually solving problems with calculus.

This is the problem the author is talking about. If you consistently interview people with computer science degrees, who cannot do a simple fizzbuzz, you stop valuing the degree - and that is the general landscape in the software world. Those who worked hard to gain a lot of knowledge relevant to both theory and practice are right to be upset that because of lax (or at least misguided) standards from their educational institution, they have to go through more hoops to convince people of their skills.

It goes beyond the simple mantra of "Universities teach computer science, not software." I would question the technical ability of anyone who cannot write such a simple program, with as much guidance as I provided - be they a CS major or a math major. I'm not asking for practical knowledge like whether unit testing is useful, or how to architect large software, or the virtues of OO over FP. Just a basic program. Can you write it?

Certainly there is not universal agreement on the curriculum. I'm saying there should be agreement on some basics. Historically, universities handle this by ensuring some introductory course is in the curriculum. My argument is that in the US the education system is a little too modularized, and there should be more coupling. If programming is seen to be a core skill that engineers should know, make writing code an aspect of every engineering course (it's really easy to construct assignments/projects in pretty much any engineering course that could involve writing code).

Don't let someone get an A in the introductory course in his freshman year, and then allow him to forget pretty much everything he learned from it by the time he graduates. If you do, you wasted the student's time, the instructor's time, and the time of all the interviewers who cannot trust your degree and have to check for themselves if this person knows the basics.

Again, it all boils down to: The university is giving you a degree. Is this degree a guarantee of anything? If so, what? I'm all for education for the sake of gaining knowledge. But how many of us would be OK with gaining the knowledge without any kind of certification from the university? Not many, I suspect. I actually did this personally (dropped out of PhD after many years as I felt I gained the knowledge but did not feel a degree would help me), and can assure you from the conversations I had with others who tried to convince me otherwise that pretty much almost everyone wants that paper to have some value.


There is a well known delineation between applied math and pure math. It seems that such a convention has yet to be established for CS.


Not sure that's an accurate comparison. We make the English majors do a ton of writing, the art majors produce a lot of art, and no one seems to have any issue assigning a lists of problems to math majors as homework.

I suspect there is just a cultural issue, where the CS faculty doesn't want to do that style of teaching. I had a professor hand out debugging assignments because that was part of the industry feedback they'd gotten, but only the one professor listened to the feedback.


Sometimes* the best way to learn things is to do them, a lot.

- Graduate of the North Avenue Trade School

(* Most times)


I think there is a thin line here. For example I was taught I to programm Smalltalk. On the one hand that is not a commonly used programming language and on the other hand it teaches Object Oriented Programming like no other language.

So yes, after that you can't start as a Senior Java Developer but I think you have a very good foundation to quickly pick up other Object Oriented languages.

So you do not have to teach the latest hyped language to give students the ability to programm. Instead you can use clean languages to teach CS and let your students excercise them to show you that they understood what you taught them.


Reminder that patio11 might be saying all these things without trying to imply value. I.e., he's not necessarily saying any of these things are good and bad; the "obvious" part is just that they are true.


My CS education turned out to be “coding in a few fundamental domains” and I was very happy with it. We implemented some fundamental algorithms and components across operating systems, networks, distributed consensus, compilers/interpreters, visualization, sensing and perception, databases, machine learning, etc.

You were free to construct a degree out of proofs rather than source code if you wanted, but those of us who desired a more practical education could fill our days with systems programming.


What's the point of learning CS if you can't write a complete program? What the hell else are you going to do with it? Try to think of yet another way to write a sort algorithm?


Don't worry most CS programs aren't designed to teach CS either. They teach a random assortment of technical material that is neither terribly difficult nor terribly relevant.


Yet not having a CS degree makes getting a programming job 10x more difficult.


> People underestimate how effective a generalist can be at things which are done by specialists. People underestimate how deep specialties can run. These are simultaneously true.

This is really, really important. In my experience, this is not just unknown but actively ignored and disagreed with (either explicitly or effectively) by a large majority of people involved in software businesses. This hamstrings: companies, individuals' growth, and the advancement of the state of the art.


There's also often a messy transition period where a team goes from a bunch of generalists to more specialists. This can be extremely rocky for all involved as often generalists have lots of historic operational knowledge and strong opinions, but the new specialists come with external knowledge of best practices and how to scale.

It is important that historic knowledge is respected for the context it provides, while outside knowledge is equally respected for the deeper context and data behind it.

Unfortunately, this change often means generalists lose jobs in favor of specialists as a company grows if there isn't a good place for them, or if they don't rise to management. In fact, even early managers can be displaced by outside managers who come in with more management experience. And this is not inherently a bad thing for the company (while it may be for the employee getting bumped). The people that take a company from point A to point B are rarely the same people that take it from point B to point C.


Am I crazy to think that generalists are the best-suited engineers to making the managerial move? You can keep a good broad knowledge base sufficient to be useful in early directional discussions, and sufficient to sniff out a lot of (not all) BS artists and identify which specialists to listen to/loop in, in less time than it would take a specialist to do the same in terms of juggling things across team, due to the nature of the generalism skill.

I think otherwise trying to be a "senior generalist" or "staff generalist" is going to be very hard, but for logical reasons.


It's certainly a factor, but I'd argue that by a very long margin, people management skills are the best indicator for engineers suited to making the managerial move.

You have to know what you're talking about and know when to defer to experts, and being a generalist might prepare you well for that. But once you've reached the threshold on knowledge and critical thinking, the biggest barrier to being a good manager is mastering the mechanics of people wrangling.


Being a tech generalist is a great background for a manager. But there are some caveats.

Firstly, besides understanding the work, a manager also has to be able to actually manage subordinates. This is orthogonal to technical ability. Secondly, internal promotion of a generalist carries risk of that generalist being too opinionated based on his own work. They are much more likely to be a conservative force. This might be an unnecessary barrier to good changes.


Great comment. I found these problems in myself as I transitioned into management years ago. I was blocking good changes...

These days I try to just be a tie-breaker for the team, instead of laying out the plan and looking for "feedback".


Disclaimer: I would categorize myself as a generalist so I might be biased here

> Unfortunately, this change often means generalists lose jobs in favor of specialists as a company grows

> And this is not inherently a bad thing for the company

Yes, it is not _inherently_ a bad thing, but I would say that it is _generally_ a bad thing. Sure, if you are raising a Series C and everything is about optimizing the hell out of your product, there might not be a place for generalists.

However, I would claim that if you throw out generalists, you throw out some of the core/most valuable people who could help the company establish new products and help you grow into a multi-product company (diversifying your portfolio is generally a good thing). If the company is only left with specialists it will slowly transform into another one of those industry giants that struggle with innovation.


I am not sure if it as bad as described here. Yes, over time the relative part of generalists in a company shrinks. But it does not necessarily mean, that generalists loose their jobs, just because some specialist knocked at the door.

I think it is more like an organical transition where newly hired personnel are more likely to be specialists. And that is totally okay, because when you start a company there are about 0% specialists onboard ;-)


Late reply, but I think there's a second harm to throwing out generalists in addition to what you outline:

Specialists aren't as good as generalists at integrating the work of specialists across domains (almost by definition). A lack of generalists can then lead to a situation where specialists are all making locally good moves, but the overall direction is negative -- something akin to Simpson's paradox (though, in the other direction).


I think companies need _both_. Specialists can miss the forest for the trees, and at some point you need to go into the last 20 of the 80/20 rule, which requires supplementing or specializing your generalists.

And I believe it's entirely possible to be both; to be a better specialist you need to be a better generalist, and vice versa.


>often generalists have lots of historic operational knowledge and strong opinions

*citation needed

Edit: to be clear, in my experience the specialists that aren't also generalists on some level are the least productive on the team because they have trouble participating outside their knowledge domain. Maybe I have worked at the wrong places though...


Starting dev of some product, you might have a lot of generalists. They make the original architecture and design decisions, do the first implementations of features, etc. It's their baby. Later on, the company identifies laser-focus areas where specialists are appropriate, and you contract or hire some. The original devs (the generalists) will have knowledge about why something was built a certain way, and might react strongly when the specialist advises deep changes to the program to help it scale up.

That's the messy part of the transition: There's kind of the implication that past decisions weren't correct. It's easy for people to get emotionally involved and take personal offense.

Example: Backup product. Design started in the late 90s. By late-aughts, multiple CPUs were obviously a serious thing. New guy was brought in, and set on the project of re-architecting the core program into a more asynch data processing model. 6 or 8 months later, the changes were working for base cases, but the senior devs threw an absolute hissy fit focused on how much of their code was changed. We should've banded together and fixed the edge cases. Instead, we spent years fighting over the structure of the program, and that dev transferred to another group.

Bringing in security-focused employees worked out better. Their changes were more around the edges of the various programs, rather than in their cores. Fewer toes stepped on, fewer disagreements.


>"That's the messy part of the transition: There's kind of the implication that past decisions weren't correct. It's easy for people to get emotionally involved and take personal offense."

Thanks for highlighting something I missed in my post. Some of the time yes, the decisions might just be wrong because generalists may not be experienced enough to know the right way to do things. But quite often, they were in fact the right decision at that time and circumstances change.

So any sort of process and culture that helps take the emotion and risk of personal offense out of those situations is really critical to gaining the benefit of historical context in a positive manner.


To clarify, when a company is young, you have a lot of people wearing a lot of hats. They may in fact be quite good at wearing some of those hats. Likewise, having visibility and exposure to multiple domains aspects of the same company gives a perspective that is hard to have if you live in a super specialized silo.

However specialists don't need to be generalists to be productive as long as they come in with the mindset that they are specialists, and therefore may not know everything. And that is especially important, as I highlighted, for respecting historical context of the company and its operations.

There might be VERY good reasons for not doing something the way the rest of the industry does it (which is what the specialist might be inclined to do). However a good specialist should also be adept at factoring that into how they approach problems, and properly weigh the new information against their past experience to determine if in fact this is a special snowflake situation, or if their approach is still the right one. Conflict can often arise when if they push for the latter, so strong communication skills are key (and that is something that never really changes for any role at any stage of company growth).


I get where you are coming from. For me, being a long-term employee having gone through a few transitions from smallish to biggish, and when your storage guy you hired doesn't really know networking basics, or the database administrator has no clue about Linux, etc. it tends to slow everyone down.

"Scale" often seems to equate with SF hotshots from one of the big names that will come in and airdrop a brand new architecture. These things take years and you need those generalists that have the tribal knowledge and the ability to jump in (almost) as deeply as a specialized developer, DBA, Network Admin, etc. I think the SRE movement is good evidence of the type of breadth and depth needed to operate a complicated system at scale.


IMO that's a problem of hiring the wrong specialists, or BS artists in specialist hats (see the article's point about "the tech industry is not serious about hiring" :) ).


In other words, we have to go to war with the army we have, not the army we'd like to have. Sometimes waiting for just the right fit isn't an option, either because of the need to keep adding heads for growth, and sometimes you have situations like a Director (just brought in from BigDealCo to lead the growth) has a former employee that knows is "perfect" for the job. Director leaves after a year, after installing people that don't have skills. Catch-22 sometimes.


True. I think it's easier said than done to find the cream of the crop with a limited talent pool for many specialized areas.

More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: