Hacker News new | past | comments | ask | show | jobs | submit login
Build stuff (why.degree)
341 points by bluemooner on March 20, 2020 | hide | past | favorite | 187 comments

I was a mentor for a scrappy, underfunded robotics team of about 12 9th-11th grade students until the season was cancelled. One of my favourite parts of the experience was pointing out the "theory" behind the "practical". "What you're dealing with is called X and this is why it's important" was an increasingly common reply I'd give.

One of _many_ examples was when the software lead came to me excitedly and said he got basic autonomy working, but the robot wouldn't drive the same distance every time. We talked about open vs. closed loop systems, their wheel encoders, and PID loops. I'm convinced that when he goes to study any of those things in university, because of this real-world exposure to them, they will be far easier to grok.

In case anyone is curious, this program is called FIRST Robotics Competition [0], and it is life-changing for the students who get to participate. (I'm an alum.)

I got my first job through FIRST and was so much better prepared to solve difficult problems through what I learned there.

You probably have a local team in your area. If you're interested, you should mentor! The students' excitement is contagious and its a fantastic experience.

[0]: https://www.firstinspires.org/robotics/frc

I am also an alum -- I started the FIRST robotics team at my high school back in the day. I was part of a community where being a doctor, a lawyer or an accountant (nothing bad with that) was the expected norm. Having exposure through FIRST to technology, engineering and the creative innovation process led me to pursue engineering. Very grateful for FIRST and the amazing people that make it happen. Woody Flowers, one of the icons of the program, and a professor of mine at MIT, recently passed [0].

[0] https://www.firstinspires.org/about/leadership/dr-woodie-flo...

Likewise an Alum here. Wound up doing EE + CS in College and gravitating to AI, and now doing a PhD in robotics! (FIRST wasn't the only factor, but probably the earlier to set me on that path)

Thanks for being a mentor! I spent years in embedded systems design and industrial automation and now that I'm "only" doing software development, I really enjoy mentoring with FIRST. Tell your kids "Hi" from team 4027 (CC4H - 2018 Detroit Champions) . We had a really strong machine this year and were sad to see the season cancelled. We'll be sharing some of our engineering notebook and a season retrospective anyway. We're also working virtually with several groups that will make our team stronger next year. The competitions are fun but it's still all about the education and experience building things.

For teams without as much funding, there's also a competition called the "First Tech Challenge". It's a smaller robot, which makes it far easier for a team to self-fund and get started.

I started a team with 4 other friends in high school, and we ended up placing 2nd at the World Finals (sadly, we lost in the finals...)

I'm still very proud of this robot we built: https://www.youtube.com/watch?v=aDYqt-jd0cg

As an international who worked with a bunch of friends and made it to FTC World Championship that year, it was definitely worth it. @chillee your robot was very impressive!

We did the same thing with a gyro to make the robot drive perfectly straight, and it was enough reliability to secure a place in the finalists at regionals.

Hello from a mentor of FRC Team 2560!

Thanks for being a mentor. Building and programming the robot was my first fun experience coding and making

I focused on building stuff, and I've never stopped. I've worked weekends and evenings doing that throughout my career. I dropped out of college because I wasn't gaining anything from the degree, it would have just been a credential for me. It made it harder to get that first job, but since then it hasn't held me back. I have gone further in my career than the very best students in my classes who stayed and finished their degree with honors, and in some cases continued on to their masters in computer science.

Now at 35 nobody even asks about the degree anymore. So a degree is not worth that much in this industry compared to practical skills. However, I think many people might struggle to learn on their own and might lack the self-discipline or initiative to guide their own learning - in which case get that degree! But keep in mind those same skills are required on an ongoing basis to keep up with this fast moving industry, and you won't get as far without them. I was home-schooled so for me it's just business as usual - that's actually been the biggest thing I got out of my schooling.

> Now at 35 nobody even asks about the degree anymore. So a degree is not worth that much in this industry compared to practical skills.

At 35, maybe.

Before that, a degree is worth a lot.

It mattered a lot to my first job, after that almost not at all. I didn't get callbacks from some very conservative companies, and that may be the reason - but those weren't great places to work anyway - so I can't say that it hurt me at all. Sample size of 1 mind you.

Very much the same here. First job is harder, but demonstrate ability and no recruiter has ever asked since.

School is essentially trash for demonstrating "can program in a business environment". If you're aiming for a research-heavy position it may help, but that's an extreme minority of jobs. And after a couple years it's useless again anyway - you're either continuing your self-education and improving, or you're stagnant, and your prior education doesn't play all that much of a role.

Same here, I did some contracting to start my career as it was less credential-dependent than FTE. After showing what I could do for a few years, nobody has ever concerned themselves with my lack of degree since. I've literally spent my whole life studying computers (my dad is a geek so I started at 4 years old), so college was meaningless. Most of my classes I either already knew the subject matter beforehand or knew more about the subject matter than the instructor, so I dropped out. Most of the information is available for free online or in a library if you're self-motivated enough to learn.

There are some people who definitely benefit from structured learning environments, so I don't ever try to dissuade people from going to college. Rather, I try to get everyone to just start building stuff and releasing it open source and contributing to open source as soon as they can. If you can learn how to search effectively, read documentation effectively, and program proficiently in at least one language you don't really need college.

I'm 27 and nobody has asked except for FANG companies (and still got offers from them despite being a college dropout).

27. Asked about my degree going for a junior role when I first started, never again since. And I've worked at every place you'd expect it to be required - Big 4, traditional insurance, traditional finance..

May help with the initial job, but to be honest - you don't really want to work for a place that gives a shit about your degree. Just your capabilities or potential.

However not everyone has that luxury, especially outside of tech-hubs. And even more so outside of NA.

My lack of degree was a small barrier to getting some early jobs, part of that was transitioning industries to an industry that can be difficult for junior people to break into (pre-iPhone mobile to games). After my 2nd game industry job at 25 no one cared about my degree.

If your goal is to get a job at FAANG or another big company early in your career you absolutely need a degree. For other jobs it will make things harder but it's not really a requirement. If you still want to work at FAANG 2 or 3 years of industry experience is worth more than a degree.

I'm 23 and have been a web developer for six years. I've never been asked for my degree. I stopped college to go work for a startup after a year of English.

Don't dance around the greater question - was the degree worth a lot or was the knowledge gained from the degree worth a lot?

It is definitely true that you can build a career without a college degree. However, I don't believe this is the main point for college. Personally, I went to college solely based on an interest in learning. I can't imagine learning astrophysics (my degree) to the same level of understanding without the community of peers and mentors colleges provide.

26, but a similar story here. Not having a degree hasn't held me back in job opportunities.

Where it has held me back is getting a permanent residency in Canada. I spent the last 2 years working as a backend developer in Canada just to see I'm not eligible for express entry PR due to lack of education.

Blue Cards in EU don't give a fuck about your degree! However you'll need 3 more years experience in your role, or have your past employers fudge what your role was to get it in EU!

All countries are moving that way in highly specialised or high demand jobs, except medicine. Even the US has dropped that requirement for sponsored jobs in recent years.

I've been told by two separate hiring managers at bigger "non-tech" companies that they need a degree to satisfy some outdated HR department requirement, and to just put something down in education, and they would pass it through without checking. They wanted me, but I wouldn't have made it through HR without a made up university degree.

Depends a lot on the business.

If the company is hiring a junior to build a CRUD app, or even Uber for XXX, a degree wouldn't affect much.

If the company is a business intelligence solution house, a degree may be just an entry ticket.

I have a very similar career story. In fact recently I've come to the realization that I never much cared for software engineering as a career, and would rather just build stuff. I'm prouder of all of my side projects than any work I've done at a job.

It's great that you've managed to reach this conclusion! Yet I assume you still need to earn a living. Have you considered finding a job where you would do something more similar to what you enjoy doing?

Yes I have been doing things similar to what I enjoy doing for my most of my career. I'd be doing more interesting or riskier things if I had financial independence (or investment with favorable terms) but I think that's true of a lot of people

Yeah that's the dream for me as well. I want the freedom to choose what I build. I don't ever want to stop building. I don't much care for most of what I build to pay the bills.

i agree with all the intangible benefits and personally got very little out of college which is why I took a job without finishing. That being said early in your career a Computer Science degree is still very important. With 2 yoe ive been looking for a new job and 2 of the FAANG + Microsoft company recruiters immediately ended consideration with that as the stated reason why. Just some competing anecdotal evidence :)

We have a nearby state university with a so-so computer science program. 2-3 graduates every year go to FAANG, a handful go to the local fortune 1000 companies, and we (a < 500 employee company) get to pick through the rest.

They all list C, C++, .NET, HTML, CSS, and Java on their resumes but haven't done anything except a simple group project in any of them, half the time not even writing any code for the project. Which means a role in documentation, testing, etc.

They come with a chip on their shoulder, flaunt their degree, and demand a salary close to what they think they should be getting, according to their advisers or statistics or whatever.

To make it worse, they don't have any personal projects to show. I tell every single person that I interview to create something, even if it's a failure. At least you can come back and discuss your experience trying to create something, and you'll learn more about development working on that project than you did the whole semester you learned Java.

It's a failure of the education system. They want to teach the academic side but ignore the skill side. They're taught the language but none of the tooling. When I was in school, which granted was a while ago now, they didn't teach you subversion, make, how to use an IDE, how to install and use a compiler, linker, etc. I remember having to logon to a VAX machine so we could do some fortran assignment. It was a cruel joke. There was little to no instruction on how to use the damn thing. You just had to make your way through it. I don't think much has changed. The students who are curious seek out and learn to use IDEs, git, maven, npm, jenkins, whatever, but there are others who do what's asked of them and they're not asked to learn these things. As a result they're unable to even start on anything other than the trivial assignments they're given in class.

I think they seem entitled because they did what they were told and possibly received good grades as a result and their attitude may be them saying, "I bought an education. I was graded and found to be exceptional. If I had needed it, it would have been a part of my education". Right or wrong you should feel sorry for these kids. They paid a fortune and got ripped off. They just haven't realized it yet.

But the purpose of schools has always been to "teach the academic side." The whole system until recently has been that schools give you the foundational knowledge, and you learn the skills at work -- in olden days, corporations would actually train you on the job.

But then:

1) Employers decided that it's better to lay off people you don't need anymore, which meant that

2) Employees realized it's more lucrative to jump from employer to employer than maintain loyalty for a company, which meant that

3) Employers realized it's not productive to spend 6 months training a junior who's going to leave in 8.

Trade schools and community colleges will focus more on skills than theory, but there's an image problem among well off children... and the social environment is far different (boring by most 19yo standards).

They've already found a solution for that class image problem -- upcycling "trade school" into "bootcamp."

Now you can live in your tiny home as a minimalist and participate in a bootcamp, instead of living in a trailer park and going to community college.

> It's a failure of the education system. They want to teach the academic side but ignore the skill side.

Common misconception, but no this is not a failure of the academic side. This is exactly what universities are for. The thinking always was that if you teach the fundamentals, then the practical applications are a foregone conclusion. Someone who is well versed in advanced theories of computer science and electrical engineering: boolean logic, circuits, abstract computation machines and models of computing, data structures, algorithms, language design should be able to pick up whatever FooBar FOTM programming language or paradigm or framework that comes out.

If you think that the situation today is difficult with all the languages and frameworks we have just know that it was almost the exact same back then as well, except you didn't have the internet, chat rooms, and robust documentation and Q/A sites to help you answer your questions. Imagine the, excuse my frank language here, whining that would be happening right now on this forum and many others if you were just slapped with an IBM manual and told to bootstrap things yourself. I could only imagine!

Trade schools are suppose to be the institutions that teach you practical skills "to get a job". Universities are for expanding your mind and learning. I don't know why we mixed the two and then act surprised when a multi-century institution fails at modern workforce demands.

To anyone reading this: Do NOT go to a University to get a job. If your goal is just to get a job in this field, you can do that in far easier and cheaper ways (start reading some programming docs and get busy building things). You go to a university to learn. The getting a job part is a natural consequence of the learning you have done and are now are able to do.

I don't see the people in the chemistry department saying, "We don't teach students to stir shit in test tubes we teach fundamentals!". I get it, they're not there to teach job skills but it's also not an excuse to completely ignore it. If you want to do gas chromatography you're going to have to learn to use the machine. You don't just say, "Hey get some trade school person to do it" and it isn't just a "foregone conslusion" from fundamentals.

This is not an apt counter analogy for many reasons, and actually can be made to argue against the point you are trying to make. I'll give a few examples here:

- In a computer science education, you do indeed still use a computer.

- There are still proprietary techniques, substrates, solutions and materials that industry chemists use that you almost certainly don't have access to in your standard university classroom. These would be akin to the variety of frameworks, tools, and libraries that exist in the programming world, many of which are open and documented, many of which are proprietary and closed sourced. Hopefully, however, your university education has taught you how to learn to learn in order to use these things.

- Chemistry is not about test tubes and beakers. That's just what it looks like today. Likewise Programming by typing characters on a screen is just what it looks like today. To introduce another analogy, just like Geometry is no longer about rulers and measuring pyramids, Computer Science is about how to formalize knowledge and it just happens to look like semicolons, braces, and 0's and 1's today.

This x1000. Universities are not trade schools but definitely over the last couple decades have been used as such by companies because for certain fields, that's "all there was". Couple that with the "everyone must go to college" mantra of the past decade or so and you get the prevailing thought you have described.

Yeah I see people complain all the time that graduates with a "Computer Science" degree don't know "tools of the trade" like source control, debugging, etc.

Computer Science isn't about learning a trade, you're learning a science.

Yet another argument to better/more trade schools that are multi-year programs (like an Associate's degree) rather than universities.

What I see with CS/Engineering grads is those who are clumsy with practical skills are just as bad with fundamentals.

That's often been my experience as well. Some students choose the major because it is interesting to them and they're smart and motivated, others choose it because they think there's a six figure job on the other side of the door. Smart and motivated people will be good regardless of curriculum content, people doing just enough to get by will be bad regardless of curriculum content.

> if you were just slapped with an IBM manual

To be fair, manuals used to be better.

As a young lad, working at an IT consulting company I once got sent out to "fix the computer" at some cruise line. I went out with no idea what I'd encounter, and found myself faced with an IBM system I'd never even seen before. Fortunately, the service manual was with it and with its aid I was able to effect a circuit level repair and got them back up and running. ::shrugs::

The material you can dig up on the internet is worlds better than poor documentation. But a comparison with good documentation isn't always so clear.

I think it's a fundamental divide between people who see degrees as a way to expand your knowledge and people who see degrees as a fast track to getting a better job. Somewhere in the middle is the ultimate spot for an educational institution to be. For example, MIT has somewhat recently started a class on "hacker tools" (https://missing.csail.mit.edu/) with stuff like shell scripting, Vim, Git, debuggers, and general computer knowledge.

The trouble is that the tooling du jour gets out of date quickly. People who know how to use Eclipse, Java, and Subversion because that's what they used in school might not have the skills to pick up the tooling used in another company (say VS Code, JavaScript, and Git). The foundation of CS knowledge is changed much more slowly.

School should encourage the curiosity to learn on one's own and provide resources to help. It should also provide a framework to be taught things that will last beyond the end of the degree. They need to do a better job of instilling curiosity, but they shouldn't go all the way to the other side and become vocational training.

> They're taught the language but none of the tooling

I want to expand on this a bit, as I think what students are being thought is the syntax of languages but none of the "how do you design a project continuously as requirements change?" which is the really difficult part of software engineering. So they might know how to write the code, but they don't know how to come up with what to actually write.

It is a co-failing of education and employers. Employers should be hiring Software developers\engineers instead of computer scientists. Educational institutions should have more\stronger Software Engineering programs, and software developer job training programs.

Instead we get the odd equivalent of companies hiring math and physics majors and being suprised they aren't electrical engineers.

That sucks, but it's equally bad when "seniors" have no idea how stuff works outside of their little IDE bubble.

I don't think it's bad. Companies need engineers with depth experience just as much as engineers with breadth experience.

I've had students come in from other colleges (usually lower-ranked for-profit schools) that were great. They actually built useful projects and wrote non-trivial code.

Where I work we regularly get a bunch of kids from the neighborhood college who have projects but are looking for mentors, or are looking for projects/mentors. Its not at all complicated to setup.

Don't wait around for stuff to happen. Call the local Univ CS dept head and have a chat. Its much more effective than talking to kids or expecting them to do things by themselves.

As a professor at a state university, I try to tell my students all of this.

They’ll get hired at non-tech companies no problem with just a degree. But if they want the high salary tech company positions, they have to spend considerable time practicing their skills outside of class and building a portfolio.

Our curriculum is great for exposing them to a variety of CS topics, but it won’t make them stellar engineers by itself.

There needs to be a "portfolio" movement in software development degree programs. That, and a clearer distinction between a "cs" degree and "software engineering" degree.

In, say, an architecture degree, students and professors understand that you will apply to positions with a portfolio. I think students are able to use work from the degree program in their portfolio. Their professors have portfolios from when they first applied for positions out of school, so they can help or at least provide examples.

Currently, we just have a vague notion that you should have code or contributions on github, and that is for some reason not part of most degrees. Or, too many students get a cs degree and try to use it to apply for software development positions. In that case, yeah, you're going to need to self-study if you want to apply for positions you didn't actually get a degree is.

If you put serious effort into your class projects outside of what is required (https://thume.ca/ray-tracer-site/), you can certainly put that in your portfolio (that project is absolutely crazy). That said, I agree with you that more students should take "software engineering" degrees if they have trouble building things outside of class, plus that universities should make it clearer that a CS degree != a job offer from Google.

It's funny because I have made sure to emphasize my personal projects on GitHub on my resume and in interviews, I'm pretty sure they've never looked at anything.

When I was interviewing after I graduated college last December, all we ever talked about was my projects. We would skip over school (every other candidate they interviewed had a comp-sci degree, not a differentiator) very quickly.

I even had an interview where one my interviewers pulled up my GitHub during the interview and asked me questions about design and tooling decisions on certain projects.

From each company where I received an interview, I was told that it was because they were interested by my side projects.

Of course- it may depend on what your side projects are! Mine were of interest for the cloud based roles I was looking for- RESTful APIs, Kubernetes, AWS, Linux SysAdmin stuff, etc.

Then you applied at the wrong companies. For people with a real GitHub repo with a project or contributions, I always skip all programming tests and jump on the real code. It's a real pleasure to have an excited candidate explaining reasons for architecture or choice of tools etc. compared to fizz buzz. I can only encourage everyone to bring a small side project, it will make your interviews easier.

You can say this as much as you want but it's an industry practice. Almost every company I interviewed with in the bay area never looked at any source code for projects I did. (Not even a project that had over 50,000 users - they just installed it and said, "Oh, yeah, that's sweet. Thanks! Now finish writing this sudoku checker and then doing a spiral matrix traversal.")

They're using a standardized interview format and they don't care about anything else. I've interviewed with others where I did have projects up, they viewed the source code, and it went over well with them - but that was extremely rare (I can only recall 1 at the moment out of the 100+ companies I've interviewed at).

How big is your sample size? What phase of the interview are you thinking of? Context changes the approach.

If I'm doing phone screens (i.e. top of the funnel and looking at large numbers of people) I glance over github projects and links quickly - about a minute per resume to look over everything (this is is why formatting, styling, and ease of reading matter on resumes). If I'm doing in person interviews (i.e. bottom of the funnel, talking to small numbers of people) I look at almost every personal project, website, and link the candidate puts on their resume with the intent of asking the candidate about them.

I've definitely had people look at my personal projects. As an example, I applied to a company doing computer vision and they sniffed out my CV projects. They definitely came up in the interview.

It's possible that they were missed because so many people list their GitHub where it contains absolutely zero work written by them (or even nothing at all).

Ironically I've found the same thing. I linked to live demos of projects on my personal web server and never found traffic referred from Github.

Depends on the place. When I was hiring at a startup, trawling a provided GitHub account was the second thing I did after scanning over the resume.

As a 3rd year CS student, I see this a lot. I personally love CS and programming is a hobby for me, so naturally I have projects on my GitHub. I always list my three personal projects I'm most proud of on my resume. I only list a language if I have proof of competence on my GitHub or in previous employment.

Some of my peers will list everything they've written a "hello world" in, and from the employers I've spoken to, a good chunk of them see a long list of languages without any proof to mean they're over projecting competence. Doesn't mean there is no interview, but it hurts the chances of getting one. The interview ends up being the point where they can test how truthful they are.

The "simple group projects" we made during my 5-year degree were the result of 3-5 people putting in ~200 hours each (over 4 months). They were of higher quality, better designed, tested and documented than most I have encountered in the industry now 5 years later.

I would not say that I am much better technically now, than I was 5 years ago. Much of what I have learnt in the industry is about coping with working in an office, having pointless meetings, doing design by committee, etc.

To my mind there are orthogonal issues here. The much bigger one, not the lack of personal projects, is the chip on their shoulder. I've observed a certain amount of hubris in the culture of CS departments, at least in the university/college level programs I've been exposed to, that is very off putting. An ideal candidate would approach an area they are unfamiliar with, with curiosity, zeal and humility. If I can find people like that, who cares what they've built or not? I'm confident they will come up to speed and become a top level contributor.

It's likely that people who approach tech with the desirable attitude above will have built something personal they can show. But having built a personal project is neither necessary nor sufficient to demonstrate curiosity and zeal for craft.

EDIT: To be clear, I think people willing to get the degree and do the work should be paid according to market value. You come across a little bitter that the top performers went off chasing jobs at FANGs... maybe if you paid market rate more of them would stick around.

I agree that some or even most important skills in software development are learned through experience. But the solution is a lower salary and job-training.

When anyone writes "C, C++, .NET, HTML, CSS and Java" on their resume, that obviously has to be considered alongside their experience. If you don't want to hire and train inexperienced programmers, I guess that's fine. But I find this idea that there is a minimum experience (and accompanying salary) threshold for the entire industry that can only be reached through personal projects of a certain size frustrating.

I agree. Honestly this thread makes me super-glad I graduated undergrad when I did: Decades ago, when you could actually get a junior position at a company without having to be Ken Thompson or Linus Torvalds. I graduated from a basic state school CSE program. We didn't have "Github profiles" and there was no expectation of having a portfolio for a junior position. You also weren't expected to crank out JIRA tickets starting day one. You could be hired on aptitude and potential only, and learn what you needed on the job.

The universities and training you get in undergrad is probably miles better today than it was back when I went to school. This does not seem like an education system failure--it seems like inflated expectations from employers for entry-level jobs. If employers are looking for mid-to-senior level employees who can be productive from day one, that's fine but they should probably not rely heavily on university recruiting.

That's something I've benefited from. I taught myself how to program when I was 14-15 by reading books and playing with basic graphics processing stuff to make games. None of my games were ever any good or impressive, but it led to me learning how to do web and network programming for fun, and led to me learning how to build stuff.

This has been really good for me, since I don't have a degree, but I still can typically get through a whiteboard interview in most jobs that I apply for (though occasionally I'll get a problem that trips me up). School is certainly useful, but it should be used in addition to personal projects, not a substitute for them.

I am a developer who started as a trainee/junior with no formal education on the field. last year I started a degree course alongside my job, thinking I would learn more and become a better developer. This is not the case. I get a small amount of actual programming done in a variety of languages, but a lot of the assessments are based on writing a report about how or what I developed. I can see how a software engineering degree can be useless compared to actual experience coding.

> we (a < 500 employee company) get to pick through the rest

Why do you try to recruit from a small local pool like this? You can hire from anywhere in the country!

To be fair I've interviewed a lot of senior (10-15 yoe) candidates at FAANG that list a lot of languages they don't really have experience with

> They come with a chip on their shoulder, flaunt their degree, and demand a salary close to what they think they should be getting, according to their advisers or statistics or whatever.


The arrogant peasants you're hiring expect to be paid? They're relying on data and expert advice to decide how much money to seek?!?

You would be wise to steer well clear of such fools.

"Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."


That applies to you as well. Nothing in the comment came across poorly. Maybe you should assume good faith on his part.

You're right - it applies fully to me, and I'm aware of how hard that rule can be to follow, so correction is welcome.

As I read the GP comment, it accuses the OP not just of a straw man, but implies being an asshole (i.e. thinking that others are peasants who shouldn't expect to be paid). That's clearly not a fair reading of bluedino's post. Each move in such a direction is a step towards degraded discussion; hence the moderation reply.

Obviously these are matters of interpretation, though. People often read the same things differently.

demand a salary close to what they think they should be getting, according to their advisers or statistics or whatever.

The OP's comment implies that the job applicants are entitled for making salary demands based on statistics. There is nothing entitled about asking for market rates.

The comment responding to it points out the problem with this thinking in a lighthearted way. To me it was the perfect response. I understand you may disagree, but it was in no way unreasonable.

My comment on the other hand could have come across better. I implied you were not following the rules you set which I don't think is fair. I may disagree with your decision, but I could have done it in a more respectful way.

I don't believe I was straw-manning the guy (never my intent), and I did very much believe his comment made him come across as - to use your word - an asshole. Rereading the portion I quoted, I find his sentiment to be just completely horrible. Zooming out to the rest of his comments here for context, the whole thing is just a sad reflection of what faces the graduate of a typical computer science program when entering the profession and interviewing with a sadly typical... well... you know.

What entry into the profession should be like for a new CS graduate, and how we fall short compared to the other professions, feels like a topic for a much longer comment. Although perhaps it's a little bit like the book "The No Asshole Rule," where one might read the title and kind of get the point, and just skip it. Or possibly one might read the title and feel vaguely threatened, and just skip it.

OTOH, I really do appreciate you pointing out that I might not have been elevating the discussion with attempted humor, and I appreciate your work in general.

You'd do better to post a comment giving your own perspective rather than flaming someone else for theirs, which you may not have evaluated accurately. (I think you misread the GP, and speaking generally, we're more likely to misread other people on topics we feel strongly about.) Since it sounds like you have experience with the topic, sharing your experience and stating the conclusions it's led you to would be a substantive contribution.

In addition, sharing experience is at lower risk of misunderstanding and less likely to descend into flamewar. There's no contradiction between experiences. A hiring manager in (let's say) one state has their experience, and a new college grad in (let's say) another state has theirs. Both are real because they happened. If we report from that level, we can't contradict each other.

It's when we turn our experience into abstractions (e.g. the abstraction "new CS grads") that trouble arises. Abstraction A appears to contradict abstraction B. Both are lossy conceptualizations of a more complicated reality, and it's the lossiness that makes it seem like we're at odds with each other.

> A hiring manager in (let's say) one state has their experience, and a new college grad in (let's say) another state has theirs.

That's interesting, and a fair point. The differences I'm admittedly fixated on are the differences between software development as a profession versus engineering, medicine, or law, for example - or even the creative arts or science, not just the licensed professions! And I really do think it's a bit much to unpack adequately in the comments here, though I have seen a few good ones that touch on some things I think are important.

> the abstraction "new CS grads"

No abstraction intended, you can read that as "members of the set of students who recently graduated from CS programs in the United States and will now look for jobs or continue further in academia" if you like.

I would encourage them, if any were nutty enough to ask me for advice, to be enthusiastic about their future contribution to what is (well... should be) an amazing field, to be proud of their education, and to negotiate as well as possible with their first employer out of college, since that will help set the trajectory of their future career.

And to avoid people who misunderstand them by saying things like:

They come with a chip on their shoulder, flaunt their degree, and demand a salary close to what they think they should be getting, according to their advisers or statistics or whatever.

I despised the academic parts of school. The work was hilariously irrelevant, the requirements totally arbitrary, and the expectations set so ridiculously high that even though I've been in the workforce for a few years now, I still have anxieties and insecurities about my engineering ability that trace back to school being set up such that you're expected to fail all the time yet still somehow graduate. Often, it felt like school was actively working against me in my drive to become a better engineer.

I quickly learned to give my engineering side-projects a higher priority than schoolwork. Schoolwork could always expand infinitely to fill as much time as it wanted, and school was a hideous monster that would swallow every ounce of effort I put into it, take it for all for granted, and only greedily demand more of me. If I'd played along with that, I would've had to give up the drive to build things that made me want to pursue engineering as a career in the first place.

I had an industry internship every summer while I was in college, and I found an industry job - which paid more than my program's average undergrad starting salary - the afternoon after my last final exam. I say this because I want to emphasize that even by the standards of my school's career department I have """succeeded""" - yet every single one of those positions made absolutely no use of the things I was forced to do or learn to get my degree, and extensive use of the skills I learned by ignoring schoolwork and focusing on personal projects.

I can't say with certainty that school did absolutely nothing for me: my current manager says that even though he didn't look at my academics at all, he likely never would've seen my resume if it didn't have a degree on it. But while I can count on one hand the number of times something at work has been remotely relevant to my schoolwork, the number of panic attacks and depressive spirals caused by anxieties rooted in my experience in the academic environment is beyond countless.

People don't expect aspiring accountants to spend their weekends managing their personal finances or imaginary companies in QuickBooks or (shudder) Oracle. If they've got a personal budget spreadsheet, and have done a few studies in school following tutorials with sample data in a few software packages, but frequently refer to the docs and don't know the keyboard shortcuts, that's great for an accountant. You teach an accountant the tools your company uses to apply the basic principles they learned in school.

They haven't spent a whole semester learning Java, they've spent a semester learning about Java. By writing a group project in each of those tools, they prove not that they are ready to start knocking out your Jira tickets on Day 1 - they're not, as your experience and their resume proves - but that they are capable of working on unfamiliar software projects in a group, which is what they should be doing in your company. Take a student who knows how the fundamental function your software performs can be built, and teach them the unique tools your company uses to do that. It only takes a couple thousand hours for someone who knows how to learn to develop good proficiency in your particular stack, and there are so many stacks out there that you can't expect everyone to have invested that in yours.

>People don't expect aspiring accountants to spend their weekends managing their personal finances or imaginary companies in QuickBooks

I do understand (and sympathize) why many job candidates disagree with the importance of a "personal portfolio" and so we get repeated frustrations of comparing it to accountants, lawyers, pharmacists, etc. E.g. if pharmacists don't dispense drugs on the weekend "for personal fun", why do we "expect" programmers to do the same?

The way to make better sense of it is to not frame it in terms of "expectation" of a job seeker but of "identification and preference" of employers choosing one job seeker over another. Whether the preference itself is flawed is irrelevant. The point is that it's human nature to use a preference as a tie breaker.

Programming, much more so than accounting/law, is an activity that many more people like to do for personal "fun". Because of this, it's inevitable that some employers tend to want to identify them and prioritize them over others.

And hiring preferences are not unique to computer programmers. Here's engineer Ben Krasnow (of popular channel Applied Science) explaining how a "personal portfolio" helps him stand out and get good jobs:

- deep link and watch for ~30 seconds: https://youtu.be/ihbYtxaEDSk?t=248

So, if you're an electrical engineer getting angry at the world because employers shouldn't "expect" you to design circuits for fun on the weekend, don't be surprised if they hire Ben instead of you.

Edit to add another Ben K video covering the same topic but mentioning Valve & GoogleX and choosing between prospective candidates:

- deep link: https://youtu.be/4RuT2TlhbU8?t=112

> Programming, much more so than accounting/law, is an activity that many more people like to do for personal "fun". Because of this, it's inevitable that some employers tend to want to identify them and prioritize them over others.

The fact that I do programming at the weekend for fun on my personal projects does not in the slightest imply that I have fun at the kind of project that I have to do for the employer. So, judging by this criterion is a really bad idea of the (potential) employer.

> The fact that I do programming at the weekend for fun on my personal projects does not in the slightest imply that I have fun at the kind of project that I have to do for the employer. So, judging by this criterion is a really bad idea of the (potential) employer.

I am not sure that was the argument being made. I think it was more like "If a candidate actively uses and develops their skill set in their own time, then they will be preferred over candidates that don't". Those candidates demonstrate a higher level of mastery of the skills needed by virtue of doing something, _anything_ over other candidates who don't, even if the demonstration is trivial. It can also signal a lower managerial load if the candidate has demonstrated the skill set to self-start and self-motivate. That can also mean less ramp-up time to being productive.

> develops their skill set in their own time

I don't see working on personal projects as doing personal development. What I do then is riddled with bugs, no unit-tests or documentation, and with no sense of project or time management.

Don't get me wrong if a personal project to you is production code, well then to me that is a hobby or maybe you are doing it to pad your resume, in which case sure it does seem like a good signal.

Overall though you have to have unproductive fun and that's what personal projects are for me, and this notion that anything tied to code equals resume is really insidious. Like when do we ever get away from work? Hustle culture seems to have made it's way into a white collar job, I need to tune out.

I mean I had side project I don't maintain anymore. I did learned some on it, but I don't think I became so much better then what I am learning normally on the job and from reading.

I am coding majority of my working time. Some more hours at home does zero for my learning.

Also, people who put really a lot into work during work day, don't have projects outside. Because they are literally tired from work. If they would not, they would work more intensively on work. Like, if one have slow boring month, then it is easy to crank some more. But in normal week when you do with full focus, you get tired.

You need to separate two aspects of this argument.

1) Recent grads should not feel entitled to pay, respect, etc. etc. just because they have a degree.

2) Someone does not need to have a wide portfolio of projects to be a good developer, but those will probably make them a better developer.

The best junior developers will be the kinds of people who build constantly, but you don't need to be that to be a good, competent programmer.

The real life state of things is that most of students don't have personal portfolios. They get hired by companies that do not require them and work with them as with juniors.

They are also likely getting salary according to what their advisers or statistics tell.

Also, real life state of things is that for non-juniors, most companies dont actually care about side projects. And overwhelming majority of developers dont have them. It is meme on hacker news that it matters, but I think that is some bias here.

I don't know. It shouldn't take very long to spin up a simple personal project. Knowing the exact same tech stack might be hard, but not being able to show a single programming project isn't great.

Also, If I were hiring a beginner accountant I would greatly prefer one who had set up imaginary companies in quickbooks, and it isn't hard to do so.

Actually building things demonstrates that you might actually build something in my company. It's just competition though. If you have two guys, one who has been in CS education for 5+ years but somehow has no personal projects in that time, versus a guy with no education and a few interesting personal projects, I know who I would pick. I'd also be kind of shocked that in 5 years of learning about a topic you never built anything interesting.

My point is not that accountants (or anyone in any career field) couldn't do that, it's that they don't, aren't expected to, and yet your company probably hires many of them. They learn on the job, and your accounting department continues to function.

It functions despite the fact that your accountants didn't have a portfolio of personal projects when you hired them and don't do accounting for fun in the evening. To my knowledge, it's substantially only programmers who are expected to do so.

I don't think comparing CS with Accounting grads works. Accounting is more like law school, where you're studying for a specific qualification, like the bar exam.

Accounting students are studying for a CPA, a similarly regulated exam that tests your knowledge of the rules and how it can be applied to specific cases.

By passing the exam, you're licensed to practice law or start doing tax/audit/advisory for companies or individuals.

There's no such analogue for CS grads, because CS programs aren't focused on training you for a specific credential.

But you would always take an accountant who had run the books for Dad's taxi cab gig or the neighborhood handyman, over an accountant who had just sat in class for a couple of years.

Plenty of 'creative' jobs often ask to see some work samples or a 'portfolio' - hard to get hired as a wedding photographer if you can't show prospective employers a single photo you've taken.

I don't know if it's just or wise - but it's not unique to programming.

No-one gets into accounting at the age of 10 because they find it fun (as far as I know). But if someone did they would surely be at a major advantage in the job market.

I think it depends on how you view software development as a profession.

If you group it more closely to creative fields like design or writing—both fields where portfolio reviews have been standard hiring practice for decades—then it makes sense to expect to see projects. If you view it closer to a professional services role—accounting, legal, etc—then your comparison makes sense.

I'm not advocating for either view as the "correct" opinion either, I just think the disagreement has less to do with what qualifies a junior and more to do with how we conceptualize development as a profession.

> If you group it more closely to creative fields like design or writing—both fields where portfolio reviews have been standard hiring practice for decades—then it makes sense to expect to see projects. If you view it closer to a professional services role—accounting, legal, etc—then your comparison makes sense.

> I'm not advocating for either view as the "correct" opinion either, I just think the disagreement has less to do with what qualifies a junior and more to do with how we conceptualize development as a profession.

I believe that there exists a good criterion to judge: If the employer considers programmers to be "fungible", programming needs to be viewed as professional services role. On the other hand, if you are considered and catered for like a "creative rockstar" by the employer (because the employer will have a hard time replacing you), it should be considered a creative profession.

Judge for yourself what is closer to the truth. :-)

That’s a poor analogy. Accountancy isn’t creative (in fact, creative accounting is frowned upon). Accountants don’t need to develop a taste for ‘good accounting’.

Coding is writing, and choosing what to write about, and picking language to use to write about it. It is an unending series of creative choices.

If someone has no exposure to the process of making those choices, they are not ready to code.

By your definition accounting is budgeting, and choosing what to budget about, and picking the tools to budget with. It operates within constraints, as programming does, and those constraints likely vary from company to company (inverted compared to programming, perhaps---it seems like a larger company may have more scenarios to run, a smaller company fewer, and thus there may be more room for the kind of creativity you describe at a larger company). Even within a tool you can make a million stylistic choices on how to present the information around your budgets and payments to enable the rest of your organization to make decisions.

Programming itself is also not limited to developers. That should be self-evident in that spreadsheets are the go-to tool for so many non programmers, and they are at heart accessible programs. In that sense, it is programming that is a creative endeavor, not development or coding.

Last but not least, the most creative (and valuable) part of programming is arguably problem definition and problem solving. I would argue these two things are shared, in one way or another, across most jobs and careers.

I don't necessarily disagree with this - I've often felt like the criteria used to make software development decisions should be more empirical (YAGNI etc, schemas, contracts - not the whole formal specification bucket, but formal-"ish") and constraint based

However, it seems to me - as a layman with zero accounting experience beyond Wikipedia and GnuCash - you could feasibly drop an accountant into any arbitrary business and they would eventually find their way around the cash flows, balances, and - given sufficient information - build up a general profit/loss and general idea of financial health.

It doesn't seem to me, however, that you could drop any software engineer into any arbitrary company and expect the same. You should be able to, but not right now. Some software engineers can do that. Not all of them.

I suppose my point being: a degree from a chartered accountant indicates a core level of competency to me. But a software engineering degree doesn't. This is why we interview them with whiteboards / fizz buzz / how many windows can you wash / what is the ideal shape of a manhole cover.

Because you can get a top class degree in S.E. / C.S. and still not be able to handle a JVM OOM error.

It's true that the problem space of “accounting” seems, at least to me (I likewise don't have deep experience), somewhat more constrained; indeed, accounting seems very specialized. Perhaps one difficulty here is that “software engineer” isn't really one job or career---it's dozens of jobs or careers that we're pretending are one and the same. To follow the accounting example, I guarantee people are hiring software engineers to do work that is largely accounting. But they're doing that work in something other than Excel, so it's software engineering, not accounting. But is that actually true? You wrap these software engineers in teams with acceptance criteria, etc, etc, but really you're trying desperately to paper over the fact that your software engineers aren't accountants, and you're basically training accountants by trying to help the software engineers build an accounting system.

In general, I agree with your point. It's mentioned elsewhere that there's a difference between a trade school and a university, and a software engineering degree is a broad university degree. Much like an English degree won't guarantee you're a great author, it seems odd to expect a CS degree to guarantee you know how to handle an OOM error, for example. I think in many cases CS/SE degrees to themselves a disservice by trying to teach everything, instead of allowing specialization that can be clearly understood in the next phase of someone's career.

As a licensed CPA and someone who got a degree in accounting, I don't think this comparison is really valid.

In university, we learned the concepts. We applied the concepts in internships and learned how to do things hands-on. When it comes to applying for jobs, internship experience is considered more important than academic coursework. So this is kind of parallel to how 'real world' experience developed from personal projects is more important than academic training in software.

So in this sense, this kind of reinforces what the original poster said. This kind of hands-on experience is important, and there are different ways of demonstrating it in different fields.

You are right though, no one is expected to do Quickbooks on the weekend. You have to realize that there aren't a ton of professional fields where people do what they do professionally as a hobby.

One thing to consider is that professions such as accounting, law, medicine, and (at least here in Canada) engineering rely quite a lot on degrees and professional certifications for hiring decision. Software does not.

The good side of this is that in order to get hired in tech, you don't need a degree or a P.Eng. The downside of it is that you as an employer have to do the entire vetting yourself, instead of relying on universities and professional engineers associations to do a significant part of the vetting for you.

EPC companies do not give their civil engineering job candidates a civil engineering equivalent of fizzbuzz, because they don't have to. If they are certified, they have demonstrated they possess the required knowledge. I cannot see how fizzbuzz can be eliminated from software without a similar certification process being created in its stead.

Accountants are required to article for a minimum of two years before achieving their designation. Perhaps we need a similar apprenticeship program for software developers. That would balance expectations on both sides.

To add: Other professions such as lawyer and doctor also mandate similar apprenticeship timeframes. It's time to take a similar approach for software development so that it can be taken seriously as a profession.

> People don't expect aspiring accountants to spend their weekends managing their personal finances or imaginary companies in QuickBooks or (shudder) Oracle.

That's because imaginary companies are too simplistic to learn real lessons about accounting. And it's prohibitively expensive to start a new company just to practice accounting.

In programming, however, it is not prohibitively expensive to bring in even all the HN-hated build and dependency boilerplate of the Javascript ecosystem to publish a project to the world. You understand this and for some reason think that the sensible solution is to replace the imaginary teaching of university group projects and replace it with on-the-job "couple thousand of hours" teaching to develop proficiency. What happened to the weekend personal projects? That's a much cheaper and resilient form of learning by writing actual programs the student actually cares about. It's too bad that accountants can't do the same but what has that got to do with anything?

Edit: typo

> People don't expect aspiring accountants to spend their weekends managing their personal finances or imaginary companies in QuickBooks

At the same time, the best accountants I know are interested in accounting. But lets say you were hiring a writer -- would you look at one who could only point to class assignments as a portfolio of work?

Those group projects dont really teach good troubleshooting tools - my dad was an EE, he was self taught having started as a bench tech, and moved up. He told me of countless newly minted engineers who couldn't troubleshoot the circuits and boards they designed. I've seen the same thing in software, they can write code, but are often horrible at diagnosing why something goes wrong - they're also often missing a depth of knowledge in platform behaviors and will reinvent the wheel because they just dont know of a less complex way to do it.

Novel solutions for solved problems are generally non-optimal.

How many civil engineers are building bridges on the side? Why has it become the "norm" that CS students pay tons of money to attend universities that leave them ill-equipped to find a job in the real world?

One of the best parts of college is the ability to learn more about yourself, diversify parts of your life, and experience new things. It's hard to do that when your life is split between school work and personal work.

I worked my life away at side projects and a part-time programming job in college and looking back there's so much I would've done differently. CS programs across the country need a wakeup call.

> Why has it become the "norm" that CS students pay tons of money to attend universities that leave them ill-equipped to find a job in the real world?

University industry, unions, and academia more broadly, set the tone that if you want to do anything "for real" you must go to school. Many companies, parents, and ordinary lay people are convinced.

My (somewhat limited, I am young) experience tells me that competence is a completely inadequate predictor of whether somebody went to school for computing or software, but self-importance is an excellent one.

A good CS education will give you a wider perspective, a stable foundation to build on. Then you need practice, lots of it, to become anything close to competent. But being able to orient yourself and having been exposed to several different kinds of problems and languages, and learned an algorithm or two, definitely helps.

One of my university teachers told us that to become a real programmer, you have to design and build at least one language of your own. I wouldn't choose the same words; but having done several, I sort of agree. You need practical experience before it makes any sense though.

> How many civil engineers are building bridges on the side? Why has it become the "norm" that CS students pay tons of money to attend universities that leave them ill-equipped to find a job in the real world?

The problem is rather the expectation of the employers: No civil engineer is expected to build huge bridges in his free time.

True, and also it's inflated role expectations: I doubt junior candidate civil engineers straight out of college are expected to have multiple finished bridge projects in some kind of portfolio. Entry level jobs are supposed to be entry level: You filter for education fundamentals, hire for aptitude and potential, and then they learn the industry tools and jargon on the job.

Software engineering is not civil engineering. Software is "soft", "plastic", you can build and change anywhere.

If one could build bridges that easy, I am sure people will do it.

Exactly. How many artists draw or paint on the side? How many musicians write music for themselves? How many writers write stories for fun?

Not many.

But lots of architects design things in their spare time.

Is software engineering closer to civil engineering or closer to architecture.

> CS programs across the country need a wakeup call. I assume you mean the United States. I'd say this is actually across the world.

> If you want to be a software engineer and you do not have any code to present in an interview, you can not call yourself a programmer.

That's just bollocks for a number of reasons. Maybe you're too busy doing productive work to build a portfolio. Likewise, employers are often too busy to look at and evaluate your portfolio. And unless you spend a huge amount of time working on your portfolio, chances are it's going to have nothing relevant to your job.

Perhaps this junior developer fresh out of university needs to show up with real world experience before blogging about it :P

No need for an example as extreme as the NSA.

If you've worked for years developing proprietary code of any sort, you cannot share it. One's personal GitHub portfolio is a completely different beast than one's professional accomplishments.

For a senior position I agree, but this changes if we're talking about a junior role.

> If you want to be a software engineer and you do not have any code to present in an interview, you can not call yourself a programmer.

You can still call yourself a Computer Scientist, but there's a difference between the two.

You can call yourself a computer scientist if you have at least one serious computer science publication to your name.

You seemed to be implying that the bar for calling yourself a computer scientist is lower than the bar for calling yourself a programmer. The opposite is true.

Also, not all computer scientists are in the business of writing code.


People call themselves chemists, geologists, etc. without ever publishing anything.

Ditto computer scientists.

I'm aware that some people want to cheapen the word. 'Scientist' isn't a protected title. Anyone is legally permitted to claim the term. That doesn't mean that everyone with a science-related bachelor's degree is really a scientist.

'Scientist' is meant to mean 'person who does science'. Roughly speaking, that means someone who produces research papers. If someone says I've been a scientist for ten years, they mean to say they've been doing science for ten years. They do not mean merely that they were awarded a Bachelor of Science degree ten years ago.

This is less of an issue in, say, bridge engineering. The term 'engineer' is (in some jurisdictions) a protected term, reserved for proper chartered engineers.

If you have a better word for people who make a profession of doing science, I'd like to hear it.

Why? 1) Its the long-accepted term - its what the universities put on your Bachelor of SCIENCE certificate. Part of our rich oral history if you will.

2) Research chemists don't suddenly start using the term 'chemistry scientist'. They are all just chemists. No hang-ups.

> Its the long-accepted term - its what the universities put on your Bachelor of SCIENCE certificate

We're not talking about the word science, we're talking about the word scientist.

It is indeed a long-accepted term. Scientist means someone who does science. It does not mean someone with a basic grounding in the field, which is what a BSc indicates. With few exceptions, scientists hold PhDs. That is, after all, the point of a PhD.

If you get a BSc in computer science and then make a billion dollars selling fidget-spinners, your Wikipedia article will describe you as an 'entrepreneur'. There's no way it would describe you as a 'scientist'.

> Research chemists don't suddenly start using the term 'chemistry scientist'. They are all just chemists.

Chemists are scientists by definition, and the same thing applies here. Holding a BSc in chemistry does not make you a chemist. Doing chemistry, and presumably publishing papers, makes you a chemist.

As with the distinction between computer science and software engineering, there is a distinction between chemistry and chemical engineering.

> Doing chemistry, make you a chemist.

As does doing computer science. I did computer science last week (analysed the run time complexity of an algorithm).

Hence: computer scientist.

> Holding a BSc in chemistry does not make you a chemist.

No, it absolutely complete, totally, without exception does.

Just look at the job advertisements.

Only if you're doing computer science.

What I do is primarily programming.

Not to mention if you have been in, say, the NSA. Then the answer to "do you have any code you can show us?" may be "No, and I can't talk about it."

for people who haven’t gone to a top-tier college/university for computer science—especially self-taught developers—I would say that the corollary advice is also true:

Don’t just build stuff—study stuff!

Is the undergrad CS curriculum at a top-tier university really that different from that at a regular university?

The average skill level of your peers has a severe impact on the curriculum. I went to a second tier state school, and the majority of my classmates could not write code and struggled through basic data structures and algorithms theory.

I ended up learning some data structures and algorithms through ACM and codewars.com, where my peers from better schools learned them in class.

I've had multiple classes where professors said they had to remove programming assignments and projects from the course because it was reducing the pass rate below acceptable levels.

I would say no, but having taken classes at both, _how_ these courses are taught makes a huge difference. Professors, via their communication abilities, the level of effort put into making a class more interesting, and accessible marks a dramatic difference in what students can get out of a class.

With the modern net, everyone could benefit from the best professors in their respective fields at least for lecture. Doesn't provide the possibility of feedback if you want it to scale ad infinitum, but that sound like it would be worth it.

As stated in the syllabus no, as seen in the classroom yes. In general the students at a top-teir university are faster on the pickup so classes can go deeper into the material. Sure 10% more in one class isn't a big deal but repeated across every class and later classes building on that deeper coverage adds up over the course of a degree program.

From talking to my friends it seems like it's not. All of the lecturers just base their teaching on the same material.

Having gone to a decent state university (Rutgers) and talking to some people who went to CMU for undergrad CS, yes.

this has totally been a struggle for me. self-taught but I have to make time for myself to learn new concepts

I graduated with a CSE degree 41 years ago. My tremendous good fortune was obtaining student employment at the University computing center during my second year. While my fellow students queued for keypunch machines in the CS labs, I had my own terminal in the office, documentation for everything, and maybe best of all, was treated as a full colleague by the regular staff. They gave us real work to do. The university's budgets were severely reduced during the mid to late 70's, and the student labs were pathetically underequipped and overcrowded.

All of the students working at the computing center realized how lucky we were, and received multiple good offers despite graduating in the middle of a recession.

I personally struggled alot in college with the academic stuff. It got so bad that I was kicked out of the engineering department and ended up signing up for a degree in economics. After having gone through 2 years of engineering courses, the econ curriculum seemed like a piece of cake. However, I had a feeling that that would not be enough to land a job post-college (my GPA was also very poor).

Instead, I dove head first into trying to work with other people to build products that people would use. While none of those products ended up going anywhere in terms of financial success, the experience itself really helped me in articulating what sort of value I could provide to a prospective employer. In fact, I would give all credit in getting my first job to the fact that I was able to talk about the challenges I experienced with an on-campus recruiter who signed me up for an interview on the spot.

I really think the learning you gain by doing, building, etc really helps you grok the underlying academic principle. For me, I found that I have to do first and then go back and read up to learn what I've experienced.

I feel like this is good advice but it creates a false dichotomy. It should be "study stuff... by building stuff."

Could not agree more. A couple of years ago I decided to learn Java (I was already aware of the 'basics' in programming from other languages... variables, loops, functions, classes, etc.) so I spent maybe a week learning the basic syntax of Java. Then I jumped right into building a (very simple) 'virtual study partner' system where you would take notes by 'teaching' your virtual study buddy certain facts about what you were studying, who could then in turn quiz you using your own notes. You could also ask your study buddy about things you had already 'taught' it to see how well you remembered them or if you just needed a quick reminder about something. A bit of a silly project, no doubt, but it really helped me learn Java, and was so much fun!

Surely there must be a balance.

That's the balance between "study stuff" and "build stuff".

Building things is a great way to learn. That being said, the way this is presented leaves a bad taste in my mouth.

> If you want to be a software engineer and you do not have any code to present in an interview, you can not call yourself a programmer.

Additionally, from one of their recent tweets:

> I wonder how many people that call themselves programmers know that there is more to coding than just JavaScript

This is unnecessary gatekeeping from anyone, but it's especially unwarranted from a "junior software developer, fresh out of university." I'd encourage the author to be less prescriptive when there's so much gray area and so much variation between experiences.

> I wonder how many people that call themselves programmers know that there is more to coding than just JavaScript

I have a BSc in Computer Science, participated in informatic/algorithm contents/olympiads, created appliactions/games in C++, Java, PHP, JavaScript and other languages, yet in the last 5 years I programmed mostly in JavaScript. From my point of view, if you know JavaScript (TypeScript) in those days you can implement a solution to almost any real-world problem or application you might have. Yes, there might be better choices in some cases, but for most of the world what a product does is more important than how it's made.

I also prefer to know a programming language and its ecosystem really well instead of having some vague knowledge of 10 different languages. If I implement something I want to implement it well and to know as soon as I write a line of code what it does and all its implications, edge cases and possible performance issues. I am not saying that you shouldn't adapt your toolset based on the project you are working on, but that it's better to have a tool that you can always rely on and whose ins and outs don't surprise you anymore.

I am talking from the point of view of someone who builds stuff, creates products. Just learn a good-enough tool very well and then focus on the product itself instead of focusing on the tools.

Agreed. It's probably a good idea to gain some good depth in one particular language ecosystem, and JavaScript is a widely applicable choice. That said, playing with more "mind expanding" paradigms (logic programming, functional programming, Smalltalk-style OOP) is also good for your skills in any language.

Interestingly, Javascript has mind expanding properties of its own for people who have only bounced between Java/Ruby/Python/etc: it's async-everything, there's sync vs async errors to understand, promises, the event loop.

You don't encounter anything like it in the listed langs unless you somehow work with Netty/Twisted/asyncio/etc.

Yeah he is insufferable. Who is he to decide what a programmer is or is not?

I think you need to be a little charitable in reading it.

I come from the opposite end, I have no college degree and started programming with writing small BASIC apps on a calculator in middle school.

I used to think I wasn’t a “real” programmer when I was in school because there was so much I didn’t know. But in retrospect 18 year old me was better prepared for an actual programming job than a lot of the new grads I’ve worked with.

It’s not to say there weren’t huge gaps in my knowledge, but the thing is, actually writing code completely transforms you as a developer.

A “programmer” that hasn’t ever worked on a project end to end is completely different than one who’s churned through one day in and day out for even a year. I’ve watched new grads make that transformation and it’s amazing.

School will make you a better professional, and if you have a passion for it, let you additionally get you enough experience to hop straight into being a programmer confidently , but not everyone has that passion.


The same way I didn’t have enough passion except to do exactly as much was needed to let me write code in a professional setting, some students only have enough passion to get a degree which will let them be a professional and don’t really care that much about being able to execute right out of the gate.

I don’t think either is less “noble” as the other, I was too lazy to get the fundamentals, I understand someone being just happy to get a degree, but both paths have very different effects based on the field (I mean, my path would be a dead end in most fields)

I'll give my perspective as well.

I've hated school ever since middle school, but I recognized I wasn't dedicated enough to get a job without a degree. While going through the motions of academics, I noticed most of the classes were, basically, a joke. Not at all relevant to industry. I went to a fairly well known school as well, but didn't feel like any of it was worth it.

However, starting my second year I was working on side projects in full force, and applying for internships. Sure, my GPA dropped a bit, but I got relevant experience. Then a black swan event, through a connection I was offered a full time position at a very large company, solely on my experience from projects and internships, no degree required. It paid well too (or so I thought). I very much considered dropping out, but several people in my life (including my new boss) strongly suggested I get my degree.

I ended up finishing the degree and almost doubling my pay, so I'd say it was very worth it. However, a degree is not necessary (nor sufficient) for employment.

Why is computer science the only career, that I can think of, where I have to spend my free time doing "personal projects" to improve my skills? What if I just want to go to work, knock out tickets, and go home? With the emphasis on spending all our waking hours coding, no wonder we all burnt out.

You do not have to!. It is choice, where for many of us side projects are the pursuit of perfection that help balance shitty codebases of our day jobs, chance to create something other will use and be grateful, build a startup idea that can free us from corporate masters or just tire our restless minds.

I really have no choice, if everyone else wants to do it, I can't very well avoid it.

I worked so far with around 80 developers. I could count on my hand number of people that did any side projects. HN is biased towards people doing side projects.

People in this thread (and the original article) emphasise building stuff to help you get job interviews and do better in your job.

I'd argue that's the wrong way of looking at it, yes having something to show might be a signal booster on your resume, but it should be a side effect.

Building stuff is only fun if you want to do it, the motivation and passion has to be there, otherwise you'll just be building something for the sake of it. Don't get me wrong, it's definitely useful to gain knowledge from building, even if it's just a toy project, but you'll get so much more out of it if you're building something you're passionate about.

I've always been of the opinion that since the world of (web) programming moves so fast, a degree is largely pointless. As long as you have a natural interest in solving problems, learning new technologies and staying up to date, you should do well. You'll have a natural inclination to create some personal projects just because you can't stop yourself, and you'll find yourself having a couple projects that's worth showing off to potential employers–a portfolio.

Reminded me of Consume less, create more:


A few years back I was chatting with a faculty in my field at a conference. His lab almost invariably produces outstanding graduates who are really apart from the rest of the field. I asked him how did he do that, and his point was when hiring postdocs in his lab he never was really bothered about the publication counts, but he asked only one question, "What did you build in your PhD?" And if a graduate student could not answer that they were not graduating.

I’ve alway called this the “Producer Consumer problem”, which I obviously reused.

You can consume a lot of books, TV, movies, games, etc., however, it’s more valuable to produce something.

You need to consume a lot in order to be able to produce something. That's why writers, for example, take their inspiration from other writers.

How about this: it's far easier to critique something than to make something better. I find that this is true with photography: my taste is so much more sophisticated than my abilities, so when I make art I am forced to be humble. Consuming others' photography is still crucial to my development.

You will enjoy Jad Abumrad's "Embrace the Gut Churn" https://youtu.be/8OH9p3hnWCY

That would be true if we were talking about consuming code, not theory.

Actually its more valuable to consume something. There's way too many creators chasing too few fan dollars. It's to the point where we have so many creative works people are dumping them for fractions of pennies via subscription programs and humble bundles.


What separates own-projects from previous job experience is the ownership.

Anyone can be a "valued member of a team", go to meetings, collaborate, open and close lots of Jira tickets, be super helpful, check in code which "should fix the problem", etc.

But it's not the same as if you have your own project, which fails or succeeds based on how much effort and debugging you put in.

I'm having a bit of trouble building stuff to learn in my free time (undergrad w/ cancelled classes due to COVID-19). I find it very difficult to come up with an idea that I'm excited about enough to actually keep me motivated. And I would love to contribute to OSS but the codebases seem so dense and inaccessible, especially to a non-veteran.

Any advice?

Build me Reddit clone (if you do please make it scale to 100M users - that will keep you busy) or modify public source to be deployable with 1 click.

EDIT: more ideas

Try to build better Firebase.

Try to build OS that's truly secure and convinient and hackable that allows me to write integration scipts based on app events and allows me to rewrite any app UI.

Design App UI building toolkit that puts current solutions to shame.

Build MMO game that will make Blizzard sad.

Build simple and configurable personal assistant.

Build relevant adds solution.

Design better search engine than Google.

Design publishing system that will get rid od ads and will allow to fairly compensate content creators and make it sustainable.

Design simpler HTML, so anyone can write browser.

Try to build/design language that is both expressive and minimal, with performance close to C or better, has great errors, no quirks and predictable memory use. Write it so it's easy for beginners to contribute ;p

> The group projects organised by the uni have always been a joke, thus it is time to work on a serious project by yourself.

This always bothered me until a teacher told me they are less about the project and more about working together as a team which most of the time is where these projects failed. With this perspective I led my senior design group project to develop a proof of concept IMSI Catcher[1] detector that ran on a Raspberry Pi with a software defined radio in one semester. None of the other students had any prior experience with python, GNU Radio, low-level networking, or software defined radios.

A good team can accomplish so much more than an individual and you can learn a lot more with one too.

[1]: https://en.wikipedia.org/wiki/IMSI-catcher

I created some open source libs that at least some crazy people use. Some mods for games with quite intrinsic scripting I must say. Painted some pictures and dabbled a bit with are partially legal version of Logic. I created animations and models with blender/max. And after all that I got a job.

So the EE/CS undergrad program I did was heavily-analytical:

- It included the entire advanced level chem, physics and stats series for scientists and engineers.

- It was two elective courses shy of a math degree, including abstract math, concrete math, and tensor math.

- Only a few courses shy of a physics degree as well.

- The entire EE track and entire CS track of course.

- Writing lots of code for multiplatform portable code including Linux, HP-UX, Solaris, SGI, and some MINIX 2.

I think they should make it a mandatory combined undergrad /Masters' or /PhD program to be of sufficient depth into research achievement and breadth of knowledge to be useful in industry that cannot be achieved in the absence of an advanced degree. I think most of the undergrad CS knowledge can be acquired by osmosis in industry or self-study without a degree.

I agree that the role of the university is to prepare students for the industry and most of the universities achieve this quite nicely. Of course that as a new software developer you need to learn something on the job, but that is the case with every other field.

Personal anecdote but doing over a decade of personal projects in game and web development, starting in middle school, did nearly jack shit for me as far as eventually getting a job was concerned. If I were to go back and do it all over again I would definitely spend my time doing literally anything else. I was so convinced that if I just followed my fancy, worked hard and made cool stuff, learning anything and everything I could along the way, then eventually I would find work doing things I liked, formal degree be damned. It did not work out that way, at all.

After years of searching for work in the Redmond area and failing to do so (because, no professional experience, no degree), I randomly happened to find work by randomly logging onto a forum I hadn't visited in years and randomly happened to find a thread where a guy was looking to hire for remote webdev contract work. After that contract was complete, I spent another couple years looking for further work, and had to eventually move back home away from Redmond because the cost of living was too high while I was unemployed. After plural years of searching again, I finally managed to get a second remote webdev job. I don't like modern webdev and would much rather do almost anything else but now I've had two jobs doing it and I feel despair—am I locked into this for the rest of my career? With no degree and only these two jobs' worth of "Professional Work Experience" under my belt, it sure seems that way. Recruiters and hiring departments seem to care about little else. They sure don't give a shit about any personal projects, in my experience. And this is even in spite of the fact that I more or less picked a new language and/or framework for each project, which, in my mind, shows an aptitude for learning new things, being adaptable, and having an above-average "general programming aptitude" than someone who's made fourteen React projects in the past couple of years or whatever.

But apparently companies don't see things this way, or maybe I'm just not good at finding the ones that do, or maybe there's something else to my approach that has been All Wrong. But as my twenties draw to a close and I find myself no closer to finding a career than I was a decade ago, at this point I'm kind of at a loss for what to do about it. And, almost more importantly, I'm extremely burned out on making stuff just for the fun and intrigue of doing so.

The advice given to people who just build stuff would be "don't just build stuff, also study the theory behind". You need the former to become a useful craftsman, but supplemented with the latter you will become a master craftsman. You'll be able to quickly recognize patterns when reasoning about problems. It will allow you to quickly and almost instinctively orient yourself toward optimal solutions.

there's an abundance of this kind of webpages. there's so many of them that it's easy get in the doubt of "should I follow page X recomendations or page Y recomendation?".

The sad thing is that there is pretty much no equivalent page for other industries (law, medicine, computer-unrelated-engineering to name a few).

True, "Go to hackathons, get involved in open source", and build side projects. I just created a free goal board tool for remote teams [1], the best way to keep learning is by building.

[1] https://teamsuccess.io/boards/new

Remind me of Elon's point, you don't need a high school diploma to work at Tesla. And he has lots of diplomas.

The theory behind this mindset is: you're either born smart, or you're born stupid, and there's really not much you can do (i.e. education) to change that. The people born stupid get dull, repetitive jobs, the people born smart get to work at exciting places like Tesla. This sort of eugenicist thinking is frowned upon in most contexts, but somehow seems to have taken a foothold in, of all places, software development.

"Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."


It may not have been the most substantive comment, but you've grossly overinterpreted it in an uncharitable direction.

I don’t disagree with your point on eugenics, however, I don’t believe it is the mindset that backs the practice of hiring people without degrees

I think it is more a rejection of traditional academic institutions as the arbiter of merit. It is a shift towards a more “pure” meritocracy where all that matters is your ability to deliver. Whether this is good or bad, I am not sure. I think it depends on how society treats people are aren’t able to deliver to the same degree, and currently, in the US at least, those people get thrown to the wayside

You can make smart people less intelligent.

believing there is variation in human intelligence isnt eugenics lol

That’s not what parent is saying. Yes there is likely a an innate variance in human intelligence, but we can’t believe this is the sole reason why society considers people to be “intelligent” or “stupid”. Many people with lower IQs and academic performance would perform differently when raised in a different environment. People who are of lower socioeconomic status often times perform worse than wealthier counterparts. People in of lower socioeconomic status are often in that situation because they have been systematically oppressed. So saying that people are either born stupid or smart without acknowledging societal and environmental factors is essentially saying that poor people are poor because they are stupid, which is certainly eugenicist.

So you are mad at people misusing the word intelligence?

This is an oversimplification of my argument and a leading question. Do you disagree with my argument? If so, why?

I love this page

create stuff, on a more general level.

This paper was written by another author and includes a quote from Tim Berners-Lee.

The title is slightly misleading.

Edit: Thanks for updating!

But at least the quote is indeed from himself ..

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact