It's not all that hard to make up a giant list of 8-10 years worth of material, which if anyone took seriously would also lead to a monoculture i.e. we didn't have any time to do anything aside from computer science and this giant-ass list of requirements like "physics up to electromagnetism".
I'd say I was close to hitting, for about the year 1997 or so, these requirements or their equivalents as they existed then, after (a) a pretty decent prep at high school level (Australian high schools allowed a heavy math/science concentration), (b) 4 years of undergrad with - nominally CS as 1 subject in 4 in first year, 1 in 3 in second year, 1 in 2 in third year and full-time in honours (but actually way more focus on CS vs other subjects), and (c) completing the 8 graduate courses required to move on to the rest of the PhD at CMU.
Even then I'd have a long list of gaps in his definition. And that's off 7-8 years of prep, not 4, and an Australian honours CS degree allowed a lot more specialization than a lot of 4-year degrees in the US at the time.
It would be a more interesting approach to try to define a minimal set. Anyone can spam out 8-10 years of study and everyone will agree that, sure, someone who knew all that would be Pretty Good (better have 11 programming languages under your belt, hey). But a far more interesting task is - what do you need?
And, perhaps, in a class of 100 people, why should everyone come out with the same laundry list? Maybe it's good to have 20 really good statisticians there, as well as 20 people who could design the processor they are programming on, etc etc (obviously an overlapping set).
I that every CS student should read the sections on resumes and communication. They're so powerful because unlike the rest of the article, they're not strictly prescriptive. They describe the goals are rather than the implementation details, and suggest a few ways you might get there.
I fully agree with this point. People here argue that minimum for scientist should be cut down to something that fits into strictly vocational degree that scores them a well paying job.
Every scientist in any STEM field I know knows "the minimum" that is larger than what fits into masters degree in their field. Someone with PhD is basically still scientist in training, a junior. 8-10 years of basics then very special knowledge above that sounds about right.
My background is in EE, so I think that being an engineer has prestige but it's not the same as being a scientist. Good EE engineer knows different things than research scientist in the field. You need to know enormous amount of theory and math to design modern circuits and radio interfaces but designing them is engineering.
This post may be influenced a little by the irritation that after seven years of study and two years of work experience (and many time spent on this stuff as a hobby in my free time), I still don't match nearly 50% of the 'requirements' in this list. But hey, these lists should be taken with a lot of salt. It's pretty fun to make one and reflect on all the things you've learned. In my list, there would be more numerical analysis and less languages.
I don't doubt that someone could touch on most of this stuff, especially in a degree that is heavily specialized to turn out someone exactly like the author or the author's idea of a 'computer science major'. But I'm very sceptical that this enormous list constitutes a reasonable knowledge base for a "computer science major" who is also meant to know some other subjects; we're talking "major", not "quadruple super-specialized CS major". One might even know subjects that aren't just existing as early feed-in service courses for the CS (i.e. the way he discusses physics, maths and statistics).
I certainly can't imagine someone meeting this guys list and uttering the phrase "I majored in CS and biology" or "I majored in CS and economics" or something like that.
Advice about career paths should always be personalized to each individual. I'm someone who has not had a traditional CS education, I am totally self taught and I don't have many certs at all, actually, I had taken a few courses and not taken the tests, and for all of those courses, the only reason I had taken them is that my employers paid for them.
I've had over 13 different jobs in the past 20 years and that's helped me to really figure out how the IT world works. I primarily work as a Gov contractor on the East coast, and rarely had a job I could not grow into provided I had basic knowledge of development, customer service, and project management. I am not a genius by any standard, but I have accomplished a lot over my career that has made me worth the money they pay me. People used to tell me to not "hop around" and that there would be unemployment eventually after doing that, but IT changed the traditional rules of employment. I am lucky to have been where I was in history because now there is a serious lack of supply for people who do what I do (Full Stack Dev, Dev Management, Cloud Architecture, Technical Project Management, Application architecture). I have also greatly benefited salary-wise within the past 4 years as I move into more management roles, and the only time I get substantial raises is when I switch jobs, not when I stay with an employer for an annual bonus. Actually, my salary growth year over year has been about +$20k on average for the past 5 years as the East coast begins to climb to West Coast standards, so I don't need to move to Cali and deal with Earth Quakes :P.
I started out learning the Internet and simple HTML in college (around 1995) when the Internet was just going public... That positioning probably helped me a lot. My first CS job out of school was making CD-Roms based on early JS, HTML, and graphics (pre-CSS) for a software development company that was making a tool which would later be killed off by message board software like PHP BB. I worked around real developers and learned their habits, and it made development in C less intimidating to me, back then development seemed like rocket science. The Dot Com boom (even citing it's collapse) convinced me that IT was a new industry I could stick with... It was really an amazing time. I interviewed with MTV, Microsoft, The Motley Fool, and turned them all down because they were offering lower salaries than small IT companies nearer to my home.
I graduated through many different companies as a web developer as sites got more complex, I embraced Google-Fu, which helped me to solve some really complex challenges, Learning concepts like Waterfall, Scrum, Rup, and Agile helped me to work my way into management.
I will agree though, maintaining personal (portfolio) web sites was what got me into the door ahead of my competition most times where no one knew me, it was shocking how so many people even now that I interview for jobs don't have any portfolio links in their resumes, or even a profile on linked in. If you're a developer, it's an easy way to bypass a code test on an interview to just be able to explain how you set up your personal portfolio. I used to have to back out of interviews that sent me code tests because they were never related to what actual company jobs were like anyway, and the tests were often geared towards people who were CS degree holders.
There is no substantial personalized advice someone can give you online that will be properly suited to your needs in a general format. The advice that works better is based on the process you follow rather than the programming language or specific decisions you should make career-wise. I've seen big companies and ideas grow and fail, like IBM and Adobe Flash... That is always guaranteed.
For people embarking on their career journey, I suggest that they take notice of the successful people around them and invite someone they really respect out to lunch and pay the bill (to compensate them for their time and so that you aren't remembered as a "mental burden" by them). Talk with that person intently, and tell them your ideas and goals (you should be able to trust them at that level of course, so choose wisely). Ask them for their take and listen intently without shooting their ideas, advice, or methods down, and then take all you learn from those conversations and design your own path in your mind, and follow it. If a path doesn't work, be sure to pivot to a new path before you're invested too far/deep... If a path that you're on does work, follow it for as long as it works, but also diversify your efforts, time IS money, never put all of your eggs in one basket. Eventually you'll succeed in time if you take note of what works, mirror that in your actions, and learn to negotiate money and your career properly. That's what has worked for me thus far.
~90% of CS undergraduates will end working as or with engineers, and in 2019 here are the skills that are indispensable:
1) understand popular protocols used in WWW (http, ssh, ftp, etc)
2) version control (Git). Understand pull-requests and the process of collaboration in a team
3) problem solving - how to breakdown problems, and how to overcome them when you reach a wall (use known algorithms when possible)
4) design patterns (learn as many as you can)
5) frameworks: MVC, angular, dependency injection, etc
6) communication skills
7) how to organize work, how breakdown big tasks into small easier-to-accomplish pieces
8) understand deadlines
9) write tests (unit-test, integration-test, etc)
10) understand that “good enough” sometimes is all you need (still try to fix it later :) )
As an engineering manager I find that a consistent difference between good engineers and great engineers is that great engineers can tell me how long something will take even when they haven't done something just like it before. That doesn't mean they can perfectly forecast how the hours will be spent -- no one could do that -- but they know how to figure things out, know how to build in some buffer, and know how to go heads down and crank when absolutely necessary, and as a result they can consistently hit deadlines.
I’m not picking on you. But I guess it highlights how hard this list is.
Estimation would by no means be the distinguishing factor to call a programmer great in my book
You can be a good coder but struggle to deliver because you get distracted or rabbit hole, over or under engineer, etc.
The engineers who have really mastered their craft have mastered not just the coding, the infra, operations, etc. they’ve mastered putting it all together and getting it out to the world.
The ones who have mastered all that can usually give pretty good estimates. The ones who haven’t usually find estimating to be difficult.
So it’s not so much that estimating is the single critical skill, but rather that estimating comes from a synthesis of all the other skills, and as a result it’s a good indicator for how much someone has mastered the craft.
Stuff I've not done before can be a real challenge, I just add a 50% slop factor, but I won't even give a guess until I can at least have a modicum of understanding of the complexity of the problem, technologies and people involved.
Generally for stuff I'm familiar with - I take how long I think its going to take, then double it, and thats what I give to my management. For stuff I'm not familiar with, I guess based on like-like work, and then add 30-50% more time in, then double that. For some of our customers however, working with them incurs a time penalty, so I add another 20-40% on top to their estimates.
The feedback I've been given is that my time scopings are reasonably accurate however. So while I feel silly turning in an estimate for 24 hours for a task that I think ought to take 8, time and time again, it takes closer to 18-20 hours rather than the 8 I feel it ought to (usually because of external factors outside of my control).
This is something most CS grads don't really grasp immediately, but in the real World business and time constraints normally trump perfection.
Technical debt is fine as long as it's fully understood and managed properly.
Early in my career good enough meant it worked correctly with no thought put into how that functionality is going to age.
I would do things like find a function that did most of what I need, add a bool parameter to trigger a new mode (usually giving it a default value), add some comments, commit, ship! Ten years later that function has four more mode parameters and it's a mess.
These days I'm better at doing one more pass to break things apart or change the interface to accommodate new functionality. I also try to never have bool parameters that triggers a different mode and almost never give parameters default values.
1) both http and ssh
2) source control and team work (and suggests courses built around these)
3) multiple sections on problem solving and algorithms (incl. proof techniquese, formal methods, knowledge of data structures and algorithms)
4) nothing in particular
5) nothing in particular
6) covers communication skills
7) emphasizes each student taking a lead role on a team in suggested team-based classes
8) not explicit
9) suggests writing and being graded on tests throughout educational career
10) section on graphics and simulation emphasizes this.
> I learned most of the topics mentioned in this document while in school, and although they are all valid, they are not enough for real world needs.
Emphasis is mine. I do not know how to interpret the emphasized part except as an indication that GP thinks OP skipped the listed topics. I was simply pointing out the overlap between GP's additions and OP's original list.
Yes the heavy math you took was very helpful (and I personally agree as someone who did a CS heavy math degree), however as stated above the list should really be a minimum requirement. Anything extra is superfluous.
Sure, learn them, but make sure you don't apply them blindly. Let the problem/solution lead you to familiar designs, not the other way around. I have seen some truly hideous code done in the name of cargo-culting "design patterns".
Another bonus of that is that the user wouldn't be tempted to, for example, use the word "factory" in method/class names, which bugs the hell out of me.
I think I've been asked to implement singleton in python in at least three interviews and when I pointed out that a module was a natural singleton and that you didn't really need to "implement" it per se, I got blank stares each time. Sometimes no knowledge is better than learning by the book.
> 6) communication skills
I would argue these are the only two things in your list that should be included in a Computer Science degree.
There's a limited amount of time to cram everything into a CS degree, and it should focus on the science, not the practice. The science is harder to learn than the practice. All of the practice items are picked up on your first job out of college.
People seem to forget we're all humans.
If you’re a CS major, panicking over this sort of “technical skills aren’t enough, you have to be a persuasive public speaker and effectively do management’s job for them”, people have been saying this since at least I started coding 30 years ago, and I don’t see any evidence that it’s actually true any more than it was back then - focus on technical ability, that’s a hell of a lot harder to come by than “powerpoint skills”.
I'll agree and disagree:
Yes, the target audience here SHOULD focus on technical ability. Not because it's "harder", but because that will be the primary decider in getting hired.
That said, I'd say my communication skills have been a central part of my success. Not "powerpoint", so much as being able to distill down ideas, translate abstractions, and of late, learning how to examine other people's ideas without coming across as I'm attacking them. My personal impression and the reaction my managers have given me is that these skills are both rare and valued in the field. But you do of course need technical ability or they are worthless.
I definitely think I've achieved more success than my technical ability alone would grant me. It could be Imposter Syndrome talking, but I'm not actually all that great at this. Just good enough, and my communication skills ensure that the "good enough" is applied where it's needed and to the degree it is needed.
Ultimately, coding is communication, and maintainable/extendable/flexible code is a lot of communication to other devs (including your future self). Even outside of the role of managers, and even outside of planning and architecting with your peers, communication is a valuable skill for a programmer to have.
If it's your first job out of college, don't worry too much about the communications skills yet. Your technical abilities will be your differentiator, and your daily job will largely consist of implementing things.
But if you don't pick up communication skills within the first 10-15 years of your career, it will limit you. You'll be stuck implementing things forever, which may be what you want, but it will shut you out of some higher-compensated and higher-impact jobs.
Unless you just want to go the founder route and hand your company off to a professional CEO that the VCs hire once you've proven out product/market fit. If you have both technical and strategic skills you can do that, but the lack of communication skills will still keep you from running your own company.
I found this to be true outside of tech hubs (Silicon Valley, NYC, London, etc.) but in tech hubs mid career developers absolutely do get hired on tech skills.
Even outside of tech hubs, mid career developers might not be hired on their technical skills directly, but they'll definitely get hired for a work history that implies stellar technical skills (e.g. a 2 year stint at Google).
This happens not because tech skills aren't valued outside of hubs, but because there's a dearth of people to actually assess them properly. This is, I think, where the idea that domain knowledge/people skills are more important - from people working in, say, Oklahoma City rather than NYC.
This is an extremely valuable skill to learn as well as a difficult one. Unless you work in a vacuum this will help you work better as a team. It is one of the differences between a fantastic manager and a bad manager. People will value your opinion and it will help you learn more (because you will see more code and how others do things. Unless you're the best and do everything perfectly). It's also something you have to continually work at and improve upon. Of soft skills this is one I'd put in the forefront for any engineer.
But it's the impact that matters.
Good resources for understanding how to control the impact of your words include the book Difficult Conversations and others from the Harvard Negotiation Project.
The distinction between this and forms of communication often considered distasteful in technical circles is that you are - in fact - right (technical skill). It's not "sleazy" marketing if it really is "the world's best pizza!" :-)
It's not becoming a salesman. It's avoiding the situation of being so bad at communicating, that your technical expertise is ignored. Non-technical people usually _want_ to hear your point, in regards a technical decision. If you are unable to help them know what you're trying to say, they may not understand what you're trying to say, and then you'll wonder afterwards why they hired the technical person and then ignored their technical advice. They're not ignoring your advice, in many cases they simply cannot perceive clearly what you are trying to say.
I hope that I have made my point understandable here. But, you know, I may not have. :)
I can't agree. In my experience, 90% of developers have poor communication skills and it hurts them every time they write or speak publicly (or document their code). Not just developers, of course. It's a rare person who knows how to tell a tech tale that isn't a confusion of disconnected factoids, or who seems to care whether they bore the hell out of their audience.
But I care. And everyone I respect cares; they just won't admit that they expect to be bored by techies when they speak or write, so they accept boredom and confusion as the norm. (In large corporations, especially.) But it shouldn't be. As a professional, everyone is obliged to organize their thoughts, be clear, and convey a message. THAT should be the norm. It's an essential part of doing any job well.
Like so many things in life that matter most, teaching good communication skills (and rational thought) is largely absent from college curricula (aside from writing a paper). But its absence doesn't mean that communicating badly won't hurt your career later.
Wasting other people's time is disrespectful and lazy and unprofessional. And it makes you look bad. So blather not.
When I was a kid I'd start every sentence with "actually" and use jargon without considering whether the other person was following what I was saying. I got told to shut the fuck up a lot as a result.
Now I've had at least a half-dozen people earnestly tell me I should be a teacher. When I think back to my childhood, that blows my mind.
The biggest benefits to my life and mental health came from learning:
* How to better squeeze requirements out of non-technical users, and explaining why the edge cases shouldn't be left to the last minute.
* How to talk to the boss(es) about prioritization, hard requirements vs nice-to-haves, and estimated timeframes. Understanding why you're being asked to do a task and varying your approach on that information will instantly improve your relationship with your boss.
* How to respectfully delegate and offer assistance.
* How to productively critique and receive criticism.
And the best thing? I don't feel at all like these came at the expense of technical skills. I need to be a different mindset to learn either of them, so I structure my efforts around that.
Learning how to communicate to non-programmers improves your technical ability.
In my experience, programmers who can't communicate to non-programmers don't understand that all scenarios are specific instances of increasingly more abstract scenarios. Programmers who don't internalize that build inelegant and fragile systems which work for specific arbitrarily chosen test data and then instantly fail in the real world.
If you can't explain a program further than jargon, then you haven't thought enough about what the program is actually supposed to accomplish.
I don't know, actual presentation skills seem to be rather hard to come by. Just because someone can crank out presentations does not mean they actually have any skill in it.
Droning over PowerPoint != Communication Skills
(most people holding PowerPoint presentations are blissfully ignorant how bad they are at it)
Technical ability is useless if you're not able to convince others why your project is useful or interesting. If you can't distill what you're doing into terms understandable by a non-expert, then you don't understand your own work well enough in the first place.
You will be left behind by coworkers who can communicate, and you will be confused about why, if you can't communicate well. Specifically, if you can't take an idea you have and convince others it's the right idea.
I don't want to drag us down the rabbit hole that is the current state of software education, however.
That said, I agree that it’s not something CS majors should worry about much in school. It’s something you learn as your career progresses.
1. What environment you work in: a large corporation vs small technical startup
2. What your ambitions are:
* Bluntly, if you want to be a very, very good coder who gets left alone and is largely handled/insulated by management, gets client requirements translated to and from them, that is certainly achievable by focusing on strictly technical skills. You can advance and be respected within that particular niche.
* If you want to be a solid team lead or manager who actually solves their team member's organizational problems / abstracts the organization challenges away from them; understand and trully aid your client; interact with various parts of your and client's organizations to make a larger change or lead a larger project; then a communication skillset is of course crucial.
It is almost tautological that to effectively communicate to various groups (programming peers, other technical personnel, management, business, clients, users) you have to be good at communicating to various groups :-/
As to what's difficult to acquire, again, it can be argued both ways. My wife is a tremendous communicator but technical abilities would be hard for her to acquire. At the same time, quite certainly a large amount of my technical colleagues and co-workers would and do clearly struggle to acquire communication skills.
Or put another way, I'm personally in an environment with technical skillset out the wazoo (which is important and respected and appreciated), but communication skills are rare and prized.
[EDIT] it also seems to me that good presentation and communication skills are how you achieve "you don't apply for jobs, jobs apply for you" status without top 1% tech skills.
I use PPT for low-level conversations. You can double click on graphs to see the numbers.
I'd love to find a tool that composes more attractive presentation media than does powerpoint. But I'm too blinkered by my and everyone else's use of powerpoint to know what that could be.
1. One of the best communicators I've met strongly suggested never to present from powerpoint on screen, but rather create a brochure to print and hand off to participants. His claim was less that it impresses, but rather that it changes dynamics of "presenter vs audience" and makes it more of a "discussion group / workshop", which tends to be a more productive dynamic, especially in sales/persuasive setting.
2. Alternatively and more prosaically, dear gawd, 99% of people using PowerPoint don't know how to use it. They don't know how the slide show view interacts with a Webex presentation, so they end up sending a small thumbnail to projector, presenter view to remote audience, but hey things look ok on their screen so they keep going. PDF would alleviate some of those issues :P
Plus you don’t have to worry about the errant slide transition working its way in, and it tends to load faster.
I don’t, as a rule, include video or other fancy effects in presentations ( they’re still referred to as “slide decks” where I work ), so PDFs are perfect. YMMV
I really dislike speaking to groups of more than 2 or 3, and I want to stay as far away from any management-like duties as I can.
That said, I'm on a smallish dev team in a very large company, and I regularly find myself having to explain technical things about our products and components to people without a current technical background. If I can't clearly communicate, things can go badly to varying degrees.
I recognise that for most, being at least moderately skilled at people comes as a given. For those of us where this is not the case it really helps to consciously work at those limitations.
I don’t know why he says learn real analysis and linear algebra to talk to engineers when I doubt 90% of engineers took one analysis course unless they’re Ph.Ds.
> Computer scientists and traditional engineers need to speak the same language--a language rooted in real analysis, linear algebra, probability and physics.
This is saying that the ontological common-ground between Computer Science and Engineering is rooted in, among other things, Real Analysis -- not that either Computer Science or Engineering majors need to take a course called "Real Analysis".
It does recommend specific math classes, but a class on Real Analysis isn't in those recommendations.
I think that the US's major Engineering-accreditation agency, ABET, requires most Engineering disciplines to have both Linear Algebra and Multivariate Calculus (also called "Calculus III") as core courses. Many students opt to take additional math beyond the basic core classes.
Pretty shocking how little math most CS programs require in comparison.
But a course actually bearing the name "Real Analysis" is typically a 300 or 400 level course, rigorously proof-based, and not needed by an undergrad engineer.
His standards are high, but fair.
But they do learn a certain kind of math. Personally I think a strong foundation in statistics is warranted. The calculus goes far beyond what is useful on the job.
What a rude paragraph. I think this viewpoint is very unfortunate. I love functional programming, and I think everyone benefits from learning it, but this garbage attitude is toxic to students. The author cannot imagine a successful computer scientist who doesn't love the same PL paradigms as him. I would prefer to keep them away from my learners.
I don't see that at all. It's not about enjoying it, it's about having the ability to adjust to the Lisp syntax, full stop. Given that Lisp is just about as close to programming directly in an AST as you can get, I'm not sure that claim of his is too far from the truth. Remember, we're talking about computer science, not programmers generally.
I deal daily with technology that I have don't enjoy, but have to accept temporarily. I wouldn't be fit for a career in computer science if I couldn't adapt to that.
1. Public Speaking. If you can't talk coherently in front of a group of people and sell your ideas, your career will be stunted.
2. Accounting. Accounting is the language of business. If you don't understand double entry bookkeeping, you can't be a manager. You can't run a startup. You can't talk to investors. If you misuse terms like "gross margin" you'll be overlooked as someone not worthy of doing business with. And worst of all, your feathers will get plucked by people who do, and you may never even realize it.
Neither of these are particularly hard to do, but like flossing one's teeth, avoiding them will be costly to your future.
Sadly I'm not that good point 1 (actually terrible at it).
It gets better each time you do it. Taking a class in public speaking helps a lot. I've seen engineers stand with their back to the audience and mumble - it's not hard to do better than that.
> And I've spent far too much of my time in my last project explaining double entry bookkeeping to coworkers to little effect. I had to watch has they re-invented the wheel.
Yah, reinventing that is certainly cringe-worthy.
I took a 2 week class in it at the local community college one summer. It's paid off well for me ever since. A very high return on time invested.
The other huge ROI for me was I took a 2 week summer school in 8th grade on touch typing. Considering how much typing I've done since, that had a heluva huge return.
I sometimes shake my head in astonishment at professional programmers hunt-peck with just their forefingers as they resolutely refuse to learn touch typing. What a waste of time.
It's especially important in the modern world where a large part of your audience will be non-native English speakers.
Any resources you would recommend that are worth looking at to help with this? I think I understand the basics, but particularly for larger companies things get much more complex.
> And worst of all, your feathers will get plucked by people who do, and you may never even realize it.
As someone who has little clue about accounting, what does this even mean? You mean I can be easily ticked by a shady partner?
By the way you are right, my side projects are stuck at the part when I have to ask for money :)
Yup. And everyone else you engage in financial transactions with, such as getting loans, negotiating a financial settlement, negotiating an employment benefits deal, dealing with your investments, even dealing with your credit card, etc.
I think maybe the top student through my five years at university might have known all of this. I sure as hell did not check all the boxes when I graduated two years ago.
If you're nervous after reading this, just know that the "should" probably means "in an ideal world".
The intersection of that and communication skills is rarer, but it happens often enough, especially for people who pick up a second major in the humanities or did a lot of public speaking prior to/during college. Again, not the average case, but not terribly uncommon.
I think we may differ on what we mean by "should". I don't (really) doubt that it's possible and valuable to learn all of this in university (forgive my use of hyperbole in the original comment). But I don't think it's necessary for most people with a major/master's degree in CS to do so.
The ideal and the necessary should not be confused.
After 1 year of working about 90% of that was already gone forever.
The answer, as trite as it is, is to teach a man to fish. If you want people to learn a wide breadth of topics, provide them with a set of qualities like curiosity, determination, problem solving skills, etc. Don't just give them what you deem to be a wide breadth of topics. I see far too many fellow students who just want a list, a guide such as this to follow so that they can be the best student and get the best job and live the best life (not that that's a bad goal). But in their attempt to live by a list, they ignore anything and everything else. I find it really sad when I talk about using OCaml or the fun of playing with Emacs, they shrug and insist that it's not "useful", that it's not on the list.
If you're fresh out of university, GitHub is useful because it demonstrates relevant non-academic code you've written outside of your studies.
After you've got your first job, I don't care about your GitHub profile unless you've got some impressive stuff on there. As an employer, I care about your ability to work on non-trivial problems in a team, and unless you're contributing to OSS (which few CS grads do) you'll never have this on your "portfolio".
In my experience, your GitHub profile is only noteworthy if you have:
* Breadth of work. Loads of projects in a load of languages that demonstrates you're a tinkerer
* Depth of work. At least one project with 5+ stars that solves a non-trivial problem.
Otherwise, that profile is usually some boilerplate code for a MOOC course or to try a language for the first time, and when hiring that's not an indication of anything.
No one in tech knows how to hire so its all network based. This isn’t a good thing but it’s where we are.
I can see the benefit of having a network, and even leveraging it. But it is not everything.
So I've failed by the bar set in your comment, but for some reason I don't feel that way, because there are plenty of jobs out there on job boards. I got 1 job through my network (an ex boss) and it wasn't any worse or better than the others. I've got a really good job now.
I can see where networking shines, and it's where you went to a top university, made lots of friends - especially in the same course and kept tabs with them since and a lot of them have gone to work for Google (etc.) or cool startups and now you can find an easier way in, which is fair enough.
Note: I'm not in the USA
Right now is not a good time to judge your career pathing plan. It’s literally the easiest time to get a job in computing in modern history, which includes the first web boom. It’s like judging your investment strategy right now. That said if your plan is working, great! I wouldn’t spend much time on indeed if it were me but good for you.
Note: all of my jobs ever have come from networks and I went to the wrong state school and have never worked for Google.
Seems like networks had nothing to do with your job to me...
Nope I used more than one job search method.
It was Advertised. They paid for an advert. To be advertised to strangers.
It might as well have been a Facebook ad or a Google search ad, for all that networks had to do with it.
> Your in person network for the final conversion
Yes references. That's a bit different though.
I'd love to see a timeline of how some of these super-productive academics are able to fit so many activities into their lives. I doubt I've done 1/10th of what Matt has done in the same amount of time. (And all this while researching, publicizing, and treating his son's rare genetic disorder, in an entirely separate field from the one he got his PhD in: http://matt.might.net/articles/my-sons-killer/)
In the 100+ interviews I’ve done for engineers, I’ve never cared for their portfolio. It’s hard for me to know if it really is their portfolio or if they actually wrote the code, etc.
I’m going to ask you design and coding questions.
My day job is engineering in systems that kill people or cost billions of dollars if they go wrong.
I'm never going to ask you to code something on the fly other than hello world, a loop to print 0..9 and maybe a recursive function to print 0..9. You can't think deeply enough about a problem in one hour, or one day. I will ask you a lot of questions about what you will do, how you will do it, and what the possible problems and mitigations are.
A portfolio that you've worked on is great because I can find the worst code in it and ask why it's there, what the trade offs of putting it there were and how you would fix it if you had infinite time.
and originally: https://news.ycombinator.com/item?id=2921440
Maybe 30 years ago, but these days electrical engineers know how to code
In my experience people retain very little of something if they don't keep using it. Many people take things like calculus in highschool or college, and completely forget it within a few years. I often checked with classmates about how much they retained from courses we had just taken the previous semester, and most of the stuff that wasn't directly connected to what they were currently studying seemed to be forgotten (if they ever had a real understanding of it to begin with).
1) You should always be learning.
2) You should mostly do interesting work you love.
3) You should foster good friendships with your highly ranked coworkers; at least having dinner once a month.
4) You should regularly look for and fix your weaknesses.
5) You should balance work and play. Exercise. Travel as much as possible.
I'm saying this as a physicist who has not studied CS formally, but worked with a lot of CS'ists.
I think what really helps is Python has really great documentation. Probably best I've seen in a large community project. Also, the great standard lib (batteries included).
Doesn't help either that Python has a great tutorial at the main project site with loads of examples, and it starts slow. Very helpful for non programmers to get started. Using it to get my 10 year old daughter into programming. I used the same tutorial around the v2.2 days to get accustomed with it. I already had a number of years of programming experience and had the help of knowledgeable coworkers to learn, but it quickly became apparent to me the value of the language.
I actually found out yesterday I'm being partially transferred to my companies quant team to help mentor and develop best practices. I'm really excited about it, given the opportunity to share my knowledge but also absorb knowledge from the; finally learn what all the data Ive been pushing around actually means (and maybe dusting off my rusty math skills from college that ive barely used since).
The ability to communicate what needs to be done, and how it is to be done, _before_ diving into the work. I work with tons of people who, sure, can get the work done, but I have little to no visibility about what they're going to do and when they think it will be done. I'm never going to be in the code as deep as you -- the purpose of my management is to help make sure the thing you do fits into the rest of the plan and doesn't waste time and resources. What good does it do our team if I don't find out how long it was going to take until you finished?
Inability to do this is generally an indication that they've not done the task before, can't tell you how they're going to solve it before they actually do, or are generally unable to plan their work at a higher level that is useful for a manager. And I don't even mean that the answer has to be a rock-solid time estimate -- if they can raise and communicate the uncertainties, that already is an important piece of information and next level thinking -- what do you know you don't know.
You can tell a more experienced person by whether they can give you plans and estimates for how long and what approach they will take, before they immediately start the work. The work isn't just a coding puzzle to be solved -- they understand it's a project to be managed properly and are working at your level to plan it. Versus you having to painfully extract that info from them.
The ability to self-assess whether their approach is the right/best approach, or what compromises or missing elements their approach adopts. Rarely do I find that people who dive right in because they "know how it needs to be done" are doing it because that is really the right way among all possible solutions. More like, it's the first thing that popped into their mind.
You can tell an experienced person by whether they take some time at the beginning to debate / discuss with you what approach optimizes for what outcome (whether time, resources, maintainability, scalability, data integrity, etc).
The person who does even these 2 things (or similar indicators of self-reflection and considering the problem) is -- almost unfortunately -- the standout these days.
These are not skills which can be taught in a school. They have to be learned through work, through mentorship and through the experience of failures and successes. Internships certainly help, but it takes years.
It's OK if hires fresh out of school aren't able to do these things. These need to be cultivated by you, the manager.
Your portfolio should show products rather than projects. Not everything needs to be complete, but you should be able to show that it does something.
Ideally, your resume is linking straight to a live demo, and that page links back to the repository. Besides the fact that many devs will be too busy/lazy to read your source, everyone will be more impressed and interested in an actual thing they can play with.
If you wrote a library that doesn't do much by itself, at least give it a README and documentation, and publish it to a repository. I'd rather see a link to a package repository than a git repo. (And, of course, fill in the metadata so I can get back to your git repo.)
(And, really, if you set up a lambda in AWS running your library, your page can just have some input fields and a button to call the functions and show what your code does.)
The point of all this is to take the time to put in some extra polish that shows that you not only have ideas, but execute on them.
There's something enlightening and empowering about tangibly having a specific file that you wrote, applying a compiler to it to produce an executable, and then invoking it yourself. It gives a better sense of ownership of the process that people don't always feel when they load a project into and IDE and fill in lines of code until the play button stops telling them what they did is wrong.
A great example of this is CSV. It's a wonderfully simple format... so simple that there is considerable variety in things such as "how do you quote a field," "how do you delimit fields" (considering its name is Comma-Separated Values, you'd think everyone would agree it's a comma), and "what characters are legal within fields," let alone the semantic interpretation questions of "is there a header row", "is this column numeric or textual or something else," etc.
we formulate abstraction that are stored in files. Still I am not in the least interested in files (a problem I have with most IDEs) that make up my component but in packages/namespaces, classes, functions, etc.
Back then, many if not most CS programs were super behind industry, so what got you hired wasn't your coursework in FORTRAN. It was what you did outside of class, in independent projects or (a surprising amount of the time) for money on the side. Many programmers in their late 40s-early 50s don't have ANY degree, because they learned on their own and then got a job early, without completing (or even matriculating to) college.
This left large holes in what they knew, if measured by this list, but generally speaking they were still pretty fantastic developers.
How this applies to someone in school NOW is absolutely unclear. I suspect the answer isn't "not at all," but I also doubt it's been possible to get hired in a professional developer job right out of high school for at least 20 years. It's hard for me to imagine how an 18 year old could know enough; the field is so much bigger now.
Of course all those skills are valuable for anyone, just like learning to play piano is. Doesn't mean you shouldn't hire people because they don't have those skills.
Assuming this stuff is needed in the first place:
Spivak's calculus, while a fantastic read, was "difficult" for motivated honors students at institutions like Harvard. Great exposition, and great exercises, but it's mostly for aspiring math people who want to spend some time on the exercises. For example, within the first couple of chapters, readers will have to do epsilon-delta style of inequalities and basically learn the tricks, before reading about them in the "limit" chapter. For calculus, any lightweight calculus book would do. For analysis, Abbott's "Understanding analysis" is quite illuminating and if exercises are your thing and want to build up to real analysis from the natural numbers in an accessible way, then Tao's analysis volume 1 would be good.
"All of statistics" is a bad introduction to statistics because it contains too many things (Kullback-leibler?!) early on?! It's a great book for a reference or for a refresher, but not something to learn from. Wackerly is light enough introduction to statistics, and for probability, Bertsekas!
I guess another generation of developers dark-patterning their users into giving up their personal info, and then exploiting them for stock options, a WRX (or Tesla if they've been around for a couple of years) and a Bay Area condo they can barely afford is upon us.
1. Communication skills. This should need no explanation.
2. Required math curricula. You need high school algebra and discrete math. Light introductions to statistics, linear algebra, calculus, and logic are too damn useful to pass up, but a full semester course in these topics is often overkill. Abstract algebra, graph theory, and numerical analysis are more situational in their utility.
3. How to design software. This is more than just programming 101 and software architecture, you also should know how--and when--to use IDEs, version control, refactoring, testing, debugging, code review, how to report bugs, navigation of large codebases you didn't write, etc.
4. What makes fast software fast and slow software slow. Computational theory and data structures and algorithms fall into this bucket. There's a fairly standard set of data structures to learn, but I would caution that you don't ever need to know how to actually write a good hashtable or binary search tree, but you should know how they work and what the performance ramifications are. Basic computer architecture should also be required, and I think there's often too little emphasis on the impact of the memory hierarchy in software. I also think you should understand how a compiler is going to interpret your code, what it will and won't do to make it faster, and what you can and can't do to make compilers work better. (Lexing and parsing theory, which tend to be the bread-and-butter of early compiler courses, need not apply).
5. A deep understanding of different programming languages. First, you need a workhorse language that most software is written in, just to get you employed. In addition, you should cover the four major programming paradigms: imperative, object-oriented, functional, and logic. It is more important to deepen your understanding of one of these paradigms than to pick up another language. Once you understand all of these paradigms, you should be able to quickly pick up whatever language you need to get your job done.
As you can see, I definitely favor depth over breadth. The utility of breadth is mostly in knowing that there are topics out there that other people know much more about than you do, and you should defer to their opinions (and in choosing something to specialize in if you're unsure). But many intro classes will probably leave students with an inflated impression with their competence. This is particularly dangerous in cryptography, which is so easy to screw up that even experts do so on a regular basis; if you're not an expert, you shouldn't even try.
* Writing skills. It is absolutely important that developers be able to communicate needs and requirements with precision and specificity. I always recommend that developers spend more time with their QA staff as the experts next door. Good QA people are excellent, I am mean absolutely astounding, communicators with a very deep understanding of the product.
* Structure versus composition. So many developers really want the world to be simple AND easy. There appears to be some false expectation that programming is like putting lego pieces together, and when this isn't the reality they are utterly lost. Instead developers need to have beat into their heads that code is more like a thought structure. Part of this comes from developing reading/writing skills and part of it comes from an appreciation that logic and abstractions aren't static qualities.
* Reading code. I often see junior developers struggling with how to write code. This is the vast majority of programming related topics I see online. There is a lot of insecurity around this and many developers first try to solve for the insecurity opposed to actually writing original code. This means that instead of taking a risk on writing original code developers will cost themselves in comfort blankets of tools, frameworks, dependencies, and layers of configurations. These insecurities are better solved from building comfort directly through an appreciation of reading code (RTFC).
* Technology diversity. Most developers I have worked with either get into stuff like robotics, or IoT, or simply remain hopelessly isolated to writing code in their primary domain. Most developers who isolate themselves to the code have no idea what is really happening in technology whether its security, networking, hardware, storage, or whatever. The really sad part is that many developers falsely believe themselves to have some expertise or advanced understanding of these non-coding concepts when its very clear, to people in that line of work, the developers have almost no practical experience there (spinning their wheels). As an example I have seen numerous comments here on HN where developers will downvote, into oblivion, networking or security focused comments that defy common understanding of application coding despite never having touched a switch, router, or any sort of security vulnerability assessment/resolution.
* Product management. More often than not developers are writing code in support of some software product. I often see many developers in pursuit of qualities completely opposed to the desires of the end users or the business that pays the bills. Sometimes these erroneous intentions are due to pursuit of easy, sometimes its due to an imaginary understanding of what people want, and sometimes its due to a false or incomplete understanding of the technology. No matter the reason developer focused intentions are generally bad.
* Data structures. Not having a computer science background myself I have always imagined that data structures are a core component of a good CS education. I am frequently reminded of how wrong this thinking actually is.
I also see developers often spending amazing amounts of energy focused on qualities that aren't helpful to their careers or the products they support. The following is a wishlist of what many developers wished they learned from their education that simply isn't helpful.
* Frameworks. A framework is an application serving the purpose of architecture in a box. Sometimes a framework is abused because its either completely unnecessary or because a developer is using it to solve all of the world's problems. A framework isn't going to do your job for you or get you promoted. It is just a tool. Sometimes original code really is better.
* Testing. Test automation is important. A good set of tests reduce risks of regression and demonstrates appropriate handling of various features used in unexpected ways. Many developers often test for all the wrong things. Tests are tech debt. Tests are not an excuse to avoid reading code. When in doubt about what to test sit down and have a very precise conversation with your QA people.
* Design patterns. Memorizing patterns is never helpful and is sometimes harmful. Instead solve the problem any way possible, even if inefficiently. Then refactor that solution over time to achieve better performance and superior simplicity. Through that process of continuous improvement you will unexpectedly learn design patterns.