Hacker News new | past | comments | ask | show | jobs | submit login
Guide for Technical Development (google.com)
396 points by gits1225 on May 12, 2015 | hide | past | favorite | 176 comments



The message: You don't have to learn much to be a software engineer just master the discrete fields of object oriented programming, web development, ux design, artificial intelligence, building compilers, and cryptography.

If you master one of these fields, you'll be doing well. I agree on taking an intro to each of these, but only to figure out which one you want to specialize in later on and because it's nice to know something about other related fields. You won't master all of these in your career, so don't let the list intimidate you.


I read your comment first, then I clicked through to see if they really suggesting "mastering" all of these fields (which is nigh impossible as you suggest.) They just say to "learn" them, and if by that they mean "take an intro-level college course on the topic" then I think it's pretty reasonable. It's my opinion that people should wait until after undergrad to specialize (in industry or academia.)


The author used the phrase "Develop strong understanding in.." in two places, "Algorithms and Data Structures" and "Operating Systems". The other fields are predicated by "Learn..", which I take to be a weaker recommendation.


I don't think you are disagreeing with the article. I took one of the crypto classes recommended. I did well, learned a lot, and enjoyed myself, but in no way does that mean I've mastered cryptography.

All in all the recommendations seem reasonably achievable in a few years of conscientious part time effort.


Anecdata: I've done quite well in my software engineer career without knowing much about compilers or AI (aside from what they generally do), and I doubt that I've "mastered" cryptography that much.


What is Google even trying to say with this?

You don't need a CS degree because you can learn everything for free on the internet?


They're trying to increase the pipeline of software developers. One way to solve the problem of not being able to hire the workforce you need is to train more workforce. This guide is a very inexpensive way for them to increase the top end of the funnel at basically zero cost or obligation to themselves. The topics are pretty specific to the types of work you'd do at Google; they're good, although the field of software engineering is much broader than what's listed here.


Given that they linked to online sources for everything, I'd say yes. Some of those Coursera courses in particular are very high quality.

Also worth nothing: "an increasing proportion of people hired at Google these days don’t have college degrees."

http://www.forbes.com/sites/erikaandersen/2014/04/07/how-goo...


I don't know if that's what they're trying to say but that's absolutely true, provided you're diligent, curious, and someone who does well with self-directed learning.


They want "people like us" as L & S did CS degrees therefore their recruiting is slanted that way.

Same for top 10 Law schools and big Law


I wonder who at Google wrote this. HTML is listed as a programming language, and JavaScript is written as two words.


I was thinking the same thing. This is sloppy writing. It reads as though it was written by someone unfamiliar with the content, and possibly translated from another language. Some of the grammar is a bit off from what you'd expect from an official release of some kind from Google.

The capitalization of the word "university", and missing the capital H in GitHub. The "Artificial Intelligence" bullet has a formatting error. Missing articles ("work on project").

Not to mention it is vague. The "Learn other programming languages" bullet lists several options, but doesn't say anything about how unrealistic it might be to try and learn all of those languages (it says "and" instead of "or")


These examples on their own discredit the post so much that it's a shame it has "google.com" as a domain of origin.


I don't know, I don't view Google as having the same level of perfection as, say, Apple. I know when I use Google products, almost infinitely in beta, there's going to be some things lacking and unpolished. Helpful yet unfinished, you take what you get.


Perfection is one thing. But this is a page on their website trying to help them recruit for engineers? Shouldn't that have a bit of a higher standard?


all of these comments about non trivial matters are bike shedding. even this comment, its only increasing entropy, your missing the forest for the trees.


@rockdiesel, it's definitely "for" the trees, where "for" means "because of". You can't see the forest (big picture) because you're paying too much attention to the individual trees (details).


If the "reply" link is unavailable you can click on the time to be able to reply directly.


That makes sense. Thank you for your explanation.


Trivial, not non trivial.


Off-topic, but I always thought it was "can't see the forest through the trees" not "forest for the trees".

Meaning someone can't see passed what is in front of them (trees/treeline) to see the bigger picture (forest).


If you look at the page now, it's been changed. JavaScript is now one word, university is no longer in caps, the "learn AI" bullet now has been formatted properly, and they've now properly capitalized GitHub. Looks like Google is listening to the suggestions made above (in parent post).


Why is so much importance to written language? Why not practice casual writing like casual speaking. I think many of these expectations put lot of pressure on the people who wants to communicate in writing but that language to be written is not mother tongue or might not even gone to high school.


The expectations are higher when it comes from a major company like Google, especially when it's speaking from a position of authority on a topic like what to do with your life.


With casual speaking, you place emphasis and pauses, naturally... With writing, nobody can read your body language... hear your pauses... notice your _emphasis_... unless you s.p.e.l.l. it out...


Casual sepaking is heard once. Published writing is read many times, and is easy to point edit, so it it worth investing a bit more in correctness.


> supplement your learnings

> Academic Learnings

Dead give-away.

I loathe this kind of thing. To me it evokes an image of someone who only knows computer science, and is entirely ignorant of other topics (say, history and literature) that a person needs to know to be a successful human being.


> I loathe this kind of thing. To me it evokes an image of someone who only knows computer science, and is entirely ignorant of other topics (say, history and literature) that a person needs to know to be a successful human being.

Actually, what it evokes to me is someone for whom English isn't a first language (at least not a mainstream American dialect of English) of but who has domain knowledge writing a guide in English in the domain in which they have knowledge.

It could certainly use editing, but the awkward word choice you point to isn't really something that suggests to me anything about what the person's knowledge of history and literature might be.


In defense of the list, I don't think it was intended as a blueprint for becoming a well-rounded and educated person, it was meant to provide a curriculum for learning the sort of technical skills people at Google would like to see in a candidate.

That said, there's actually a tremendous amount to learn on this list, and there are only so many cells to hold information. While it's always nice to say "and yes, also learn history and literature", truth is, eventually you have to not learn something in order to learn these things.

This may be a case where where the extreme rigor of the interview process and tolerance of false negatives may hurt large companies and create an opportunity for small startups and individuals. Here's why - a candidate who is very very talented at CS but also learned history and literature might come in at 90%ile, but let's say it's nearly impossible to come in higher than that without neglecting those other topics. However, the interviewers are all demanding a 95%ile+ performance. As a result, they pass on the more well rounded candidate.

It's even more insidious than this - because they select for the 95%ile, and it is very very rare to be able to achieve this feat without a single minded focus on CS, the people doing the interviewing will be at best faintly aware, and possibly completely unaware, of what they are missing.

Before getting too irritated, though, remember that any blind spot on the part of a large company is an opportunity for a small, nimble one. This is largely what is going on, I think, when you read those stories about talented programmers getting rejected by google and Facebook and coming back 5 years later to sell them a company for 100 times what their salary would have been. Of course, it's a much higher risk path that requires being broke for a while, and may just not be an option for people with other substantial life obligations.

There's this story I really liked about Andre 3000 (sorry, no link, just something I read, may be apocryphal). During an interview, he was asked how he chose the three chords for the song, and his answer was that he was learning guitar and those were the three chords he knew so far. So he took just the slightest bit of knowledge and turned into something great.

I read this at a time when I went to a music store and saw an incredible guitarist doing such impressive things with his instrument that he drew a small crowd. He wasn't just showing off, he was the local instructor. I believe he was also available for small gigs, weddings, parties, that sort of thing.

A lot of companies would pass on Andre 3000[1], and hire the awesome guitar playing dude, because Andre 3000 wouldn't be able to play a Bflat scale on demand or explain the circle of fifths or sight read a medium complexity piece of music at at least 150bpm. And I'm not knocking those skills, if you're a musician, by all means, yes, learn those things. But if that's all you focus on, to the exclusion of more creative things (and there's enough complexity that you could easily do so), you will starve other important things.

[1] prior to the hit. eventually, accomplishments do speak for themselves.


By

> I loathe this kind of thing

I meant "academic learnings." "Learnings" is not a word in the English language.

When I see writing that is this bad, it evokes, to me, an image of an Indian student who effectively only learned math/computer science/engineering, and does not have a sufficiently broad education to think critically and independently about anything else.


As much as you might loathe it, learnings is now a word (and has been used intermittently for a new centuries [1]) so deal with it.

Additionally, Indian students do the best they can given the limited resources (we have). And even when they do have broad education it's not going to be about Mozart and effect of WWII on the western nations - they will always be out of context when it comes to western culture. It's the same kind of lack of ability to think (or express without getting eyebrows raised) about everything else that you might be showing here.

[1] http://en.wiktionary.org/wiki/learnings


Seems more like you are a provincial person who is not aware that English has many dialects.


Why does it have to be an Indian student? Your point would be equally valid without racist generalizations.


edit: To put it more simply, a stereotype is not the same thing as racism.


I happened to learn all of them. It's not necessary, but it's also not unrealistic.


I'm sure there are others who have learned many of these languages as well, but that doesn't mean it's not unrealistic. It probably also depends on how we define "learn".

One of the (many) problems with this page is that it is geared towards computer science students (see header or URL). Listing 10 languages (with an "and", not an "or") may give a student the impression that he/she needs to add these 10 languages to their repertoire in order to be a successful engineer.

I'd be less critical of the page if it listed "or" instead of "and".


I'm a software engineer at Google and I was involved in a university recruiting program last year. The content at the linked page is pretty close to the slide deck that I was given by the recruiters to read to the students at the event. So I imagine that any sloppyness is an artifact of the recruiters moving text around w/o a live SWE to expand on & correct the text as they read it.


I write JavaScript for a living, and I often forget if it's one word or two. It doesn't mean you don't know JS. And yes, we all know HTML isn't a programming language, but it's so essential to releasing software nowadays that it's almost a pointless distinction to separate it. One extra list item to point out that it's a markup language? That groaning sound you hear? It's every intro to programming deck having to laboriously expand by one slide for no good reason.


It's highly likely that this is the result of format transitions (being a copy of a copy), while the links and substance probably haven't changed. The non-technical people who moved this from some internal document to this page just don't focus on the same details you do -- they don't know the difference between markup and programming languages


Well, consider that probably the content was developed by someone who knows what he/she is talking about, and has been later edited by content editors, mistakenly.

This is a reasonable explanation to me.


The guide to python (and I am currently training up a few folks in python) seems to be a guide for complete retards to learn python. It's woeful.


"retards"? really? that's still a thing that people say?


There are people for whom it hasn't sunk in yet that it's a thing you don't say, and then there are jerks.


Hmmm. I think it has well evolved beyond being a medical term of any kind and is now just an expression for somebody who is being stupid. There is a big difference being somebody who is stupid (something I would never say someone is), and somebody who is being stupid, which most of us do at least once a day.


And w3schools is provided as a reference :|


Google.com loves w3schools. Top search result for every DHTML query.


That's one reason you don't search for "DHTML."


DHTML is a category, not a single search term.


I don't think I've heard someone use the term DHTML since the 90s. I don't expect you will get a lot of resources on it.


Is the content of w3schools actually bad?

Its layout is reasonable, and it seems to have a fairly comprehensive coverage of the topic. Its not overwhelming like w3.org. I can find what I want to know with relative ease (unlike w3.org)

I have seen a couple of articles pointing out that their implementation details are bad, but I don't think I have ever gone on there with the intention of clicking "view source"


w3schools isn't an alternative to w3.org. MDN (developers.mozilla.org) is. And so is webplatform.org.

w3.org is the W3C, the actual organisation that publishes the CSS and XML specs (and the versioned snapshots of the HTML spec).


No, it's not bad. Hating on w3schools is like hating on PHP: a way for a person to socially signal that they are a Real Developer, not some noob.


w3schools is fine for students just starting out. Though I'd direct them to the MSN articles (but I'm guessing Google can't)


You mean MDN?


Sorry yes, Mozilla Developer Network.


CSS is also on the list of programming languages along with "Shell" which is probably supposed to mean bash.


Bash is not the only shell out there though. Personally I'd never tell anybody to write bash: either I want a very portable script that can run basically everywhere (on un*x) without installing anything and I use POSIX sh syntax or I want a more expressive language and I use a real programming language like perl or python or whatever.


If I'm going to be writing something that consists mostly of chaining commands together, I'm going to write a shell script, just because it gets out of my way for that purpose.

At the same time, anything I do at that level is probably going to be the exact opposite of portable: it's going to be something I intend to be specific to the exact system I'm writing for (namely, it's meant to run on a specific version of a specific Linux distribution, and possibly even to specifics of the installation on that machine). While I'm not likely to take advantage of many bash-specific features just because anything I write in shell tends to be so simple as to not need them, I'd also be an idiot to eschew useful features like arrays just because they're bash extensions.


If you don't care for portability and you need "advanced" language features like arrays and other bash-isms why do you hurt yourself by using such a terrible programing language?

In my experience shell scripts are fine for starting (and restarting) various programs and daemons or as thin wrappers around other apps. Beyond that I don't see the point. There's nothing other languages won't do better.

I've seen many "small bash scripts" grow up to thousands of lines, adding features one by one. And then you have to maintain that mess that catches fire every time an unexpected condition occurs because error handling in shell scripts is a joke.


Because there's no point in writing a Python script if more than half my code will consist of calls to commands.getstatusoutput().


It's not the only language, but I think you're splitting hairs on the spirit of "learn at least 1 shell programming language."


In 2015, are there that many systems out there without a bash.


Windows apparently has > 90% of the desktop market share. http://www.netmarketshare.com/operating-system-market-share....


In the base install? Last I checked all the BSDs at least.


Have we already forgotten shellshock? Especially in 2015 it seems like a good idea to avoid bash for anything non-interactive.


I've found that Node.js scripts are easy and fast to write and can be run everywhere easily with npm.


At least until recently, it was pretty difficult to use node to wire together applications piping from one to another like you can do in a shell language... The recent cp.execSync addition along with ShellJS go a long way though...

That's not to say I wouldn't use it... it's actually my preference, and I've used npm for stuff that isn't strictly even node, but ymmv.


GitHub also lists "Shell" as a language.


I wonder if we could create a github guide kinda like the community-written novel that was around a few days ago.


Yeah, also noticed they referenced w3 schools as a source of learning. Guess MDN was not allowed by management.


The MIT Introduction to Algorithms link takes me to a page that says: "Your connection is not private. Attackers might be trying to steal your information from ocw.mit.edu (for example, passwords, messages or credit cards). NET::ERR_CERT_COMMON_NAME_INVALID"


Same error with the link to MIT Mathematics for Computer Science.


I've noticed this sort of thing in a lot of Google's recruitment materials. If you ever interview with them you'll likely be sent some emails with a bunch of stuff like this which includes similar typos, minor broken English, and iffy recommendations.


they listed w3schools as a reference? not sure if I can take this seriously anymore.


I cannot disagree more with this line of thinking and the rest of the commenter's general ultra-pedantic attitude.

Please, stop thinking like an "expert" and put yourself in the mind of an 18 year old who is unfamiliar with all of the stuff. Yes, the author could be infinitely more pedantic, precise, and helpful for one specific career outcome (being exactly like you). This means precisely jack for a confused 18-year-old looking for guidance on what areas to spend their time learning programing topics while being medium-ly effective.

Please can the pedantic nonsense and try to focus on the goal.


seems like they corrected that typo


I can't help but feel that the "Java Script" error was more of a subtle jab at JS not being a "real" OOP language.


Its more that the non techie who wrote it fell for Mozilla's decades-old harebrained marketing gimmick to name JS after Java.


s/Mozilla/Netscape/


Wow I actually thought about that and then my fingers typed Mozilla anyway. Oops.


That's a bit of a one true Scotsman. #\L in "HTML" stands for Language. HTML is a declarative programming language. Programmers have voted with their keyboards and it is not to early to predict HTML's landslide victory.

The reason for it's success is declarative languages are great for data-compression. All the low level gore of reaching across the internet from one remote computer to another is reduced to a pair of <a> tags and a little text.


Calling HTML a language is OK. But it's generally agreed that the term "programming language" should refer only to Turing-complete languages.


Agreed by pedants. The whole post about spelling mistakes is pure pedantry. When I notice myself caring about such things, I admonish my superficiality.

I know lots of brilliant people who don't write well. If Google doesn't run posts through 10 levels checkers, fine.

Excluding non-turing-complete languages is probably a bad idea technically, as perhaps one wants to promote a toolkit of appropriately-powered languages. (I've seen arguments over "HTML isn't a programming language" before, and really don't care about the of precision when it's unwarranted in the context. Concepts like mathematical functions weren't all that precise until precision was needed.)


Isn't Coq's language, Gallina, not Turing complete (all functions must terminate)? It's still a programming language.


That's an interesting case. HTML appearing in a list of programming languages makes me feel like Google isn't trying very hard to make a good impression on the students they want to recruit. I would feel less that way about Gallina appearing in such a list.


De facto evidence that HTML is a programming language is right here on the computer screen. Arguments premised on turning the ought's of should's into is's are collectively considered fallacious. The fallacy goes by a variety of names including "is-ought fallacy", "naturalistic fallacy" and "definist fallacy".

That Turing completeness is a property of some programming languages does not imply that all programming languages are Turing complete or that Turing completeness is an essential property of computing languages. The assumption that the extents of Turing Complete languages are identical to the extents of Programming Languages requires some evidence in support.

Programming languages for describing state machines need not be Turing Complete and their implicit avoidance of the Halting Problem make them useful in practice


And the HT stands for HyperText and M for Markup, neither of which is programming. If HTML was a programming language, the web wouldn't need JS to be programmed.


Just the same, the HTML output in most contexts is generated by code that someone did program, or deceptively results in.

In terms of things to know, a programmer should know HTML to some extent in this day and age... Requiring a separate bullet point separate from "programming languages" for HTML specifically is absurd, considering the knowledge and concepts are indeed tightly related in the greater context.


For anyone reading this. I strongly recommend to find a better resource than w3school, suggested by Google.

The link is even broken, but this is a tiny detail.

W3School have bad reputation [1] and is considered bad.

[1] http://www.w3fools.com/


For example: https://developer.mozilla.org/en-US/Learn/HTML

But really, if you want to learn html+CSS, you need to spend a bunch of time just making layouts if you want to get a mental model into your head. It is just too finicky to reason about otherwise


I prefer W3Schools over MDN since W3Schools gets right to the point. MDN would rather have you read three paragraphs on the history of the feature you're trying to use, but I usually don't need a full history on my reference cards.


MDN contains both guides[1] and reference documentation[0] (which I happen to have read some minutes ago). While I have seen the occasional digression on a particular history point that explains some particular design choice, I found references to be often concise, complete, to the point, and correct (even across browsers, providing details if need be). BTW w3schools apparently contains squat about MutationObserver.

[0]: https://developer.mozilla.org/en/docs/Web/API/MutationObserv...

[1]: https://developer.mozilla.org/en-US/docs/Web/Guide


I think you nailed it here. I started learning SQL years ago from W3Schools, and it was great because it got right to the point. I can't imagine trying to get a handle on something completely new by starting with full reference material. Seems that people hate W3Schools for not being something it isn't supposed be?


That site doesn't say what's wrong with W3Schools, and it does say "when you're ready to level up, move on" which suggests that it's a reasonable resource for beginners to use.


For me at least. It's simply incomplete list of popular keywords with some explaination. I can never use it as a full reference. There are [1] many [2] posts [3] suggesting what is bad inside.

1: http://meta.stackoverflow.com/questions/280478/why-not-w3sch... 2: http://meta.stackexchange.com/questions/87678/discouraging-w... 3: http://www.codecademy.com/forum_questions/4fd1d78b7e79680003...


I'm really curious. What is bad with W3Schools? From those 3 links you posted, the only useful criticism says "they used to be notorious for serving outdated, or outright bad information." I don't see any example.


Except even the w3fools recognizes that w3schools has improved...


W3F is brought up quite often, and W3S receives quite a lot of hate, criticism (not always constructive).

But I still use W3S when it's the first result in google search and I am ok with it. It might not be the best learning resource but it's pretty great if you forgot some CSS property, or how some JS method is called. So W3S is fairly good as a cheat sheet.


Along with that... going to https of w3schools merits you with:

Your connection is not private

Attackers might be trying to steal your information from www.w3schools.com (for example, passwords, messages, or credit cards). NET::ERR_CERT_COMMON_NAME_INVALID

... Scary for someone who's never seen it.

Bad form from Google -- really.


Hmm, I was a little disappointed with the list for a few reasons

- Mostly just stuff that you learn along the way if you remotely like coding and go to an engineering school

- Doesn't talk about what to NOT spend/waste too much time on in favour of the listed things

- Doesn't list getting familiar with any version control systems. Obtuse as they may be, it's a must-learn skill for a software developer before being able to contribute to production code.


You seem to be contradicting yourself -- "just stuff that you learn along the way" vs "Doesn't list getting familiar with any version control systems". Aren't version control systems "stuff that you learn along the way" in your opinion?

IMO, any resource like the one in the OP is going to get bashed ("not the way I would learn stuff"). Might as well learn something from it.


> Aren't version control systems "stuff that you learn along the way" in your opinion?

I don't think so. The reasons for using version control when programming solo aren't obvious. Especially with the intimidating learning curve something like git has for newcomers. Most CS students I've tutored would either dropbox their code, or store it by emailing it to themselves.

And then there's the difference between using version control just for yourself, or a small group project, vs using version control on a large application, such as OSS with hundreds of contributors. Workflows, best practices, etc.

I had never used any version control until my first internship. I didn't really put any effort into a good git workflow until I was at a company that actually cared about establishing one.


No, you pretty much don't need to bother with version control systems unless you're writing code on a team with others. And even then, students will likely never deal with any of the complexities that arise in the real world like branch merges, continuous integration, etc.

In general, school teaches you the content without the tradecraft. You'll need both to be a "rockstar" developer, and I don't see how you could ever get the tradecraft without working in the industry.


>> No, you pretty much don't need to bother with version control systems unless you're writing code on a team with others.

If you are writing hello-world, maybe you don't need a VCS. Anything bigger than that can benefit from using one.


I have a pretty clear recollection of having heard about VCS in an undergraduate class, but presented in the most boring possible way, so I learned whatever I needed to regurgitate for the exam and moving one.

A few years later, I found myself working in the industry and using a real VCS (Microsoft's dont' recall the name), and realizing this was the same thing I have being doing lamely for the last two years with daily zip files and lots of notes on the README.txt for each project.


> No, you pretty much don't need to bother with version control systems unless you're writing code on a team with others.

You may not "need" to, but almost any individual project will benefit from it, also.


Right, but if you've never used one, you don't know what you're missing. The default of coding without one works just fine for most school projects.


Yet students are among those MOST confused by "it worked a minute ago; what did I change that broke it?!" Keeping your code under source control allows answering that question. A useful tool and lesson for the student.


I can't see any decent school not teaching version control. It's immensely valuable even to the solo developer. Hell, my shitty school taught us Microsoft SourceSafe, which was hilarious because two team members couldn't work on the same file simultaneously. You'd get lots of late night phone calls: "Check in the damn file so I can edit it!"


That's a misconception. If you set the VSS 'lock on checkout' flag, then yes it sucks. Its a mistake to set that on a project, yet its the default (?!)

There are adequate merge tools available in VSS, like any other source control tool. Disabling them is silly.

I consider source control to BEGIN when two people check out the same file. If you can't do that, then its really version control.


I dont think version control is necessarily something you learn along the way if you are just learning on your own. It is a good idea for any nontrivial project to have, but I also dont feel like its such a necessity someone who is learning would go out of their way to use it. I'm not even sure everyone would necessarily be exposed to it and know it exists when just learning, unless they decided to contribute to OSS which seems pretty far down.


I would agree with VCS, preferably distributed and centralised.

It does mention GitHub and Kiln (Mercurial)

"Work on a small piece of a large system (codebase), read and understand existing code, track down documentation, and debug things. Notes: Github is a great way to read other people’s code or contribute to a project.

Online Resources: Github, Kiln"


I don't agree at all. First, although it doesn't state it, it feels to me (and was confirmed by a googler in a separate thread) like this is meant for university students, for whom there is no "along the way" -- just academic CS courses.

Second, they did bring up version control: "Work on a small piece of a large system (codebase), read and understand existing code, track down documentation, and debug things. Notes: Github is a great way to read other people’s code or contribute to a project. Online Resources: Github, Kiln"


Sorry for disagreeing a little bit, but being familiar with a tool like a version control system is the least of the problems.

It's important, but not that much to be a item on this list.


Eh, I agree with most other commentators, saying that it's quite dissappointing and fairly confused why it got 300+ upvotes. Probably the influence of google.com domain. I even double checked the URL if it's not some publicly editable page that happened to be under google.com.

If I had to pick one tip for a junior developer, I'd say slow down when it comes to adopting methodologies, techniques and use your own judgement. What's the benefit of doing? What are alternatives?

When you learn about OOP design patterns, it's easy to suddenly see them everywhere. Factory here, singleton there, and with a bit of imagination observer goes here. Not everything should fit into one pattern or another. One of the benefits of the patterns is that it might make the solution cleaner, easier to understand. If it increases the complexity - don't (or at least think more about it)!

When you learn about Agile (although nowadays it's a very ambiguous term and loads of stuff goes under this label), it's tempting to start organizing standup meetings, plan everything under sprints, assign story points, do TDD, etc. But think twice before going for it. What do you get back? Maybe the morning standups are not beneficial for the team, and just distracts everyone? Maybe story points are not worth the overhead of agreeing on, allocating them, etc and your team can do just fine without them?

If some methodology doesn't really stick to your team, maybe it's an indicator that they are totally fine without it? For example, maybe you haven't had a morning standup for a week and no one has brought it up, everything's going smoothly. Not saying that the listed methodologies are wrong, they are not, just sometimes they are not beneficial.

Same goes for a new language, a new promising framework, etc.

Be critical.


Interesting they don't mention any Database related tech ie set theory relvars and PL/SQL et all (plus the no sql query language)


Databases weren't part of the core CS curriculum at least when I was a student. I still don't understand why since it seems as core to understanding modern software as anything to me.


The university I went to seemed like it was half Java, half .NET, and half databases. Seriously, databases leaked into everything, and they had database classes that were just as extensive and in depth as the other two 'tracks'. We also had an ORM god for a professor.

https://en.wikipedia.org/wiki/Terry_Halpin


No wonder some told me that new devs often cant write a simple SELECT statement or use a database from the command line.


It doesn't just stop there. They tend to rewrite a lot of things that you could do in SQL on NoSQL databases(Irony..) albeit the implementations are half written, buggy and break on adhoc inputs.

Even more worse, these sort of things come and bite people while these design schemas. After NoSQL database fashion has taken off, schema design, performance tuning et all are rapidly getting rare skills to find people with. This results in all kinds tech debt.

Much could be avoided by having good SQL skills.


> They tend to rewrite a lot of things that you could do in SQL on NoSQL databases(Irony..)

You do realize most "NoSQL" databases are "Not only SQL"?


Or do as I do and go on the 1 week Pl/SQL Oracle course before starting a project.


I second the recommendation of Udacity's Software Testing course and Software Debugging course. Software Testing is taught by the author of csmith, and Software Debugging is taught by the author of delta.

I feel that most CS curriculums could be improved by teaching more of testing and debugging. Do you know why it is not done?


> I feel that most CS curriculums could be improved by teaching more of testing and debugging. Do you know why it is not done?

I always thought this kind of skills was learned best through experience, by writing buggy code first, then fixing the bugs, then realizing that some testing could have helped to find the bug much earlier on...


Because it is a practical skill and hifalutin schools prefer theory and arcana.


testing seems to be somewhat of a niche field. Using manual unit testing and random testing gets good results and seems to be the standard. But, you certainly could not devote an entire semester course on them.

Beyond that, more advanced program analysis techniques, such as model checking and abstract interpretation, use some tricky math and don't seem to be widely used outside of some niche fields (critical software).

Testing in VLSI design seems much more commonplace (people usually attribute this to the Intel bug). I don't know how much hardware design is done in a typical CS degree.


Software Testing is taught at the college I attend.


Even Google knows that it's called "Javascript", not "Java Script": http://i.imgur.com/kEpux5f.jpg


I cringed when I read "Java Script".


"JavaScript", actually. ;)


Learn UX, AI, Cryptography, parallel programming, compilers, maths... lol don't let those guys/girls or HR write those guides they put all the keywords in there


Interesting, when I use the phrase "learn xyz" i mean "become very proficient at xyz" (maybe a notch below expert).

This post and the comments here seem to indicate that most people take it to mean "get an intro to xyz" (a notch above novice).


UX Design is just one mobile Android course.

https://www.udacity.com/course/ud849

Design has become so important that it would be nice to see a lot more resources. Learning to program is relatively easy these days because of the large number of resources. Design is still a dark art.


This is a guide for technical development. What I think is useful is DevOps knowledge and knowing your way around Unix or even Windows. Spinning up VMs and setting up your dev environment is often times more than just following directions because there's always some special case you run into that requires some knowledge and experience to solve.


I realize it's not a guide for the exact same thing, but How To Become A Hacker[1] has many things to say about leveling up.

The 'hacker attitude' is especially well summed up, imo.

[1] http://www.catb.org/~esr/faqs/hacker-howto.html


I bet that if this list would be published on other domain rather than G, it would be not even taken serious.


Does the advice of "just learn to use a few different languages and build a CRUD app that has a cloud hosted database" not hold true anymore?

A really good learning experience is having a crack at making the same app for both android and IOS (much frustration with objective c) For a newbie stuff like persistence storage, learning MVC, learning objective c, java, android and iOS, sqlite, and all the other stuff in between (Xcode is awful).

aAl of the other CS stuff is valid. Although i feel like you probably won't have the necessary buy in to learn algorithms and data structures well if you don't have a degree / GPA on the line.


Do you still need to buy an Apple computer in order to develop iOS apps? As long as that's true, I don't think "make an iOS app" belongs in any general-purpose advice for aspiring developers.


True, most schools run both, or at least my university did/does. but for the self taught, go android, more phones thus cheaper to get into the game.


Does anybody else notice and question the lack of functional programming on that list?


Big time.

It's especially depressing given their huge focus on "object oriented programming," given the 1000s of travesties birthed in its name on an hourly basis in BigCos of the world.

1) Even if you don't code FP in your day job, your sanity and code will benefit permanently from a strong grounding in it.

2) Nothing beats getting paid to code FP.


It's Google. They're notorious for using... less than advanced languages, to put it nicely. (C++, Java, Python, Go) Obviously it works for them. But just imagine if the founders had been more inclined to a LISP or an ML.


We'd still be using yahoo?


Learn other Programming Languages: [...] Lisp and Scheme.

Apart from that, there's probably a reason no top 100 company uses a purely functional language as their foundation.


I compiled a list of online courses on Computer Science and Electrical Engineering the other day. The curriculum is based on my old CS college syllabus. Maybe it can be useful for students. Feel free to add something on the comment.

https://docs.google.com/spreadsheets/d/1oZ6eY7WqVNGgyRSF9F8L...


I don't see much about scientific computing, such as the topics covered in the Numerical Recipes books, many of which are relevant to data scientists.


It's nice to see a curriculum that hand picks courses from a few different MOOCs and other relevant resources - that has definitely been a weak point of MOOCs as most of them don't have enough depth/breadth on their own for you to learn all of these things.

I also think this is a great point:

"Work on project outside of the classroom. Notes: Create and maintain a website, build your own server, or build a robot."


It's interesting to compare this list with the topics covered by the Software Engineering Book of Knowledge or the Software Engineering PE exam. Requirements engineering, formal documentation, estimation, project management, and ethics all seem to be overlooked in Google's guide.

I guess this is sufficient to be an engineer at Google, but this wouldn't be close to sufficient in other industries.


That's software engineering 101. Everyone learns that in school. Of course Google still needs all of that, just like everyone else. But you can pick up the knowledge part (that is actually tested in exams) very, very quickly.


No, you can't pick it up "very, very quickly." That is a dismissive and indirect way to assert that engineering is somehow easier or requires less intellectual rigor than "higher" disciplines like CS or math.


I tried to make it very clear that you can pick up the knowledge-based part - or that part that you can learn at school or using books, which is also the part that is tested in exams - quickly. In that respect it is indeed easier.

What may not be that easy is the experience to make good decisions. But there is no clear path for that, except being an engineer for a sufficiently long time. This piece was targeted at students and you can't just tell them: "Keep working somewhere else for 10 years".


Right. In school you can learn an ISO version that isn't relevant to he actual projects you take on a non-govt-contractor.

And note that the vs topics are also presented as not need in formal schooling to obtain.


I think for parallel computing, this udacity course is very nice too: https://www.udacity.com/course/intro-to-parallel-programming...


Hmm. They have something on testing, but nothing on requirements writing or project management.


This is explicitly a guide for university/college students, and maybe that should be reflected in a title edit. Otherwise the article could be read as Google assuming that a university context is the only time and place to learn.


The message: these are the kinds of employees we want to hire.

If you follow these guidelines there is a good chance we would hire you.

Universities: make sure you are teaching students these skills if you want us to hire your students.


Lots of criticism here, but very little of it constructive. Please could those people criticising the list explain exactly what is missing, or what should not be on the list?


The first sentence in the article encapsulates one of the biggest problems in this industry. It may apply to very specific roles within certain companies, or to specific subdisciplines, but in general the statement is little more than an unsupported assertion. It is very popular in a population heavily weighted by CS graduates, but popularity doesn't imply correctness.

As a thesis statement it is wildly out of sync with all but a few of the bullet items, most of which have almost no relation to "CS fundamentals". Working on projects outside of class/work is broadly applicable to any field, as is contributing to large projects as part of a team or group; neither of these requires "CS fundamentals." OO programming and learning specific languages are likewise disconnected. Some of the points aren't bad with respect to being required in order to be a good software engineer, but even those are too scant on detail (even for a bullet list).

Most of the "learn about" points are vague. In particular, as an example, the point about DS/algorithms is awful. What does "learn about" mean? Satellite engineers "learn about" materials, orbital mechanics, radiation and E&M, but they aren't in general expected to know the fundamentals (as a physicist would understand the term) of the theories of particle physics or gravitation for example.

The bullet points read like a survey of random computing related topics. There is no focus or cohesion connecting them to being a good software engineer. It reads like somebody's random meanderings when contemplating something they might find interesting within the field of computing. It's not a guide; it's a disconnected hodgepodge.

Remove everything that isn't related to being a good engineer (that would be almost all the points about sub-discipline-specific items, like machine learning), elaborate on the DS/Algorithms points, and provide something in the guide to actually support the opening statement (good luck).

That specific enough for you?


I generally agree, but I would also argue that they've left off the things that relate to being a good engineer, at least for engineer in the traditional sense.


Regardless of what should or should not be on this guide, it's on Google's Career pages and is being served up to students interested in a career at one of the most admired employers in the world. Not only is it on the career section, it's on a section specifically targeting software engineers, which are considered a key component to Google's success. And the page is incredibly sloppy, both to those in the industry and probably many who don't even code but are in the industry (recruiters, non-technical hiring managers, etc.).


Google need to think about this from a "Brand" perspective.

Likewise some of Googles published API's are "in need of improvement" - which is polite civil service speak for your on a PIP


This doc is exactly how recruiters and managers write. It is written by recruiters.


I wonder why Go is not in the list of recommended languages


Maybe a lack of resources out there or Google thinks if you learn a more generic language you will mold easier


Maybe they're a large company and Go is a system's language used by a subset of the company. And maybe it's not important to learn for general technical development with no specific problem to solve.


Go is only recently popular, so very few past successful candidates had Go experience, so recruiters wouldn't notice it.


IT makes me sad that they have to specify this: "Checking off all items in this guide does not guarantee a job at Google"


This reminds me of my 0 to web developer guide:

www.developingandrails.com/2015/01/crash-course-on-modern-web-development.html


> Add to your repertoire - Java Script, ...

Should I ignore this as a typo? How am I to rely then on this as a classified information?


Surprised, computer networks and protocols are not listed. Isn't that knowledge essential?


This seems mostly fine, a robust undergraduate education should cover most of those bases.


Lisp and Scheme? Scheme is a Lisp. I think they wanted to say Common Lisp and Scheme.


Grammar and spelling aside, this is a good reference for those Ask HN "How do I get an internship/first job" posts.


[flagged]


Could you be more constructive, and suggest what is missing or what shouldn't be on the list, please?


HPC, Cluster, Computing, Assembler, some exposure to PIC or Amtel - plus CCNA level in networking.

Oh and Know how to use a soldering Iron :-)


Or getting copy editing done via crowd sourceing




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: