For us, when hiring, we explicitly look for candidates without degrees, or degrees in unrelated fields.. they honestly tend to be better hires. They might not know some formal definitions or some algos off the top of their heads, but we've found they're generally more resourceful, better at thinking outside the box, and better at solving complex problems without being handheld. Not to mention that what someone learned in uni a decade ago is probably not applicable anymore whatsoever (in tech). Experience/projects are a way, way better indicator of a potential hire's quality than their formal education.
> Not to mention that what someone learned in uni a decade ago is probably not applicable anymore whatsoever (in tech).
Working all night to meet a deadline is applicable. You learn dedication at university.
University also teaches you to take a risk on something long term. You pay a bunch of money, and work hard without compensation for a number of years, hoping it will pay off somehow. Your risk not making the grade and dropping out, losing the investment of time and money.
I remember one assignment which was implementing floating-point addition in 68K assembly code. (Third year course on machine architectures.) Code was executed on an emulator that could provide cycle counts, and there was a cycle count maximum to meet on certain test cases.
Now the permanent lesson here wasn't 68K coding, and that's not what the experience memorable. Rather, the day before the assignment was due, the prof surprisingly dropped the cycle count requirements to significantly lower values. Everyone had to scramble to squeeze their code down, even those who had not procrastinated at all and completed their code days in advance. We learned about dealing with a change in external requirements close to the deadline, and still making it.
Everyone I know, including myself, who came to software engineering outside of the standard academic route needed to make a huge commitment. Pretty much everyone who goes down this path faces a large cost in terms of the time commitment required to learn a new discipline, the price of training materials (boot camps, online courses, etc), and the opportunity cost of leaving an existing job to take a chance on a new entry level one.
I personally switched to software development from a decent but mind-numbing career. The first couple years were really rough as I lived off of dwindling savings trying to learn and pick up enough entry level work to get by.
Going the computer science degree > intern or entry level position > good engineering job route is both difficult and full of valuable experiences, but it's also the most cookie cutter approach with the clearest guardrails.
The notion that only the most dedicated people can make it through the experience of a comp sci college course in their early 20s is hogwash in my opinion. I have a 4 year degree in an different field and that was a MUCH easier path than breaking into the software dev business with a self taught and boot camp background.
>Working all night to meet a deadline is applicable. You learn dedication at university.
It's not dedication. It's simply a deadline that you must meet or face failure otherwise.
>University also teaches you to take a risk on something long term. You pay a bunch of money, and work hard without compensation for a number of years, hoping it will pay off somehow. Your risk not making the grade and dropping out, losing the investment of time and money.
Actually university is opposite of taking a risk. It's a safe choice. You get a degree - you get a good job. It's something that always drilled into everybody.
Taking a risk is doing something that is opposite of common sense.
Not a given. A kid just has to walk down that hallway where the yearly grad photos hang, and take note of how less populated they are than just that one lecture he or she just came out of.
Well, with dedication that you mentioned above, or more realistically "persistence", pretty much given. Unless it's really not your thing and you switch to something different.
But I personally won't know: I didn't graduate from high school, never got into university, yet now I am senior director level overseeing architecture at rather large company.
I guess I made safe choice back in high school when I decided not to go to finals at math and instead to go to first day of full time employment with city level isp
Yep; e.g., out of, say, 350 enrolled in first year, 300 might find it's "not their thing", eventually leaving a graduating class of 50. Or else, in some situations, the university finds that some of the students are "not its thing".
and how many of those 300 "dropouts" will switch to "liberal arts" or something else, just because having a degree is a "proper choice that will help later in life" as opposed to really dropping out ?
CS degrees are not for churning out new hires for whatever you do. They're for providing a well-rounded Computing education which provides the foundations for the student to build on. Things like classic algorithms and foundation computer architecture are classical things that will always help you. Perhaps you're only hiring coders.
I'm not the OP, but I have a non-STEM degree and work at a research firm doing DARPA-funded program analysis and compiler research.
My experience does not match your claim: I've found no relationship between having a CS degree and having a strong grasp on "foundations" (which foundations?). The strongest indicators that I've found for whether someone can do complex, research-level work are:
* Being able (and willing) to try at a problem, even with partial information. Even more critically: being able to subdivide a problem such that they can solve some of it and be forthright about being unable to solve the rest.
* Just knowing a lot of productivity- and thought-enhancing stuff. My best interviewees are active and vociferous readers, both of technical and non-technical materials. They almost always end up treating interviews as conversations, ones where both parties end up learning a bit.
It's an interesting take. I wonder which market you are hiring in.
There might be a selection bias at play here. Someone who was able to create its own degree from scratch and learn enough to pass your interview should adopt the same mindset when tasked with a new problem. You are selecting for folks who require less hand holding.
> Not to mention that what someone learned in uni a decade ago is probably not applicable anymore whatsoever (in tech)
Not if they had a good degree. Sure the tools change and whatever language/framework too but the fundamentals don't.
Applying a a systematic approach to debugging, algorithms, automata and complexity are still very much the same than they were a decade ago.
The bias comes from me. I'm the guy who got a CS degree a decade ago. I can tell you from personal experience, 100% of tech-related course material in uni fell into one of two categories: either "I know this already" or "this will never be applicable in a real-job scenario". And I know I'm just some guy on the internet, but I went to a "good" university, and I'm director-level at a tech company you've heard of. As you say, "applying a a systematic approach to debugging, algorithms, automata" is absolutely essential in what we do, but I'm asserting you're much better at all of the above by researching, discovering, and applying it for yourself rather than memorizing textbook definitions and answering multiple choice questions.
If the degree is memorizing textbooks and mostly multiple choice questions then I question the validity of the degree itself.
Going to college forces someone to learn full time, and that's the real value. Sure you can learn on your own part time but it'll take wayy longer than dedicating a few years to the problem. So in that regard, I wouldn't advise a prospective CS student to go do something else and learn software "on the side".
I don't have a college degree at all. I went from a movie theater projectionist making minimum wage (8 years), to tech support(2 years), to customer service/qa at a startup(1 year), to an iOS developer(5 years), to now being a lead iOS developer at an established tech company. I don't particularly want to be an iOS developer, but it's the path that got me to where I am at and it pays very well. Not once during my last interview cycle (10 different companies) did anyone ask about a degree.
I think the grind and knowledge gained in multiple sectors was worth it, but it was a long road. I've been programming on and off my whole life so I think it really depends on the person, but not having any degree it is doable. Would I do it differently after looking back? Absolutely, but I doubt I would get a CS degree.
Moral of the story, if you want it bad enough and go out there and make it happen, you can find a place in this industry with a high school education. Just demonstrate knowledge and keep on learning.
I'm just worried about age discrimination. I'm 35. Do I have 15 years to make bank, after which they take me out back and shoot me? How common are developers over 50 at these companies?
One of the reasons we see fewer older developers is that software development wasn't a major career in the 80's and even early 90's. Nothing like it is today in terms of numbers of people. So, there just weren't as many people doing it. Now that they are older the number of younger people dwarf them, in part, because the number of jobs has gone up.
Older developers are there. I know a number of them.
There are also company differences. Some companies don't like to hire older developers. Some of these companies want people to work around the clock and older developers tend to have families and hobbies. It's harder to have an influence on them to work a major number of hours.
Then there are older developers who do not want to change. I know some of these. As technology changes a breakneck speeds they don't want to move on from what they did to something new. This can cause issues and give the age group a bad name.
I find the most successful older developers are those who use their experience to do a good job and what needs to be done while adapting as things change.
My company is LOADED with older devs. Our average dev age is probably close to 50. There is in fact a huge gap, only a handful of us under 30. At 27 I might still be the youngest developer in my 50 developer department.
We have web,app, and even game dev. Personally I work in Data Engineering/Machine Learning/Applied Stats space with some app dev work as well. I'm learning to wear a lot of hats as we segment into small groups and it effectively feels like a bunch of different startups in a "closed" economy of the company.
Sounds like a fascinating place to work. I feel the general rarity of gray beards in the industry denies me a different perspective and experience that I could learn from.
Is there any difference in productivity to a more stereotype younger workforce?
I'm not good with people, so I doubt I'll ever be in a position to do so, but the ideal team composition to me needs a mixture of naive geniuses and people who've been around the block enough to know when to say KISS.
I would say we're a very stable group. It's rare anyone is going to put in more than 45 hours a week, but at the same time stakeholders have extreme confidence that their projects will be done correctly. We're more of a steady churn than a burn out in a flash--which is the feeling I get from younger startup workers.
How does age discrimination manifest? I'm less concerned with being passed over for promotions than just having a job. Is it mainly that older guys are first on the chopping block for layoffs? How much does one have to "watch their back" and perhaps overachieve to stay safe?
I feel like it really depends what you're doing - not necessarily what job title, but philosophically too.
If you can kind of transcend past manager to "architect" then it seems to be much more "cool" to have older staff. If you're doing something highly cyclical (or prone to fads) like designing websites it would have to be an uphill struggle. Experience is hard to measure and value, in some areas more than others.
Obviously the company matters a lot too, e.g. maybe avoid IBM... I suppose you'd have to spot whether the management are likely to believe that youth guarantees innovation (to quote James Bond of all people).
Not all companies have age discrimination not every developer job is in SV or even a “tech company”.
Also why don’t you move to management? I’m soon to be 35 and I’m taking a director role now primarily because it’s somewhat the natural progression path.
A lot of the experience you gain isn’t purely tech focused and it’s a shame to let that experience go to waste.
I like coding, I don't like management. I'm not good with people. Other hobbies of mine are cooking, woodworking, rock climbing. I like getting my hands dirty. I value good management and I'm pretty sure I'd be horrible at it, in addition to just not liking it.
I remember interviewing someone around 50 who interviewed well, we hired, and did good work. I think part of why he interviewed well was he knew we'd ask whiteboard coding questions, so he studied for them.
Some amount of what you hear about age discrimination is older candidates are less familiar with CS fundamentals because you don't use them all that often on the job, whereas fresh-out-of-school candidates still remember them.
This isn't to say the typical software engineering interview is effective or ineffective, just that preparing for what you know you'll be tested on helps a lot.
Yeah, Boeing (can't believe I'm citing them positively) had two tracks for engineers: management and technical fellow, with the latter basically being a researcher. Tech changes much more quickly than aerospace (which hasn't fundamentally changed since the '30s) and so I don't how a "tech researcher" would make sense, unless you're in a rarefied elite like Yann.
I used to hold it as a point of pride being able to make it in this industry without a degree. But increasingly I find it embarrassing, and am worried about the future. Particularly with the shift to remote first becoming commonplace now. What exactly is there to keep salaries from a race to the bottom at this point? With the number of young CS grads exploding, I can see it becoming a hard requirement in the next 10 years. And beyond that, the career trajectory for a mediocre IC with no degree is looking... bleak. Will I really be able to keep this up into my 50s? Probably not. I love software development but it simply doesn't seem sustainable. And the management path seems almost unimaginable for someone without even a high school degree. At the senior level, there's really nowhere to go from here. It seems like you either hit the startup equity lotto, or burn out by 40.
Why do you say that? I mean seriously, you're cutting yourself short. Getting a CS degrees is nothing more than a starting point. After a few years, it certainty won't help you progress into being a better developer automatically. Being a more senior developer is more than just knowing how to create a linked list or reverse a binary tree, and honestly the majority of places have no significant need for that understanding. It's about understanding tradeoffs between technical decisions, its about understanding business, it's about understanding when to refactor vs let it alone. A CS degree will not teach you that, and yes some places will be "snooty" and say we won't hire someone without it. Fine, screw them then. I didn't want to work for them anyway.
Salaries won't race to the bottom, because 1. there is still a demand, and 2. someone from a boot camp doesn't have the knowledge and understanding that a more senior person does. Can they learn that? Of course they can, but then they will become more valuable and expect more money. Companies that hire the cheapest person possible almost always end up at the same conclusion. You pay peanuts, you get monkeys.
Story time. At a previous company I worked with a number of smart people. Education wise, I didn't hold a candle to them. One went to a top engineering school in the U.S., one had a master's degree in CS, and one went to one of the top rated CS programs in the world. (They are all several years my junior.) All of a sudden our boss calls me in a panic, because some update that was done was failing and causing our clients to not submit data. After asking questions about what's happening, level of impact where it was occurring, I briefly look at the error log and say, "There's the problem, and this was the update that caused it." How did I know, because I smarter? No. I had more experience, I've seen more things. I've solved more problems. Their degrees didn't teach them anything about how to handle that situation. This is why not having a CS degree isn't a death knell of a career, and new people entering the field aren't either.
When I went to school to study computer science in the 90's, it seemed like most of my classmates were similar to myself: grew up tinkering with computers and had "taught themselves" (that is, read books and worked examples without being told to by a college professor) programming pretty effectively before setting foot on a college campus. I have no doubt that if I had never bothered with college at all, I could have done nearly the same work in the same positions that I've held since graduation.
In fact, every once in a while on here you'll see somebody post a sort of ironic list of the things that they don't teach you in a CS curriculum: databases, networks, version control, debugging, command-line terminal usage, web security, usability... in short, all the things that you actually need to know in order to be a proficient software developer.
Still, nearly everything (computer related) that I learned in my CS degree was stuff that I never would have come across on my own, like calculus, linear algebra, statistics, complexity theory, numerical analysis. It sort of "ties together" the practical stuff, but you don't need to know any of it to be effective. I'm glad I did it because it's neat stuff and a different perspective. In a way, the nice thing about having a degree is knowing that you don't know what you might be overlooking in not having one.
That said... why not just go and get one now? You'll feel a little weird being 10 years older than everybody else and it'll look weird at first putting "15 years experience, graduated last year" on your resume, but you're nearly guaranteed a perfect GPA at this point. In a few years, you'll have the "checkmark" you need and you'll know what you were (or weren't) missing.
>That said... why not just go and get one now? You'll feel a little weird being 10 years older than everybody else and it'll look weird at first putting "15 years experience, graduated last year" on your resume, but you're nearly guaranteed a perfect GPA at this point.
I tried. CS classes were a total walk in the park. But I simply could not do the math. I hired tutors, I quit my job and studied full time, I've never tried so hard at anything in my entire life. I bashed my head against a wall for almost 2 years trying to pass calculus, and failed it 3 semesters in a row before giving up. It's why I have the utmost respect for people with engineering degrees now.
> And beyond that, the career trajectory for a mediocre IC with no degree is looking... bleak.
I know a number of software developers who are sustaining. They aren't going up the ranks or down the tubes. They are happily sustaining. They do good work and are content.
As we are sold that we always need an upward trajectory... it doesn't work out that way. As you get older you find there is no more up. The purpose of always going up runs out of steam.
> It seems like you either hit the startup equity lotto, or burn out by 40.
Is this a SV or startup mentality?
Most successful startups have an average founders age at founding of like 45. It's not the young in age who are more likely to found successful startups.
If you burn out by 40 it may mean you have an unsustainable lifestyle.
I feel the same thing as you, I think. I was never proud of it per se, but proud of how far I've gotten without one (20 years as a dev, 9 years at Google). But always felt a sense of impostor syndrome and now I feel it even more pronounced as I've started to look for work elsewhere and wondering if I'm at a profound disadvantage: middle-aged, spent a bunch of time out of the mainstream industry inside Google using non-"standard" tech stacks, no CS degree, and the industry has shifted to interviewing standards that prioritize new grads ("skill" testing algorithm/data structure questions that are far easier for people with recent CS class exposure, etc.)
And yeah, management track involves a lot more politics and organizational stuff than I really want to take on, at least within Google. I could see going mgmt track at a smaller company, but without recent experience they wouldn't hire me on that path.
I think interviewing at places that do interviews the right way is important. I've interviewed for senior/staff positions before when they immediately start talking about big o notation and asking me to reverse a string on a white board (seriously). Sorry but your CRUD line-of-business app with 150 total and ~10 concurrent users can use whatever algorithm the developer wants, and if anyone is reversing a string manually there is a problem. It's not even worth continuing the interview if they're asking a senior candidate junior-level questions.
Interviews for senior level positions should contain almost no technical questions at all - you can tell pretty quickly if someone knows what they're talking about by digging into their experience with earlier projects. How was a solution architected? What other ways did they consider? Why did they go the route they did? And my favorite, "think about the most complex project that you had a hand in architecting (for architects) or developing (for developers), and teach it to me." You can immediately see where they focus, what things they gloss over, and if you have time and it's an engaging conversation, changing one of the core pieces and asking how they'd change the design can be a lot of fun (for an interview, anyway).
I agree, and this is how dev interviews always were up until I'd say about 2010 or so... but if you stroll through past conversations on HN about this topic (interviewing) you will see there's very strongly two camps but a _lot_ of people who really feel that the kind of interview you are describing lets in too many false-positives and leads to deadweight people who "can't code."
I think of programming as a creative process. And like any creative process, set and setting are extremely important.
It's endemic at Google, to the point that it's a cliche.
Thing about having no CS degree is, in some ways I really _am_ an impostor here, in a company where masters and phds are commonplace, in a company was founded by two academics who never worked in the industry, etc. When I first joined I was blown away by how much of Google seemed academia-like... internal conferences with poster boards, peer and code review processes like a doctoral thesis committee, campus cafes, academic publishing all over the place, etc. etc. Hell there's even the equivalent of residences (GSuites, Google managed rental properties for traveling googlers)... It's toned down some as Google has ballooned in size, but still.
>If it makes you feel better, I have a CS bachelor's and I still get imposter syndrome on a fairly regular basis
I hear this a lot. The difference is that I objectively am an impostor. I've worked with plenty of Stanford/Berkeley CS grads and can conclusively say that I am a worse programmer than them and will never be as good. My concern is that eventually the number of those people will be so great that mediocre devs will be pushed out of the market. I'm honestly incapable of passing these Leetcode/Google style algorithm interviews which are skewed towards CS grads, and that's increasingly becoming the standard for every company regardless of size.
If you're like me you probably are entirely capable of passing those interviews -- with practice !-- and are probably absolutely fine as a programmer, but are at a disadvantage because it's not the type of stuff you've ever had to do on a regular basis.
CS degrees train people to think that that kind of thing is what programming is ("why else did I pay thousands for this degree?"), and then they go out into the industry and propagate that in interview processes, and believe that failing these tests in an interview is an obvious indicator that someone is "lying" about their programming ability.
The reality is that for many of us programming is a far more experimental and experential process, and I will use all sorts of algorithmic and data structure techniques but will go read or acquire that knowledge in the process of creating. It's not something I feel comfortable doing on a whiteboard in front of a person or group of people.
> The difference is that I objectively am an impostor.
Imposter how? Did you lie on your resume? To your co-workers?I would assume not. You were judged on (presumably) the same criteria as anyone else, and found to be at an acceptable level. Maybe other people are better than you, and that's fine. But if you weren't "good enough", they probably wouldn't keep paying you, and your manager would probably say something. It's kind of the point of performance reviews.
> I'm honestly incapable of passing these Leetcode/Google style algorithm interviews
Almost everyone is completely incapable of passing a Google interview. That's because they're astronomically hard and they only take the very very best people who are at the top of their fields. I've no hope of passing one either.
But so what?
There's always going to be some place that you aren't going to be able to get a job no matter how good you are. Look somewhere else - there's so many more options.
> That's because they're astronomically hard and they only take the very very best people who are at the top of their fields.
I would disagree with that. It's because coding problems are a very specific skill. It's essentially being good at riddles with extra esoteric knowledge thrown in.
Most Googlers are also not capable of passing them. Because they make them harder every year. I've trained twice to give interviews at Google and I just find it demoralizing to administer questions which I myself would find difficult to solve.
But correct answer to the question isn't actually necessary. They're looking for your thought processes, and what it reveals about your knowledge and/or techniques, not necessarily correct answers. Some questions don't even have "correct" solutions.
I don't have a CS degree either and this does worry me. Luckily it kinda seems like the interviews weigh heavily towards algorithms, which I think is feasible to study alone (might be hard to do while working admittedly).
Don't be too discouraged the vast majority of programmers are not going to be from elite institutions ever
> What exactly is there to keep salaries from a race to the bottom at this point?
I see so many takes like this, but I'm not yet convinced that this is gonna doom salaries long term. I haven't noticed a difference in the recruiting communication I receive in terms of salary, and my current salary is above average. Companies would be insane to ask current employees to take a pay cut to go remote unless they want people to leave.
As long as companies are focused on hiring the best people they can find, they're going to be motivated to continue supporting market rates.
> What exactly is there to keep salaries from a race to the bottom at this point?
Nothing. Based on numbers I've seen, there are probably a lot of companies out there where I'd take 75% of what their devs are making today if it means I can be permanently remote. Probably 50% if I could move freely internationally. Now multiply that by how many people are eligible once you expand your hiring range from driving distance to national/international.
I'm 38 and have been asking myself this same question. I have about 4 years of experience, and with the scope of knowledge that a dev is expected to know, I feel like I need to go back to school and get a degree in CS or in something else that I can transition into another field.
Software was always an outlier in professional fields, as most industries require some form of degree or certification, so I think it's good to get ahead of the curve.
As a data scientist without a PhD, I have a very similar feeling. You can kind of make without one for now but I have no idea how long that's going to last. Hopefully by the time it happens my experience outweighs that lack of credentials.
I'll just throw out that I think CS degrees are a terrible idea, because the fact of the matter is there are virtually no actual CS jobs out there. I got my degree in Software Engineering instead, and I find myself far more prepared than many CS colleagues. In my degree we had classes on project management, requirements gathering, structuring and managing databases, heck one class was just about getting your A+ so you could know how to diagnose and repair misbehaving machines and understand things a little bit more holistically.
If you want to be an aeronautics engineer, or a civil engineer we don't require you to go get a degree in physics, sure you'll have a lot of classes that deal with physics and you're expected to be familiar with the basics, but you aren't going to have physics as your major. So why the heck do we keep pushing people to CS when their going into Software Engineering?
If you think as CS education is still necessary and valuable in your work as an SWE please tell me why, I am curious to understand why if people feel it was useful I want to know why.
EDIT: I appreciate the answers I've gotten, as clarification I have noticed several people cite how their degree helped them get a job, or move positions, and I agree that it does that; however I feel a degree in Software Engineering would be just as valid, in these situations. My question is more about is a does a CS degree or an SE degree help make one a better engineer?
I think my degree is fantastically useful, and I'm relatively young.
There is just no way I would have ever learned the things I learned in my CS degree on my own. My degree reshaped my brain. It forced me to be good at math. It forced me to think like a programmer.
I failed my first 3 CS classes, but I kept going and ended up graduating top 5% in my CS graduating class. I would never have learned how to program without this degree I am absolutely convinced. Like yeah maybe I could hook some stuff up to an array in python and struggle through a for loop.
The CS degree gave me confidence, expertise in solving CS related problems, and for better or worse the market expects me to have this degree and it helped me land a job where I already make 50% more than median household income in my city at only 27.
I would take that student debt again in a heartbeat.
Two of my friends with 4 year college degrees recently quit their jobs and went back to study CS. I told them to avoid bootcamps, why?
I've been a part of the hiring process and people from bootcamps are usually not very good, but occasionally they are. To me the bootcamp education doesn't guarantee a CS related role. You could end up with nothing.
Even my friends who STRUGGLED through a CS degree and are AWFUL programmers were able to leverage their degree into roles like Project Manager or Product Owner which have great outlooks and solid pay.
I've seen a lot of Software Engineering curriculum, and the more serious ones are always CS + Engineering (so semester long projects, projects management) and there was an expectation that the student could build something at the end. Basically, what Stanford/MIT does (and a bunch of others).
On the other hand, I've seen places where the "Software Engineering" is about requirement, methodologies and tools and has no math nor algorithms. That's akin to an aeronautical engineer without a solid background in classical mechanics.
Not sure why you're getting downvoted, I don't think you're wrong. Most industry work is not "computer science" work, but more software mechanics or engineering work. Having the CS background no doubt helps, but having it exclusively is a disadvantage. But it is worth pointing out that there seems to be a great diversity in the content in CS programs, and many schools do emphasize practical industry aspects more than others. Or have aggressive internship programs like U Waterloo, does etc.
Switched to? I guess I really am old. This is how almost all jobs were labelled before 10 or so years ago. Somehow "Software Engineer" became standard, probably under the influence of the FAANGs, in the last decade.
It's worth pointing out that here in Google Canada we were recently instructed to start referring to ourselves as Software Developers again, as "Engineer" is at least partially a protected term.
I wish there was a better term. We are not scientist but not mechanics or engineers either. This isnt a minor question. There is an entirely new set of knowledge workers in the new economy that do not have 19th-20th century equivalents.
"Programmer" and "Programmer analyst" were titles used frequently when I first started getting into the industry. Then "software developer." Honestly, that's mostly what we do... develop and repair software.
> It's worth pointing out that here in Google Canada we were recently instructed to start referring to ourselves as Software Developers again, as "Engineer" is at least partially a protected term.
You make a statement without explanation and then when people dismiss it you ask for an explanation. Its very odd behavior. At best I would say it is asymmetric way of thinking.
I find that there's an implicit assumption that an engineer will posses certain knowledge. Things like a solid math background, project planning, economics... Also, that they should be able to apply rigorous solutions to problems and be capable of doing root cause analysis.
It's the same thing in the medical field where everyone had the same core courses in med school before choosing their specialty.
I don't think "Front-end engineers" is the correct term to designate someone with a 6 month bootcamp on their resume, because it really breaks all assumptions I've made above.
I dont disagree. However I do think the 21 century is partially about the breakdown of certifications and professional organizations and a shift to purely merit.
There are people in FAANG the dont even have any post secondary making > 500k a year as "Software Engineers". How this works without total chaos is that each company has a rigorous testing procedure for candidates.
This cataclysm has yet to reach the field of medicine or law but it is sorely needed. (I say this with family in both).
I understand the difficulty and that the term "engineer" probably should never have been used as a substitute for "skilled knowledge worker" but I would expect this kind of deracination to continue till the economic calcification is no more.
> There are people in FAANG the dont even have any post secondary making > 500k a year as "Software Engineers". How this works without total chaos is that each company has a rigorous testing procedure for candidates.
That's what I expect from a credible credentialing organization, to be able to tell me that person X meets a minimum bar. Right now they get a bad press because, especially in the medical field, it's more about gate keeping and lobbying than actually enforcing standards and protecting the public.
I would argue that surely someone making that type of money at a world class organization like Google would have no problem passing any engineering exam.
How else will you be able to pass algorithmic whiteboard interviews? Everyone knows that acing those is the only way you can possibly be a good engineer
For those wanting a degree, check out WGU. It's accredited and you can go through it as fast as you're able.
I dropped out of traditional college. Last class I remember they were talking about what a GET and POST request were. I had been a professional web developer for years at that point. I felt it was a waste of time.
In WGU I had a similar easy class, stuff that was "way below me". Not to worry, I passed the class in a day and stopped worrying about wasted time.
I recently did a C++ assignment that was filled with artificial requirements making it artificially difficult, yet I learned some things about pointers and figured out the basics of CMake for the first time. Useful stuff. Now I'm doing a similar assignment with Java. I knew I would have to set aside my distaste for things like Java (gross!), and with that mindset things aren't so bad. I'm actually enjoying JavaFX.
WGU's calculus class was harder than calc 1 at my brick and mortar university, but easier than calc 2. Once again, I knew most of the material and passed after only a few weeks of review.
I do agree that a CS degree doesn't teach as much as it sometimes gets credit for. I do believe WGU's CS degree is as good as many brick and mortar CS degrees, but with minimal "bullshit", though there is still some.
Somewhat unrelated, but getting a degree (not necessarily in CS) can help with international mobility since some countries make it a hard requirement.
So if you are from a country where developer salaries are low, or that isn't conducive to start a company if that's your thing, then a degree can have a huge impact on your life regardless of what you actually learnt.
I know this is an unpopular opinion in STEM fields but a broad education in lots of different subjects is important to being a citizen in a democratic society. While a degree isn't and shouldn't be necessary for everyone, the goal should be to educate people more, not strive for a society where nobody needs formal education.
To me this is evidence of part of the problem is that no one can agree on what higher ed should be, job prep or providing a liberal education. So we get the worst of both worlds where we force students to do both while taking out massive debt, resulting in a workforce that still has to be trained when they graduate, but much of the liberal education being ignored or useless because there is an argument over whether or not you should be preparing for a job by going to college or whether it is to make you a more holistic individual.
For the record I would subscribe to the liberal education viewpoint of college but question why that can't be had for free with the resources of the internet at this point.
Note: By liberal education I do not mean politically liberal but rather a term encompassing a full rounded individual akin to a renaissance man, that is familiar with and can discuss language and mathematics just as well as art history or philosophy.
>For the record I would subscribe to the liberal education viewpoint of college but question why that can't be had for free with the resources of the internet at this point.
How well is remote schooling working during the pandemic? Now imagine that, but with no teachers at all. Teachers are an important part of the education process.
Honestly I feel that those that are desirous for knowledge and want to become a well-rounded and get a liberal education are the ones that are going to go out and learn and gather knowledge regardless of whether or not they went to school.
Whereas those that view college as a tool of securing employment complain about the classes and feel they are a waste of time, resulting in them not getting much out of it anyway.
I agree teachers are important, but does a teacher have to be someone that sits and lectures in a classroom? If I am interested in OS theory I can join the Linux mailing list, ask questions on Linux development forums, heck if I ask a good question I may even have Linus himself take a minute to respond to my question.
Beyond that from my understanding many of the "teachers" at the college level are Grad students assigned to teach a class and that professors view it as a chore.
Not that I am totally against teachers or traditional education, I just think that one can acquire all of the benefits of a liberal education without ever needing to set foot in a classroom in this day and age.
All it takes is a single 200-level Poli Sci or Econ class to see how about 40% of the comments in those threads on HN are nonsense. I'm sure the same is true for law and other humanities/social sciences as well.
I never felt like dropping out of college was a hindrance to my ability to write software, but lacking a deep and broad education has been an obstacle in the kind of general problem-solving that needs to happen when you're a senior developer or taking on project management or design work. Thankfully a lot of the education I got before I ran out of money was general ed and electives, but it's still a regret. These days with my busy work schedule I don't have a lot of time to fit in that kind of education on my own.
That's a good point. I think it's slightly different for SWEs though because most (all?) who don't have a degree must be good at self education in order to be employable
This is a common (and, I suppose, understandably so), but mistaken HN trope: that developers who went to college learned how to program from their professors whereas the non-college types are the scrappy, pull-yourself-up-by-your-bootstraps autodidacts. In reality, I've never even heard of anybody who's completed a CS degree (or any other degree) who works as a developer who didn't also learn independently exactly the same way people who didn't get a four-year degree did. The CS curriculum seems to expect this and instead of focusing on the things that you can realistically learn on your own, focusing on the things that you're probably not going to come across on your own.
>instead of focusing on the things that you can realistically learn on your own, focusing on the things that you're probably not going to come across on your own
This is an excellent point. Almost nobody has an idea of the true depth and breadth of most subjects or how to sequence that learning.
But I mean in terms of a degree as a heuristic for employability, a non degree candidate needs to be much better at self directed learning the stand out. I.e. there are plenty of CS grads who are terribly unqualified but will get the job because of their degree