One of _many_ examples was when the software lead came to me excitedly and said he got basic autonomy working, but the robot wouldn't drive the same distance every time. We talked about open vs. closed loop systems, their wheel encoders, and PID loops. I'm convinced that when he goes to study any of those things in university, because of this real-world exposure to them, they will be far easier to grok.
I got my first job through FIRST and was so much better prepared to solve difficult problems through what I learned there.
You probably have a local team in your area. If you're interested, you should mentor! The students' excitement is contagious and its a fantastic experience.
I started a team with 4 other friends in high school, and we ended up placing 2nd at the World Finals (sadly, we lost in the finals...)
I'm still very proud of this robot we built: https://www.youtube.com/watch?v=aDYqt-jd0cg
Hello from a mentor of FRC Team 2560!
Now at 35 nobody even asks about the degree anymore. So a degree is not worth that much in this industry compared to practical skills. However, I think many people might struggle to learn on their own and might lack the self-discipline or initiative to guide their own learning - in which case get that degree! But keep in mind those same skills are required on an ongoing basis to keep up with this fast moving industry, and you won't get as far without them. I was home-schooled so for me it's just business as usual - that's actually been the biggest thing I got out of my schooling.
At 35, maybe.
Before that, a degree is worth a lot.
School is essentially trash for demonstrating "can program in a business environment". If you're aiming for a research-heavy position it may help, but that's an extreme minority of jobs. And after a couple years it's useless again anyway - you're either continuing your self-education and improving, or you're stagnant, and your prior education doesn't play all that much of a role.
There are some people who definitely benefit from structured learning environments, so I don't ever try to dissuade people from going to college. Rather, I try to get everyone to just start building stuff and releasing it open source and contributing to open source as soon as they can. If you can learn how to search effectively, read documentation effectively, and program proficiently in at least one language you don't really need college.
May help with the initial job, but to be honest - you don't really want to work for a place that gives a shit about your degree. Just your capabilities or potential.
However not everyone has that luxury, especially outside of tech-hubs. And even more so outside of NA.
If your goal is to get a job at FAANG or another big company early in your career you absolutely need a degree. For other jobs it will make things harder but it's not really a requirement. If you still want to work at FAANG 2 or 3 years of industry experience is worth more than a degree.
Where it has held me back is getting a permanent residency in Canada. I spent the last 2 years working as a backend developer in Canada just to see I'm not eligible for express entry PR due to lack of education.
All countries are moving that way in highly specialised or high demand jobs, except medicine. Even the US has dropped that requirement for sponsored jobs in recent years.
If the company is hiring a junior to build a CRUD app, or even Uber for XXX, a degree wouldn't affect much.
If the company is a business intelligence solution house, a degree may be just an entry ticket.
They all list C, C++, .NET, HTML, CSS, and Java on their resumes but haven't done anything except a simple group project in any of them, half the time not even writing any code for the project. Which means a role in documentation, testing, etc.
They come with a chip on their shoulder, flaunt their degree, and demand a salary close to what they think they should be getting, according to their advisers or statistics or whatever.
To make it worse, they don't have any personal projects to show. I tell every single person that I interview to create something, even if it's a failure. At least you can come back and discuss your experience trying to create something, and you'll learn more about development working on that project than you did the whole semester you learned Java.
I think they seem entitled because they did what they were told and possibly received good grades as a result and their attitude may be them saying, "I bought an education. I was graded and found to be exceptional. If I had needed it, it would have been a part of my education". Right or wrong you should feel sorry for these kids. They paid a fortune and got ripped off. They just haven't realized it yet.
1) Employers decided that it's better to lay off people you don't need anymore, which meant that
2) Employees realized it's more lucrative to jump from employer to employer than maintain loyalty for a company, which meant that
3) Employers realized it's not productive to spend 6 months training a junior who's going to leave in 8.
Now you can live in your tiny home as a minimalist and participate in a bootcamp, instead of living in a trailer park and going to community college.
Common misconception, but no this is not a failure of the academic side. This is exactly what universities are for. The thinking always was that if you teach the fundamentals, then the practical applications are a foregone conclusion. Someone who is well versed in advanced theories of computer science and electrical engineering: boolean logic, circuits, abstract computation machines and models of computing, data structures, algorithms, language design should be able to pick up whatever FooBar FOTM programming language or paradigm or framework that comes out.
If you think that the situation today is difficult with all the languages and frameworks we have just know that it was almost the exact same back then as well, except you didn't have the internet, chat rooms, and robust documentation and Q/A sites to help you answer your questions. Imagine the, excuse my frank language here, whining that would be happening right now on this forum and many others if you were just slapped with an IBM manual and told to bootstrap things yourself. I could only imagine!
Trade schools are suppose to be the institutions that teach you practical skills "to get a job". Universities are for expanding your mind and learning. I don't know why we mixed the two and then act surprised when a multi-century institution fails at modern workforce demands.
To anyone reading this: Do NOT go to a University to get a job. If your goal is just to get a job in this field, you can do that in far easier and cheaper ways (start reading some programming docs and get busy building things). You go to a university to learn. The getting a job part is a natural consequence of the learning you have done and are now are able to do.
- In a computer science education, you do indeed still use a computer.
- There are still proprietary techniques, substrates, solutions and materials that industry chemists use that you almost certainly don't have access to in your standard university classroom. These would be akin to the variety of frameworks, tools, and libraries that exist in the programming world, many of which are open and documented, many of which are proprietary and closed sourced. Hopefully, however, your university education has taught you how to learn to learn in order to use these things.
- Chemistry is not about test tubes and beakers. That's just what it looks like today. Likewise Programming by typing characters on a screen is just what it looks like today. To introduce another analogy, just like Geometry is no longer about rulers and measuring pyramids, Computer Science is about how to formalize knowledge and it just happens to look like semicolons, braces, and 0's and 1's today.
Computer Science isn't about learning a trade, you're learning a science.
Yet another argument to better/more trade schools that are multi-year programs (like an Associate's degree) rather than universities.
To be fair, manuals used to be better.
As a young lad, working at an IT consulting company I once got sent out to "fix the computer" at some cruise line. I went out with no idea what I'd encounter, and found myself faced with an IBM system I'd never even seen before. Fortunately, the service manual was with it and with its aid I was able to effect a circuit level repair and got them back up and running. ::shrugs::
The material you can dig up on the internet is worlds better than poor documentation. But a comparison with good documentation isn't always so clear.
School should encourage the curiosity to learn on one's own and provide resources to help. It should also provide a framework to be taught things that will last beyond the end of the degree. They need to do a better job of instilling curiosity, but they shouldn't go all the way to the other side and become vocational training.
I want to expand on this a bit, as I think what students are being thought is the syntax of languages but none of the "how do you design a project continuously as requirements change?" which is the really difficult part of software engineering. So they might know how to write the code, but they don't know how to come up with what to actually write.
Instead we get the odd equivalent of companies hiring math and physics majors and being suprised they aren't electrical engineers.
Don't wait around for stuff to happen. Call the local Univ CS dept head and have a chat. Its much more effective than talking to kids or expecting them to do things by themselves.
They’ll get hired at non-tech companies no problem with just a degree. But if they want the high salary tech company positions, they have to spend considerable time practicing their skills outside of class and building a portfolio.
Our curriculum is great for exposing them to a variety of CS topics, but it won’t make them stellar engineers by itself.
In, say, an architecture degree, students and professors understand that you will apply to positions with a portfolio. I think students are able to use work from the degree program in their portfolio. Their professors have portfolios from when they first applied for positions out of school, so they can help or at least provide examples.
Currently, we just have a vague notion that you should have code or contributions on github, and that is for some reason not part of most degrees. Or, too many students get a cs degree and try to use it to apply for software development positions. In that case, yeah, you're going to need to self-study if you want to apply for positions you didn't actually get a degree is.
I even had an interview where one my interviewers pulled up my GitHub during the interview and asked me questions about design and tooling decisions on certain projects.
From each company where I received an interview, I was told that it was because they were interested by my side projects.
Of course- it may depend on what your side projects are! Mine were of interest for the cloud based roles I was looking for- RESTful APIs, Kubernetes, AWS, Linux SysAdmin stuff, etc.
They're using a standardized interview format and they don't care about anything else. I've interviewed with others where I did have projects up, they viewed the source code, and it went over well with them - but that was extremely rare (I can only recall 1 at the moment out of the 100+ companies I've interviewed at).
If I'm doing phone screens (i.e. top of the funnel and looking at large numbers of people) I glance over github projects and links quickly - about a minute per resume to look over everything (this is is why formatting, styling, and ease of reading matter on resumes). If I'm doing in person interviews (i.e. bottom of the funnel, talking to small numbers of people) I look at almost every personal project, website, and link the candidate puts on their resume with the intent of asking the candidate about them.
Some of my peers will list everything they've written a "hello world" in, and from the employers I've spoken to, a good chunk of them see a long list of languages without any proof to mean they're over projecting competence. Doesn't mean there is no interview, but it hurts the chances of getting one. The interview ends up being the point where they can test how truthful they are.
I would not say that I am much better technically now, than I was 5 years ago. Much of what I have learnt in the industry is about coping with working in an office, having pointless meetings, doing design by committee, etc.
It's likely that people who approach tech with the desirable attitude above will have built something personal they can show. But having built a personal project is neither necessary nor sufficient to demonstrate curiosity and zeal for craft.
EDIT: To be clear, I think people willing to get the degree and do the work should be paid according to market value. You come across a little bitter that the top performers went off chasing jobs at FANGs... maybe if you paid market rate more of them would stick around.
When anyone writes "C, C++, .NET, HTML, CSS and Java" on their resume, that obviously has to be considered alongside their experience. If you don't want to hire and train inexperienced programmers, I guess that's fine. But I find this idea that there is a minimum experience (and accompanying salary) threshold for the entire industry that can only be reached through personal projects of a certain size frustrating.
The universities and training you get in undergrad is probably miles better today than it was back when I went to school. This does not seem like an education system failure--it seems like inflated expectations from employers for entry-level jobs. If employers are looking for mid-to-senior level employees who can be productive from day one, that's fine but they should probably not rely heavily on university recruiting.
This has been really good for me, since I don't have a degree, but I still can typically get through a whiteboard interview in most jobs that I apply for (though occasionally I'll get a problem that trips me up). School is certainly useful, but it should be used in addition to personal projects, not a substitute for them.
Why do you try to recruit from a small local pool like this? You can hire from anywhere in the country!
The arrogant peasants you're hiring expect to be paid? They're relying on data and expert advice to decide how much money to seek?!?
You would be wise to steer well clear of such fools.
As I read the GP comment, it accuses the OP not just of a straw man, but implies being an asshole (i.e. thinking that others are peasants who shouldn't expect to be paid). That's clearly not a fair reading of bluedino's post. Each move in such a direction is a step towards degraded discussion; hence the moderation reply.
Obviously these are matters of interpretation, though. People often read the same things differently.
The OP's comment implies that the job applicants are entitled for making salary demands based on statistics. There is nothing entitled about asking for market rates.
The comment responding to it points out the problem with this thinking in a lighthearted way. To me it was the perfect response. I understand you may disagree, but it was in no way unreasonable.
My comment on the other hand could have come across better. I implied you were not following the rules you set which I don't think is fair. I may disagree with your decision, but I could have done it in a more respectful way.
What entry into the profession should be like for a new CS graduate, and how we fall short compared to the other professions, feels like a topic for a much longer comment. Although perhaps it's a little bit like the book "The No Asshole Rule," where one might read the title and kind of get the point, and just skip it. Or possibly one might read the title and feel vaguely threatened, and just skip it.
OTOH, I really do appreciate you pointing out that I might not have been elevating the discussion with attempted humor, and I appreciate your work in general.
In addition, sharing experience is at lower risk of misunderstanding and less likely to descend into flamewar. There's no contradiction between experiences. A hiring manager in (let's say) one state has their experience, and a new college grad in (let's say) another state has theirs. Both are real because they happened. If we report from that level, we can't contradict each other.
It's when we turn our experience into abstractions (e.g. the abstraction "new CS grads") that trouble arises. Abstraction A appears to contradict abstraction B. Both are lossy conceptualizations of a more complicated reality, and it's the lossiness that makes it seem like we're at odds with each other.
That's interesting, and a fair point. The differences I'm admittedly fixated on are the differences between software development as a profession versus engineering, medicine, or law, for example - or even the creative arts or science, not just the licensed professions! And I really do think it's a bit much to unpack adequately in the comments here, though I have seen a few good ones that touch on some things I think are important.
> the abstraction "new CS grads"
No abstraction intended, you can read that as "members of the set of students who recently graduated from CS programs in the United States and will now look for jobs or continue further in academia" if you like.
I would encourage them, if any were nutty enough to ask me for advice, to be enthusiastic about their future contribution to what is (well... should be) an amazing field, to be proud of their education, and to negotiate as well as possible with their first employer out of college, since that will help set the trajectory of their future career.
And to avoid people who misunderstand them by saying things like:
I quickly learned to give my engineering side-projects a higher priority than schoolwork. Schoolwork could always expand infinitely to fill as much time as it wanted, and school was a hideous monster that would swallow every ounce of effort I put into it, take it for all for granted, and only greedily demand more of me. If I'd played along with that, I would've had to give up the drive to build things that made me want to pursue engineering as a career in the first place.
I had an industry internship every summer while I was in college, and I found an industry job - which paid more than my program's average undergrad starting salary - the afternoon after my last final exam. I say this because I want to emphasize that even by the standards of my school's career department I have """succeeded""" - yet every single one of those positions made absolutely no use of the things I was forced to do or learn to get my degree, and extensive use of the skills I learned by ignoring schoolwork and focusing on personal projects.
I can't say with certainty that school did absolutely nothing for me: my current manager says that even though he didn't look at my academics at all, he likely never would've seen my resume if it didn't have a degree on it. But while I can count on one hand the number of times something at work has been remotely relevant to my schoolwork, the number of panic attacks and depressive spirals caused by anxieties rooted in my experience in the academic environment is beyond countless.
They haven't spent a whole semester learning Java, they've spent a semester learning about Java. By writing a group project in each of those tools, they prove not that they are ready to start knocking out your Jira tickets on Day 1 - they're not, as your experience and their resume proves - but that they are capable of working on unfamiliar software projects in a group, which is what they should be doing in your company. Take a student who knows how the fundamental function your software performs can be built, and teach them the unique tools your company uses to do that. It only takes a couple thousand hours for someone who knows how to learn to develop good proficiency in your particular stack, and there are so many stacks out there that you can't expect everyone to have invested that in yours.
I do understand (and sympathize) why many job candidates disagree with the importance of a "personal portfolio" and so we get repeated frustrations of comparing it to accountants, lawyers, pharmacists, etc. E.g. if pharmacists don't dispense drugs on the weekend "for personal fun", why do we "expect" programmers to do the same?
The way to make better sense of it is to not frame it in terms of "expectation" of a job seeker but of "identification and preference" of employers choosing one job seeker over another. Whether the preference itself is flawed is irrelevant. The point is that it's human nature to use a preference as a tie breaker.
Programming, much more so than accounting/law, is an activity that many more people like to do for personal "fun". Because of this, it's inevitable that some employers tend to want to identify them and prioritize them over others.
And hiring preferences are not unique to computer programmers. Here's engineer Ben Krasnow (of popular channel Applied Science) explaining how a "personal portfolio" helps him stand out and get good jobs:
- deep link and watch for ~30 seconds:
So, if you're an electrical engineer getting angry at the world because employers shouldn't "expect" you to design circuits for fun on the weekend, don't be surprised if they hire Ben instead of you.
Edit to add another Ben K video covering the same topic but mentioning Valve & GoogleX and choosing between prospective candidates:
- deep link: https://youtu.be/4RuT2TlhbU8?t=112
The fact that I do programming at the weekend for fun on my personal projects does not in the slightest imply that I have fun at the kind of project that I have to do for the employer. So, judging by this criterion is a really bad idea of the (potential) employer.
I am not sure that was the argument being made. I think it was more like "If a candidate actively uses and develops their skill set in their own time, then they will be preferred over candidates that don't". Those candidates demonstrate a higher level of mastery of the skills needed by virtue of doing something, _anything_ over other candidates who don't, even if the demonstration is trivial. It can also signal a lower managerial load if the candidate has demonstrated the skill set to self-start and self-motivate. That can also mean less ramp-up time to being productive.
I don't see working on personal projects as doing personal development. What I do then is riddled with bugs, no unit-tests or documentation, and with no sense of project or time management.
Don't get me wrong if a personal project to you is production code, well then to me that is a hobby or maybe you are doing it to pad your resume, in which case sure it does seem like a good signal.
Overall though you have to have unproductive fun and that's what personal projects are for me, and this notion that anything tied to code equals resume is really insidious. Like when do we ever get away from work? Hustle culture seems to have made it's way into a white collar job, I need to tune out.
I am coding majority of my working time. Some more hours at home does zero for my learning.
Also, people who put really a lot into work during work day, don't have projects outside. Because they are literally tired from work. If they would not, they would work more intensively on work. Like, if one have slow boring month, then it is easy to crank some more. But in normal week when you do with full focus, you get tired.
1) Recent grads should not feel entitled to pay, respect, etc. etc. just because they have a degree.
2) Someone does not need to have a wide portfolio of projects to be a good developer, but those will probably make them a better developer.
The best junior developers will be the kinds of people who build constantly, but you don't need to be that to be a good, competent programmer.
They are also likely getting salary according to what their advisers or statistics tell.
Also, real life state of things is that for non-juniors, most companies dont actually care about side projects. And overwhelming majority of developers dont have them. It is meme on hacker news that it matters, but I think that is some bias here.
Also, If I were hiring a beginner accountant I would greatly prefer one who had set up imaginary companies in quickbooks, and it isn't hard to do so.
Actually building things demonstrates that you might actually build something in my company. It's just competition though. If you have two guys, one who has been in CS education for 5+ years but somehow has no personal projects in that time, versus a guy with no education and a few interesting personal projects, I know who I would pick. I'd also be kind of shocked that in 5 years of learning about a topic you never built anything interesting.
It functions despite the fact that your accountants didn't have a portfolio of personal projects when you hired them and don't do accounting for fun in the evening. To my knowledge, it's substantially only programmers who are expected to do so.
Accounting students are studying for a CPA, a similarly regulated exam that tests your knowledge of the rules and how it can be applied to specific cases.
By passing the exam, you're licensed to practice law or start doing tax/audit/advisory for companies or individuals.
There's no such analogue for CS grads, because CS programs aren't focused on training you for a specific credential.
I don't know if it's just or wise - but it's not unique to programming.
If you group it more closely to creative fields like design or writing—both fields where portfolio reviews have been standard hiring practice for decades—then it makes sense to expect to see projects. If you view it closer to a professional services role—accounting, legal, etc—then your comparison makes sense.
I'm not advocating for either view as the "correct" opinion either, I just think the disagreement has less to do with what qualifies a junior and more to do with how we conceptualize development as a profession.
> I'm not advocating for either view as the "correct" opinion either, I just think the disagreement has less to do with what qualifies a junior and more to do with how we conceptualize development as a profession.
I believe that there exists a good criterion to judge: If the employer considers programmers to be "fungible", programming needs to be viewed as professional services role. On the other hand, if you are considered and catered for like a "creative rockstar" by the employer (because the employer will have a hard time replacing you), it should be considered a creative profession.
Judge for yourself what is closer to the truth. :-)
Coding is writing, and choosing what to write about, and picking language to use to write about it. It is an unending series of creative choices.
If someone has no exposure to the process of making those choices, they are not ready to code.
Programming itself is also not limited to developers. That should be self-evident in that spreadsheets are the go-to tool for so many non programmers, and they are at heart accessible programs. In that sense, it is programming that is a creative endeavor, not development or coding.
Last but not least, the most creative (and valuable) part of programming is arguably problem definition and problem solving. I would argue these two things are shared, in one way or another, across most jobs and careers.
However, it seems to me - as a layman with zero accounting experience beyond Wikipedia and GnuCash - you could feasibly drop an accountant into any arbitrary business and they would eventually find their way around the cash flows, balances, and - given sufficient information - build up a general profit/loss and general idea of financial health.
It doesn't seem to me, however, that you could drop any software engineer into any arbitrary company and expect the same. You should be able to, but not right now. Some software engineers can do that. Not all of them.
I suppose my point being: a degree from a chartered accountant indicates a core level of competency to me. But a software engineering degree doesn't. This is why we interview them with whiteboards / fizz buzz / how many windows can you wash / what is the ideal shape of a manhole cover.
Because you can get a top class degree in S.E. / C.S. and still not be able to handle a JVM OOM error.
In general, I agree with your point. It's mentioned elsewhere that there's a difference between a trade school and a university, and a software engineering degree is a broad university degree. Much like an English degree won't guarantee you're a great author, it seems odd to expect a CS degree to guarantee you know how to handle an OOM error, for example. I think in many cases CS/SE degrees to themselves a disservice by trying to teach everything, instead of allowing specialization that can be clearly understood in the next phase of someone's career.
In university, we learned the concepts. We applied the concepts in internships and learned how to do things hands-on. When it comes to applying for jobs, internship experience is considered more important than academic coursework. So this is kind of parallel to how 'real world' experience developed from personal projects is more important than academic training in software.
So in this sense, this kind of reinforces what the original poster said. This kind of hands-on experience is important, and there are different ways of demonstrating it in different fields.
You are right though, no one is expected to do Quickbooks on the weekend. You have to realize that there aren't a ton of professional fields where people do what they do professionally as a hobby.
The good side of this is that in order to get hired in tech, you don't need a degree or a P.Eng. The downside of it is that you as an employer have to do the entire vetting yourself, instead of relying on universities and professional engineers associations to do a significant part of the vetting for you.
EPC companies do not give their civil engineering job candidates a civil engineering equivalent of fizzbuzz, because they don't have to. If they are certified, they have demonstrated they possess the required knowledge. I cannot see how fizzbuzz can be eliminated from software without a similar certification process being created in its stead.
To add: Other professions such as lawyer and doctor also mandate similar apprenticeship timeframes. It's time to take a similar approach for software development so that it can be taken seriously as a profession.
That's because imaginary companies are too simplistic to learn real lessons about accounting. And it's prohibitively expensive to start a new company just to practice accounting.
At the same time, the best accountants I know are interested in accounting. But lets say you were hiring a writer -- would you look at one who could only point to class assignments as a portfolio of work?
Novel solutions for solved problems are generally non-optimal.
One of the best parts of college is the ability to learn more about yourself, diversify parts of your life, and experience new things. It's hard to do that when your life is split between school work and personal work.
I worked my life away at side projects and a part-time programming job in college and looking back there's so much I would've done differently. CS programs across the country need a wakeup call.
University industry, unions, and academia more broadly, set the tone that if you want to do anything "for real" you must go to school. Many companies, parents, and ordinary lay people are convinced.
My (somewhat limited, I am young) experience tells me that competence is a completely inadequate predictor of whether somebody went to school for computing or software, but self-importance is an excellent one.
One of my university teachers told us that to become a real programmer, you have to design and build at least one language of your own. I wouldn't choose the same words; but having done several, I sort of agree. You need practical experience before it makes any sense though.
The problem is rather the expectation of the employers: No civil engineer is expected to build huge bridges in his free time.
If one could build bridges that easy, I am sure people will do it.
But lots of architects design things in their spare time.
Is software engineering closer to civil engineering or closer to architecture.
That's just bollocks for a number of reasons. Maybe you're too busy doing productive work to build a portfolio. Likewise, employers are often too busy to look at and evaluate your portfolio. And unless you spend a huge amount of time working on your portfolio, chances are it's going to have nothing relevant to your job.
Perhaps this junior developer fresh out of university needs to show up with real world experience before blogging about it :P
If you've worked for years developing proprietary code of any sort, you cannot share it. One's personal GitHub portfolio is a completely different beast than one's professional accomplishments.
You can still call yourself a Computer Scientist, but there's a difference between the two.
You seemed to be implying that the bar for calling yourself a computer scientist is lower than the bar for calling yourself a programmer. The opposite is true.
Also, not all computer scientists are in the business of writing code.
People call themselves chemists, geologists, etc. without ever publishing anything.
Ditto computer scientists.
'Scientist' is meant to mean 'person who does science'. Roughly speaking, that means someone who produces research papers. If someone says I've been a scientist for ten years, they mean to say they've been doing science for ten years. They do not mean merely that they were awarded a Bachelor of Science degree ten years ago.
This is less of an issue in, say, bridge engineering. The term 'engineer' is (in some jurisdictions) a protected term, reserved for proper chartered engineers.
If you have a better word for people who make a profession of doing science, I'd like to hear it.
2) Research chemists don't suddenly start using the term 'chemistry scientist'. They are all just chemists. No hang-ups.
We're not talking about the word science, we're talking about the word scientist.
It is indeed a long-accepted term. Scientist means someone who does science. It does not mean someone with a basic grounding in the field, which is what a BSc indicates. With few exceptions, scientists hold PhDs. That is, after all, the point of a PhD.
If you get a BSc in computer science and then make a billion dollars selling fidget-spinners, your Wikipedia article will describe you as an 'entrepreneur'. There's no way it would describe you as a 'scientist'.
> Research chemists don't suddenly start using the term 'chemistry scientist'. They are all just chemists.
Chemists are scientists by definition, and the same thing applies here. Holding a BSc in chemistry does not make you a chemist. Doing chemistry, and presumably publishing papers, makes you a chemist.
As with the distinction between computer science and software engineering, there is a distinction between chemistry and chemical engineering.
As does doing computer science. I did computer science last week (analysed the run time complexity of an algorithm).
Hence: computer scientist.
No, it absolutely complete, totally, without exception does.
Just look at the job advertisements.
What I do is primarily programming.
Don’t just build stuff—study stuff!
I ended up learning some data structures and algorithms through ACM and codewars.com, where my peers from better schools learned them in class.
I've had multiple classes where professors said they had to remove programming assignments and projects from the course because it was reducing the pass rate below acceptable levels.
All of the students working at the computing center realized how lucky we were, and received multiple good offers despite graduating in the middle of a recession.
Instead, I dove head first into trying to work with other people to build products that people would use. While none of those products ended up going anywhere in terms of financial success, the experience itself really helped me in articulating what sort of value I could provide to a prospective employer. In fact, I would give all credit in getting my first job to the fact that I was able to talk about the challenges I experienced with an on-campus recruiter who signed me up for an interview on the spot.
I really think the learning you gain by doing, building, etc really helps you grok the underlying academic principle. For me, I found that I have to do first and then go back and read up to learn what I've experienced.
> If you want to be a software engineer and you do not have any code to present in an interview, you can not call yourself a programmer.
Additionally, from one of their recent tweets:
This is unnecessary gatekeeping from anyone, but it's especially unwarranted from a "junior software developer, fresh out of university." I'd encourage the author to be less prescriptive when there's so much gray area and so much variation between experiences.
I also prefer to know a programming language and its ecosystem really well instead of having some vague knowledge of 10 different languages. If I implement something I want to implement it well and to know as soon as I write a line of code what it does and all its implications, edge cases and possible performance issues. I am not saying that you shouldn't adapt your toolset based on the project you are working on, but that it's better to have a tool that you can always rely on and whose ins and outs don't surprise you anymore.
I am talking from the point of view of someone who builds stuff, creates products. Just learn a good-enough tool very well and then focus on the product itself instead of focusing on the tools.
You don't encounter anything like it in the listed langs unless you somehow work with Netty/Twisted/asyncio/etc.
I come from the opposite end, I have no college degree and started programming with writing small BASIC apps on a calculator in middle school.
I used to think I wasn’t a “real” programmer when I was in school because there was so much I didn’t know. But in retrospect 18 year old me was better prepared for an actual programming job than a lot of the new grads I’ve worked with.
It’s not to say there weren’t huge gaps in my knowledge, but the thing is, actually writing code completely transforms you as a developer.
A “programmer” that hasn’t ever worked on a project end to end is completely different than one who’s churned through one day in and day out for even a year. I’ve watched new grads make that transformation and it’s amazing.
School will make you a better professional, and if you have a passion for it, let you additionally get you enough experience to hop straight into being a programmer confidently , but not everyone has that passion.
The same way I didn’t have enough passion except to do exactly as much was needed to let me write code in a professional setting, some students only have enough passion to get a degree which will let them be a professional and don’t really care that much about being able to execute right out of the gate.
I don’t think either is less “noble” as the other, I was too lazy to get the fundamentals, I understand someone being just happy to get a degree, but both paths have very different effects based on the field (I mean, my path would be a dead end in most fields)
I've hated school ever since middle school, but I recognized I wasn't dedicated enough to get a job without a degree. While going through the motions of academics, I noticed most of the classes were, basically, a joke. Not at all relevant to industry. I went to a fairly well known school as well, but didn't feel like any of it was worth it.
However, starting my second year I was working on side projects in full force, and applying for internships. Sure, my GPA dropped a bit, but I got relevant experience. Then a black swan event, through a connection I was offered a full time position at a very large company, solely on my experience from projects and internships, no degree required. It paid well too (or so I thought). I very much considered dropping out, but several people in my life (including my new boss) strongly suggested I get my degree.
I ended up finishing the degree and almost doubling my pay, so I'd say it was very worth it. However, a degree is not necessary (nor sufficient) for employment.
I'd argue that's the wrong way of looking at it, yes having something to show might be a signal booster on your resume, but it should be a side effect.
Building stuff is only fun if you want to do it, the motivation and passion has to be there, otherwise you'll just be building something for the sake of it. Don't get me wrong, it's definitely useful to gain knowledge from building, even if it's just a toy project, but you'll get so much more out of it if you're building something you're passionate about.
You can consume a lot of books, TV, movies, games, etc., however, it’s more valuable to produce something.
What separates own-projects from previous job experience is the ownership.
Anyone can be a "valued member of a team", go to meetings, collaborate, open and close lots of Jira tickets, be super helpful, check in code which "should fix the problem", etc.
But it's not the same as if you have your own project, which fails or succeeds based on how much effort and debugging you put in.
EDIT: more ideas
Try to build better Firebase.
Try to build OS that's truly secure and convinient and hackable that allows me to write integration scipts based on app events and allows me to rewrite any app UI.
Design App UI building toolkit that puts current solutions to shame.
Build MMO game that will make Blizzard sad.
Build simple and configurable personal assistant.
Build relevant adds solution.
Design better search engine than Google.
Design publishing system that will get rid od ads and will allow to fairly compensate content creators and make it sustainable.
Design simpler HTML, so anyone can write browser.
Try to build/design language that is both expressive and minimal, with performance close to C or better, has great errors, no quirks and predictable memory use. Write it so it's easy for beginners to contribute ;p
This always bothered me until a teacher told me they are less about the project and more about working together as a team which most of the time is where these projects failed. With this perspective I led my senior design group project to develop a proof of concept IMSI Catcher detector that ran on a Raspberry Pi with a software defined radio in one semester. None of the other students had any prior experience with python, GNU Radio, low-level networking, or software defined radios.
A good team can accomplish so much more than an individual and you can learn a lot more with one too.
- It included the entire advanced level chem, physics and stats series for scientists and engineers.
- It was two elective courses shy of a math degree, including abstract math, concrete math, and tensor math.
- Only a few courses shy of a physics degree as well.
- The entire EE track and entire CS track of course.
- Writing lots of code for multiplatform portable code including Linux, HP-UX, Solaris, SGI, and some MINIX 2.
I think they should make it a mandatory combined undergrad /Masters' or /PhD program to be of sufficient depth into research achievement and breadth of knowledge to be useful in industry that cannot be achieved in the absence of an advanced degree. I think most of the undergrad CS knowledge can be acquired by osmosis in industry or self-study without a degree.
After years of searching for work in the Redmond area and failing to do so (because, no professional experience, no degree), I randomly happened to find work by randomly logging onto a forum I hadn't visited in years and randomly happened to find a thread where a guy was looking to hire for remote webdev contract work. After that contract was complete, I spent another couple years looking for further work, and had to eventually move back home away from Redmond because the cost of living was too high while I was unemployed. After plural years of searching again, I finally managed to get a second remote webdev job. I don't like modern webdev and would much rather do almost anything else but now I've had two jobs doing it and I feel despair—am I locked into this for the rest of my career? With no degree and only these two jobs' worth of "Professional Work Experience" under my belt, it sure seems that way. Recruiters and hiring departments seem to care about little else. They sure don't give a shit about any personal projects, in my experience. And this is even in spite of the fact that I more or less picked a new language and/or framework for each project, which, in my mind, shows an aptitude for learning new things, being adaptable, and having an above-average "general programming aptitude" than someone who's made fourteen React projects in the past couple of years or whatever.
But apparently companies don't see things this way, or maybe I'm just not good at finding the ones that do, or maybe there's something else to my approach that has been All Wrong. But as my twenties draw to a close and I find myself no closer to finding a career than I was a decade ago, at this point I'm kind of at a loss for what to do about it. And, almost more importantly, I'm extremely burned out on making stuff just for the fun and intrigue of doing so.
The sad thing is that there is pretty much no equivalent page for other industries (law, medicine, computer-unrelated-engineering to name a few).
It may not have been the most substantive comment, but you've grossly overinterpreted it in an uncharitable direction.
I think it is more a rejection of traditional academic institutions as the arbiter of merit. It is a shift towards a more “pure” meritocracy where all that matters is your ability to deliver. Whether this is good or bad, I am not sure. I think it depends on how society treats people are aren’t able to deliver to the same degree, and currently, in the US at least, those people get thrown to the wayside
The title is slightly misleading.
Edit: Thanks for updating!