This article is a thinly veiled advertisement for bootcamps, spouting this nonsense:
The “learning-by-doing” approach in the Flash workers’ scenario highlights the value of coding schools and boot camps designed to teach students in-demand skills in as little as eight weeks. And whereas traditional educational environments tend to view their curriculum as having an end date (generally aligned with their students’ graduation), Lambda School co-founder Austen Allred envisions students coming back to his coding school every eight years or so to learn new skills.
No. What will give you longevity is a degree in CS. Frameworks and technologies change, but CS concepts stay consistent.
If you have a good grounding in CS, you have longevity. Couple that with a good math background, and then you have even more possibilities.
Bootcamps are rubbish in the long term. They try to lure you in with quick results from little time investment by teaching you how to hack with the hip framework of the week.
I have been there, done that, having gone to a polytechnic school where I learned Flash + web dev. I later made the switch to a university to study CS. One of the best decisions I ever made.
Don't fall for the trap of instant gratification. Learn CS (and math) with a formal education, and you will thank yourself years from now.
There's many reasons why a CS degree would not be a good fit for someone. Some people just don't perform well in an academic environment. Some people simply don't have the intelligence required for that kind of education; not all tech workers are "software engineer", there's "tech workers" stretching from PhD level all the way to near minimum wage. Some people might not have 3 or 4 years of time to spend on a CS degree. Some people might not be able to afford a CS degree, depending on where they live.
Basically, I think your post just mentions a lot of the positives of a CS degree (without mentioning any of th negatives), and then somehow uses this to conclude that "bootcamps are rubbish" without actually saying anything about bootcamps themselves, as though there's a binary choice between the two where both options are always equally viable.
The OP seems to be saying bootcamps alone are insufficient and a poor choice for someone who will not be able to get a degree and wants career longevity. I've worked with a lot of people from the technical skills but no tech academics portion of the tech field. If they want to work in tech support or IT then it is not such a bad option, but that isn't really what bootcamps are trying to prepare them for.
You certainly can compete with all the people without degrees where they can make as much total compensation as someone with a degree by going with the most instable parts of the field. But that competition will sound exhausting and the level you put in will be exhausting your other options in life while still probably being less energy than the bootstrap-only graduate.
I’ve been working in the industry, in Atlanta for 20+ years. I’ve been on the market 6 times (stayed at one company almost a decade) and it has never taken me more than three weeks to have multiple offers. I can guarantee you that none of those jobs or interviews cared about my CS degree. The quickest was when my contract was over, I called a recruiter, had an interview four days later and an offer the same day. I’m no special snowflake, any halfway connected, developer with buzzword compliant resume could tell a similar story.
My second job out of college my manager cared a lot more about my then encyclopedic knowledge of C from spending way too much time on comp.lang.c, my (self taught) experience with x86 assembly and that I was a big enough geek to be able to talk about some of the 65C02 assembly language hobby projects I did in middle and high school than my courses in Pascal, Basic, FORTRAN, and COBOL or even my one simple data structures class in college.
Do you think consulting companies are recruiting me now based on my 20+ year old degree or my more recent experience as a team lead, AWS architect, and my being able to be talk about “The Well Architected Framework”, Domain Driven Design, and the Cloud Maturity Model? Yes these are all buzzwords to a certain extent but consultants get paid a lot for them.
Whether I had a degree or not, if I couldn’t negotiate a salary commensurate with my experience - I would be doing it wrong.
But that competition will sound exhausting and the level you put in will be exhausting your other options in life while still probably being less energy than the bootstrap-only graduate.
Considering what I learned in college and how useless it was compared to what I taught myself even when I graduated or even more importantly the experience and networking I’ve done in the past twenty years, I really don’t see my career trajectory being that much different.
The only thing college did useful for me was get my first job based on an internship the year before. In today’s world, if a boot camp (which didn’t exist back then) could have gotten my foot in the door, it wouldn’t have made any difference.
As far as pension, it’s nothing special. A pension is just worth the present value of all the payments you hope to make after you retire, considering the difference in pay - and the flexibility - in the private industry even in Atlanta, you can easily save/invest enough over your career to create a greater annuitized income in retirement.
I delayed my degree while the first dotcom boom was in swing.. I did the only relevant classes before I delayed it, and not having the paper did make a difference. I could have argued my way into the same position I transferred to after getting the degree, but they would have delayed things, delayed upping compensation, maybe given me lesser tasks or been less convinced of my work in some cases. They also would know which other companies couldn't be competing for me.. again loss of compensation.
I would be very surprised if that 1996 degree isn't on your resume, equally surprised if anyone checked if the school is accredited and surprised if many of your choices were slightly fewer/different options leading to accrued path dependent losses on average. I.e. a first team lead isn't suggested leading to no path or a significantly delayed one to team lead on your resume.
I used university as a bootstrap back then, and then had the company pay for the rest of the degree. This could probably work for bootstraps and a degree today.. but the only reason I see for that path over credited time toward the resume item is if you needed to convince yourself of whether you want a tech job.
The dot com boom was fairly isolated, unlike what happened in 2008. In 1999, when I was looking for my second job, enterprises were so desperate for qualified developers jobs were easy to come by. I negotiated a relatively decent raise (not Silicon Valley type money) after the dot com bust because profitable companies still needed developers.
Even today, most corporate enterprise type developer jobs - where most developers are - wouldn’t know CS from a hole in the wall. They just want people who can turn business requirements into shipping products. The degree has never been what gave me leverage and optionality, keeping my resume in line what the market wants has been.
When I stayed at one job for 10 years until 2008 and was woefully behind the state of the art, my degree did me little good.
Yes the degree is on my resume, but not the year I graduated. Neither is anything before 2008.
My very delayed path to being a team lead was a function of me taking my eye off of the ball for close to a decade and not gaining the skillset to be one. Being a team lead has nothing to do with how well you can do algorithms in most companies and is usual a combination of your interpersonal skills, your emotional intelligence, and experience with translating business problems into working systems. In fact, more often than not, it’s knowing what not to build and focusing your team/company on writing software that is within their core competency and outsourcing the rest or using third party systems.
That being said, I purposefully self demoted (in title not pay) from being the dev lead of medium ($1 billion in revenue) non software company to being a senior engineer at a much smaller software company who needed someone to modernize the software architecture. They were basically using AWS as a glorified overpriced colo. I discovered as a team lead that the real money locally was in consultancy and I had a few gaps to fill in.
My CS degree from 1996 definitely didn’t prepare me for my roles over the past three or four years dealing with enterprise architecture.
Of course not, and the GP never said that they were, but bootcamps are often marketed as an alternative to a CS degree.
But I disagree with your other statement: "Learn CS (and math) with a formal education".
You can learn CS from internet.
In the first link, it is said you need roughly 2 years to master core CS.
Actually, I have a wild fantasy. I build a startup like Lambda School but it is not supervised. So I give money to motivated students. This money cover their living cost for 2 or 3 years, internet, and a decent laptop. They have to spend 10 hours per day, 6 days per week to learn CS. After finishing the study, they can repay back my money with some interest (capped to a number) only if they find a proper job. If they can not find a proper job for 5 years, the debt is forgiven. But this is just my wild fantasy. :)
Maybe in the US or SV but in Europe nobody outside research, academia and archaic engineering companies where a degree is a form of signaling, care about your degrees.
In tech here, unless you have x years experience in the latest tech(.net, typescript, Node, etc.) you ain't getting any job regardless of your CS degree.
Edit: not sure why I'm being downvoted, I'm only expressing my experience on the market here in Europe as a engineer with 6 years of C&C++ experience, with BS in CS and a MS in Embedded systems, and web shops have declined to interview me citing I don't have enough experience with the latest web languages and frameworks. Perhaps people feel that ageism and obsolescence won't happen to them.
Outside a couple of startups with dubious prognostic of success, I never saw a CV going through HR for IT department without having higher education degree, with the exception of technical students.
And in most European countries one doesn't get to sign Engineer on formal documents (from law point of view) without the proper tile certification.
I regularly hire people into a commercial software company in Europe, and I don't inherently care about your degree, but I do care about you understanding a broad range of the fundamentals, and in the absence of a CS degree (which does happen) I have to spend extra time assessing an applicant's understanding of the fundamentals. Recently hired a fresh grad without a CS degree for a C++ programming position, and his interview was a bit longer than usual while we ran over those basics.
The degree IS a form of signalling; it signals "I probably have a grasp of the basics".
It's easy for senior engineers on the team to address gaps in CS knowledge, but much harder to address a lack of soft skills, writing, or over-designing/too much abstraction (because you were taught the patterns, but not when to use them). I could go on, but IMO, this is why interviews are still so broken, and why the field of software development still faces the same issues it always has. Granted, I'm not building a DB, but compared to maintenance and over-engineering, performance is rarely an issue, and easily solved via profiling.
Basically, can you write simple software? Everything you mentioned is also needed, but I really, first and foremost, need them to be able to write some simple software. If they can't do that, I really can't afford to teach them. A CS degree often makes this check simpler than for the candidates with other educations, such as the history degree holder we have.
If you don't need that then sure, it's a different hiring game for you.
My view has been that the experience is good but not enough, you must show a willingness to learn other things and work in other languages and any experience in the languages actually used trumps a lot of other things.
That’s also true in the US outside of the HN/Silicon Valley bubble...
That was exactly the notion of modern labor, in Y Combinator last year. It lasted six months until it learned that teaching yourself CS is much more difficult than it sounds, almost everyone had given up, then they pivoted.
Modern Labor in April 2019: "We Grow Tech Talent. Modern Labor is a revolutionary platform that grows talent for technical roles at your company." http://web.archive.org/web/20190412130518/https://modernlabo...
Modern Labor in May 2019: "Hire Tech Talent On-Demand. Modern Labor is a highly-selective talent network of US-based tech professionals to help you build tech teams when you need them." http://web.archive.org/web/20190501133947/https://modernlabo...
Modern Labor in August 2019: "Hire technical talent fast.
Modern Labor partners with companies to help them find and hire technical talent quickly and easily." http://web.archive.org/web/20190831055322/https://modernlabo...
Fascinating to watch their transition from revolutionary platform to ordinary recruiting agency.
I thought self-learning CS can be a good way to democratize wealth. You only need a laptop (even Chromebook is fine), internet connection, self-determination, raw intelligence and living cost covered to be able to master CS. But apparently most people are not able to do that. Maybe less than 1% of population can do this kind of thing. Or maybe not.
I strongly disagree. You need real professors TAs and project work, deadlines,exams etc.
The assertion that those things must be taught/learned within the confines of a CS degree is a little silly.
I was very interested in math and theoretical CS. It took the whole batch of students, including the ones in honors math bachelor a good 2 years of daily intense study alone to get the basic feeling for math straight (I was part of the IDEA league program in Europe and I would consider my CS studies among the best in Europe, especially when I see what students come out of Berkeley and Toronto). Then it took at least another year or two of algorithm studies to properly deal with algorithms and data structures. Also note the exam failure rate of about 90%.
Sure, you can hack some knowledge together in 9 months. But it is biologically impossible for the brain to properly learn these things in 9 months. If you can do it, you are a genius and your talents are completely wasted in this bootcamp. You should instead apply for a PhD in astrophysics at MIT or something and start contributing to mankind...
Also, what is the baseline you're starting at, before you start the timer?
Maybe I'm just arguing about what you call a genius, but I think it's totally biologically possible for many students to learn CS fundamentals beyond the capabilities of the average CS major, in 9 months.
It'd be intensive; but doable.
Even if you could take them all at the same time, maybe 2 or 3% of students would have been capable of completing those courses in that time frame at my University.
You're also cutting out 5 classes because they are called electives. In most programs electives are structured so that you're going to get exposure to certain topics no matter which electives you take, so randomly cutting out 5 classes just because there is some choice doesn't make sense.
The baseline is high-school.
Well "the average CS major" is also a very broad statement :D. I have no good overview of what an average CS major is, to be honest.
I think we are talking about different things. Let's say you learn how to build a hashmap, how to solve TSP with dynamic programming. Conceptually, this is possible to learn in 9 months if you are a good student (and therefore, your are talents are already wasted in this bootcamp). But what if I ask you some follow up questions? Modify the problem. Will you be able to explain how perfect hashing works or what the runtime of a hashmap is if the hashing function is not O(1). How different hashing procedures lead to different qualities for different implementations of hash maps?
We are not talking about knowing that hashmap give you O(1) lookup if you are lucky. If that is the skill you want to learn, sure, 9 months will do. But truly understanding what you are talking about and being able to explain, augment, modify/improve data structure and suit algorithms to your needs... Proving that they still work correctly after your modification. Understanding how that damn distributed consensus algorithm works that seems to have a bug that fucks up your database every now and then.
Yes agreed. You don't need to know that for 99% of CS jobs. But we are not talking about whether bootcamps can prepare you for work in average code mills, we are talking about whether they can replace CS education.
As a matter of fact I DO believe that bootcamps solve a critical purpose in filling the vast amount of gaps in our IT market. But I am always surprised again and again why "the new kid on the block" always needs to attack other completely valid paths (like CS major). They are completely different things made for a different purpose.
I think people need to realize that CS major is NOT the right choice for most coding jobs. But that does not mean that a bootcamp can replace a CS major, it just means that a bootcamp can be an efficient shortcut to hit the job market running.
We're discussing "what happens to tech workers when their skills become obsolete." If being able to contribute to a scaled system as a great software engineer at Google or Amazon is not a sufficient measure of "Knowing CS well enough," I'm not sure we're talking about the same thing.
Is it enough CS to do fundamental AI research? Eh probably not. Is it enough CS to do pretty much any other job out there? Yes.
Also, it's far easier to grasp the basics of AI from a course than to reconstruct "basic body of knowledge" just from reading arxiv. As a matter of practical considerations, I just don't believe anybody can become an expert by reading only the research, without going first through the basic training. It's too damned difficult, too much work (and pointless, too - people did that work for you already and built great courses with the summaries, why not take advantage of that?
Impressive. (TSP is NP-hard. Not even NP-complete because you can't verify the solution in polynomial time... or at least, I don't know how to do it and would love to see a solution with dynamic programming)
Edit: it's 3/2 for Christofides' algorithm.
Native dynamic is bad speed but gives accurate solution. It's no good for more than 24 nodes or so.
Usually it is good enough, other similar and better attempts try to tighten that bound with more admissible heuristics.
E.g. for metric spaces solution is possible to tighten a lot. (Like shortest travel without weights.)
I value CS degrees and value mine, but reality is, not everyone that writes code, need a CS degree from MIT or Stanford. Bootcamps serve their purpose and I have worked with many self taught and Bootcamp grad developers who went on to be exceptional developers. I myself was self taught before I entered into a CS program and honestly I wish I would have went into applied mathematics. Everything I learned in school I could have self taught by already having a base in programming. I also never use anything I learned in school at my day job since I left AI and 3D dev.
More relevant to the story, aging techs can actually be quite lucrative once you pass the curve of no new people are learning it, and all the skilled workers have left for greener pastures and are not looking back. I have taken some gigs for IBM Universe, VB 6, and COBOL JCL that have been quite lucrative because no one wants to touch them and most of the talent has retired out of the field.
I think that is horrible advise. They will pay you peanuts and abuse you. Instead go found a startup and make lots of money. Then you can fund interesting research that contributes to humanity, if you want. Or you could get a PhD once you've gotten your money.
This is all assuming that said person is a genius, of course. Otherwise disregard this advise.
* Machine learning is nothing but multivariate calculus.
* Analyzing network traffic is all queueing theory which is based on calculus.
* Even the simple data structures proof that no comparison-based sorting algorithm can run faster than n log n requires calculus.
And all those older programmers doing machine learning when they haven't had any CS courses on the topic.
Obviously a degree helps, but it doesn't stop someone suitably motivated and capable.
Of course it is possible, a few people throughout history have done it, but it is rare enough that I would claim that anyone who thinks they did are deluding themselves unless they can come up with further proof.
To quote my prior comment, this basically says it takes more than reading. But reading is the root of it. These days “reading” encompasses everything from books, internet, and even YouTube style videos. Basically individual leaning content.
It’s very possible if you’re motivated and put in the time. College is forced motivation.
Lets take an example from the subject at hand, series in calculus: we want to prove that the sum of a series converges, how to you verify the proof without an instructor? Check that it is the same as the book? Most likely it wont be the same, proofs can come in many different ways. So either students start discarding their correct solutions thinking they are wrong or they fail to discard wrong solutions. Either case they fail to fully grasp the material.
You need to be a genius to properly root out all the errors in your head on your own, doesn't matter how many videos or books or tutorials you go through they can't evaluate your creative solutions like a real person can. Of course the need for instructors mostly disappear as you reach mathematical maturity, but to get there without help is extremely hard.
Why? Because learning to take tests and learning are two different things. And nothing is permanently learned. Except maybe how to ride a bike.
I've been doing self-studying for the past few years and there's no way I could possibly truly learn the bulk of CS in only nine months. Every skill needs many hours of practice and the brain needs time to process information. Unless you're part of the 1% of people who are extremely quick learners and have profound memory retention skills, most people need a lot of time to comprehend complex topics. It's not pure coincidence that some of the best performers of a lot of subjects started when they were young: by the time they were in college/adulthood they already had thousands of hours of practice.
Yes. "Biologically impossible" is quite the high standard to meet.
Could you learn the necessary CS fundamentals that are included in a CS degree in nine months?
Nine months of Lambda School full-time is virtually equivalent to the amount of time you'd spend in the CS portion of a four-year CS degree.
Also notable that the CS block is last, right before interviews. It would have made more sense if you put that first and then refereed to that knowledge in the other parts, but now it looks like Lambda school just treats CS fundamentals as interview prep instead of necessary building blocks.
How many jobs require a knowledge of “CS fundamentals”. The typical development job are basically the “dark matter developers” writing yet another software as a service CRUD app or bespoke internal app that will never see the light of day outside of the company.
Why would you want to confine yourself to working on CRUD apps instead of working on cool tech?
In given week, I’m up and down the stack from the web to the infrastructure (AWS). I found all of the things that AWS enabled that took literally months to provision in the old world “cool” for about a year and then it became just another means to an end.
2000 hours is a LONG time to do something, you will certainly have a grasp of the fundamentals after 2000 hours. Will you be a master? No, but did you stop learning when you got your first engineering job?
Hell, most traditional CS degrees (at least their core credit requirements) can be completed within 9 months if you crunch
I was thinking about attending to a CS undergrad course some time ago, but since I'm full-time employed at a very big ("unicorn") startup - an opportunity too good to be wasted -, I opted to take some classes on advanced topics such as Discrete Optimization and Automata theory on Coursera . It has been the best use of my spare time ever.
If you go into your CS degree as a young hacker, adept at *nix with a penchant for assembly language, with enough experience to appreciate the CS concepts, you're going to be disappointed. There's a strong chance that you'll know more than the professors.
Alternatively, if you go in knowing nothing and really work hard to sponge up everything the degree program wants to teach you, you're going to think you've learned something but you won't actually leave with any useful knowledge. You'll have dated and often half-baked understandings about how computer systems should theoretically be designed. You might have an idea how a basic OS kernel works from your OS class, or how to write a sorting algo from your data structures class, or maybe a parser from a language design class. None of that is useful at all, other than maybe to get you interacting with your machine, in the hopes that you'll learn how to use a computer (which is a completely different set of knowledge) by the time you graduate. And if you did ever want to write your own programming language, you can throw out everything you learned in class and start over by Googling it and following the best practices of today, like we do for everything else.
The only point I can see is to be able to say you have a CS degree, and hopefully that's losing value.
Yep, everything newer is automatically better. No need to have any grounding in objective utility as long as you're up-to-date on the latest web framework!
> You might have an idea how a basic OS kernel works from your OS class, or how...or maybe...None of that is useful at all, other than maybe to get you interacting with your machine, in the hopes that you'll learn how to use a computer (which is a completely different set of knowledge)
Any decent computer science program is both theoretical and hands-on. Projects where you get hands-on experience with the concepts you just learned. Not all of us can read Data Structures and Algorithms and implement a search algorithm as an 18 year-old. No, not everyone needs to know this. Yet, some people do. For those people who do need to know, their work wouldn't be possible without it.
> If you go into your CS degree as a young hacker, adept at nix with a penchant for assembly language, with enough experience to appreciate the CS concepts, you're going to be disappointed. There's a strong chance that you'll know more than the professors.
True, but (especially now) I'd be surprised if this is <2-3% of the CS undergrad freshmen population. I personally switched from pre-med to CS as a junior in college with only a single QBasic high school class under my belt.
Finally, as others have pointed out, (and as much as I hate admitting to this phrase that was so often spouted out by my college professors, perhaps I have drank the Kool-Aid), you're learning how to learn. That is, you're practicing juggling abstract concepts in your head, making connections and weighing solutions. I found my CS classes were 10x better at making me a general problem solver than my pre-med classes (mostly my discrete math/logic classes).
I didn't say newer is better, I'm saying that your recollection of how a compiler worked in university will not be of any use to you if you need to write your own language for production today. I've also never heard anyone say "oh yeah, I did this once in college, here's how you do that".
> For those people who do need to know, their work wouldn't be possible without it.
Who? Seriously, who would ever be employed to author search algorithm using the knowledge they obtained in a 4 year computer science program? Extremely few people are involved in the work of implementing anything that low level, and thinking you understand what's going on under the hood is just tricking yourself. For example, your 4-year CS degree holder would surely understand the trade-offs between a linked list and an array, right? Then you find out none of that is true in practice: https://dzone.com/articles/performance-of-array-vs-linked-li...
> I personally switched from pre-med to CS as a junior in college with only a single QBasic high school class under my belt.
I'm not saying only the l33t can do CS, I'm just giving the example that if you actually do know what's going on and you have to take all those classes, it's very obvious that the professors have nothing of value to add that you can't learn faster by yourself online. I grew up cracking windows software for fun and I had to watch a professor stumble through the basics of x86 assembly. It was an obvious waste of time for all parties. Educational resources today are vast, and if people want to learn something, they can just go learn it.
Also, have you looked at professor salaries vs engineer salaries? I know there are some people who teach at night or do it for the passion of it, but the reality is that it's not going to attract the most ambitious minds in our society. Meanwhile, so many major pieces of software are available for free online, and you can actually talk to the teams doing the work, and they'll let you contribute and give you feedback... for free!
> Finally, as others have pointed out, (and as much as I hate admitting to this phrase that was so often spouted out by my college professors, perhaps I have drank the Kool-Aid), you're learning how to learn. That is, you're practicing juggling abstract concepts in your head, making connections and weighing solutions.
This is my primary objection, and perhaps you have drank the Kool-Aid. If you get a CS degree at any major institution, you are not learning how to learn. You're both preventing learning and picking up bad learning habits. Doing a pre-built lab in a CS course is so much worse than contributing to literally anything on github. Studying how algorithms used to work in the 70s is useless compared to diving into any modern piece of code and benchmarking and learning how to optimize.
Look at what is actually in a CS program from a respected school: https://cse.engin.umich.edu/wp-content/uploads/sites/3/2019/...
There's a 4th-year course just called "Algorithms", check it out: http://www.eecs.umich.edu/courses/eecs477/f02/syl.html
They might as well have called it "Inefficient implementations of already solved problems".
I can't imagine why there's a "Databases" class and "Web Databases" class, but I think you see my point. Nobody is implementing a database with anything they learned in uni, and nobody is doing a better job of learning how to use a database in school than they would with experimentation and online explanation.
Maybe the fact that you're graded on learning these things, coupled with the enormous price, drives a student to independently research each of the topics presented throughout your 4-years? Then through research and play, they would gain understanding of each topic. But if the school is simply serving as an extremely expensive prompt for self-learning, then what value is the school really adding?
> I found my CS classes were 10x better at making me a general problem solver than my pre-med classes (mostly my discrete math/logic classes)
Every time I hear that reasoning, I think it's a rationalization for spending huge amounts of time and money doing something pointless. I also like "the college experience" as a good reason it was valuable.
Maybe general problem solving is being taught and is valuable, but that's not what these CS degrees are being sold as. To pick on umich again, look at the "Student outcomes": https://cse.engin.umich.edu/academics/undergraduate/computer...
So if you graduate with this program, you'll be able to "Analyze a complex computing problem and to apply principles of computing and other relevant disciplines to identify solutions" and "Apply computer science theory and software development fundamentals to produce computing-based solutions"? Maybe if you study independently while also stressing about passing your exams and doing your homework, then you'll be able to do those things, but it'll happen at a slower rate than if you just started building software and researching as you went. I just looked at a few of the syllabuses for this program as an example, and there's no way it's going to deliver what they're promising.
What planet am I on right now?
So I would say solid math base is what you get at university that allows to work at more challenging/interesting projects.
Unfortunatelly most unis exept top ones are waste of time at least in my country.
For people who can't afford that, there are usually some forms of getting it for free.
Higher education really shouldn't be a matter of personal finances.
(This was slightly longer ago than I'd care to admit, so maybe things have changed... I would expect it's still something of a case of "YMMV", as we used to say.)
If you spent time looking at how much time students spent (in and out of class) on CS and math topics, it's way less than you'd assume.
It's also known in layman terms as practice. 9 months of cramming is not good enough. Neither is 4 years of tests without actually using the taught things in practice and laboratory exercises. And preferably also homework.
Source: SuperMemo research, Piotr Woźniak and his citations. No reason why it wouldn't apply to CS or programming.
The amount of information useful requires essentially lifelong interleaved focused learning at least hour a day. You can also maybe pick some things up in a job, but these have stricter limits.
This isn’t true. People forget the most advanced skills they learned unless they use them regularly but they don’t forget the ones that are necessary for that last skill. If you ever knew calculus you’ll remember algebra after decades without using it. I haven’t spoken German in most of a decade but I can still read it fine. Spaced repetition is the most efficient way to durably learn something but it’s not necessary.
People tend to forget unpracticed skills, not oldest, and most complex parts (least compressible, biggest) first, not necessarily hardest. And the forgetting is exponential, depends on how well the material is presented (cohesion specifically).
Exponentials flatten a whole lot.
You do forget a little still over time. Ask a 40 year old who knew algebra and calculus really well and does not use it often if at all. (I do sometimes, so I don't count.)
Stability and accuracy are also separate variables.
Algebra is not one thing, it's thousands of memory chunks. As a probe, think if you remember Fundamental Theorem of Algebra which is a keystone, but slightly tricky. Compare to whether you can solve ordinary differential equations of second kind, and whether you can solve an equation involving logarithms of rational non-negative numbers. (I picked random not absolute basics from high school, 101 and 202. Bonus points if you spot the stinker.)
I always thought of even latest versions of SuperMemo like Antikythera mechanism for learning. It's a great model even if principle is not yet properly researched.
Harry P Bahrick, Lynda K Hall
Journal of experimental psychology: general 120 (1), 20, 1991
An analysis of life span memory identifies those variables that affect losses in recall and recognition of the content of high school algebra and geometry courses. Even in the absence of further rehearsal activities, individuals who take college-level mathematics courses at or above the level of calculus have minimal losses of high school algebra for half a century. Individuals who performed equally well in the high school course but took no college math courses reduce performance to near-chance levels during the same period. In contrast, the best predictors of test performance (eg, Scholastic Aptitude Test scores, grades) have trivial effects on the rate of performance decline. Pedagogical implications for life span maintenance of knowledge are derived and discussed.
Related knowledge strengthens already known things, and once critical maximum stability/retrievability is reached (optimally 7 or so precisely spaced rehearsals) it is pretty much cemented. Without optimal spacing, probably quite a few more repeats. At maximum stability/retrievability the exponential maintains "flat" form for a very long time. A refresher may be needed to get facile again, otherwise it may take a short while to remember and there may be mistakes. Even a trivial refresher will work though, chances to use basic algebra are many.
Principles SuperMemo SM-17 algorithm puts in quantification.
Here's a link with all the history and references to some other research: https://www.supermemo.com/en/articles/history
Thanks for the paper, though it's slightly wish washy, but still useful.
So my question is what level of calculus is retained, and if students who take calculus then advanced calculus or introductory analysis or whatever you’d call it retain more.
The introduction of the ECTS was however highly controversial at least in Germany. Before the Bologna Process German universities had the "Diplom" taking no less then 4,5 years for CS. The introduction of the bachelor with down to only 180ects, so only 3 years, was seen as cheapening the education severely. Universities generally dont see themselves as preparing you for the job market, but instead provide you with the prerequisites to enable you to contribute novel ideas to your field in form of a doctoral degree. That almost every CS job requires at least a bachelor degree is just an side effect from the point of view of the university. An often mentioned criticism was the government selling out the education system so companies could quicker get new employees. Quantity over quality. As a result, you still find a high percentage of students automatically adding a masters to get the equivalent of a "real degree", with some universities not changing much in the structure of their diplom curriculum, awarding you a bachelors degree after 3 years but expecting you to finish the rest.
There is however also the opposite in the form of an (payed, but badly payed) 3 year apprenticeship training after school. You can become an "IT specialist" without going to university and even without the prerequisite school years to start university (normally successfully finishing grade 13 and thus getting your "Abitur"). You can start the training once you successfully finished your 10th schoolyear and thus getting your "Realschulabschluss". The 3 years are 50/50 professional school and working in a company. Looking at the curriculum of one of the first schools in google, they have in total between 880 and 960 school hours. Depending on your focus, 300-400 hours of "Information and telecommunication systems", 200-300 hours of application development, 200 hours of econ and business processes and ,due to accepting people who finished with 10th grade, 60-100 hours of English lessons. IT specialist here means either becoming a sysadmin or a coder.
FWIW, I assume little. I have a degree and know the effort that went into it, and am well versed in how courses are distributed in and out of your major over 4 years of study. I'm not engaging in this discussion to say one is better than the other... I'm saying they are different. Different schools, for different purposes, serving different people.
I think bootcamps actually demand a far higher rate of output and often at a very high level within specializations. The problem is I don't want to hire someone who can crank out bleeding edge framework code 20 hrs/day for 6 months.
University level work in physics, literature and chemistry are so different as to have basically no overlap. University level is as meaningful as high school level in a world where Calclulus II tells you the course covered calculus and is otherwise uninformative.
Hopefully changing technology will be learned on the job as needed.
A bootcamp is not a substitute for a CS degree.
I think you've just described the 1% of programmers out there. For the rest, it's bootcamps and stackoverflow. Unless we're still deluded that any software developer is as good as any other just because they happen to have the same title on their contract papers. And yes, I agree with what you're saying.
Being a good person to work with, ability/desire to learn stuff, cool head when things go wrong, and being reliable are far more valuable once you've got that basic CS knowledge.
I know this isn’t popular on HN, and maybe things are very different in places where education isn’t free, but I’m fairly certain we aren’t the only country to do this.
The jobs themselves don’t usually need much CS knowledge, as you point out, but the degree is an entry point because it gives employers a certain form of safety. I know, I know the HN mantra of “weeeell you can get a CS degree and suck”, but the reason practices are like this in my region is because the companies who followed it did better than those who didn’t.
There is MOUNTAINS of research at Google which cannot be shared externally that shows -- despite all of our efforts -- we are hardly better at selecting candidates through interviews than flipping a coin.
There's also a ton of research by Amos Tversky at the Israeli Air Force that suggests similarly. That's where Google got the idea to try to measure this, and how we got our "strange" interview process.
Except that this practice will be dropped in the next few decades in many parts of the world, particularly in the tech industry.
We mostly write CRUD apps to fulfill some sort of business or project function. If we're not doing that, we're trying to make two or twenty different systems work together. If we're not doing that, we're developing CRUD interfaces for our databases with some business or project related logic that can be executed programatically..
None of this requires a CS degree. Design and architecture doesn't. Testing doesn't. Implementation doesn't.
I did a did a 4year apprenticeship in IT and Programming before deciding to go to college. It was all fine, could do real coding for a good company. I knew the syntax of a few languages, database design, uml, some basics of how webservers work,... But it didn't feel proper. It was more like a lot of things, but only superficially.
College on the other hand felt much more in-depth. I was really given the time to properly understand the things I was doing. Even the "maths" bits helped. E.g. vector maths changed the way how I reason about little unrelated things sometimes.
That's such a weak justification in my opinion. Why would you want to work on CRUD apps over more interesting stuff?
The workload increased by 50x, but the system I built only needed a couple engineers to maintain and add features to.
I hate to sound cocky, but I think any engineer without a formal CS background would have attempted to solve each case on an ad hoc basis, either greatly delaying the project at best, or fucking it up irremediably at worst.
I have many examples like that.
Perhaps in all of the examples you cited, you’ve inadvertently cost the company 10 times the time or the money that a more methodical engineer with a theoretical background would have. You don’t know what you don’t know, and that’s the difference between an engineer and a technician (a distinction that the US doesn’t tend to make, but some other countries do).
As for CS fundamentals, the only time I have needed to implement a sort algorithm was at university and in some interviews. Not saying that it's all bad, it is useful to understand the underlying principles but there seems to be a focus on stuff that is becoming less and less relevant.
Sure, but at some point you've probably had to implement some sort of novel algorithm. The reason they make you implement a sort algorithm isn't so that you learn to write sort algorithms, it's so that you learn how to implement an algorithm in general.
Even designing and implementing fairly simple algorithms is easy to screw up if you're not sure how to approach it.
A CS degree on the other hand teaches you a lot more than you really need to know (compiler design, theorey of automata, etc) as a software engineer and so it's kinda wasteful. Sure, it's interesting and fun to learn as an academic but it's not actually useful.
The nice thing about bootcamps is they teach you just what you need to learn and nothing more, nothing less.
There's few Software engineering jobs that utilize the full scope (or even half of it) of CS knowledge you gain from a CS degree.
One can learn CS at their own pace. Being interested and driven to learn is the difference between a bootcamp coder with only a few years of shelf life and a long career. Same goes for that CS degree.
I think bootcamps attempt to fill this, although maybe less so for CS/maths fundamentals. Online course work seems good, but the social aspect of having a physical space to go to, meet and collaborate in is what's missing.
Any ideas for how to solve that?
Try getting a job outside of Silicon Valley by just knowing “CS concepts” when your standard Corp job wants someone who can hit the ground running on their tech stack.
That statement is rubbish. An education in CS will get you certain things, but your career is what you make of it. If you(hypothetical person, not OP) go to a bootcamp expecting to be handed a career on a silver platter, then you're a moron. Yes, there are plenty of those morons. I knew some of them when I went to Dev Bootcamp. And they were the types who didn't take charge of their careers, work on their people skills, go to meetups, interview like mad, etc. They got exactly what they put into it and either floundered or ended up in a different career, and I don't begrudge them for figuring out that the field wasn't for them.
Going to a bootcamp was a turning point in my life, and I'm making six figures working on a high profile app 6 years later as of this month. I know others who went to my bootcamp, didn't have a degree in CS, and are way more successful than I am. But it doesn't even matter if someone goes to bootcamp; they can learn entirely using free resources, and their inner drive is still the most important variable in their future success.
Is having a CS degree better? Maybe it is. But your post is telling people, who may be the perfect type of person to go to a bootcamp, that bootcamps are junk and are short term gratification. That's asinine.
By the way, what makes you think that your knowledge in CS and AI are going to be applicable in another decade? You have a good shot, for sure, but if there is a revolution in AI that changes the field fundamentally, or if(when) computers begin programming themselves, then you are just as screwed as someone whose framework or language of choice has become obsolete and dead.
Tech stack might go out of vogue? Then fucking learn another one. Your language is going the way of COBOL? Then fucking learn another language. The end. You don't need a formal education to do that. The only reason anyone is in a bad situation because their skills became obsolete is because their ability to predict the future sucks and they couldn't or wouldn't learn new things fast enough. Most languages aren't even that different, and there's a TON of replication between frameworks, libraries, compilers, VMs, etc.
I'm not trying to disparage your background in CS. I commend you for it. No matter what, there will always be people who are too inept to adapt. CS will only help those who have a calling for it, but no matter what, if someone doesn't have that fire within them that's going to take them where they want to go, they're an accident waiting to happen.
6 years is not really long enough to measure longevity.
Is having a CS degree better?
That's exactly what I'm saying. Your time and money would be better spent on getting a CS degree if you are concerned about career longevity and opportunities.
By the way, what makes you think that your knowledge in CS and AI are going to be applicable in another decade?
Because general CS knowledge stays consistent.
This is almost like asking whether broad subjects like math, chemistry, physics etc. will be relevant years from now. Yes, they will be. If you are asking this question, then perhaps you should reevaluate what you think CS is.
If AI reaches the point of becoming self aware and writing code, then we're all out of work. I'm not too worried about that happening.
Specific domain knowledge, outstanding interpersonal skills and customer empathy, design/UX/sales etc - these will likely be far more valuable than CS fundamentals if AI starts writing all code.
I am 45 years old, got my degree from a no name school that taught one simple data structures class and the rest of the classes were teaching either outdated programming languages (except for C). I can honestly say that nothing I learned in school served me better than the time I spent hacking around in assembly in the 80s in middle and high school.
How much more “longevity” should I be on the look out for since My degree was useless? I’ve already been doing this professionally for almost 25 years.
When Flash was at it's peak, I worked in eLearning writing simulations in flash, and supporting backends in VB/VB.Net, some C# and ASP/ASP.Net with SQL. From there, much more web/js, and various other database backends (cassandra, mongo, redis, etc). Currently working with C# and Node.js in web apps still. Learning Rust.
In the end, progress doesn't stop and wait for you. I tend to push for things faster than my workplaces want to adopt. In the end, it's a struggle, and it doesn't end. I will continue to do so until I die. I have absolutely no plans to retire.
Formal CS knowledge and education can help. Understanding and learning multiple platforms and languages helps more. If you want to settle in a rest on your laurels you won't last forever. You're best off understanding various ideas, workflows and patterns and how to recognize when one is a better fit. I have my preferences but am under no illusions that it will stay the same.
In the end, you have to commit to spending time each month/year learning and working on new things. It's the only way to keep up or get ahead.
Eh... While that might be a good story to sell to his investors, most anybody worth their salt won't need to go back to a "bootcamp coding school" every 8 years to learn a new language.
Maybe if you finished an undergrad you'd be well equipped to stay on top of an ever-changing tech landscape?
If bootcamps take off it will be because they replace this function of university.
If you know how to program, that skill does not simply "go obsolete".
The sort of "skill" that this article is talking about seems to be on par with like, the time I thought I knew how to write C as a teenager.
The local market going to shit, you developing medical issues, ageism, etc are all far more of a worry than "oops, I've been doing QBASIC for 10 yea.....[NO CARRIER]"
The article was right that ongoing learning is the key. And while formal higher education is not for everyone, it does well at teaching people new techniques for learning and research. It isn't the only path, especially in today's reality. Its current failings are one reason that bootcamps exist...
But the idea that people are going to let their skills stagnate for 8 years, then return to a bootcamp to get the latest tech... Sorry, but that is simply absurd.
Its nice for small scripts that people might use bash for. I do Python most of the time, but Perl's syntactic sugar makes it so much nicer for something like looking up a set of files matching a regex, moving and renaming them.
Looking back, I don't even think anyone could find the seam in my career.
"HiSOFT has been in existence since 1980, founded by David Link and Dave Nutkins.
Originally we created software for the NASCOM 1 kit-built microcomputer but swiftly moved on to the ZX Spectrum, for which we created many esoteric items such as HiSOFT Devpac, HiSOFT C, HiSOFT BASIC, HiSOFT Pascal, UltraKit, Colt and much more.
After great success with the various incarnations of the Spectrum we ported our core titles (Devpac, C++ and Pascal) to many other Z80-based computers; Tatung Einstein, Newbrain, Memotech 512, Amstrad CPC& PCW, Elan Enterprise and more!
'Twas a lot of fun and, undoubtedly, this list will stir as much excitement in some people as David's favourite band since 1971, Genesis, do in him!
After the Z80 processor began to flag (shame!), we moved on to the 68000 which meant moving stuff over to the Atari ST and Commodore Amiga. This, along with many hardware projects (such as Megalosound, Replay 16, Clarity 16, Squirrel SCSI, VideoMaster etc.) kept us going through the 90s until, reluctantly, we were forced to take the PC seriously.
Having forged a close relationship with MAXON Computer in Germany throughout the Amiga and Atari years, it was natural for us to take on the UK mantle for their flagship product, CINEMA 4D, an exciting and now rather important 3D product.
HiSOFT promoted, distributed and sold CINEMA 4D from 1997 until 2001, at which point David Link formed MAXON Computer Ltd and moved all things CINEMA 4D under the MAXON umbrella. David worked at MAXON UK as CEO until resigning for personal reasons in early 2003.
David Link continues to work at HiSOFT, as you will see from this website, while also trying to earn some money running the odd pub, café and seaside bar/restaurant/guest house!"
I just thought it was a good example of how people evolve. And also interesting that they used to make development tools and now does websites. As well as the rest of the career is just an interesting evolution.
But since essentially all other descriptions of older programmers make that point (and it’s the conventional, however incorrect, wisdom that older programmers stop programming, similar to sports). When you simply describe an older programmer without making any other point explicit, one must assume that that’s what you meant to say.
I'm driving a semi truck.
Loadsmart And Starsky Make First Start-to-Finish Autonomous Truck Delivery
UPS Quietly Using Self-Driving Trucks For Months
In the mean time, trucking is hiring.
Not that I would recommend it to most people, it's just what I did when I realized I wouldn't be working in software anymore. I ended up liking it.
Software is a field that works on how other jobs get automated. We'll have work until it's all automated, which is unlikely in our lifetimes.
To prevent this from happening to me, I like to keep an eye on which technologies and programming languages are in highest demand, and which direction they're moving in terms of job demand. So, I created this app that measure programming language demand based on job postings and analyzes on a city by city basis, cross-referencing with salaries posted: https://skilldime.com/
A year to learn a different stack and put it into practice to satisfy the prior art? She was no programmer. She was applying rote patterns and had no programming ability.
Stop conflating practice with programming ability.
It's like saying she was a doctor, but it took her a year to learn to use a different stethoscope.
In 2000 during the dot-com boom we learned to talk about IT "tourists", and the tourism did not end with the crash in 2001.
If you direct your career by keeping a close eye on job postings, you are definitely a tourist.
The bootcamp culture and the emphasis on teaching young people to "code" is nothing better than the travel agency encouraging people to spend their vacation dollars on a trip to the Caribbean. Creating more tourists to satisfy the demands of an expanding industry does less than nothing to satisfy the real technology needs of a growing sector.
Nurses are not doctors.
This was the same discussion around 2001. You had people who got into tech for the money and people who just like it. Many people who did it for the money didn't survive past the dot bomb in 2001.
This is more of an observation than a criticism, btw.
You know what they do with engineers when they turn 40? They take them out and shoot them.
The constant churn while valuing little of the things I feel that I am actually skilled at - good code design, keeping things as simple as possible but no simpler. Instead we need experience trendy framework of the month, and interviews on algorithms that you will never need in your job. Oh, and please spend the weekend completing our pointless technical test before you get to speak to anyone technical about the job we want to interview you for.
I’m 45 and have had to abandon framework, languages, operating systems for close to 35 years.
We get paid an above average wage and all we have to do is watch a few videos and read a little bit.
I’m not sitting her le crying that my knowledge of 65C02 assembly is obsolete or that I had to learn the intricacies of DEC VAX and Stratus VOS over 20 years ago.
I want to solve problems efficiently, not chase fads.
Flash is not the first tech to die and it won't be the last. Hell, I spent almost a decade of my career focused in Flash dev... and even when I was learning Flash and ActionScript there were 3 or 4 main platforms I used to work with that I had already abandoned. This is a constant.
Also. Even a long time after Flash was already on the way out, the demand for good Flash devs was absurd, made worse because it wasn't a "hot" platform anymore. I remember we had a Flash project that needed to be maintained at the company I was working at and we needed to hire an external developer to do it. A lucky guy with the skills ended up being paid top dollar for a super easy job for years because it seemed no one else could do it. He was smart and kept his skills up to date, so he had no problem getting out of that scene later, but he surely used this "outdated" knowledge to his advantage. This also happens all the time.
Only having knowledge is probably the hardest place to be, and those are the least desired staff ("paper $CERTIFICATION"), but if you have the others then knowledge is (arguably) the easiest thing to add.
If you have programming skills then you can extend from that base into front end, back end, etc. It's going to be harder to jump to sales, because you don't have the skills or the knowledge. Similarly if you've only ever done sales, or accounting, or cooking, or whatever then it'll be harder to develop the skills and knowledge for programming.
It's not a stretch to imagine having Flash skills is one thing but building digital experiences in js, or js friendly/inspired syntaxes likely has plenty of transferable skills.
In another way of looking at it, today's js developers share some kinship with actions riot developers.
I came into CS expecting a cushy job which was well paid. What I found was a challenging job that placed extraordinary premium on agility and the capacity to build useful products. I love being a "tech worker".
1. Learn new skills
2. Move into management
3. Leave the industry
In my non-Bay Area city, the majority of jobs are Node or RoR. Been struggling to find a job, especially since Java is synonymous with big data, which I have no experience in. No Spark? Sorry. No ML? Next! Ageism in tech is real.
Most Java jobs are not big data. There may be a skew in the openings, as people with java/scala big data skills are hard to get, but there are probably millions of people doing non-big data Java development right now, and it's not going to change.
One possible explanation for what you're seeing (except not being in a right kind of city) is that maybe the offshoring has hit the US enough to affect the Java market? There tens/hundreds of thousands of Java developers working in Poland are doing offshore development for US/global corporations. The whole Central/Eastern Europe is like that.
Initially mentoring less experienced developers, then into running a small team, scaling up the managerial aspects and learning to let go of the code, hire great people and trust them to deliver over time.
The modern industry moves faster than that, new roles are machine gunned into our inboxes, we are told you can't stand still or you're hurting your own career.
Those aren't mutually exclusive situations, but it's certainly made more difficult by their orthogonality.