Who knows what the future will hold, but in the US, there are more new lawyers graduating each year than there are law jobs. Software has taken over many of the jobs the entry level lawyers used to perform.
Law (in the US) reacquires an extra, very expensive, 3 year degree, and many law school graduates are never able to find jobs as lawyers.
The market will probably eventually correct itself, but the days of law school as a sure-fire way to a high paying job are probably over.
The "more new lawyers graduating each year than there are law jobs" factoid is both stale and irrelevant. Most of those new lawyers had admissions scores well below the 50th percentile and are graduating from third and fourth tier law schools. Who cares about them? I doubt many would have had bright futures in programming, either.
Relatively modest undergraduate performance can get you into a law school like Northwestern's, where 87% of the class of 2013 "found work and reported a salary", and at least 43% of the class had a starting salary of over $160k, a very respectable ending salary for a programmer, especially outside of SFBA (http://www.lstscorereports.com/schools/northwestern/sals/201...).
Almost 50% of new law school graduates can't find jobs. Of course it's likely that the unemployed 50% is the bottom 50% of applicants.
Look at the distribution of lawyer income. There is a large group bunched near the bottom making 40k-60k a year, and a smaller group at the top making above 160k. You need to do really well to be in the 160k group.
>Relatively modest undergraduate performance can get you into a law school like Northwestern's
The median LSAT score for Northwestern is 168 about that's right at the 96th percentile, so about 4% of people taking the LSAT will score a 168. Even Northwestern's bottom quartile score is just below the 90th percentile. Their median undergrad GPA is 3.75. How are those numbers relatively modest?
So yes, someone who scored in the 96th percentile on the LSAT and had a 3.75 GPA in undergrad has a decent chance of spending 3 years at a top tier law school where they have a 43% chance of making over $160k a year upon graduating.
Northwestern also costs about $300k to attend.
>87% of the class of 2013 "found work and reported a salary"
I don't thinks that's really saying all that much. It doesn't say they're working in jobs requiring a law degree. Of course 87% are working at some kind of job--they owe $300k in student loans. Another way of looking at it is--13% of graduates from a top tier law school are unemployed with $300k in debt.
No one is arguing that lawyers from top tier law schools can't make a decent salary, but there are only a few thousand slots open in the top law schools each year. If you're in the top few percent of law school applicants and you think you'd enjoy practicing law, then by all means go to law school.
But looking at the averages, the median salary for a software developer is about $93k, and the median salary for a lawyer is $113k (from the bureau of labor statistics). Total cost for law school is over $150k on average, and the opportunity cost for not working as a software developer for 3 years is much more than that. Add in interest for student loans (and forgone interest on potential savings) and it will take over 2 decades before the average lawyer pulls ahead of the average software developer.
Add to that the fact that software developer jobs are expected to grow at a significantly higher rate than lawyers, and that lawyers constantly place near the bottom on job satisfaction surveys.
By the way I, initially planned to go to law school, but every lawyer I talked to was so discouraging that they eventually talked me out of it. A few of them were very successful family friends, but they absolutely hated their jobs, and they warned me that there are much easier ways of making money.
If you're a programmer who can pass algorithm interviews, you have a great chance of scoring near the 90th percentile the first time you attempt the LSAT, and the 96th percentile with practice. Scoring much below the 80th percentile on the LSAT is very poor. In Canada, few students are admitted to any law school with scores that low.
Say you spend $300k on law school. Over a 30 year career, you only have to make an average of $10k extra per year to break even.
The "Jobs Data" tab offers more details: "79.2% of graduates were known to be employed in long-term, full-time legal jobs", "93% graduates were employed in long-term jobs", etc.
The number of software jobs are expected to grow, but is the growth going to be in jobs you really want, or will they all be for 23-year-old coding bootcamp grads?
Again, who cares about the nationwide averages? The 50th percentile Northwestern law grad makes $160k right out of school, and is on track make several times that as a law firm partner, or somewhat less as in-house counsel. My impression is that most programmers struggle to hit $160k any time in their careers, at least outside of SFBA.
When someone describes the downsides of their job, I take it with a grain of salt. Often, it's a case of "the grass is always greener". Sometimes, members of high-status professions want to downplay their success. In any case, most of the lawyers I've talked to say they enjoy their work (though they do work much longer and less predictable hours than programmers).
>If you're a programmer who can pass algorithm interviews, you have a great chance of scoring near the 90th percentile the first time you attempt the LSAT, and the 96th percentile with practice.
That's probably true. But again, there are only a few thousand slots available each year at top law schools, so for the vast majority of programmers this can't work. Just a few hundred each year taking your advice would change the equation.
>Say you spend $300k on law school. Over a 30 year career, you only have to make an average of $10k extra per year to break even.
That's true, but the average is more than $300k. The average programmers makes $93k a year, since he can work 3 fewer years because of the 3 years in law school, that's $279K in lost wages + $150k for law school.
Sure the lawyer will likely eventually pull ahead, but extra money near retirement is worth less than money early on. If the programmer invests the extra money early on, the lawyer may never actually pull ahead.
>"The "Jobs Data" tab offers more details: "79.2% of graduates were known to be employed in long-term, full-time legal jobs"
Legal jobs doesn't mean working as an attorney, or jobs requiring a law degree. It could mean $15 an hour paralegal work, so that statistic isn't useful.
>The number of software jobs are expected to grow, but is the growth going to be in jobs you really want, or will they all be for 23-year-old coding bootcamp grads?
That's possible, but the new jobs for lawyers could be just as bad. From the Bureau of Labor Statistics "Some recent law school graduates who have been unable to find permanent positions are turning to the growing number of temporary staffing firms that place attorneys in short-term jobs."
Software has been eating into jobs that were traditionally done by lawyers, and it will continue to do so.
On top of this, lawyers are limited to practicing in states where they have passed the bar exam, meaning their ability to move to find jobs is much more limited.
>Again, who cares about the nationwide averages? The 50th percentile Northwestern law grad makes $160k right out of school, and is on track make several times that as a law firm partner, or somewhat less as in-house counsel.
And they admit about 200 new students per year. So yes, if you can get into Northwestern and you like law, then it's a good decision.
>(though they do work much longer and less predictable hours than programmers).
That's a huge caveat. The average programmer could have been the average lawyer instead, worked more hours each week at a higher stress job so that by that he can break even in 20 years, and spend the last 10-20 years of his career making a bit more money.
If you like law and can get into a good school, then practice law. But I hardly think the extra, debt, stress, and hours worked makes it worth it for purely economic reasons.
>When someone describes the downsides of their job, I take it with a grain of salt. Often, it's a case of "the grass is always greener".
This would be the case for both programmers and lawyers, but job satisfaction surveys show that lawyers consistently rank near the bottom below programmers.
SEEKING WORK - Remote & Atlanta (remote preferred)
I'm a full-stack developer based in Atlanta. I've built a profitable startup, so I know how to solve problems and get things done with a minimum amount of direction.
I can take on projects at any stage--from sketches on the back of a napkin, to 20 year old legacy code. Whether you need someone to build and deploy a complete product from the ground up, or untangle an existing mess, I can handle it.
I'm an excellent communicator, and I will provide clear and concise status reports through every phase of the project. My job is to make sure you never have to worry about how your project is going.
I also have a strong foundation in computer science (B.S. in CS and constantly learning), and experience with many other languages and frameworks. I can handle anything you can throw at me, so don't hesitate to contact me if you don't see your technology stack listed.
Rates $60-90 per hour. Weekly/Monthly discounts available.
Uber isn't really a job open to the poor. First, you need to be able to afford a decent car, which is right out for the for a lot of people. Second, if you're poor and living paycheck to paycheck, you're unlikely to have the cash to cover any major repairs.
Also, after you deduct vehicle depreciation, insurance, gas, and time driving without a fare, uber doesn't pay all that well per hour.
As far as actual driver statistics go, Uber isn't a good employment alternative for the poor, it's a stopgap measure for the unemployed middle class--more than half of drivers have a college degree and more than half work for less than a year.
> Uber isn't really a job open to the poor. First, you need to be able to afford a decent car, which is right out for the for a lot of people. Second, if you're poor and living paycheck to paycheck, you're unlikely to have the cash to cover any major repairs.
Which doesn't stop people from buying nice cars. Don't get me wrong, the whole recent model car thing is a barrier, you need credit; but have you seen how easy it is to get a car loan lately? It's like subprime mortgages back in the day.
And uber will even give you one of those 'merchant account advance loans' to buy the car where they take their payment out of your earnings; I imagine the income requirements on those are even lower than on a conventional car loan. (though, I imagine if your credit is terrible, you are screwed either way.)
Also, a 2005 model year car is... not a high bar.
Remember, in most parts of America, you need a reliable car for that job at Burger King, too. And once you factor in maintenance and gas, a 2000 model year really isn't all that much cheaper than a 2005
>Also, after you deduct vehicle depreciation, insurance, gas, and time driving without a fare, uber doesn't pay all that well per hour.
I'm comparing it to foodservice; and compared to burger king, from what I hear, it does pay pretty well. And the cost of operating the car (that you are gonna need for that burger king job, too) can (well, it's complicated) be paid for out of pre-FICA (well, pre-tax, but FICA is most of the tax that poor people pay) money, which is a nice benefit you don't get from burger king.
But Yes, I think we're in agreement that it's not very good compared to a middle-class job.
My point is just that it's pretty nice compared to a lower-class job.
>As far as actual driver statistics go, Uber isn't a good employment alternative for the poor, it's a stopgap measure for the unemployed middle class--more than half of drivers have a college degree and more than half work for less than a year.
40% of working-aged Americans, from what I read, have college degrees; I don't think that's an indicator of "not poor" anymore.
> Don't get me wrong, the whole recent model car thing is a barrier, you need credit; but have you seen how easy it is to get a car loan lately? It's like subprime mortgages back in the day.
You do realize that subprime mortgages turned out terribly for a lot of low-income people, right? Because when the economic circumstances changed, their debt was suddenly unaffordable, ruining credit and ruining lives? And that many of those sub-prime mortgages were basically designed from the start to take money from unsophisticated borrowers, saddling them up with debt they probably couldn't afford.
>Which doesn't stop people from buying nice cars. Don't get me wrong, the whole recent model car thing is a barrier, you need credit; but have you seen how easy it is to get a car loan lately?
I think we have a different definition of poor. I'm not talking about someone who make $40k a year and lives paycheck to paycheck because they spend above their means. I'm talking about people making $15k a year (or less) at near minimum wage job. The kind of people who work in fast food.
>Also, a 2005 model year car is... not a high bar
For a person working 30 hours a week at a fast food restaurant that's a very high bar. And you're forgetting that Uber inspects the car as well, so the clunky beaters that nearly everyone drove when I worked retail during college aren't going to pass inspection.
>Remember, in most parts of America, you need a reliable car for that job at Burger King, too.
Most people working at Burger King don't have reliable cars. If they live outside of a mass transit area, they rely on friends and family when they have breakdowns. If you've ever managed retail employees (I was a geek squad supervisor in college), you'll know just how frequently workers rely on other people to drop them off, or how often they call out because their car broke down.
> And once you factor in maintenance and gas, a 2000 model year really isn't all that much cheaper than a 2005
That's true. But this is a common problem that poor people face. They don't have the money up front, so they take the cheaper up front option even though it's more expensive over the long term.
>I'm comparing it to foodservice; and compared to burger king, from what I hear, it does pay pretty well.
Definitely, but for the average burger king worker buying a car to work for Uber isn't a realistic goal. They don't have the cash and they don't have the credit to get a loan.
My point is, that there is no point comparing Uber to burger king because there is a very small intersection between people who are forced to work at burger king and people who can afford to work for Uber.
>My point is just that it's pretty nice compared to a lower-class job.
Yes,but it has barriers to entry that keeps it from being an option for the working poor, so again there is no point in your comparison.
>40% of working-aged Americans, from what I read, have college degrees
If we're talking about the same statistic, the 40% number was pretty inflated because it considered 2 year and professional degrees.
Also only 2.1% of people with a bachelor's degree are classified as working poor (under the poverty level and working) while over 21% of people with no college degree are classified as working poor.
Since a college degree decreases your chance of being working poor by 10 times, I'd say it's a a good discriminator.
The important part of wpietri's critique is "during the instants we are directly useful".
The CEO doesn't stop being paid when she walks to the bathroom, and a cashier doesn't stop being paid when he's waiting for a customer to show up. wpietri is talking about a shift in the granularity of trading your labor for money, so that you aren't paid for a day or an hour of work, but for the exact number of minutes you performed useful work.
The shift is away from paying an employee for 8 hours of work--knowing that he will only be performing useful work for some fraction of that time--towards paying the same employee only for the time spent being productive.
Since most people aren't able to work productively for an entire 8 hour shift, this change probably means a net decrease in wages.
We've already started to see this with computer scheduling in retail. Assigning split shifts, irregular shifts, and scheduling the bare minimum number of employees required to minimize employee downtime.
And I'd add that for me the biggest issue is the extent to which it treats workers as disposable. In a salaried position, everybody understands and behaves as if people are people. E.g., sick days.
As somebody who was a self-employed consultant for years, I get that being one's own boss has upsides. But I always maintained a large buffer of cash and could price in all the time I spent looking for work. Things like Uber and the new retail scheduling trend don't allow for that, and as far as I'm concerned amount to calculated exploitation of desperate people.
In some sense, that's nothing new in labor history; we've always had exploitation of desperate people. But that's no excuse for increasing it, and in particular I despise Uber's "we're just helping people be independent entrepreneurs" spin to cover up that exploitation, especially when their whole plan is to kick those supposed entrepreneurs out into the snow the moment they have self-driving cars.
Worse still, school un-prepairs you from healthy interactions. You grow all kinds of needles, you become wary of people and insecure. It takes literally years of your life to overcome that. Sometimes forever.
What grades are you talking about? There was a very large experiment done over 80 years ago that showed kids who weren't taught arithmetic until 6th grade did just as well as their peers after just a year of instruction. They even performed better than their peers on word problems.
There is no solid evidence that teaching reading early is advantageous, and in fact there is evidence that it can be damaging. No one really knows why exactly, but there are several theories ranging from discouraging other types of play and interaction, to the difficulty turning them off of reading later on.
>it's "how soon does this kid have the tool to satisfy their natural curiosity about things that require more than someone telling them about it?"
I don't think that's true at all, an average 4-7 year old who can read, cannot read to a level where they can learn topics complex enough that they "require more than someone telling them about it."
A kid who learns to read at 7 will catch up to the kid who learned at 4, so that by the time they are ready to teach themselves on their own through reading, there won't be a difference.
The worst class I had in college was Software Engineering. It was the university's attempt to prepare us for the work force, and it was taught by an adjunct who had plenty of industry experience, but it was already a 10 years out of date.
Industry processes are mostly fads that change with wind. CS fundamentals however, are much more stable. 20 years from now knowledge of automta, graph theory, and complexity analysis will still have immense value--a scrum master certification won't.
I also had a Software Engineering class (actually two them) that were focused on how to build software in real life. This was in '03 and we covered things like waterfall methodology, requirements gathering, functional specs, etc. If taught the exact same way today it would be woefully out of date but the time we spent on requirements gathering (where the teacher or TA) pretended to be a product owner and purposely gave really crappy answers and we had to extract useful information but by bit was one of the best pieces of prep I ever received.
All in all, it was boring and tedious but it certainly wasn't the worst class I ever had in regards to preparing me for a career in technology. I use those lessons to some degree all the time, I rarely directly use all the work I had to do to create my own OS...
> I use those lessons to some degree all the time, I rarely directly use all the work I had to do to create my own OS...
I think you probably would have learned those lessons after a few months on the job. You probably wouldn't have picked up the knowledge to build your own OS on the job, however.
>All in all, it was boring and tedious but it certainly wasn't the worst class I ever had in regards to preparing me for a career in technology.
Did you learn about billing clients, and effectively advertising your services in your CS degree? What about equity versus salary tradeoffs?
At least 50% of working in technology is soft skills, why doesn't a CS degree spend 50% of the time teaching you those? The answer is that a CS degree isn't supposed to be vocational training.
Vocational training, including learning to talk to clients, should be done on the job, during an internship/apprenticeship, or in a specialized vocational training program. By including it in a college degree, employers have successfully pushed employee training costs onto workers and society (as the article argues).
The article isn't complaining about CS majors not getting jobs, it's about soft liberal arts majors not getting jobs.
When companies hired people for 30 year careers, they could afford to invest a tremendous amount in training. When they hire people for 2-3 years, they have less time to amortize the costs. And it's up to the employee to convince the companies that they can learn quickly, and on their own if need be.
Any CS major with decent grades and a positive attitude can learn anything in most any job. (Certainly CS, consulting, finance, marketing, even some kinds of sales) I can't say the same for liberal arts majors. There are great ones out there, but also a lot of folks who goofed off for 4 years and didn't learn anything.
You could take that one step further. CS has, in it's modern incarnation, only been around for ~90 years. CS, as it is taught at the undergraduate level, has changed dramatically in recent years, and will continue into the future.
Math, on the other hand, has been around for thousands of years, is relatively stable, and unlikely to become obsolete in the way a scrum certification, or even a machine learning algorithm, will.
Of course, 'CS fundamentals' usually end up at a very close intersection with math. I'm just suggesting that mathematics has an even deeper level of the 'stability' you referenced.
>Of course, 'CS fundamentals' usually end up at a very close intersection with math. I'm just suggesting that mathematics has an even deeper level of the 'stability' you referenced.
I agree with you completely. The parts of CS that are stable are the parts that are based on rigorous mathematical foundations. I think that teaching things like Object Oriented Programming strays too far from a rigorous foundation--away from math and even engineering into craft (which belongs in vocational training).
When I look back, the classes that I learned the most from were, Discrete Math, Automata, Design and Analysis of Algorithms, and Programming Language Concepts (which went into the academic side of programming language research more than what was currently in use in industry).
I mostly agree, but for me, the line gets blurry around the applied areas that have a lot of depth: computer architecture, operating systems, networking protocols, compilers, and databases. In all of those, I learned a lot about theory, practice, and engineering trade-offs, all of which was worthwhile. I didn't study it myself, but I would imagine distributed systems is (or should be) a similarly rich subject. I also learned a ton from studying the history of computing, which I wish would be more of a focus for those entering the industry.
Good point. Many of the area's you mentioned do have a lot of formal underpinnings, and there are large bodies of research to look to for guidance.
Computer architecture is big E Engineering, done by Computer Engineers for example. The networks class I took was also one of the most math heavy, and most of the book was supported by proofs. In addition, we spent the first half of databases working with only relational algebra.
If you look through a textbook on any of the subjects you mentioned, and compare it to say a book on design patterns, the distinction between math/engineering and craft is pretty clear.
>I also learned a ton from studying the history of computing, which I wish would be more of a focus for those entering the industry.
We went over the history in depth in my program--from Turing to Konrad Zuse to Backus. I also found it immensely useful.
My software engineering class spent most of the semester going over design patterns which are in fact quite useful to learn in school. And then maybe 1/4 of the class on going over the various development methodologies. I agree that a class devoted entirely to methodology would be complete over kill. However I think there is room for getting some exposure to it in school. Ideally before taking higher level classes, where having knowledge of a existing ways to structure your group work will be beneficial.
The problem is design patterns are subjective, they are craft, not science or engineering.
There hasn't been enough work serious research done on "Software Engineering" to call Engineering with a straight face. You can't point a whole stack of serious research to say the design pattern A is objectively better in situation X because Y and Z.
What you can say is that design pattern A is currently in vogue so you should probably use it, while design pattern B has fallen out of favor in industry, so you should avoid it.
That is something that belongs in a vocational training program or an internship/apprenticeship not in a university Computer Science education.
You're going too far when you dismiss design patterns as being merely fashion. Just because something is not objectively proven doesn't make it false. Things in the real world are not binary, where they are either objectively proven (hard sciences) or completely false ("The earth is flat"). In reality, lot of things are gray. Design patterns fall in that bucket — many of them help, as long as you remember that there are exceptions.
If many people learnt over and over again that global state, for example, leads to more bugs, you'd be wrong in completely dismissing it just because it isn't objectively proven. Because then you'd be arguing that a program that uses only global variables is just as good as one that's properly encapsulated and abstracted. Do you think anyone would take you seriously if said that?
I don't have a problem with design patterns as a concept. But you need to recognize them for what they are--folk wisdom. Some of it is useful, much of it isn't.
If there is no theory we have to fall back to empirical analysis, and unfortunately our industry hasn't done much of that. The only thing we have to go on is the general "consensus" of the industry, which is cyclical, transient and mostly fashion.
Some of the industry folk wisdom if beneficial and withstands the test of time. Most of us agree that encapsulation is nice. However, we don't agree on what form that encapsulation should take.
OO programmers argue that state should hidden away inside objects, functional programmers believe that state should be explicit and we should always try for pure functions and immutable data when possible. There's very little objective data to support either side (except that functional programming languages tend to have more formal underpinnings). Mostly it falls back to personal preference, which programmer sages you trust, and what the current industry fashion is.
Our industry reinvents the wheel time and again because we are slaves to the cyclical nature of the industry fashion.
Relational Algebra/calculus has been formalized for decades, yet people who don't understand relational theory cried out for something "simpler" and thus NoSQL was born. Fast forward 5 years and you'll find many of the people championing NoSQL had to reinvent most of the tough problems that CS had solved decades ago.
Again, there is nothing inherently wrong with craft and folk wisdom--just like there's nothing wrong with learning salary negotiation, but they don't belong in an academic CS environment. These things are best taught in an internship/apprenticeship after you've learned the underlying theory.
Design patterns are models, something very much within the wheelhouse of engineering. Engineering doesn't always deal with absolute facts as in science. When an underlying system is too complex to fully describe, simplified modeling can be appropriate.
Models are used to get a better of understanding of a complex system, or to test a complex system when you can't test the real thing. A design pattern could be used to this way but they aren't. They are used as a design methodology that you are encouraged to follow.
They also aren't based on any formal theory. They are based only on the experience of the people who create them. They are almost the definition of a craft (as opposed to engineering).
In an actual Engineering discipline you would need evidence to prove that your model fits reality as opposed to just trusting the experience of a few guys who wrote a book.
From my experience design patterns are both taught and used as a starting place for solving a problem which is recognized as similar to a problem which was previously solved effectively with in the manner of the pattern. Rarely will the design pattern fit perfectly as the solution, but by recognizing and using the correct one as a starting point for many architecture related problems you can greatly reduce the work involved, in creating the solution.
They keep people from re-inventing nearly identical solutions over and over, and allow for a vocabulary which can express rather complex ideas because fellow practitioners of software engineering generally know many of the same design patterns. This saves time and reduces the opportunity for miscommunication when expressing a more complex idea (assuming the person you are talking to doesn't lie and for instance say they are familiar with the facade pattern, when they are not).
To me this is using them to both get and express a better understanding of a complex system. I am also able to understand code for new projects I am to work on much quicker when I understand the underlying design of the elements. When the elements are designed with a structure that resembles things I am familiar with this process becomes fairly easy.
I see them as things like definitions of 'Suspension Bridge',
'Victorian House', 'Tunnel', or 'Dog House'. Sure we might see a book consisting of very rudimentary definitions of structures like these as being absurd, for the corresponding type of engineer, but that is because of our familiarity with the form these structures take. That familiarity with the form is the point of design patterns in my opinion. And I believe their utility is indispensable should we want to continue building more complex structures in code.
I have no problem with design patterns as a concept. And I have no problem with design patterns being taught as the collective folk wisdom of wise sages of industry.
However, design patterns are craft, not engineering, and should be taught as vocational training--not as an academic subject.
Design patterns have no rigorous underpinnings. They have very little academic research to back up their efficacy. The only "proof" we have that the design patterns being taught are beneficial is the word of a few guys who wrote a book and the collective folk wisdom of industry.
There is nothing wrong with this, but it isn't firm foundation for an academic subject.
OK, just cool your jets mister, everyone here is having a pretty civil conversation without you attacking people for 'pushing corrosive ideology'.
And in order to formulate that attack you pieced together quotes from two completely differently messages and people to make your 'point'.
I said they keep people from re-inventing solutions, however it was the comment you actually replied to which else claimed they were too complex.
I agree they are patterns which languages do not yet abstract over, but until languages do I think they are very useful patterns for programmers to be able to correctly recognize as they occur frequently enough that knowing the pattern saves a lot of time and work. When languages can successfully abstract over those patterns, then knowing them will be niche knowledge assuming new patterns cease to be detected by humans prior to their ability to be abstracted by a language.
> OK, just cool your jets mister, everyone here is having a pretty civil conversation without you attacking people for 'pushing corrosive ideology'.
Rhetoric = Ethos + Logos + Pathos
Unless you want to believe I chose my words while extremely angry at you personally, and unless you want everyone to never use pathos and talk like robots in perfect clauses and rebuttals connected in an acyclic directed graph, I suggested you get used to this very common rhetorical necessity.
Oh god yes I went through the same pain. Our university decided that every Engineering student had to take two CS courses during the first year. This applied regardless of whether you were studying Civil Engineering, Electrical, Mechanical etc. This was alongside all the other first year courses we were required to take such as Chemistry, Physics, Calculus, Linear Algebra, Statics, Dynamics and all that.
First Semester we took Intro to Programming and Algorithms (CS1100 or something like that). We learnt everything from Binary notation, logic gates through to floating points, I went from not knowing any programming language to having a fair idea about how topics like recursion worked our last project was to write some code to walk a tree using recursion. I also learnt how to use Unix and recieved my lifelong love of Emacs (which we used in our tutorials) from this course.
Second Semester we had to take Software Engineering (CS 1110 I think). It covered waterfall model, version control, specifications, unit tests and more estoteric stuff like loop invariants and formal correctness. Our major project was to write an essay about the Arriane V rocket explosion. I really enjoyed CS 1100 class but CS 1110 effectively succeeded in boring me to tears. It was somehow supposed to give us a taste of real world software design. All it really did was encourage myself and many others not to take CS electives in later years.
My major was Materials Engineering I took mostly physics electives in the last year of my degree, which is kind of ironic now because I work pretty heavily with programs - my day job involves writing computer simulations. I learned most of the skills I need on the job (including the C programming language, databases/SQL etc.) I probably would have benefited from more formal training but my experience of university CS was so miserable I actively avoided it.
I think it's a fallacy to assume that anything not based in CS fundamentals is a fad or has only short-term value.
In college, I didn't learn version control, continuous integration (continuously submitting your work in small changelists or patches), unit testing, making sure you're building the right product before building it, delivering the simplest possible code and design that meets the requirements, code quality, working in teams, untangling dependencies and making as much progress as possible today without waiting for all your dependencies to be resolved, and so on.
I expect that all these skills will be very much relevant 20 years from now. So, don't confuse long-term value with "grounded in CS fundamentals". Programming isn't a hard science like physics.
>I think it's a fallacy to assume that anything not based in CS fundamentals is a fad or has only short-term value.
I didn't say that. I said most industry processes are fads. I also didn't say that nothing that isn't grounded in CS fundamentals has long term value.
There are plenty of other skills that have long term value. Office politics, salary negotiation, self-promotion are much more valuable than knowing how to run a few git commands. But non of those things should be taught in a Computer Science program.
They are fundamentally vocational skills. Just like version control, unit testing, and continuous integration are vocational skills. Sure they're useful but they should be taught in an internship/apprenticeship or on the job.
>In college, I didn't learn version control...
I learned to use subversion, and other than the fact that they are are both version control systems, what I learned didn't really carry over to distributed version control like Git.
In a CS program you should be learning things like how to implement a version control system, not how to use Git. I would have been pissed paying thousands of dollars per semester for a professor to walk me through a Git tutorial.
I don't have a problem if a professor wants you to use github to submit your assignments or something like that. And sure some of the vocational skills you listed are going to be useful for years to come. But these skills should be ancillary. They should be just a happy side effect--like learning teamwork during a group project.
I'm a practician, not a theorist or an academic. I couldn't care less whether something is based in CS theory. I care whether something will be useful to me over the course of my career. If it is, I'd like my education to train me for that.
In fact, some of the academic stuff like compilers and automata have been useless in real life. That's a failing of academia from my point of view.
That's perfectly fine. What you're looking for is vocational training, not a liberal college education. Non professional college programs are explicitly not vocational training. If they were, they wouldn't require spending nearly half your time on general education requirements (assuming we're talking about the US here). I doubt art history, physics, or psychology has proven much direct use to you in your career.
>In fact, some of the academic stuff like compilers and automata have been useless in real life. That's a failing of academia from my point of view.
Finite state machines and pushdown automata are an incredibly common pattern, and I can't see how you can work as a professional software developer without running into that pattern time and again. Have you never used regular expressions?
Automata (usually taught along with theory of computation) teaches you all kinds of useful real world knowledge, like why you can't parse HTML with regular expressions, and why you can't write a program to tell if another program will eventually halt.
My idea of education is one that teaches you skills that are broadly used throughout your career. I don't a priori reject things that meet this criteria just because it's not based in theory (because theory is not an end to itself), or by applying arbitrary labels like "vocational" (whatever that means), "liberal" or "professional".
As for art history and psychology, that's a different debate to be had about education — whether these should be part of education and how much time they should take.
As for your question, I've used regexes, but you don't need to understand the details of the regex engine in order to use them. Neither do I, in my day-to-day work, write programs that try to tell if other programs halt.
>but you don't need to understand the details of the regex engine in order to use them.
Yes, at some point you do. Without understanding how regular expressions actually work, you can't know when it is appropriate to use them. Many things aren't possible with regular expressions and many grammars aren't parsable with regular expressions. You can either waste time trying to write an impossible regex (or write one that works on your tests, but blows up in the wild) or you can study automata theory and understand what actually goes on underneath.
As for the halting problem, I'll leave you this stack overflow explanation for why it is beneficial to understand.
Many problems in CS have already been solved, some are impossible to solve. You can either waste time on trial and error trying to reinvent the wheel or you can study the theoretical underpinnings.
Do you want to spend a week trying to model a problem as a finite state machine, only to determine that finite state machine isn't powerful enough to solve your problem?
Do you want to spend a month banging your head against a wall trying to solve a problem that you could have solved in 5 minutes had you realized it was just a well known graph theory problem all along? A problem that was solved decades ago. The only way to know these things is to study the theory behind what you do.
Why do you think Civil Engineers are required to take physics? The difference between an Engineer and an artisan is a rigorous understanding of the formal system underpinning his work. Artisans build through trial and error and experiences, and they leave many failed projects in their wake while they gain this experience. Engineers use theory and modeling to limit the number of failed projects to the net benefit of everyone involved.
I doubt that there are cheaper and better ways to become a practician than going through college. In India, where I live, most companies are not interested in you unless you have a degree. The normal path to a career in programming is to get a CS degree. And the normal outcome from a CS degree is a career in tech. So, they are much more closely related than you acknowledge, at least in my part of the world (things may be different in yours).
I guess it depends on the purpose of the education. If you're already getting the CS fundamentals, maybe it doesn't hurt to get up to speed on some of the industry fads at the moment, since most of the students could be looking for an industry job in two-three years.
Everything is a tradeoff if you spend time on industry fads, you're spending less time on CS fundamentals.
Industry fads aren't useful outside of industry, so learning them should be paid for by industry. As the article argues, 20 years ago they were. Now employers are trying to push the cost of vocational training onto workers and the public.
Well, the whole point of the article is that those students looking for industry jobs in two-three years should just be hired and trained in the fads of the moment, instead of wasting higher-ed time teaching trivial and likely irrelevant-by-then knowledge.