Even if you are of the opinion that CS is math, and coding doesn't come into it, you will hit a coding wall early on.
In fact, every exercise in CS has this problem. You add a new thing (eg inheritance), and it breaks. But not only that, it might be broken because of a banal little syntax problem.
And that's just what you consider code. If you put in the wrong compiler flags, it breaks. If you can't link because something needed rebuilding, it breaks. Want to avoid it? Learn how make works. Huge number of options, which you won't understand, because you're a novice.
Oh and learn git, too. And Linux. Just so you can hand in the homework.
Compare this to the rest of university. I'll use my own experience.
- Engineering subjects tend to revolve around a small number of vignettes. Here's an aircraft engine in thermo. Draw some boundaries, apply some equations. If you get it wrong, your tutor can still see if you were going the right way. Once you've learned the relevant points, it's not hard doing some rearrangements and plugging in some numbers.
- Economics essays are basically bullet points. Miss one out, you still have an essay. Which you can hand in without knowing git.
My CS program was really easy and I still don't know how people who hadn't put hundreds to thousands of hours into picking up that stuff in the years before college made it through. Many didn't, I guess. With some programming background on top of that it was pretty much a cakewalk for me, but the poor n00bs....
This, I think, is the core issue. Like most of us here, I started playing with computers as a young child, from a Timex Sinclair 1000 as a pre-teen to a Commodore 128 as a young teen, then an Amiga and beyond. It wasn't work, it was fun, a hobby. And more importantly, by the time I hit college, I had literally thousands of hours of practice just being immersed in how computers work.
I started tutoring other students in CS for some beer money, and it was a real shock at how awful otherwise very intelligent people were at what I considered utterly trivial questions. But they weren't trivial. They were only simple if you already had a complex, detailed and well-worn model of how computers work running in your head. Without that, even simple computer tasks may as well be written in cuneiform for all the good it does the genuinely new student.
I'm not even sure it is possible to take a young adult who is truly computer illiterate and have them succeed in a technical major. At least not in a standard 4 years. There is simply too much foundational knowledge you need to have before you can even begin the real work of learning what to do.
I had the advantage of having a department that assumed we were computer illiterate (I went to an HBCU in the late 00s: lots of low-income kids, very few had PCs of their own growing up, let alone college educated parents) so there were a lot of resources dedicated to tutoring, TA involvement in labs, and building a community around our department. It sort of felt like being in a sports club. Every freshman was assigned am upper-classman mentor and every mentor I had took their role seriously. If I didn't come around for a week I got a phone call. If I struggled I got extra help, and my relationship with my mentor was something that came up if I started slipping in a class. We also had advisors based on our interests who were constantly throwing REUs, internships, and events into our face.
When I went to grad school at a majority institution it was like night and day (no pun intended). Lower level students were often left hanging on a vine and were only paid attention to if they were already successful. The TAs weren't actively helping students, only passively so. I left there thinking there were probably a lot of kids who got turned off to the major because they didn't get the help they needed. Poor kids especially had it rough because they often came from a situation like mine, households with one computer that they were only allowed to use if they were using word write a paper or something.
I say all that to say I think you can teach a young adult to succeed in a technical major in 4 years, but it takes a village. If you immerse them in a community built around that technical major, it becomes a lot easier to get them up to speed.
It really bothered me that I wasn't able to successfully tutor more CS students, and many folks that I know would have (eventually) loved it and thrived in the discipline dropped out due to their inability to get over the hump.
It sounds like from your experience direct, constant mentoring (along with a highly motivated student!) does work.
I do remember watching other people trying to figure out how to use a mouse or the most basic stuff. I've already studied algorithms and data structures and had a paying programming job. Most of the CS work seemed trivial but I did learn some math. The math did get hard so the degree was definitely not a breeze. Even with CS there were areas that were less trivial like computational geometry or numerical methods but I had the advantage that coding was a non-issue so I could focus on the essence.
Many many many years later, I still often make the mistake of assuming things that look trivial to me are also trivial to others. It's just so hard to tell whether something is universally trivial or just trivial because you've done it so many times.
I wonder if the current generation's "playing" with their phone/computer/tablets gives them an advantage over the previous generation. I do work with quite a few sharp young people but I don't know whether they had programmed before going into CS or not, I guess I should ask...
Contrast this with the 8-bit era: devices made of multiple clunky components large enough to manipulate and simple enough to fully comprehend, shipping with "user manuals" that look like today's programming textbooks or service/maintenance manuals (where those exist today at all), a readily available developer environment giving full access to the hardware, tinkering with hardware and software as a matter of course to get anything interesting done at all... The experience is very different indeed.
When I was a tween I was building my own desktop, installing drivers, trouble shooting driver issues, trawling log files, and even dabbled in a little bit of python here or there when I had no idea what I was doing.
None of that is even a thing on the black box of of a touch screen device.
It is a shame that this is no longer the default option, but it can be done for people wanting to get an educational experience. I wonder how easy it would be to build a computer with a little kid?
(He also just about jumped for joy when he saw the color-changing LED fan I put on the case.)
Obviously at that age he doesn't quite have the manual dexterity to do the work himself, but I let him touch the RAM as I installed it so he could feel it click in to place, and showed him how to close & latch the CPU lever.
He appeared to really enjoy the experience. The next time one of the offices I work for retire some desktops, I may clean some up and partially disassemble them, so he can 'build' one himself. I really want to impress on him the fact that things that look complicated can be understood, if you just break down the task into bite-sized chunks.
It's a great little system. It teaches the commandline and python programming and it lets kids do things like play minecraft then change the code to make it run differently.
What surprises me is the lack of consolidated resources addressing these basic pain points. I don't think it would be hard to come up with a list of examples just by looking at a normal work day and taking a step back; really evaluating where you're relying on basic knowledge to do even the most mundane of tasks. Once you've been doing this for a few years those tasks are just that - mundane - but for beginners it's having difficulty with those basic parts that is a real turn off. "How will I ever be any good at this if I can't even download code off this thing called Github". If someone could point me towards some kind of existing resource covering a bunch of different areas in this way I'd be really interested in having a read.
On a related note, I'm reminded of one of Dan Luu's related essays on debugging , which I recommend reading if you haven't already.
For example, if I wanted to teach a kid to use an Arduino, I'd start them off getting an LED to blink by telling them which wires to plug in where, and what code to type in. If they were good at that, maybe I'd have them make it fade in and out like a macbook light.
That way, when it comes time to learn about voltage and current and P-N junctions and bandgaps and PWM and cross-compiling and memory maps and bootloaders, they'll have real-world examples to relate it to.
The difficult part, of course, is even if I thoroughly test the instructions with a clean VM and hardware, the instructions can still fail - like if they've installed a different version previously and there's some sort of conflict.
*Or at least, interesting by the standards of the course
Perhaps the lesson was "read the error message; if it doesn't help then take that as a call to produce useful errors in your own programs"?
For a job, I had to get python working on Windows, and it took me a few days to get it in stalled correctly on Windows. Even after years of coding experience.
The issue with Windows is it is designed to explicitly hide the backend functionality, and modular accessing and modifying of it to the front end user, so when you do need to do something of the sort, it's more difficult than linux, and even after using linux operating systems, macbooks I find difficult to work with because comustomizable things are hidden to the average mac user.
That being said, a linux operating system is probably daunting for a first timer.
HOW I LEARNED COMPUTER SCIENCE
I only minored in Computer Science, and like others, did not even know what it was until I went to a tech college for Electrical Engineering. I grew up in the deep south low on the socioeconomic scale, noone in my family even went to college ever so I just had had no exposure to computers or even the idea of computer science before.
It seemed cool but daunting.
1. I got a part time job in tech support to help myself get acquainted with the parts of computers and technology that you need to learn so that when you do Computer Science you can focus on Computer Science.
That side job was 15hours a week over two years and it helped me learn alot about troubleshooting, viruses, computers, I reimaged hundreds of computers and learned how to make my own isos, take apart and put together computers.
It made my computer science courses in relation seem ideal and the technical bottlenecks of working in different environments whether operating systems, relatively trivial but yeh, I had to double up on my time commitments and turn my college job into tech work to catch up.
2. once I learned what a linux operating system was, I started using only minimal versions of linux, so I would be forced to learn more about how computers work to do everyday needed tasks, along with all over layered projects and school work on it. This helps ALOT.
You want to print something, you need to install or update the drivers, you want a functionality that you didn't know could never not exist on a computer, now you learn how to figure out what to download, what rpms are learn alot about options you have from opensource packages many people have made.
Now you are aware of hundreds of gaps/needs in functionality that have not been written so you could try something and have a realistic project to work on.
You have access to other peoples code...etc...
All of this from googling about printing drivers and why you have access denied....
the learning curve is steep but its hard to learn about this world any other way.
WHY I DID NOT MAJOR IN COMPUTER SCIENCE
In regards to Computer Science, there were a few reasons I did not end up majoring in it despite really liking it, and now taking coursera courses from Stanford and Princeton to supplement my desire 5 years post graduating to go back and learn more and shift to full time software development.
As a disclaimer, these are all really a result of my mental approach/reaction to a growing desire to learn more Computer Science.
1. I really didn't have the self esteem that I could do it in four years.
I knew I could excel at Electrical Engineering, but Computer Science seemed like a whole new world I could not learn in four years.
2. I went to a pretty good engineering school, and the kids in Computer Science had basically been coding since they were 8, so for me I felt like the competition was high and I could not stand a chance.
3. In retrospect that was not the case, and in fact the people I did know in Computer Science were very welcoming and encouraging to me coming into it, but I really didn't believe I could do it.
I regret that approach now, and I could blame it on all things academia in Computer Science, but it was really my reaction to the sentiment echoed here already that it takes years to really feel comfortable with Computers, programming and all of it in context so you aren't constantly wondering what youre not understanding or spending hours trying to set up varied work environments because you've never used a terminal before.
It is a continual effort and Computer Science, more than any other major or practice, is a new lifestyle and way of thinking, and a lifelong commitment to learning.
For me, I have continued on this path and continued to code for years after college and work on independent and group projects, and taken on coding work in my jobs in relation to electrical engineering. Just now at 26, being first exposed to Computer Science at 18, do I feel confident about my ability to learn and do well as a software engineer that is not a code monkey, working on computationally challenging problems where scaling, algorithms and data structures are a core part of design and implementation to make something successful.
You just have to keep trying, and in general, in regards to computer science or elsewhere, this weeds out alot of people.
SOME INTERESTING THINGS
As a final note, I would say the most daunting part of Computer Science in college was the perception I had about the experience and practice of Computer Science majors versus mine. I felt really ignorant in comparison and had no realistic way to convince myself I could compete and do well amongst kids with 10+ years experience.
Perhaps the value of this post is to say. I overcame alot of obstacles socioeconomic and otherwise to get into college in the first place.
I made a 4.0 in a private high end highschool and had and have a good work ethic and intelligence.
I even went to an engineering school and majored in Engineering.
Despite all of these layers of statistical thinning, I still didn't feel I could cut it as a CompSci.
If it is challenging for someone who is otherwise very smart, exceeding well by all academic metrics, has a good work ethic and is already in a school offering a good Computer Science program, wants to learn Computer Science, takes entry level Computer Science courses and makes As in them, and still doesnt feel confident about Computer Science, then there is definitely something about Computer Science, that sets the barrier high for learning.
It's definitely a DIFFERENT way of thinking, and I think the barrier to that new way of thinking is not a barrier for learning in most other places perhaps maybe save linguistics, which is also extremely difficult to become proficient in if you did not grow up in a multilinguistic environment or are not constantly saturated in living in a multilinguistic community.
You need to saturate yourself in a new way of thinking, and for me it is incredibly rewarding and translates into every other part of my life and allows me to think more objectively and logicially, creatively and efficiently than I have knew I could, but I think it is the way of thinking that is the learning barrier, along with the multilayered skills needed to navigate complex computing environments, that keeps people from Computer Science.
If you don't mind me asking, did you apply this approach to specific programming languages? I'm curious which ones you tried to learned and ended up sticking with, and why.
On the other hand, Java is really nice for algorithm testing, and writing for classwork.
It's better to have actual projects to work on. From there its easier to center around a framework, but I still prefer C/C++ but I am learning to appreciate Java the more I develop GUI applications and test more complex algorithms.
You have to like it to do that, I really enjoy it.
Thanks for clearly articulating this in 6 words: "Stupid Computer Shit we all know". This is something many people aren't willing to acknowledge (aka tribal knowledge gained from lots of trial and error).
It perfectly summarizes what pg tried to communicate in a 2013 interview  where he said founders who have been "hacking for the past 10 years" have an enormous competitive advantage over founders who haven't i.e. female founders when it comes to starting technology companies.
Unfortunately, the tech press  went out of its way to grossly misinterpret what pg was trying to say .
I think a lot of us have forgotten how strange and frightening a blank command line was the first time we encountered it. And how long it took to actually get comfortable with it. Even a windowing system is weird if you're not used to it, and lots of people use those for years and years without ever actually feeling like they're in control of them.
Knowing where the boundaries lie, how to protect oneself from unrecoverable situations and how to get oneself out of bad-but-recoverable ones puts one at ease in front of the keyboard. The absolute n00b (which we all were, once) gets lost navigating directories. The experienced user loses their boot sector, curses, sighs, then fixes the problem. They aren't put off by finding the solution to their problem is to apply a patch to their kernel and compile, and so on. It takes time to reach that kind of ultra-power-user level of competence and confidence, it's hard to teach, but it's enormously useful for letting you put the concerns of basic (ha!) computing in the background and focus on the task at hand (programming, for example).
There were so many idiosyncrasies my coworkers knew that I had to learn (anyone know how to push the F22 key?) The terminal didn't function in any way I was familiar with at all (no ls, cd, mv, nothing). The "filesystem" was basically for interoperability while the actual system ran with a concept of "libraries" instead of folder hierarchy.
I ended up having to take basic lessons on the system.
It really gave me an appreciation for "stupid computer shit we all know". It sucks to be in a position of not knowing that stuff. Suddenly the most basic functions becomes monumental tasks.
BBC Computer 32k
It’s all about what you know...
I figured most other classmates were the same. However, the amount of skill variation I witnesses as a lab assistant where intro course students would plead with me to basically complete their assignments after I helped them once was mind-opening. Horrifically formatted code. Unable to use copy-and-paste keyboard shortcuts--or any keyboard shortcuts, really. Unable to run and debug code. Unable to find where in the file system the compiler or their commands were running. Unable to understand file system permissions, etc.
There was a huge valley between those with no experience that just wanted the CS degree for a well-paying job, and those that already had CS experience and wanted CS for a well-paying job that they would enjoy*
To be honest with you, that teacher sounds like a real tool. All that that can possibly achieve is to breed various subtle forms of animosity, either among classmates or towards engineers or towards the teacher or the course.
I can see this being a joke. Entire subfields of mathematics have been built on the assumption that Riemann hypothesis is true. He probably assumed that everyone in the audience would know this, which is not unreasonable.
Thanks! Now I feel like an idiot.
But it is not an important skill for persuading people, or for cheering people up.
There was a good class on Coursera a while back called "Startup Engineering" which did cover some of this even though it wasn't the focal point of the class.
Of course, it would be impossible to cover every little detail - but a class dedicated to SCS would still be super useful.
However, my thought when taking it was that if I were a freshman taking this who had minimal experience with computers, I'd have been pretty lost. Not so much because I couldn't have handled individual aspects of the course but there would have just been too much new from the practicalities of a programming environment to a lot of the basic programming concepts to the more theoretical aspects.
I couldn't explain how to make it make sense if my life depended on it. It just happened.
Loop, increments, semi colons, returns, recursive everything was a deterrent. But this wasn't the real problem. The real problems were the teacher and the rest of the class. Pretty much everyone else, most of them were from the that city or other cities and had been doing CS from quite early on - many of them had their own computers to my surprise (it was quite pricey back then), as they didn't face any challenges and the teacher expected me to just go at the same speed and catch up somehow. I tried to seek his attention and get some help with the challenges I faced and he would just say a thing or two and move on, or ask a student to help me who'll help with "no, you have to put a '==' there, not '='" and I would never know why. At that time of of struggles books confused me more. I was shocked to see years later that why they didn't use C&R and why Robert Lafore and some Indian authors. Why? Also, this was also the first time I was introduced to the purely commercial nature of education where the "connection" in the class was missing right from the start which I was used to having been to boarding schools and semi rural schools.
I failed. Miserably. Again and again. But I still kept the course throughout 11 and 12, though I had a strong urge to drop it in 12 (just before the board exam; the one that matters and you get a certificate for this) and opt for something familiar and straight like Economics or Sanskrit which was I really good at. I don't remember why but I guess it was the urge to prove myself or get something that will get me or job or so; as that's what I had known about CS. I can't recall.
Anyway, for some reason my seat during the board exam (during CS paper; your seats change for all the papers) was diagonally behind my best friend who knew pretty much everythiing in the paper and I ended up scoring 3 marks higher than him out of 100. He's still cross about it.
Well, I never recovered from those initial CS trauma days. I still feel daunted by anything new in CS and get stressed. First I try to avoid it and question the need of it in the first place (like I did when I was asked to move to frotend/Angular from Android) and after a few weeks or few months when I have my hands dirty with it and I find it easy like muscle memory I look back and wonder why I was worried about it and it repeats.
tl;dr at least in CS you get immediate negative feedback
It's fun to observe the teacher's reactions sometimes when it is apparent that the student presenting is clearly heading towards a train wreck of some sort due to misunderstanding the question, not being rigorous enough, or just being generally confused about the subject matter. Occasionally I've seen professors at the math dept get this visibly pained look in their face as they watch a CS student abuse their beloved math in particularly horrendous ways.
I don't know how many times I've been trying to fix something with somebody, and an error dialog comes up, and they dismiss it instantly, before whatever content it might hold has a chance to flash across their retinas. Then I have to get them to do whatever it was four or five more times, before they calm the hell down and let me read what the error message is telling them.
Taking Calculus I in college most of the class had taken the course in high school. The most advanced math course in my HS had been Algebra II/advanced geometry. So they were ahead but it wasn't insurmountable, it was still math.
A classroom CS student trying to get an application to work asks a SCSWAK repository for assistance. One notepad app is suddenly 2 terminal windows with EMACKS? (at least it wasn't vim) on one and the other lots of lists go bye and quick commands are being entered while asking where the file is saved. "Nevermind, found it." The file contents show up but no editor was opened. Then there's all kinds of key combinations that no sense on the EMAKS just to move around! (add in more to the story).
One is still getting used to the operating system while the other has logged thousands of hours plugging away at the CLI. (Remember MUDS?) How the heck are the next 2.5-4 years going to be if the classroom CS couldn't follow anything he just saw?
Being around a bunch of SCSWAK who seem to naturally grasp everything can scare off students. IMHO the instructors for the low level CS courses should talk about this and get the classroom CS students to realize they can do just fine in the courses and SCSWAK are great resources if you can't find your answer on google. Also, the best way to actually learn something is to teach it. Thus, there's an opportunity for the SCSWAK group.
Incidentally, this is also why we see so many "experiments" in production code, which is the downside.
I would say that this basically screams "badly taught programming". Missing semicolon does not take down work, it just needs to be put back in and ide should help you.
"Even if you are of the opinion that CS is math, and coding doesn't come into it, you will hit a coding wall early on."
I mean it seriously, programming is easier then math. I am telling that as someone who always liked math.
"And that's just what you consider code. If you put in the wrong compiler flags, it breaks. If you can't link because something needed rebuilding, it breaks. Want to avoid it? Learn how make works. Huge number of options, which you won't understand, because you're a novice."
This is not something novice should be dealing with.
"Oh and learn git, too. And Linux. Just so you can hand in the homework."
As I told, badly taught programming that takes in people who know nothing and then proceed as if they already learned stuff in the past.
The point is that the incidental complexity is there, and that it's something that has to be dealt with, to even begin to get to the fundamental part of the problem.
That you have to be using the right IDE (and learn how to use it, and learn to recognize it's attempts at helping you with the problem) does not refute the point, it's a perfect illustration of it.
CS isn't harder. CS has high incidental complexity, the crap you have to deal with just to be able to work on the problem within a real world environment.
If you can teach programming to 10-year-olds without it being too frustrating... on the other hand, I suppose you could argue that younger people will have an _easier_ time, for the same reasons (human) language acquisition is easier for children. I dunno.
I'm not sure how it relates, but what I think was _really_ an immense aid to people of any age learning programming in the 80s was that computers and software were _simpler_, meaning that you could very quickly approach the level (at least at first glance) of 'real' software with what today seems like 'toy' software. The text adventure games I started writing as a child weren't actually _fun_ for anyone to play but me, but they _looked_ like a real text adventure game that other people played, they had the same form. Or the ascii-graphic blackjack game. There was gratification with a pretty low bar.
I have no idea how I'd become a programmer today. It's an entirely different path. People ask for me for advice, and in my head, I'm like, well, first be a child in the 80s with an Apple IIe, then start writing for the web when all you need to know is HTML and you write it in Perl cgi-bin... I have no useful advice.
I do have a CS degree, which I got in the 90s.
Like someone mentioned logo or the old basics - if kids start in that environment they pick it up quite easy. They pick up concepts (wtf variables, yes does not matter how they are named, conditions, for cycle, functions) and there is instant gratification and little complexity. Just playing those games with little robot and arrows that "programs" it or simple scripting language in some game already makes much difference.
Then you can move on compiler and java or C - because basics are out of the way and it is the time to deal with what you call incidental complexity. Basically, one concept at a time. It is a bit as like teaching quadratic equations to people who were never taught normal ones nor much arithmetic.
At some point between Logo and bachelor's degree, you run face-first into that complexity, which is the "wall" the commenter atop this thread was referring to.
I'd be willing to bet you could create a CS course using nothing but Logo - and it would probably be better for students than what currently exists.
In his book "Mindstorms", Papert discusses the faulty idea that Logo is only for children, and that it isn't a language that can be used to develop complex software.
The fact is, it can. In fact, Logo is pretty close to Lisp in its functionality.
The most complex piece of Logo I've ever seen was a few decades ago in an issue of the Rainbow Magazine (for the TRS-80 Color Computer); it was essentially a game of Monopoly, with full graphics (and probably sound too); I'm not sure if it did any kind of file i/o - but I know Logo supports all of that and more (especially current versions of Logo).
Seriously - if you think Logo is only for kids, you're missing out on a very fascinating language.
Is it? Quote from the Logo Manual, 1974:
All complete LOGO statements are imperatives, so that an operation cannot stand by itself. If you type SUM 17 26, Logo responds with the error message YOU DON'T SAY WHAT TO DO WITH 43. In contrast PRINT SUM 17 26 is a complete statement. The computer will print 43.
I mean, you can teach concepts one at a time and still be demanding and require certain speed. I am not saying it should be easy. You however should separate those concepts from each other and teach them one by one. Otherwise you just relying on them to have it "picked up" randomly which is not an expectation other majors have (like physics or math or biology).
Eventually some people got me back into Flash development (I had dabbled before and had a couple sort of popular things on Newgrounds, and they were asking me for help). I then made a couple of games that were super popular and realized I probably wasn't in the wrong field after all, and eventually went back to college and finished my degree. It was easier the second time around, and I had much better teachers.
Now I've been in the industry for about 10 years and have worked on all sorts of different types and sizes of programs and learned and used dozens of languages and technologies.
That was back in the 90s when I had that semicolon problem, and I'm pretty sure compilers have improved since then, because I usually see compilers catch missing semi-colons pretty much exactly where the problem happens nowadays. Also there was no Stack Overflow back then, and most forums weren't terribly helpful either, and programming blogs were a lot less common and comprehensive.
I remember specifically that I had difficulty wrapping my head around the concept of linked lists and there seemed to be no where to find that information besides my professor who was a real jerk and said, during his office hours, the specific time he's supposed to be available to help students, that "if i didn't understand it during his lecture he couldn't help me" and two old books in the school library that had a few paragraphs about it each.
Yeah - that's been replaced by C++ compilers b0rking hard on errors in variable templates and such (and leaving you just as confused as to where the problem really is).
/wish I was making this up...
I learned java in high school as my first language, and I struggled to get the IDE properly set up - it definitely slowed down my understanding.
We don't expect law students to have been amateur lawyers in high school; it's a bonus if they have been in the debate society or have taken a personal interest in the subject, but it's not necessary. I don't see why CS should be a special case, aside from a misplaced attitude of exclusionary elitism.
If you can afford a video game console, you can afford a raspberry pi (or equivalent.)
> We don't expect law students to have been amateur lawyers in high school
No, but we expect math majors to have done some math before they get to college.
I don't disagree with you that most people can be brought up to speed in less than a year, but it's also not unreasonable to expect first year CS students to have had some programming experience. Computers don't cost thousands of dollars anymore.
If you are a boy, video game console is something that is bought for you often times whether you really want it or not. Raspberry pi is likely not and you don't know it exists. And even if you know it exists, you need computer to upload code to it and then you could have just use that computer to code. I know that it is easy to code when you know where to look, but the biggest hurdle is that many kids dont know where to look. Instead, they are told there is something magical and hard about it.
I remember being told that it is hard or that I cant compete because I am (presumably) just learning and other kids "already know a lot". It was bs, but that is where it is for many kids.
Seriously, I knew straight A students who were more hardworking then me and had good grades in math (meaning no dummies memorizing stuff) were under impression I have some special brain because I can program. All the myths around cs tend reinforce such nonsense.
If a kid can't type "how do computers work" in google, they probably won't become a programmer. Are we really wringing our hands over that?
> All the myths around cs tend reinforce such nonsense.
There's another bit of nonsense around CS that a lot of people here believe - that there are all these secretly good programmers out there who just don't know they're programmers because no one ever told them how to start, and that they might not even know what programming is because it's so difficult to get started.
I find it nearly impossible to believe that there are all these smart people out there, surrounded by technology, with hacking and computers in every TV show, with the President telling them to "learn how to code", Bill Gates' name dropped by rappers, Google in everyones' pocket - and no one is typing "how does my smartphone/videogame/internet work" and reading the results?
Like, who is still walking around like, "I got good grades in school and knew people could become doctors, but I didn't know people could make computers go! Why didn't anyone tell me?!"
Are there other problems keeping people from breaking into the industry? Absolutely. Is "not knowing where to look" one of them? There's no way that it can be, not anymore.
The idea that we block off CS as a subject for only those who've taught themselves some programming is absurd and elitist. While we may expect Math students to know Math before Uni, it so happens that 99% of high schools teach the level of Math we expect them to know. In a lot of countries it's compulsory. Same goes with most of the sciences, we expect the level that schools will teach. Unfortunately most schools don't teach programming, and those that do often don't do it well.
If a kid cant type "how human body works" in google, they probably won't become a biologists. Somehow, being constantly surrounded by all those bodies and dogs and grass they did not learned biology. Except that they come to college and do learn. But realistically, "how do computers work" is such a broad question that it does not have to do with anything. It is literally irrelevant question to anything practical.
But that is literally this culture I am talking about - the conviction that if you already don't know stuff and was not interested in the past, you are not talented point stop don't even try. Math teachers nor chemistry teachers assume that - if you are curious now are happy with you learning now.
> that there are all these secretly good programmers out there who just don't know they're programmers because no one ever told them how to start,
They are not secretly good programmers. That takes more then just aptitude. But yeah, a student with good memory and basic math aptitude has genetics for programming. There is nothing special about us. The hardest part of beginner is to figure out what makes sense to learn and what not.
> and that they might not even know what programming is because it's so difficult to get started.
Well, it is difficult when you don't know what to do and get told you probably don't have aptitude first time you struggle with something. It is easy if you learned from simple concepts, either because you run into right game or because you run into good teacher or book.
Exactly like any other learnable skill - math, chemistry, law, physics and so on.
> "with hacking and computers in every TV show"
That has nothing to do with reality. I see sword fights in many movies too.
> "and no one is typing "how does my smartphone/videogame/internet work" and reading the results?"
What does that question have to do with building sofware? Like, real world software with real world homework that suddenly out of nowhere expects you to have skills that were not taught previously.
What does that question have to do with algorithms? Programming languages? Whether the students would be more attracted to solving problems side or building things side or theory side or software engineering side, your how does videogame works is largely irrelevant. Plus the answer will be high level overview of memory/cpu and such.
> Are there other problems keeping people from breaking into the industry? Absolutely. Is "not knowing where to look" one of them? There's no way that it can be, not anymore.
You did not suggested a single practical place for beginner to learn stuff. Only few very general question that does not necessary lead to programming - most of them would lead to electronics at best. And such good resources exist, but none of your suggestions lead to them.
The "breaking into the industry" is far away from "starting with programming". But then again, that is part of nonsense around this. I know people who found a job with very little knowledge in small company just because they looked confident and hiring manager was inexperienced. Few of them even became good programmers, although they created huge mess on their way there and had to rely at politics a lot. But somehow, a honest student with good results in pretty much everything else is assumed unable to learn the same, because he is less good at pretending.
You're way beyond the scope of the original discussion.
Someone suggested people would drop out of CS due to the grief factor of - for example - breakage due to missing semicolons.
Another person then suggests that maybe it's a good idea that CS majors have some experience programming before they enter the program. Note: some experience - no one said you need to have built a "real-world" team project before going into a CS program.
Then all hell breaks loose because, apparently, getting online and reading a couple tutorials and screwing around in a browser-based programming sandbox (there are tons), is something only wealthy people with college educated parents can do.
Sorry, but I don't believe that.
Why not offer two tracks? People who have never programmed before have to take a 6 month intro to computers course, that people who know how to program can skip. The people who skip the course can then either use the extra time to take other courses or graduate earlier.
Having the free time with access to a computer to "screw around" on and knowledge of the existence of browser sand boxes are absolutely tied to class.
Beyond that, I know a lot of poor kids who don't have video game consoles, and there literally isn't $50 in the budget for something that may or may not produce any value (not to mention the monitor, the peripherals, and taking up vast amounts of time on the family computer -- if one exists -- to figure out how to use it).
I have extended family living in rural Utah who don't even have a computer -- they use their phones for the Internet. You can debate back and forth about the cost of a phone vs. the cost of a cheap Desktop, but people need phones to function in society, for better or worse. Try figuring out what to do with a bottom tier smartphone, a limited data plan, and a Rasberry Pi that just arrived in the mail.
I volunteer at the Boston Museum of Science, working in the Tech Studio/engineering department, often showing off the latest "engineering toys" -- Little Bits, Rasberry pi, Scratch on an iPad that controls a Lego robot... Rich kid parents ask "Oh, where can I get this?" It's not a big deal for them to drop a couple hundred bucks on flavor-of-the-week programming toys. Poor kid parents are often interested and enjoy playing around with it at the Museum, but never ask about getting one themselves. It doesn't even occur to them.
Beyond THAT, there are a lot of majors that don't cost a lot of money. You can use the argument "X doesn't cost much to learn -- why haven't you done X before on your own time?" to apply to anyone.
There are now computers that cost as little as taking the family out to dinner at McDonalds. Probably a hundred million more people (yes, however, not every single person, ok) have the means to learn programming compared to when I was a kid.
Acknowledge that, because it's amazing.
Right. This type of democratization is EXACTLY what allowed me to climb the social ladder out of the rural midwest into a Top 15 university and then Silicon Valley. As a teenager, all I needed to teach myself coding and advanced math was free time and an Internet connection.
What makes you bring up gender over race or class ? There are more middle-class Asian and White women in the field than lower-income African-American and Hispanic men.
Agreed, and when it comes to CS, programming is the easiest part. People who fail to see this are probably the ones who only know programming and no CS.
Say you apply to study marine biology but have never been to the seaside then sure, you might do fine, but really I'd be wondering why you weren't taking a gap year and doing some self-directed study at a beach (hey, that sounds fun!).
How do you manage expectations for courses/modules/whatever from both sides (teacher:student) without such things.
In theory can one easily go to a USA Uni based on relatively unrelated abilities - lets say history and cooking? - and leave with a degree in Electronic Engineering? If there are no subject-based pre-reqs why does one need to declare a major, or is that done later?
My UK degree was modular so I could have done something quite similar, but I was assessed for entry based on abilities pertinent to a nominated degree (not the one I ended up with as it happens). Modules still had [relatively loose, unassessed] pre-reqs such as reading particular books or understanding specific concepts.
Somewhat related to your point, letting novice programmers get CS degrees doesn't fix the problem because there is still a performance gap which results in a hiring gap. If there is problem with demographics having unequal experiences, while colleges can be tweaked to keep the inequality from growing, it is far better to fix it where it comes from.
It's far simpler to say than do, but our society needs to find why the gender skew in high school programmers happens and stop it.
In the UK you start to specialise at 13/14
I and I bet having a GCSE and A level in Law would help for wanabe lawyers
Linux wasn't required for entry-level courses. At the time, they were taught using C++ and a Borland IDE on Windows.
Similarly, source control wasn't introduced immediately. I'm pretty sure CS101 assignments were turned on paper, in addition to email.
And the department offered many options for lab time, office hours, and tutoring. I worked as an undergraduate TA for 2.5 years, which basically meant holding lab hours 3-4 times/week for 2-3 hours at a time, in addition to grading assignments and exams.
I also started a CS program in '95, with practically no previous programming experience. (10 PRINT "HENRIK IS THE BEST", 20 GOTO 10 doesn't count)
One thing our university did though was that the introduction course was in Scheme, which evened the playing field immensely, because most of the kids who could program were self-taught in Pascal or C, and where pretty stumped when confronted with a functional programming language.
But my university's approach didn't help the abysmal graduation numbers either, there were so many classmates that dropped out during the first year. A bunch dropped out because the programme was an engineering programme, and they failed the math parts, but did good on the CS parts. Most of those switched to other programmes that were more pure CS and were successful.
But there were a lot of students who just lacked that elusive thing that makes a person a good programmer. There's been a lot of studies, and lot of previous discussions on HN of those studies, but the jury is still out, we have no idea how to effectively screen people for programming ability. The only thing we can do is toss them into an education and see if it sticks. The original article asks:
> Isn’t it reasonable to expect that people with an aptitude for math, physics, and engineering will also have an aptitude for CS?
And the answer is a resounding no.
Funnily enough, this whole thing ties into the problems of recruiting good programmers, another HN staple topic. We can't tell if someone will be a good programmer before an education, and we can't even quickly tell if someone is a good programmer after an education, or even after years of working in the industry! If there was a quick way of identifying good programmers, we wouldn't be in this mess.
Agreed. Apart from understanding logic, complex math is such a small part of development. I would say programming requires the overlap of knack for math and a pure creative interest like painting. That's much, much rarer.
During the dot boom, there were plenty of CS grads, but after the dot bomb and the market dried up, people lost interest. Also, the pay and incentives aren't nearly as good as they were during the dot boom.
Add off shoring, constantly pushing down wages, and the fact that most US businesses make programming miserable, that's probably most of the big picture.
You mean a pure creative interest like maths, surely ;0)>
There are solutions to this problem as well.
JMU and GMU (high quality, but not top-tier, state schools) both offer pure CS degrees, but also a spectrum of multidisciplinary programs in "IT". One of my current summer interns is wrapping up a degree in info security, with a heavy dose of programming. The other intern is a straight-up CS major. Both appear to have the programming chops to join the workforce as typical application developers.
Of course, in my experience coming in to a CS degree as a techie didn't actually teach me how to make a living writing code. Most of the useful stuff I had to learn on my own time. The connections I made at school helped, but the value of the actual classes was low.
So perhaps a CS degree isn't intended to teach anything to anyone.
However, in general, the idea that you need to already be a programmer per-college to major in CS is somewhere between wrong-headed and toxic.
Where would you have them start instead?
Bumble through learning to program without the theory behind any of it?
That seems like a cart-before-the-horse scenario.
Alright, so both the hard sciences and CS have less traffic than say business, but then physical science has been growing and CS has not, and so we've solved nothing.
So we have a huge increase in CS majors in just 6 years. What could be the explanation for that? I would say that CS has become both more rewarding and more accessible in the last decade.
More rewarding because with better browsers, and with smartphones, you can simply do more stuff. And those who graduated in 2009 started studying just before iPhone, before Google Chrome, HTML5. Long before Tesla and SpaceX were household names, long before we again believed in AI and robots. The field has simply become more and more interesting.
More accessible because of Codecademy, Udacity, Udemy and the rest of the bunch. 10 years ago, learning to code meant reading books that were already outdated when you bought them and spending days in frustration trying to install whatever language you had decided on. Before you did as much as a hello world. Unless your friends knew how to code or you were enrolled at an educational institution you didn't have access to tutors. Today, you can get great tutorials online, and you can learn JS, python and many other languages without the hassle of installing anything. And once you are up and running, there are a plethora of libraries, modules, frameworks and ways of deploying which mean that you can do in weeks what before could take years.
Interestingly there is also a giant bubble at 1985-86 which wasn't match until around 2010.
http://www.computerhistory.org/timeline/1982/ shows a couple of noteworthy events (Tron and the commodore 64).
Perhaps another reason for low numbers is the difficulty in the major HN has two posts on the front page about it today: https://news.ycombinator.com/item?id=14438601
Perhaps the difficulty combined with the ability to more easily detect cheating/being easier to cheat in the classes are contributing to a slower than desired growth. Given that when plotted out computer science appears to be one growing faster than many others suggests that it isn't that few people major, it's that the science is new and barriers to entry are high? (Annecdotally I know my college had additional barriers to pursuing CS focused on pre-reqs and grades, it's possible a lack of quality CS teachers is causing this as it is so new.)
I never really made the connection of "building websites" to actual computer programming, so CS didn't even come up for me.
but now its 2017 and I went back to school and just graduated with a degree in CS.
If your engineering project doesn't work, there's no pretending that it does. It's the ultimate in fairness :-)
Yeah, that's marketing's job.
I find those strong in programming may be prolific in what they write, but often have to rewrite portions as they encounter problems with their design. A more probe and prod sort of solution to the problem, where you feel your way through. I often do this. On the other hand, I have known people that take quite a bit more time up-front to think about problem areas, and end up programming generally what they thought in the beginning (with some delays as they stop to reexamine problems as they come up, but generally before too much time has been wasted on an incorrect solution). I strive for this.
Being able to do both at roughly the same time, so you come up with a good model that you quickly get implemented, is what I think makes a good programmer. I think a great programmer is someone that does that but also sees interesting new paths that others often miss, leading to more elegant solutions more often.
Clarify your terms. There is a huge difference between willing to take on liability for your faults versus slinging code.
Eh, it snuck in there. I specifically changed programmer to software engineer in multiple spots to disambiguate it because I was talking about "programming" as an act distinct from applied CS. I missed doing so at the end.
> There is a huge difference between willing to take on liability for your faults versus slinging code.
Based on your assumption of what I meant, it sounds like you have something you wish to express on this point. Perhaps you should do so in more detail?
Imagine if anyone who "knew enough about doctoring" could drop out of undergrad and "become a doctor" without any license or credential.
'Cept physics. Misalign a mirror or shake the table and all your experimental data is bogus, and you don't find that out until you've analyzed your data and realize it's nonsensical.
I suspect other hard sciences (chemistry, biology) have similar issues, but am not familiar enough to judge. I started out as a physics major and switched to CS in my last semester...CS is hard, but not nearly as hard as the hard pure sciences.
An example from math: you make one tiny logic error while composing a proof and miss it until you start rereading, then you might have to start all over again from scratch.
As someone with degrees in non-CS engineering fields, I mostly find the idea that programming on a stable computer has a high frustration/grief factor to be pretty funny.
"If you can't recompile a linux kernel by the time you graduate, you have learned nothing. Oh, yeah, also, we wont teach you that - it's just something you will run into. "
-- Professor Norman S. Matloff
Isn't that what makes CS easier? You get immediate feedback. You know if it will be right or wrong before you even turn it in.
So you have to sit there and make sure it actually works. I'm not a programmer, but I took CS courses. Debugging your homework is fucking awful. I understand it's a skill necessary for programming, but it's awful. Especially when the bug is a mistake that is unrelated to the topic of the homework.
I remember pulling an all nighter trying to figure out why my problem set didn't diff correctly. I must have spent 20 hours on a single bug.
In most other subjects, there are ways to get most of the feedback in one go.
Most student-written test suites I've seen are not nearly this thorough.
Say you could learn to build aircraft engines, understand how the economy works (which has a huge influence on people's lives) or learn how to make a computer print "hello world" on a terminal screen and how to efficiently store data. Most people wouldn't choose the latter. It's actually quite hard to describe what computer science is and why it is a science.
Or the other end of the spectrum, where you are a cube-dweller, who shuffles endlessly to Scrum meetings, does email, enter garbage into JIRA, and once in a blue moon gets to write a little code that might possibly be executed as part of some faceless, soulless enterprise monstrosity.
Also, in the end, big companies like Google and Facebook are IT companies, or advertisement companies, and they are after your private data. I guess most people would rather choose a more noble goal in life.
And finally, tools created by big companies such as Google docs perform worse than wordprocessors of the 90s. Yes, they allow collaboration now, but still, they are perceived as slow. In people's minds, IT is not progressing much.
I took ML in person at UIUC 4th year / graduate cross-listed level, and the 3rd year Stanford algos course on Coursera (supposed to be identical to the material in the actual course, yes yes though feel free to laugh b/c it's Coursera). These classes were interesting, but about as challenging as typical second year math/physics classes. They are peanuts compared to senior math/physics classes.
That said I still think CS is more challenging than most majors, it's just that the "few" is much more highlighted because there is very high demand. Certainly way higher demand than math/physics, and more demand than other fields that are still reasonably challenging like engineering, pre-med, etc. It's definitely the field where some reasonable increasing function on both demand and difficulty is maximum.
Now I'm looking at git again in my free time it's a pretty nice tool, and I can't imagine that I loathed it so much when I had to use it for a course.
What we need is people who understand the complexity we constantly throw more shit on top of; it's only going to get worse the more data points and networking we have to deal with. Pretty soon we're going to have a massive network of autonomous vehicles on physical, country-spanning networks of roads. For this we need systems people. Not front-end developers. The sad thing is that most programmers are absolutely useless when shit hits the fan.
You can't expect someone without git experience to fix broken repository.
Furthermore, I think that there are too many tools programmers are expected to know, and git is one of those. Git is a nice tool, but I don't think that 'real programmers know git' is true per definition. Above all, git is a TOOL; and one could use SVN or mercurial to achieve the same in most circumstances (of course sometimes SVN is an inferior choice, because it's not distributed).
I agree with the rest of the post though. There are reasons to prefer C above other programming languages. However, I think the best approach would be to learn the C-style and learn multiple similar languages, to be aware of the differences.
The programming people do most of is incredibly easy. Stop perpetuating the myth that programming is hard. People have changed careers to programming using week-long courses. No, they're not working on OS design, they don't contribute to the Linux kernel, but they work as programmers.
Don't kid yourself people, most of you work on CRUD shit. For every systems prgorammer there's a thousand front-end monkeys. Most of you work at such a high level it's basically just lego, glueing components together, with your hundreds of .JS files in your project, none of which you wrote. This is what most of you actually do. Complete packages like Rails... a six-week course can change people's lives (not a bad thing).
There is programming that is hard, it's just that that is not the type of programming most people do, and it's becoming more and more rare.
One major difficulty in programming is maintaining consistency and stamping out ambiguity. When you throw together dozens of libraries to create a new one, the problems of inconsistency/ambiguity are as hard to solve -- or possibly even harder -- than in your own code.
If you've got an 85 IQ and memory deficits, it's going to be a real struggle to even be productive slapping together static HTML pages.
Only somebody with zero experience with wet lab chemistry or biology could ever write the above with a straight face.
At least you can fix a missing semicolon. If you screw up a long-running organic synthesis, or you kill your bacterial colony around generation 2,000 all you can do is start the whole thing over.
The author's point was that the early unforgiving nature of programming could possibly be a factor in the potential lack of growth in the field.
As for git, that only takes a few days at most to learn to the level you need to use it in school, at least if you're even remotely technically minded.
In undergrad CS courses I learned about, oh, databases, algorithms, data structures, inheritance, and so on. In undergrad math courses I learned vector calculus, abstract algebra, linear algebra, numerical analysis, differential equations, etc. In science courses I learned organic, physical, and inorganic chemistry, some of quantum mechanics, how to use instruments ranging from: mass spectrometers, gas chromatagraphs, infrared spectrometers, UV-Vis spectrometers, flame emission spectrometers, Nuclear Magnetic Resonance spectrometers, as well as cramming "free time" during the weeks/weekends with lab time. If anyone came up to me and asked "so, do you think studying computer science is more difficult than vector calculus or quantum mechanics" I'd laugh in their face.
No, CS is not uniquely difficult among college majors. I'd say there are a few major factors why CS majors haven't grown at the same pace as others. One is that the number of CS majors ballooned in the '90s when the industry started blowing up, and it's been a bit inflated since then. Another is that a CS education is treated as a necessary and sufficient pre-requisite for becoming a software developer and it's really ill-suited for that. Software dev. is not scientific work, a lot of what they teach in CS programs is not hugely applicable to being a good developer, and a lot of what you need to be a good developer isn't taught in Universities. Parallel to that, you can jump into the industry "running" if you just start coding and getting experience and learning while doing. That is often a superior way to build your skillset, build your network of colleagues, build your resume, and start making money instead of digging yourself into student debt. So a lot of people who enter the industry go that route. That also has it's problems but on balance it's probably the better route for most people.
>In fact, every exercise in CS has this problem. You add a new thing (eg inheritance), and it breaks. But not only that, it might be broken because of a banal little syntax problem.
That can be said for math as well, switch a + for a - at any point and the whole solution crumbles down, or use the wrong method of resolution and you are in for a world of pain.
>Oh and learn git, too. And Linux. Just so you can hand in the homework.
Again, comparing that to math, learn all the different notations used just to understand what the problem asks of you.
Point being that any scientific course require a part of self study to get what you are doing.
Although I can certainly agree that the way CS is taught now interwines many basic things that should be on their own course and could do with some reorganization, even though it's hard for universities to keep up with the way most of current day tech advances.
I also had some papers where your grade was based around functionality, where there were several milestones, so as long as you got it partially right, you got some marks. For instance, in one paper, we had to implement RIPv2, but we got partial marks if we only got parts of the spec working.
Then there were papers where you had a final goal, but even if your code didn't even compile, they'd take a look at it and if you were on track, you'd get partial marks. For example, we had a paper where we had to control a helicopter reading data from a rotary encoder and other inputs, and use PID control. We had our PID control backwards so the thing wouldn't even fly, but the code was otherwise correct.
Of course, you probably won't progress very far career-wise (or maybe you'll become a manager...) if you never venture out of the little box. But the amount of effort you'll have to invest to reap a solidly upper-middle class income is pretty minimal.
If you spend 2-3 years in industry and don't get fired, you probably know how to code well enough to satisfy your employer. And at that point you're probably gonna get more (career) value working on soft skills like planning projects, managing expectations, navigating bureaucracies, etc.
At age 14 I used to copy BASIC into my Atari 600XL, QBASIC into my old 5150 and 486 PC, some HTML in the 90s, some Python, Perl others here and there.
So I tried to go for CS about ten years out of school and Java got me.
My brain just doesn't seem to like programming.
That is not true. You do not put your "whole tower" back together again when you move cursor and type the missing [';'].
With programming, you have plenty of tools that detect errors for you. With maths you sometimes will only see the error after somebody else (like the tutor) point it for you.
Therefore you can be much more efficient when programming than when doing proofs and other maths stuff.
Except with economics, as you point out :)
They should know linux/unix/gnu before they start on anything cs related.Are you really suggesting the people that don't know anything about computers should be taking cs?
Git would have made debugging the missing semi colon a lot simpler because it makes the divide/conquer approach simpler.
I think his point 1 is underrated. CS degrees are flat because aptitude is flat.
You can compare CS degrees to other degrees over time at nsf.gov:
We have more grads than ever, but they are dumber than ever (we have the data to prove this), getting less difficult degrees.
I have a bad feeling that we are running up against some diminishing returns on education and hiding it with numbers like the total number of grads. The number of grads for difficult degrees and the quality of grads seems to be another story.
> In 1970s 1-in-2 college grads aced Wordsum test. Today 1-in-6 do. Using that as a proxy for IQ of the median college grad, in the 70’s it was ~112, now its ~100.
More stats: https://firstname.lastname@example.org/why-is-computer-science-enr...
In the early 2000s it was widely perceived as the economically smart thing to go into a technical field other than CS, especially petroleum engineering, chemical engineering, and similar areas, or maybe even better, a non-technical field like law. They had higher pay and were seen as more stable employment options. CS employment was seen by many people as unlikely to ever fully recover from the dotcom crash, partly because of mass outsourcing. So unsurprisingly, new enrollment numbers were low, and some departments at smaller schools even closed. Then from around 2005, CS began to be seen as a lucrative again, and enrollment has been steadily rising every year since then. Universities have responded likewise by reopening CS departments, hiring more faculty in existing ones, and increasing class sizes (so enrollment growth is actually considerably outpacing faculty growth).
This does not look to me like a market that fails to respond to economic incentives, but exactly like one that does respond. Maybe it ought to respond even faster, although if it did in general, the CS-degree crash of the 2000s might've been even deeper than it was.
No, what you described is called "misleading with data" at best and "lying" at worst. There is a clear dip in CS on that plot that is suffered but none of the other points. Moreover, the growth is below the rate of increase in college graduates.
The post then spends a bunch of time musing on why CS enrollment supposedly doesn't respond to economic incentives, even though responding to economic incentives, i.e. rising and falling together with tech booms and busts, is precisely what the data shows it doing.
Edit: better said here, https://news.ycombinator.com/item?id=14441423
winrar! Came here to say that. ctrl-f says you're the only other person to remember it.
This is why I never believed the "There is no STEM shortage" stories. They were mostly based on the assumption that all STEM grads were qualified to work in STEM. That's not true in any field where you can complete a degree with a C or D average.
I find it unfortunate that some of the larger software companies filter out applicants based on the school they graduated from, but I understand it. The companies have to have some sort of filtering mechanism, because many colleges clearly do not.
There's no one to hold companies accountable for having a unjustifiably high or low filter. And frankly I don't think anyone outside of CS has gotten those filters right any more than CS has. An incorrect filter turns into "Well, there's no one qualified to work for us" or "Well, all of these candidates failed to do the job" which leads to incorrect assumptions about the labor pool, which is what's going to bias your hiring decisions.
There are companies out there hiring for A+ students that would be just as well off with a C or D average student because what the company is doing day-to-day is too far from what university was like.
I mean you've spotted what you claim is a clear inefficiency. Your guys could be taking the jobs that Infosys and gang are taking.
Yeah, I'd need millions to be convinced to do something I absolutely don't want to be doing.
This is true IFF we assume that gpa accurately reflects qualification to work in industry.
I was in the boat of not caring about school (although I did get a B) and it worked out - but anyone following that path should understand its an up hill climb.
"Salary" is the wrong metric to consider. The "interesting" metric is "salary minus cost of living". And this is stagnating even in NYC and SF, since the costs of living are rising.
Not in terms of tech jobs.
Depends on where you work. If you work at an actual technology company that "gets it", they'll have parallel management and individual contributor tracks where you can make as much as a director-level person as an IC, or more.
The only point about this test is it's specific to native speakers of English, but if you were born in North America than it surely applies to you even if your parents speak another language.
What of high IQ individuals who are born into low- or middle-class families, who didn't have their parents read them 19th century British novels, and therefore don't know what a dowager is?
You don't need your parents to read things for you.
That doesn't match what I see in the ground. I get 10-20 interns a year on my teams from big state universities and liberal arts schools with CS and stem degrees.
They tend to be awesome. The best ones outperform most of the consultants you find.
Nothing you say relates to IQ stats about the median grad. If CS enrollment was up, it would push the needle I'm sure. It's not. You may as well be saying, "What do you mean most wine is bad? I drink good wine all the time."
That's their hypothesis. So their hypothesis isn't that CS student IQs are dropping.
Are you also comparing this over a long timeframe? 112 in the 70's -> 100 now means less than 1 point every 3 years. You probably can't notice a difference that small year over year.
What does 2017 have to do with IQ?
Actually I think IQ is more relevant now since we have lots of incompetent people holding degrees.
the "college for everyone" experiment is grinding our society into paste and generating enormous debt burdens for an entire generation. time to end it.
those with aptitude for programming can learn programming without a college degree. let's find out just how flat aptitude really is. all we know now is that aptitude is flat in the institutional environment.
But bio and ag sciences are beating out Psychology, which is a bit of a reversal. Biology isn't easy, especially not any program with a requirement for organic chemistry. Could that be b/c the big midwest and southwest universities (Iowa, Texas A&M, etc) have large bio/ag programs to support that industry in their region?
Also, there is a lot less myths about biology around. Nobody assumes you have to be some kind of nerd to learn it, nobody assumes there is special in-born ability for it totally different from all other kinds of intelligence (like is often implicitly assumed even with things like operating system configuration), there is much less cultural bullshit about "hackers and their culture" around.
By contrast, even though it is the same subject matter, chemical engineering is (correctly) perceived as being vastly more difficult than chemistry because it requires reasoning about complex system-level behaviors that has no analogue in chemistry. This creates the oft-observed effect that being skilled at chemistry has surprisingly low correlation with being effective at chemical engineering despite being the same domain.
But realistically, a.) memorizing that much is hard b.) my friends who studied chemistry said that it becomes much easier when you understand how it works.
chemistry and chemical engineering has very little to do with each other, despite the name. chem eng is focused on scale up while (organic) chemistry is focused on novel mechanisms
Physical Chemistry on the other hand...
Btw, I have done quite well in life, not sure where she ended up; probably chasing some other kind of "grade" these days, lol.
But, unlike CS, there isn't this assumption that you've played around with organic chemistry and/or chemical engineering in your spare time if you want to major in it in college.
In Michigan the universities have a lot more focus on automotive because that's what runs the state. Agriculture runs a huge portion of Iowa, Missouri, Kansas, and other nearby states.