Did I know as a 17-year-old applying to college what the hell the difference was? No. Did I study CS? Yes. Was it the right decision? Probably not (though I did stay in it). And I had about every advantage in the world of being exposed to programming from when I was 5 or 6.
Maybe the craziest part of the US college system, and technical fields in particular, is that most people don't know what's available, and don't know what they're applying for, all the while they go somewhere between the cost of a car and a house into debt.
If one is not prepared to take an active role in identifying their course of study and desired path of education I would suggest that that person should wait to go to college.
High school guidance counselors strongly encourage students into pursuing university.
When I was in high school it didn't seem like there even was a viable option other than Computer Science, which I know now is largely unrelated to most software development. Of course, you could argue I didn't do my due diligence, but it was the guidance counselor's job to guide me and I trusted them implicitly. In retrospect, that in and of itself taught me a valuable lesson about critical thinking.
Everyone I've talked to has said they haven't used a single bit of calculus in a professional capacity since graduating. I may be succumbing to confirmation bias, but I've never heard to the contrary.
That's an incredibly privileged point of view. It is not frequently the case that students with the apitutude for college have the support they need from family and relevant experts at school to engage in the kind of thoughtful deliberation you suggest.
It does take more support from family and counselors, because it's swimming against the current. It's easy to jump right into college when everyone's drilled it into your head that you need to go, regardless of your particular personality/goals/stage in life.
That effect to not go against the grain should be more extreme in communities where 99.9% of students attend college.
Regardless, the issue isn't to pick your path exactly before you go. The point is to be prepared to dedicate some marginal effort to it either at the beginning or before.
You really think it's on the schools to tell students what they should learn or what degree they should get? Certainly people have different amounts of support going into it, but we can't just keep pushing the age of personal responsibility further and further out. To absolve underprivileged young adults of these key steps in life is not only harmful to their personal development, but I'd argue personally insulting to their potential as independent adults.
> You really think it's on the schools to tell students what they should learn or what degree they should get?
No, not me. I agree with you that we've pushed the age of adulthood too far out.
But when the mistake you learn from comes with crippling debt, it is on us to guide kids to make low-consequence mistakes before the big ones. That's why I suggest we encourage more real-world exposure with internships, co-op programs, shitty jobs first. We need to foster the environment that allows kids to actually critically think about their future instead of shoving them towards it, and we need to practice the meritocracy we preach by looking past degrees to hire people.
It depends on the school -- Some schools have a broader general education focus, with less major-focused work, and enough of an opportunity to explore before declaring a major.
Others require you to pretty much declare from day one, and it might well be another $50,000 if you get it wrong.
I'm not from the US, and I was lucky because I knew what I was getting into when I started a CS education. However, it's true even in my country that many students commit this mistake (even with other career paths unrelated to CS, it's pretty common for students to drop out because it wasn't what they expected).
I'm not truly blaming the student. It's an honest mistake to make. However, I see this same mistake (CS == programming) propagated online by people who should know better. Whenever someone links to "The Perils of Java Schools" by Spolsky, in relation to a debate about CS, he/she is doing this. Java/C#/C++/${LANGUAGE} doesn't matter in this context because a CS education isn't about learning a particular programming language. And this article, while not written by one of these people who should know better, doesn't help.
> However, I see this same mistake (CS == programming) propagated online by people who should know better.
"CS != Programming" is also a mistake. An over-generalization, at best. A good CS program will involve a lot of programming and be a great foundation for becoming a programmer.
Yes, there is some degree of generalization involved, needed to discuss this in a forum thread, but in my opinion one of the two mistakes is way more serious. I believe you understand what I meant.
Indeed, a good CS program is a great foundation for becoming a programmer. But CS is not about programming, at least not in the sense that gets discussed when Spolsky laments the "perils of Java Schools". Also, a CS researcher may not even be particularly interested in actual programming languages, preferring to focus on theoretical matters. Some of them are barely distinguishable from applied mathematicians.
People who want to learn programming may get impatient when they are taught about graph theory, lambda calculus, big-O, Turing machines, algebra, calculus, statistics, computability, etc. Not saying they necessarily will, and in fact all of this may surprise them in a good way, but many times -- which is what you often read in this kind of articles -- they become frustrated: "what is this nonsense? Why don't they teach $LANGUAGE? Everyone is using $LANGUAGE! All these CS types live in their ivory tower and don't even understand what is being used to build $MOBILE_PLATFORM apps! None of this helps me get my startup running!". The conclusion is obvious: academia "doesn't get" the real world.
In reality, however, it's more of a case of mismatched expectations. CS is -- if you look at it with squinty eyes -- a kind of applied mathematics; one in which you get to play with both idealized and real-world computers. Ok, that's another oversimplification, but it's no coincidence the founding fathers (and mothers :P) of CS were mathematicians, logicians, physicists, and the like. They would scoff at the notion that "you don't need Calculus".
It's perfectly fine to find the research/theoretical aspect of computers uninteresting. It's perfectly fine to want to learn Ruby/C++/whatever and just build your site. But it's not fine to go into CS, and find it too theoretical because they don't just teach you Python (or whatever language is currently being used in the real world), or force you to sit through algebra. In that case, you simply picked the wrong career path, and it's not academia's fault.
This is exactly how I feel. I spent much of my senior year of high school talking with everyone I knew about what options I had to go into website development (which I already had a job doing). The answer from everyone was basically "Get a CS degree, but don't use it. Otherwise nobody will take you seriously". In the end I dropped out since I realized that spending 3 years doing GE and taking classes on things I already understood was a huge waste, even if the last year would have been very useful.
Did I know as a 17-year-old applying to college what the hell the difference was? No. Did I study CS? Yes. Was it the right decision? Probably not (though I did stay in it). And I had about every advantage in the world of being exposed to programming from when I was 5 or 6.
Maybe the craziest part of the US college system, and technical fields in particular, is that most people don't know what's available, and don't know what they're applying for, all the while they go somewhere between the cost of a car and a house into debt.