I know this doesn't refute the point the article is trying to make, but it doesn't seem that unreasonable to me that an underclassman might not get into a highly sought after elective. A lot of the "fun" electives when I was in school were little passion projects taught by one professor for one or two small sections a semester (or maybe only one semester per year), and while I agree it sucks that not everyone can take the classes, I wouldn't want them to die out because of complaints along the lines of "if you can't accommodate everyone, don't offer the course". Not that the article is saying that should be the case, I just worry that too many complaints about this kind of thing just puts pressure on the good professors who put enough effort into a class that a bunch of people want to take it.
I wasn't affected by bombing any classes and got direct admit to CS [^1], but there were classes of interest offered only once a year which filled up quickly. I just took them e.g., a year early than expected to make up for that or disregarded the prereqs and learned just enough on the fly. Another technique that helps is emailing the professor in advance of registration or after it's full and letting them how interested you are. Even better if you can stop by their office in person.
Another trick is to get into your school's Honors program which typically allows early registration vs non-Honors.
There were also 1-2 courses I audited for knowledge instead of officially taking which let me invest in basic learning without adding the pressure of a grade to a heavy course load.
Ideally everyone would have an academic advisor that lets them know this kind of thing in advance.
If I were graduating high school today, I would give serious consideration to Lambda School instead of a CS degree; though I also believe that we haven't seen enough time pass for the long-term implications of a decision like this to play out yet (e.g., future career discrimination based on lack of degree).
[^1]: I feel for people that don't get direct admit into their engineering major. This part feels unfair IMO. I think that everyone should be able to start in their desired major by default and disqualify out vs default out and qualify in.
Its ranking went up a lot somewhat recently, so I think its a growing pains thing. ECE is a bigger major but we don't have the same problems because the department has been around longer and growth not so fast.
I always thought about doing this, but since class signups only opened twice a year (once per semester), and that was the same time I needed to actually sign up for the classes, I concluded there would never be a time to debug and refine such a script.
In short, you gave the school a list of the classes you wanted to take. The school loaded everybody's information into the computer and had it make the first set of assignments. Then you got back the schedule and discovered that the computer had put you in classes at 7AM, 11AM, 4PM, and 6PM MWF and only one class on TTh so you had to go in and shuffle your classes around until you got a reasonable schedule. Or you would discover that all of your electives filled up before you had a chance so now you're in a fight to get into the classes.
It sounds all Mad Max, but the schedules were in such flux that it was usually possible to get the classes you wanted. I only remember getting screwed once, on a mandatory class with only two or three timeslots and one of which was highly undesirable (7AM MWF) and the other two timeslots never had an opening. Worse, the professor for that class had a heavy accent and tended to drone slowly in a monotone.
Say you're waiting for something on a website to become available. I'd look for some language on the page about availability that's likely to change when available. I don't know what they're going to change it to, but I know that it'll (probably change). I set up a script to run every few minutes to check that particular spot on the page. If it changes, I get an email.
Maybe I got an email when it changed from 'unavailable' to 'available soon', but that's okay, that means it'll definitely work when it's available for real.
There's always a chance that it won't work, but if you can reasonably assert what is likely to change, it's very easy to monitor it.
So, I didn't need to debug what happens when the class shows "open" - I just saved the div that said "closed" and sent myself an email/text any time it didn't say exactly that.
You basically had to just send the correct number of arrow key presses to get the cursor to the correct field, send the digits, and then send the enter key. Parse the data that comes back, and add the routine to cursor over to the "add course" prompt when it says there is an availability. The script was totally gross looking but it worked.
Back then people would also oversignup to classes then leave them after the first few class days of it didn’t work with schedule/etc.
I worked in the CS Dept in college and this is a big one. Everyone in the department is there to support you and is happy to help, and the chair too. It's surprising how few CS students actually take advantage of these resources.
You can also use this to get into grad classes as an undergrad even if the course scheduling system blocks you.
Signing up on a course waitlist also helps give the department more info to be able to potentially move it to a bigger room or add additional sections.
That doesn't sound like fierce competition if they were able to interview 40 candidates... Did they give them all high-paying offers and they all rejected?
It is hard on both sides. The entire process was about 6 months when I went through it, and I had offers expiring before I even finished all the onsites.
Disclaimer: I’m a CS prof at a R1 university in my first year.
Of course, they are compensated horribly, so nobody wants to do that unless they have no other choice, and once you do that, good luck ever getting a tenure-track position.
The whole academic job market is pretty dysfunctional. It's more interesting to look at the flawed ways universities have of measuring research contributions, but sometimes it just comes down to money. Universities are trying to save money by paying as many people as possible peanuts, while paying "stars" hugely.
See "Teaching Professor" on this list: https://www2.eecs.berkeley.edu/Faculty/Lists/list.html?_ga=2...
It would be good if this were more common...
Non-tenure track academic teaching positions that pay "horribly"... I can already imagine the kind of talent they attract.
Honestly, as a former CC student, non-TT professors/lectures at the junior level were some amazing teachers. While they may only have a masters in their field, most of them are taking these jobs out of enjoyment as secondary income or after retirement.
Examples: One psych prof is a clinical child psychologist, teaches PT, highest evals in the dept.
Other psych prof, retired psychologist, head of CHARGE research, drove 45 minutes to teach PT
Social work prof, 20+ year professional, teaches PT, was a complete savior to her students.
Eng prof, trans, former Navy, journalist, taught PT.
Chem prof, retired, works at a local grocery store stocking fruit PT, lectures PT
Don't look down on these people. They're smart, they work hard, and they deserve better.
My point is that you can't attract and retain top talent by offering low pay and no job security.
You are confirming said point by showing any person of talent taking this job is only for a short, temporary period while looking for something else.
This article is mainly concerned with the humanities, but it covers well the issues at stake:
Professors are there to teach. You get two years done and most, if not all, of your gen eds.
Really among most of the department, I was surprised how much of an afterthought teaching seemed to be. They were by no means bad teachers, and I guess when you've been teaching the same class for many years it becomes pretty effortless, but there really is no good incentive for them to sign up and teach a bunch of extra classes when research is what they are interested in and pays for their salary (along with the salaries of all of their lab members).
Not really. It’s a good question to ask but the answer is rather simple: R1 institutions stay afloat because of research grants; they need researchers to secure that grant $$.
This is where liberal art universities strive. While their endowments and grants may be lower, their retention and graduation rates may be higher (unsourced opinion). But Unfortunately we’re living in a time where we need both type of institutions and having that choice is good. While i still don’t agree with the rising costs of education, research driven programs are good for future researchers and STEM. Liberal art programs are great for the humanities.
The Computer Science Stampede
>> "While the number of undergraduates majoring in computer science at certain American universities more than doubled from 2013 to 2017, the number of Ph.D. candidates — the potential pool of future professors — remained relatively flat."
They seem to ignore the three decades of legacy PhD graduates who gave up looking for jobs in academia because of so few spots opening up. The article makes it seem almost like all legacy graduate supply poof disappears!
>> “I had a faculty member who came in with an offer from a bank, and they were told that, with their expertise, the starting salary would be $1 million to $4 million,” said Greg Morrisett, dean of computing and information science at Cornell University. “There’s no way a university, no matter how well off, could compete with that.”
Nice anecdote, but the number of graduates in that spot is tiny. Rarely do PhDs get that type of salary on Wall St, and often it does not last. No, they cant compete with the back on this one candidate but luckily there are a thousand other candidates, and tens of thousands of candidates once you consider the pool of grads having graduated over the last generation.
It is poor reporting to chose the most extreme possible anecdote (one person $1m-$4m salary) and then use that as an example of what is going on. The real story is...universities messed up badly structuring decent pathways for teaching positions and are trying to blame a hot market. There is also the ongoing story of poor measurement schemes (measure how good a teacher is by how good their research is) and the story of uncertainty (let highly accomplished academics rot in limbo-hell for a decade and use a tenure carrot to control them.)
BTW, the quote was from Cornell University's dean. I did my CS undergrad from Cornell. 90% of CS lectures were in a large hall where ~60% of the seats were empty, and this was in an upyear (1998-2001.) There was no shortage of space in the lecture halls. The teaching was mostly done by MS and PhD students, of which there were plenty -- the supply of recitation sections is quite elastic since there is no long-term commitment on either side.
I don't dispute you can make a higher salary than a faculty member, even without working at a FAANG, but having a PhD won't automatically make you eligible to get such a job.
Those jobs, the SWE jobs, there's really no degree that will allow you to skip that gauntlet. But I think there might be a different hiring process for PhDs or faculty leaving academia to work in research labs.
I don't really know, though, just something I heard/read somewhere.
I’m other words, do “cracking the coding interview” style questions play as big a role?
EDIT: I figured, why not do a quick web search. I found this link:
Just one POV, but it does confirm that PhD graduates do have to go through the same kind of coding exercises (at google) as anyone else. It is worth noting that this is the case if you're applying for SWE positions, where a PhD might not really confer that much of an advantage. Again, I'm not sure if this would be the case if you were, say, getting hired as an AI researcher for a lab.
So, this link sheds some light, thought I still don't know about research positions specifically.
According to Glassdoor, Facebook research candidates are given the typical interview questions as part of the process: https://www.glassdoor.com/Interview/Facebook-Research-Scient...
Same at Google: https://www.glassdoor.com/Interview/Google-Research-Scientis...
For what it's worth, though I've never particularly sought out a research position in industry, I haven't really come across too many listings which leads me to believe the positions are few and far in between.
Usually (from what I have seen) candidates on the academic job market don't end up in industry (at least not immediately) - they are in the job market because they are passionate about academia.
I teach in http://www.sena.edu.co/ (the biggest trainer in Colombia, run by the gov) because for a advance class about databases them not have anyone and one of the faculty ask me the favor to cover for it (I was there as part of a program for startups).
It work well, and I do it because the time commit was not bas and the extra money help. So I was ready to continue to teach about mobile development. I was like the only one that apply for it.
I was not selected, because I don't have certain diplomas related to how operate the (very) arcane Sena "education platform" and other side stuff that demand ONE YEAR of extra courses for me.
In the end, only the people that accept that get it the job, but how many with actual skills on the field?
Since it's a public school you can find out how much each one makes - https://www.newsobserver.com/news/databases/public-salaries/
Meanwhile, the random dude sitting next to me is making more than every single person that works at Asheville.
$95k at age 32-35 honestly doesn't seem like a lot if your savings at that age doesn't even hit 5 figures.
The best programmers I know are self-taught, because they have a passion for it. I know people working at minimum wage, never enrolled in college, who got decent programming jobs after a year or two learning CS on their own.
These people will always (I hope) form the core of the profession.
Some people can learn a lot about a complicated field on their own, but there are still plenty of people who would benefit from some pedagogical help.
If you think that being a plumber or an electrician or a mechanic never requires creativity and deep knowledge, then I think you are looking down on those professions a bit too much. Much like programming, most of what those fields involve is routine grunt work. But from time to time, the usual ready-made solutions don't work and you'll have some tricky problem to solve.
This is also true for many other skilled trades.
More importantly we need people in our industry who have interests beyond just tinkering with things all days. We need balanced individuals who have skills beyond just code, who can look at problems from a wide array of viewpoints. If we only hire hackers who have been coding since they were young we're hiring an incredibly narrow set of people while trying to solve problems for billions of people around the world. That's not efficient! It's good for our ability to solve problems that we expand the tech community to include diverse individuals, including folk who didn't code as kids.
Like seriously, a $200 chromebook that you earn by working a local minimum wage job, a household that feeds and clothes you and your local public library wifi is what you would mostly need. The requirements for college far exceeds that by a mile.
Now, 14 years later and tons of industry experience, I realize how trivial the material in my CS bachelors was and really the big problem is that most students don't really work that hard. Now my work ethic is much, much better, and I've been learning far more advanced material, far faster than I ever did before, because I come home in the evenings and actually study (using online options) and practice.
When I look back at college level CS, it's a joke. Also, the online resources are built to actually be taught well, when I was in school, there was no Khan Academy, Pluralsight, Coursera, Udemy, etc. Most college professors are research oriented at heart and are uniformly terrible teachers. The online moocs and so on are focused on actually making things easy to learn, which gives modern learners a massive advantage.
As someone who dropped out, worked in industry for a few years, and is now back as an undergrad(though in math, not CS), one thing that I feel is neglected in these discussions is the time-sucking effect of homework.
When I was self-teaching as a dev, I could learn something, play around with it until I felt I had a good grasp, and move on. A college course ties a lot of work to each concept. I have solved way too many matrices by hand in the last few weeks, for instance.
There's not even that much cs in a cs degree. Half the classes are gened. There's a lot of irrelevant math.
On the other hand, I question the utility of most university CS curriculums. There are usually only one or two classes covering the core "algorithms" knowledge that autodidacts might not have, given how many people squeak through these classes by copying HW, I'm not convinced that anything really sticks.
In fact, I think I could take any random person off the street, and provided they're motivated, teach them algorithms and data structures to BSc level within 3 months. It's not rocket science and any engineer who makes it out to be like "way hard" is just being a bit insecure over their knowledge.
Out of the remaining 40%, the level of overlap and repetition was just absurd. It seemed like each class spent the first half of the semester re-covering things that were covered in prerequisite classes. So maybe 30% of your actual class time is spent on new concepts, skills, etc.
Of those, probably 1/3 involve very basic things, like bubble sorts and the like, which help you build basic problem solving skills and language familiarity, but not much else. Then you get another 1/3 of CS related stuff, like architectures, patterns, SDLC, etc, which is probably the stuff that separates the degree holders from the self-taught in most cases. Then, if you're lucky, you get a couple of classes that are elective and teach something moderately interesting or useful. As I recall, the only classes I had in that category were a class on Computer Graphics (which ended up just being a bit of Java Swing and a bit of math to explain some of it), Web App Engineering (neat class using ASP.NET MVC), and an AI class that really just covered some solvers and search algos, topping out with Genetic Algorithms and Simulated Annealing.
Point being, I could definitely see someone being able to learn all the same CS stuff in a year, if they were self-motivated enough. A few good programming books, a few good CS concepts books, and you'd be covered. Honestly, you could probably cover CS with both more depth and breadth in a year than I got in my degree, if you put a full-time effort into it.
And all that is ignoring the fact that half my class graduated with virtually no ability to actually sit and write code of their own beyond just copy-pasting snippets from online examples until they had something that worked well enough to pass. The only real value of a CS degree, as far as I can tell, is to signal to employers that you can show up and put in at least a minimal effort at something for years without giving up. I guess that makes for a useful enough filter to keep employers using it and keep colleges in business.
Sorry, but what they are teaching themselves is not Computer Science, it is programming, which is a tiny subset of computer science or even slightly outside it but intersecting. To get a well rounded knowledge of fundamental things, such a program and data structures, memory management, analysis and design, architectures, how OSs work, cryptography, machine learning, networking and functional programming etc, takes a degree course and possibly a masters or even PhD.
I know, I know, CS is not about "coding". But trust me, these people were not making up for it with advanced theoretical work or even like, HCI.
Since everybody and their mother currently needs developers, they’ll still be hired somewhere.
these people generally do not learn computer science on their own, they learn programming instead. This is useful, but it is not computer science and there is no way that (most, modulo some exceptional edge cases) non college graduates will have the same skills or abilities as someone who has studied maths and computer science at university, as well as software engineering. there is some truth that the core (80%) of the software engineering profession will always be formed of programmers/developers/software engineers who do not need computer science skills, and simply need to be able to write code according to a specification i suppose. but the profession will be advanced by the (20%) rest who have the skills and knowledge of the underlying science...
This doesn't preclude getting CS degrees--that is often a part of that passion as well. But I agree that the best programmers would get there with or without it. And people who start programming in college are extremely unlikely to ever really get very good at it (in my experience).
I go to the university and get severely disappointed, and after 3 semester all the passion get sucked and I left.
However, I WISH that college/university were way better. Be on a class with others help a lot, and despite I'm not very social at all, be along others with the same interests? that is so rare that is a prize on itself.
How make things better and why I get disappointed is other matter that have been discussed on HN a lot, so instead I wanna point that having a good experience in an education setting (that I have in a tech course!) not only is good for the people that like to walk the common path, but also the self-taught!
How lovely If I could get into some very rewarding clases, not matter if in a university or in a backyard....
I really can't imagine a scenario where getting a formal education would make you a worse programmer. Having a degree does not necessarily mean you are a great programmer, but your self taught friends would still most likely be able to benefit from experienced and knowledgeable professors.
If your starting point is "didn't go to college", then why comment on an article specifically about college students trying to enroll in a class?
Would you suggest that in addition to a full course load + tuition, students should set time aside to learn on their own? When they already go to a school with a highly regarded CS program?
Domain-specialized classes become an increasing proportion of your overall classes after the first or second year (when you choose a major) though never 100%. So while the school can get some idea of demand based on what people put on their application form, typically what you put on that form has little to no impact on whether you are admitted and doesn't actually bind you to a particular program.
I actually think this is a pretty good system, and was glad my own kid chose a US university (which luckily we could afford) rather than the free university education he could have chosen in either of his mother's or my countries. The theory is that you can get a broad foundational education to prepare you for a variety of possible futures, and also that there is more to education than simply work skills. Of course the reality isn't quite as utopian. It also means professions like law and medicine require whole additional degrees.
I kind of like the fact that in the US, doctors usually have some exposure to science outside of memorizing facts for their next test at med school. It makes them more likely to be able to understand studies, for one thing.
I might hazard to guess that if you study philosophy before going to law school (a very common path) you will be a better lawyer, but I can't attest to that with any first-hand knowledge.
Maybe courses in the US are generally longer?
(Although they are not that far from a few other majors, it is an interesting read in any case.)
After I took the LSAT, and scored pretty well (99.6 percentile), including a perfect score on analytical reasoning, I was in the supermarket and saw at the magazine rack, down in the section with the crossword puzzle magazines, a magazine of logic puzzles just like those from the LSAT. Based on the ads in it, the main audience for this was old ladies.
I bought it, and decided to start with a puzzle marked as hard, figuring it would be easy for me--I had just aced these things on the LSAT, after all, a test designed to make distinctions among the brightest students in the country. Obviously, anything old ladies could handle I could handle almost in my sleep.
It completely kicked my ass. So did all the medium puzzles. I think I was able to do a couple easy ones, with a lot of effort.
Lesson: don't underestimate old ladies!
> The heads of Bletchley Park next looked for women who were linguists, mathematicians, and even crossword experts. In 1942 the Daily Telegraph hosted a competition where a cryptic crossword was to be solved within 12 minutes. Winners were approached by the military and some were recruited to work at Bletchley Park, as these individuals were thought to have strong lateral thinking skills, important for codebreaking.
It looks like they've consolidated it somewhat in the intervening years, now it requires only 186 units: https://www.seasoasa.ucla.edu/curric-14-15/21curchem-14.html
That said they did limit some popular classes to only people accepted to those subjects but Compsci wasn't one last I checked - though first year Compsci in my fourth year had tripled in size from my first year.
E.g. someone with a French baccalauréat or enough British A levels can usually skip the first year or two of university in the US.
There are two main differences.
Firstly, in France, this choice is available to everyone. Every student is able to go to a high school that offers it. It is not some unusual special thing that exists in rich neighborhoods.
Secondly, for people not enrolled in it, there are meaningful vocational options (bac technologique, bac professionel...), so they are still doing something useful. Contrast this to the US system, where your choice is between semi-serious academic work that kinda sorta approximates a European standard, and non-serious, waste of time babysitting.
This is because there's, for the most part, only one type of high school in the US. It's not split into academic vs. vocational. The US is one of the only countries in the world like this. I believe it expresses the idea that everyone has the ability to succeed if they try hard enough, so we shouldn't be sorting people into more and less prestigious tracks. This is a false myth, but it's so deeply implanted in the bedrock of American culture that saying common-sense things like "it's possible to figure out who the good students are well before age 18, and allocate resources appropriately" shocks a lot of people.
No need. Study on your own then specially request the AP exam; it's what I did for the CS one, since my highschool didn't have a class for it. Ended up being the 3rd or 4th to succeed at my school, out of around 600 graduating students per year for over a decade.
International students are moneymakers (they got £75k off me) so I'm pretty sure they also over-admit.
When I went to university graduating in 2007, so just before the crash but not really very long ago, everyone was pretty chilled out about their courses. Nobody did internships. As long as people weren't failing they were relaxed and enjoyed themselves. Getting a 2:2 was a bit of banter rather than a serious threat to your ability to ever get a good job.
Now when I see students they are laser focused on absolutely doing the best they can in order to survive in a much more competitive world. I'd be working harder now as well.
Perhaps these things go in cycles. When I was in university in the late 90s, almost everyone tried to get some sort of internship over the summer, and some were very competitive.
The telecoms were booming back then and I was turned down by one of the big companies (Nortel) before landing a co-op at a smaller network equipment maker.
I was definitely influenced by the on campus CS culture (big "top 5" state school), which encouraged summer internships. However, I could also see how people who entered college after seeing and experiencing the economic pain of 2008 would have a more focused approach to how they spent their time in university.
That's what you put on job application forms. We don't use a GPA.
A 1st is good. A 2:1 is fine. A 2:2 is a problem. A 3rd is a failure.
Some people use cockney rhyming slang - a 2:2 is a 'Desmond' (Desmond Tutu, two-two).
In some cases you can get a double 1st, or triple 1st, but these are very specific to your university and are about what courses you took.
No, a full third of GCSE exams are fails.
I went to Swarthmore (referenced in the article) before CS enrollment surged (www.swarthmore.edu/computer-science/alumni) and there were some people in my CS classes who decided to start on a CS major their junior (year 3 of 4) fall.
Somehow someone from the US has done maybe 1.5 years of that and gotten the same degree?
In addition to the department (Computer Science, say) requirements, there are required classes for the college (Natural Sciences) and the university. A typical bachelors degree would be 120-130 semester hours, divided up into approximately 1/3 departmental, 1/3 college and electives, and 1/3 university and electives.
Someone like me (and I know I was) would go into CS and take 1-2 CS classes per semester the first couple of years, then more after they have requirements out of the way and have the prerequisites for higher classes.
Someone who didn't choose a major initially would take all the general requirements and electives until they did decide on a major.
A semester hour is one hour of class time per week for a 14-15 week semester; most classes were three semester hours.
(This doesn't count requirements outside the CS department, like math, or other graduation requirements like "some total number of classes, at least three humanities classes, at least three social science classes" etc.)
Most non-US universities don't work like this. You get a programme that you follow, sometimes with alternatives that you can pick from. You don't have a target number of classes or credits or hours to complete. Everyone takes the same classes (except for the alternatives) at the same time in one big cohort all the way through the degree. There's not much flexibility for a minimum or maximum - there's just the one set path.
It's more useful to refer to the CS curriculum standards across ABET accredited schools.
That said, we don't have anything like a comprehensive undergrad final exam for CS schools that would validate this.
I took CS classes during all 4 years, but wasn't considered a CS major until the end of my 2nd year.
The other notable difference is an undergraduate degree in the US is typically a 4 year program. Many in the UK are 3 year programs.
This is largely based on the fact that admittance is done mainly done on what people do in 5th year of high school rather than 6th year A-levels.
Where I did my CS degree, any student in the major was always going to be able to get their core courses on a proper schedule that allowed for on-time graduation. Sometimes that meant the college would add entire extra courses if there were more people needing something than expected.
Did I get every elective (not all electives were taught every semester, some were only taught once every year or two, there was a large rotating selection - You had to take a few from different areas) that I was interested in? No, but I got most of them.
However, if you were a non-major looking to take some CS classes (beyond an intro one specifically intended for non-majors), you would have a tough time getting in. You were only allowed to register for those after all the students majoring in CS registered.
On day 1 of term 2 there were 40 and we fit comfortably.
US colleges' approach to admissions is more like selling aeroplane tickets but not asking people where they want to go, then be surprised when they all get to the airport and try to cram on the one plane going to the same place.
In other places you buy a ticket for a destination. Yeah some flights might be a bit overcrowded in some cases, but it's not as bad as not knowing how many people want to go to each destination at all.
Universities regularly "admit" more students than they can actually allow to "enroll". If a university can physically support a freshman class of, say, 1000 students, they will "admit" a number higher than that, say 1300, knowing that only a percentage of the students that they admit will actually end up enrolling (because very simply some of them choose to enroll at a another university that they were also accepted into).
Here at the university I work at they refer to this "admittance/enrollment" ratio as "yield" or "yield rate". The yield is tracked from year to year and the number of acceptance letters sent out is based on this historical data.
Sometimes, though, the University is surprised (like this past year here) and gets a much higher yield rate and ends up with more students than dorm rooms...
I think it's a value judgement. Is a lowered price of the majority of instances of a thing (flying or taking a class, in these examples) a worthwhile trade for the reward being a probability rather than a guarantee? Is the severity of the minority case (getting bumped) worth the trade? What actual probabilities for happy vs. sad path are acceptable to you?
Those are subjective questions with no inherently correct answers.
Did you go to Edinburgh? They have a US system unlike the rest of the UK.
No I didn't go to Edinburgh.
The school doesn't care if students drop out because they already got their $$. The school doesn't care if students who can't hack CS switch majors because then they take more money making 100 level classes in their new major.
You cant just rock up at a university and say id like to study X with out any previous education - this is what the Raspberry Pi was made for originally.
You might not be able to get into a top flight oxbridge university, but show a history of interesting projects and maybe attending hackathons, competitions etc many universities will fight to have you
Worked for me.
I mean they will want good grades in maths (and decision maths covers some parts of the CS curriculum) but little else, perhaps Physics - but other than that just good grades.
The thing is that the Computing A Level wasn't widely available and the quality differed between qualifications and exam boards as well - whereas Maths is pretty consistent and is run at every school.
I was fortunate enough to be able to study Further Maths as well, which honestly I would consider a better preparation.
I liked coding for fun but really had almost no idea what CS was when I applied. So maybe the US system would have been better for me in that I could have dipped into a bunch of 101 courses and changed my major.
In the UK system, most people don't go to other subjects' lectures and if you don't like what you're doing, you have to apply from scratch. There's usually no way to change course at the same university or even get any kind of help or advice about finding an alternative elsewhere.
One of these years I'll work out what I should have studied and do that instead... actually, I probably won't, now it costs £30k+. I'm glad there are MOOCs so I can satisfy my urge to sign up for random things, watch one week's worth of lessons, and then never go back again, all for free.
I chose Physics though in the end - I think the subject that really gets let down is Engineering, I had no idea what that meant.
The UK (and the rest of the world) really needs much better access to education - there is no reason it should cost 30k when huge parts can be delivered MOOC-style and the remaining exams and labs done in person.
Even the Open University costs a fortune these days! I hope Corbyn's National Education Service idea might fix it - if he gets elected and Brexit doesn't wreck everything...
Back in the day I tried to switch from a HNC stream mech eng to a Maths Stats and Computing but it was 99% pure maths (no CS maths) and the computing element was nugatory and ancient.
I graduated from a relatively good school with a degree in Computer Science, yet wasn't admitted into the major until 2 weeks before graduation. I spent my entire college career in Computer Engineering while taking Computer Science courses and (thankfully) was able to complete the curriculum without actually being a part of it. I lucked out as at my school Computer Science isn't locked down like many of the other engineering majors, so anyone can take a CS course as long as they meet the pre-reqs. It would be an understatement to say the experience was harrowing.
My school (UVA, in the late-90s) already had recently converted Computer Science as a limited enrollment major to get around this problem. Declared CS majors had preference for CS courses. It wasn't quite so bad that others were completely locked out, but if you weren't declared, you had to be quick to register and might not always get into fun electives (the core courses were usually larger and easier to find a seat).
Since that time, they've actually added a second CS major. They now offer the original BS in CS through the engineering school. And a BA in CS through the college of arts and science. The two primary differences being a foreign language requirement for the BA students, and a heavier emphasis on math and general engineering in the BS program.
Then again, despite being a top 5 program, it only received accreditation while I was attending.
The number of MIT/Stanford/Harvard kids running around the valley should disprove that point. As someone who attended a Goldman/McKinsey farm school, many of the prestige chasers who would in another era have gone to a bulge bracket are going to work for a FAANG or a unicorn instead because they do see tech as an appropriate substitute for a more traditional high prestige career.
I've heard that "actual" engineers, like people who design plants, vehicles, bridges, buildings, etc. are considered higher status in Europe than in the US. Here I think probably the top 20-30% of software developers are better regarded (and make way more money) than most of these engineers, unless these engineers are on their way to upper management or determining investments. I also think whether you're one of those people -- your social class, basically -- is pretty much determined by the day you turn 18, even in the US.
Let's take this person, John Doe, and imagine he gets accepted to a computer science program at a well regarded university. The majority of people in his class will fail/drop out of the program before graduation, regardless of background. How likely is John Doe in particular to make it through the program? I don't know the exact number, but it is going to be extremely low. What these 'barriers' do is not only save people from wasting their time, but also make room for people who stand a better chance of making it through the program.
And furthermore rejection is hardly some death sentence. If somebody genuinely wants into the program then they could take independent classes at a community college, take remedial non-major classes and demonstrate excellence, or any of a wide array of other options that could then be segued into acceptance next year/semester. And this doesn't even necessarily have to slow them down. There are so many core non-major classes required that you can get those out of the way and ultimately end up graduating in about the same time as if you were accepted to begin with. Imagine getting your calculus, linear algebra, physics/chemistry/... pick, etc stuff all done in your freshman year. That would've actually been AWESOME to have been able to have your schedule packed with nothing but CS classes and maybe a few softball liberal arts requirements for your later years.
Really? That sounds like a huge failure of the system, especially since they've gone into debt for that.
The universities that maintain their standards for graduation will typically have droves of students dropping out, because there are droves of students who do not put effort into learning the material.
They don't all drop out of university - just the program. They usually change majors. I assume most still get a degree. People who can't handle CS usually go for BIS or something IT related.
For the lower level core classes required of all CS majors, space is basically guaranteed for CS majors. If there's not enough seats, capacity will get increased to accommodate the students who need to take the class.
In the upper level classes, capacity will generally be increased if needed. Sometimes, there's a handful of seats (1-10) open in the less popular class at the beginning of the semester.
During registration (which occurs two-thirds of the way into the prior semester), only CS majors are allowed to sign up for CS classes. All other students (including CS minors) are required to submit a request which will be decided on a space-available basis the week before classes start. Those requests are only granted for students who have taken <=5 CS-major classes. There's a very specific sequence that CS majors usually take the core classes in. The semester a CS major would take a core class if they're on schedule is called a "peak semester". Non-majors are completely barred from taking those core classes during their peak semester.
There's no other subject that's as well covered on the internet; nothing in an undergraduate course isn't public knowledge. There's loads of video lectures these days.
And the coursework is incredibly scalable too, in addition to being similar to what people actually use in industry. My brother did CS at an Ivy, and they were just pushing repos to a server, and the UT software would check you'd done it properly.
Couldn't you just let everyone enroll, then make sure they do their practicals?
A bachelor's degree once meant more than passing a sufficient number of tests. Now that everyone treats it like a ticket to employment, it barely even means that.
The whole thing got quite apparent when a former roommate of mine studying mechanical engineering asked his father to help him study for a first year course and they figured out, that his father used the exact same literature in his college days as my roommate did at the time. The content hasnt changed, yet we keep throwing time at manually repeating it or going even as far as reinventing it with everyone writing their own script on the topic.
Not as nicely made video lectures. I doubt I can find video classes of some of the upper electives I took in engineering. Sure - textbooks have always been available. But they are dense, and rarely does a class cover the whole book. It helps to have a professor point out "You should know this" and "Don't bother with this - it's just academic detail."
Anecdotally, when I was in university around 2012, I noticed that the same people who would have gone to medical or law school, abruptly change their paths to CS. When I started hiring people, I noticed the same trend in resumes.
In all honesty, I much prefer them to the "I want to code videogames" crowd that made up a big chunk of my undergrad class.
Of course, that lack of forethought isn't exclusive to them. You see the same behavior in other markets. Today I see that someone has developed X and is making money, but the market isn't captured (lots of growth potential). So I decide to make X (sound decision on what I know). However, 98 others also make the same decision at about the same time. By the time we all start shipping X we've flooded the market and now it's not profitable (getting too small a slice of the whole, or competition drives prices too low). I'm trying to recall the precise term in systems theory, "bounded knowledge" maybe?
EDIT: "Bounded rationality" was what I was trying to recall. It's useful in a number of fields, I came across the term (proper, the concept wasn't new to me) in studying systems dynamics.
The best way is to choose to do it because it actually interests you, not just because you want the money. You can get a job with the degree despite the glut of graduates. Just accept that you may not always be getting the $150k/year starting salaries you sometimes hear about.
But if CS is only of passing interest and you have other majors you're considering:
I wish more people would minor in CS, rather than major in it (NB: Not all schools have a good CS minor program even if they have a good CS major program). I have a lot of engineer friends (from my professional career or my time in school) who have little programming skill, but find themselves increasingly needing to program. Even the first 5 CS courses in most programs (versus the first 1-2, at best) would make them 10-100x more effective in their jobs.
A former colleague (EE) wrote test analysis software (we were both doing verification and validation work) that took minutes to analyze the data. My slightly improved version resulted in getting execution time down to seconds in the worst case (500MB file being processed). The improvement didn't need more understanding of programming than a 2nd year CS major would have, but he didn't have it because he'd taken literally one programming course (and never programmed again, except for small matlab things). I've seen similar things from aerospace and other engineers. Their code is usually correct, but often inefficient. Or they lack an understanding of the underlying memory model of a language like C and try to do impossible things (that, again, a 2nd year or so CS major ought to know).
So: Stay the course with CS if it is of real interest to you. Otherwise, look to double major or minor and use your CS skills to stay ahead of others in your discipline.
I almost wonder if it would make more sense to major in CS and minor in something else, which would hopefully make it easier to specialize in a particular field rather than competing against a lot of generalists. I'm not entirely sure what else I'd want to do though, which goes back to the lack-of-interest problem that you mentioned. It's definitely something to think about!
It will also reduce the average skill and competency, dragging the expectations and capabilities of the profession down.
(of course, one should also note that most of people studying Stats, Applied Math and CogSci would be studying not for CS but for other applied fields (physics, bio etc). These majors offer a huge flexibility for their applied cluster)
I hope that's not TC!
Sure it overlaps, but it's a kind of marmite-like work: you either love it or hate it.
You also have a fairly harsh selection process, at least at the top end companies. Not everyone is going to pass the whiteboarding quiz, and although you can somewhat study it, it's very different to most other fields where it doesn't feel quite as harsh.
It'll also drag the quality of major way down. I TA'd a few classes at my University and the quality of people is not that high already.
That is not true at all. Maybe if you are in the upper-echelons at the Stanfords, MIT, and CMU's of the world doing advance research in AI/ML/Robotics/Security but for the most part most CS grads are going to school to get a job -- I was one of them.
As I was graduating, my program not only had 10-13 intro courses but had 4x as many electives as it had when I started.
I've also heard of the idea of charging a departmental fee to become a CS major, similar to the idea that certain engineering departments have such fees. The idea here is that this fee would go towards the department but allow the student access to things like hacker spaces and or other resources that everyone else in the school would be presumably excluded from. A lot of things are trying to be figured out to address this issue.