
Why I am Not a Professor, or The Decline and Fall of the British University - jackfoxy
http://www.lambdassociates.org/blog/decline.htm
======
UK-Al05
I find threads like these funny, people complain about not enough software
engineering such as not knowing design patterns or the details of a java
implementation. On the other hand a different group complain about the lack of
of 'real' cs, which is mostly math.

Two opposing views, but they think the argument is both in their favour.

There is also a massive backlash against universities in the UK at the moment.
I am seeing many 'apprentice' software developers, and freelance developers
skipping university. They tell me they know more than the average university
graduate. So I ask them about complexity? Nope. Predicate Logic? Nope. These
are things taught in even the lowest polytechnic school and they don't know
them. They don't know the bounds of their own ignorance because they think cs
is just knowing a few programming languages, whipping up a few apps and
websites. They know more languages the average graduate? Maybe. Can they do
software engineering? The easy business stuff. Do they know cs? No. I think
they have never been exposed to cs and don't how deep it goes.

I've seen the accredited software developer apprenticeship curriculum and it's
mostly java and design patterns.

I think the focus on apprenticeships at the moment is creating a generation of
software developers who don't know what NP-Complete means.

Computer science is not whipping up applications with fancy design patterns,
that's why you're better than them. Give them a few years and their cs
background + Software engineering experience will begin to shine.

~~~
demian
I attribute that confusion to the fact that the term "Computer Science" is
used as a facade for both Computing Science (a branch of mathematics), and the
skills necessary to create and deploy software systems (which most of the time
doesn't even require "engineering" math).

It's like grouping "Industrial Design" and "Theoretical Physics" under the
term "Physics", if "Industrial Design" also included "Product Managment".

~~~
forensic
There's also the fact that zero science actually happens in a Computer Science
department. But since our society worships science, everything has to be
science or it's no good! Even stuff that isn't science!

~~~
tansey
That is absolutely wrong. I am a PhD student in Computer Science, and I
strongly consider what I do more science than engineering. Machine learning is
all about making a hypothesis, implementing it, testing it, and observing the
results. There is a pressure toward only publishing positive results (no one
want to learn about an algorithm that doesn't work), but it's still very much
a science.

The same kind of experiments happen in lots of other CS sub-fields: high
performance computing, security, etc.

~~~
forensic
It's a science in the same way that math is a science. A "formal science"
seems appropriate. My school considered it a natural science, which is just
stupid.

Note that most computer "science" classes are stuff like learning java,
learning UML, learning software engineering. This stuff is barely engineering,
let alone science. It has more in common with a fine art.

Algos and data structures etc are arguably a form of math, depending on the
class most likely applied math.

It's not a natural science, and what you're doing is similar to the way
artists will use, say, a picture of a DNA molecule to create a cool painting.
If they use scientific principles in their art, does that make it a science?

No - what you are describing is a "craft" - a very informal kind of
engineering. Architecture is not a science either.

------
jpdoctor
> _An easier way is to water down the educational system to a lower standard
> and then peg the university income to the number of students accepted while
> reducing the funding per head. In that way universities are given the happy
> choice of losing money and enforcing redundancies or watering down their
> requirements._

This has also become true at many US universities.

You can't water down the requirements and maintain placement stats at the same
level. Many companies will simply pull the plug on recruiting and hiring once
they have a bad experience with lame recent grads.

The really sad thing: The universities take the student's money, and then
leave the kid unemployed at the other end. All that student debt is not
dischargable in bankruptcy.

~~~
UK-Al05
I don't know how elite US universities were. In the UK very very few people
went and pretty much guaranteed middle or upper class life. One of reasons we
boosted universities in the UK was because USA was sending a lot to
univeristy. However it ended the guarantee of middle or upper class life.

They were education path of the elite, but we let a few working class smart
people in via the grammar schools(Elite state schools) to keep the majority
happy. Grammar schools selected at 11. The other way to get in was private
schools, if you had the money. My ex-school is now an ex-grammar school
because selection was banned for state schools. The people who did not pass
selection went on to vocational training at 11.

Note - Working class in the UK is roughly the same as lower middle class US.

The university system in UK is still one of the best, were small country and
have many universities in the top 100. However it's gone down.

~~~
wisty
In the old days, you could study at a library, go to private tutorials, then
take a bar exam or accounting exam to become a lawyer or accountant or doctor.
I doubt even the public service required a degree, as long as you could pass
their entrance exam. There were other requirements for the professions, (such
as an apprenticeship), but a degree wasn't always needed. (note, I'm not
certain of the details, that's just what some of my relatives have said about
how people used to cope without degrees).

The more people get degrees, the more things they are needed for. The lower
class were mostly freed by economic growth. In 1960, the UK GDP/capita was
about $3 a day, which is about the point where people stop worrying about what
they are going to eat, and worry about health, education, their career, and
where they are going to eat. If they needed to study at privatized training
centres, then pay for a professional board to assess them, they'd have done
that too, but the government responded faster than the professional societies.

~~~
wyclif
You're right, and it was the same in the other medieval institution he doesn't
talk about-- the Church. Unlike today, you didn't need a Master of Divinity
(M.Div) to be offered a job as a pastor or priest. You presented yourself--
and were usually sent-- to a bishop (or consistory or presbytery) as one
prepared, usually with Latin and Greek, to "read for Holy Orders." Often, the
student was taken into the home of the bishop (!) and ate all meals with his
family while being tutored in Greek, Hebrew, biblical exegesis, and Church
history. Before the advent of the theological seminary in the early half of
the 19th century in America, all ministers were educated this way. It worked
very very well up until recently.

I mention this because it parallels the University exactly. They changed the
system, and educationally we are all the worse for it because the most
important thing now is the credential rather than knowledge. To get the
credential you have to go into extreme debt if you happen to be poor. If you
don't believe that there's been inflation and the system has gone soft, just
take a look at a McGuffey's Sixth Reader. Shakespeare and Dryden at that
stage! Something very important has been lost, but most Anglo-Americans don't
realise it.

------
edtechdev
See also this recently published book for more on the problem of publishing in
academic journals that nobody reads: Planned Obsolescence: Publishing,
Technology, and the Future of the Academy

A free version is online:
[http://mediacommons.futureofthebook.org/mcpress/plannedobsol...](http://mediacommons.futureofthebook.org/mcpress/plannedobsolescence/)

Here are some quotes from an interview w/the author from
[http://www.insidehighered.com/news/2011/09/30/planned_obsole...](http://www.insidehighered.com/news/2011/09/30/planned_obsolescence_by_kathleen_fitzpatrick_proposes_alternatives_to_outmoded_academic_journals)

"Here are two ideas Fitzpatrick proposes to kill for good: Peer review is
necessary to maintaining the credibility of scholarly research and the
institutions that support it; and publishing activity in peer-reviewed
journals is the best gauge of a junior professor’s contribution to knowledge
in her field."

"Little in graduate school or on the tenure track inculcates helpfulness,” she
writes, “and in fact much militates against it."

"But to the extent that individual academics continue in their lust for “power
and prestige” by vying for exclusive spots in elite journals, they should not
be surprised to find themselves as irrelevant and moribund — indeed, zombie-
like — as print monographs have already become, warns Fitzpatrick."

“If we enjoy the privileges that obtain from upholding a closed system of
discourse sufficiently that we’re unwilling to subject it to critical
scrutiny, we may also need to accept the fact that the mainstream of public
intellectual life will continue, in the main, to ignore our work,” she says.
“Public funds will, in that case, be put to uses that seem more immediately
pressing than our support.”

------
frou_dh
This aligns with my experience as a British student who received a first class
degree from a fairly well-respected program.

The reality was that the teaching was uniformly mediocre, I remained pretty
clueless about the subject material, and I produced so-so quality work. I
should have worked harder for my own curiosity, but there was a complete lack
of external motivation because the academic standards were so low.

To paraphrase someone's quote: I wouldn't recommend a club that had me as a
member.

~~~
rbreve
Woody Allen

~~~
morganpyne
I thought it was Groucho Marx? <https://en.wikiquote.org/wiki/Groucho_Marx>

Although apparently the sentiment even predates his usage of it by about 60
years.

------
sbarlster
I have experienced two degrees in two different universities in the UK. First
was a standard undergraduate course finishing in 1992, a joint BSc in
Economics and CS. Course content was fine (complexity, algorithms, compilers,
database design etc).

However the students attitude was poor. As an undergrad straight from school I
did not self learn, did not study - generally not motivated. Left in recession
and with poor CS skills could not find work so delivered pizzas, worked the
call centres, manned beach car park huts - all fun however.

Second degrees was a 1 year MSc in Software Engineering - this time focusing
on the softer skills around methodologies, design, large systems modularity
etc.

This time the student was motivated. I read, studied, wrote C++ and Java
programs, delivered tasks on time and used my previous economics knowledge to
build a dissertation on neural networks analysis of wholesale electricity
prices. I left and went straight into a C/C++ job helping to build a mobile
phone network planning system.

On both occasions the tools, environment and time were available. Just the
student attitude differed. I saw what I wanted and went for it - just not the
first time.

------
cdcarter
Not just in Britain. The grade "scaling" or inflation is rampant in the US,
but not nearly as bad as the insane focus on research. The number one job of a
professor should be to teach the students of the college.

~~~
rubidium
Research is teaching; teaching the graduate students how to do research.

I agree with your sentiment that college profs should care about teaching, but
to be honest too much is asked of them. They're expected to be excellent
researchers, inexhaustible grant writers, engaging teachers, inspiring
mentors, and part-time entrepreneurs. Those who can achieve 4 of the 5 are
still impressive, and more often then not somethings got give.

~~~
Retric
In the US 'Research' generates money for a University. You apply for outside
grants and then fund your students who are then able to pay the University the
vast majority of that money. Depending on the grant you may even pay for lab
space and equipment as well as a large chunk of a professors salary.

------
chalst
"I've never worked out whether I was, in American terms, an assistant
professor or an associate professor."

Generally speaking, British lecturers are equiv. to associate professors, and
assistant professors are untenured.

Most post-docs in proper academic departments are assistant professors, but
this is not a hard and fast rule; post-docs in tenure-track programs nearly
always are assistant professors.

------
john_flintstone
I graduated from one of the top 5 British universities in the mid 90s with a
2:1 degree in History, and I have to say that it was one of the easiest thing
I ever did. I had class and lecture time of 4-6 hours a week, and spent
another 8 hours reading and writing essays. The essay requirement, which
contributed almost half to the degree grade, was only about 12 3,000 word
essays a year. I had so much free time that I turned my part time job into a
full time job, working 35 hours a week.

Of course, I should have worked harder, and I would have learned more if I
had, but I was 19 and 20 years old and it was just so easy to get a good
degree without working at all. I know I was far from alone in this.

Every degree course is not the same, and no doubt others may have worked long
hours to earn theirs, but my own experience has left me with no respect for UK
degrees, to the extent that when I read CVs from candidates I consider a third
level degree irrelevant.

~~~
waqf
If it was so easy, why didn't you get a First? Sounds like you chose low
standards for yourself.

~~~
john_flintstone
I addressed that above:

"Of course, I should have worked harder, and I would have learned more if I
had..."

My point (which I thought I'd made clearly) is that it was far, far too easy
to do well, with little effort.

~~~
waqf
My point is that by your own admission, a 2:1 requires little effort and
therefore is not to be considered "doing well".

------
k-mcgrady
Two and a half years ago I made the decision not to go to University and so
far I think it was one of the best decisions I've made. I have learnt a lot
more through experience (I'm a freelance iOS developer). I have also learnt
about life quicker. After living with some of my friends who went to
University I quickly realised how easy they have it. A few hours of classes a
week (depending on the course) and very little studying outside of class. They
also have everything paid for them through student loans and grants. On the
other had I had to work hard and ensure my business succeeded or I wouldn't
make rent.

University in the UK has been something that is just 'what people do'. Most
people coming out of A-levels wouldn't even consider not going to University,
especially because High Schools push it so hard (as it makes them look good).
There needs to be more education in High Schools about the option of not
attending University.

~~~
UK-Al05
Learning how to program in IoS is such a small part of software engineering,
and probably does not even constitute computer science. Simply put, you don't
know, what you don't know.

~~~
k-mcgrady
I don't mean that by learning to develop for iOS I have learnt what I could
have learnt studying at University. I think that everything I have learnt in
the last few years which I wouldn't have had I attended University has/will be
vastly more important and useful.

I could be wrong, there are still many years ahead of me, but I think what I
have gained by not attending University is much more than what I would have
gained by attending it.

~~~
xenophanes
Don't mind the mean guy or the down votes.

If you want to learn fancy stuff, you can read books on your own time -- and
watch video lectures from universities and other sources -- and learn it
faster, more efficiently and more pleasantly than you would have by attending
school.

He's right that there are various important things you don't know right now.
But:

1) you might never need to know them, it really depends on your career
trajectory, life priorities, etc...

2) if you had gone to school, there would still be plenty of gaps in your
knowledge anyway

3) you can address gaps in your knowledge whenever you want without going back
to school. Non-school learning is possible and effective.

4) Learning advanced CS topics -- and really understanding what they are for
and how to use them and other useful stuff -- is a lot easier with some
experience as a programmer like you're gaining.

~~~
UK-Al05
I didn't mean to be cruel.

Self-Doubt is most likely going to be the driving force for him to get through
the fancy stuff on his own. However I want to be known learning to program !=
cs.

Being confident in your own knowledge can often stops you from learning more.
I often doubt my own knowledge, I often feel worried if I meet someone else
because they expose me for the fraud I am. This keeps up the knowledge hunger.

I was young hackery type when I was a kid(I wasn't particularly academic
either), trying various programming languages(C, Scheme, Haskell), building
games(Even 3d), exploiting software with buffer overflows, maybe some
malicious hacking(I was curious) and generally exploring computing. This made
me over confident in my ability. Then the academic community completely showed
me up, they showed me how little I know in terms of theoretical cs. Destroyed
my confidence. They don't even respect the skills I had, they're not academic
skills. This made me doubt myself, and to catch up on the academic side of cs.
This taught me the lesson of being overconfident. Actually changed my attitude
to approaching other computer guys too from "I'm the best", to "this guy might
know more than me".

This made me impulse buy copies of don knuths books =P. My math skills also
received a serious boost when I realised that was needed too.

------
lhnz
This was my experience doing Computer Science at a British University, too.

Most of the modules which I took were so watered down that they were
absolutely useless to me. I knew this and I was pretty depressed at the time.
I'm not very good at doing mindless work: some of the lectures I just stopped
going to and other times I completely ignored the vacuous assignments I was
being asked to do.

Looking back I wish I had dropped out and gone straight into a job with the
programming skills I was teaching myself. (But I guess if I had done this I
might not have learnt about fundamental CS concepts.)

I actually love learning but like to do it in my own way, on my own accord.
I'm thinking of taking those online Stanford classes which are starting soon
-- I guess the only thing that is missing from these is human conversation. I
wonder if one day people might informally meet for coffee to discuss the
online courses they're taking together. ;)

------
kmfrk
If you want a longer, elaborate article, NY Books has a great article on the
current problems plaguing higher education:
[http://www.nybooks.com/articles/archives/2011/nov/24/our-
uni...](http://www.nybooks.com/articles/archives/2011/nov/24/our-universities-
why-are-they-failing/?pagination=false).

------
remyroy
This gives me a better insight as to why my CS degree is a piece of shit and
why I learn not much more than I already knew or that I use on a regular
basis.

I'm Canadian not British, but I do relate to everything that was said in this
article. I completed my degree in 2004.

Great read.

~~~
InclinedPlane
Most colleges do a poor job of providing CS education. Worse yet, CS is a poor
substitute for the education needed to do software engineering.

~~~
tsotha
We're finding candidates with a math, physics, or engineering background are
better prepared to write software than CS graduates. I don't really understand
why that should be the case.

~~~
arethuza
When I did a CS degree in the 80s we did roughly the same amount of maths as
engineering courses and all of the more difficult CS classes (generally the
more mathematical/formal ones) were mandatory - there was a relatively small
amount of choice and certainly no way to graduate without being a fairly
decent developer _and_ quite happy with formal abstractions.

Unfortunately, as the article describes, many CS courses have become "customer
focused" so are now, as far as I can see, attempting to become vocational
training courses, which universities are generally pretty awful at. When I
finished my undergraduate course (with a 1st) the only thing I felt qualified
to do was go into postgraduate research - which is pretty much what the course
was oriented towards, although this was only apparent in retrospect.

"Real" CS is irrelevant to 98% of development jobs. In my opinion anyone
believing that a CS degree will train them to be a good developer is going to
get a nasty shock.

Having said that, some of the very best developers I have worked with did have
the combination of awesome raw talent _and_ CS degrees (often PhDs). Of
course, I've also worked with some equally good developers (in their own way)
who didn't have a degree.

------
CJefferson
This is, in my opinion, entirely the government's fault.

The government decides how much to fund universities based on publication
quality, which they rate based on the journals the papers are accepted into.

There is almost no benefit to teaching students better, and there are huge
advantages to passing students who would otherwise fail. This is because
universities have a strict limit on the number of students they can accept,
and these are not replaced if students fail their first year.

So, to maximise income universities have to keep hold of students, while
getting as many papers as possible into high quality journals.

------
rluhar
I graduated in Computer Science in 2004 from a very well respected British
University. I graduated with a first. There is usually a very clearly
signposted path to getting a "good degree" without necessarily having to know
all that much core computer science.

My course was a four year course. The total weighting for each year was
10-20-3-40 (years 1-4). The first two years had non-optional, core CS modules
(algorithms, logic, discrete maths, etc.) and the final two years had a lot of
electives. If you could muddle through the first two years, you could take a
series of electives in the final two years (foreign languages, Accounting and
finance, etc.) that were agruably much easier.

I got mediocre grades in the first two years, but good grades in the final two
years, resulting in a first class degree overall.

I regret my choices, but as a lazy undergrad, I took the path of least
resistance to achieve my target (a first class degree). I was not the only one
who did this. The problem is that people like me made the University look
good, so I think they made it very easy to game the System. The only thing I
worked really hard on were the programming assignments and projects. The exams
were easy to pass provided since they had a very clearly laid out pattern, and
questions tended to be repeated year and after year. If you could solve exam
questions from the last 3-4 years before your final exam, you would probably
ace it.

------
lambda
You know, I'm getting pretty sick of all of these "doom and gloom" stories
about the modern higher education system.

Yes, the modern higher education system is not ideal. But what, in life,
really is? That's not to say that we shouldn't pursue a better system, but we
shouldn't give up on a system just because it's not ideal.

And with all of these doom and gloom stories, I have yet to see anyone offer
an alternative. Yeah, there's a lot of paperwork involved in being a
professor, and you get evaluated on criteria that don't quite line up with the
ideal for being a great professor. But what would be better? How can we create
a better system? And if there's such an obvious better answer, why doesn't
someone do it?

If there is some obviously better system, I'd love to see it. If such a thing
exits, it should be quite competitive with the current higher education
system. No one wants to hire incompetent new graduates. No one wants to be
one. So we should see something better, something that indicates there is some
better way of doing this.

Instead, we see a steady stream of technological progress. I can do things
that no one could do before, like carrying a device around that allows me to
pinpoint my precise location, stream maps down to me, find me directions to
wherever I want to go, read those directions aloud in a synthesized voice, all
for the price of of 2% of median yearly income (including hardware, software,
and the service). And that, of course, is not to mention all of the other
things that are available to me.

Now, maybe I'm living in a bubble, built by people who got a proper education
before all of this grade inflation and other nonsense. But really, this
article is complaining about the last 20 years. A large portion of the people
who are doing work in technical fields finished school within the last 20
years. And yet, we're still seeing significant progress; we are still living
in a world that is tumbling into the future at a high rate.

So I want to know two things. For one, why are we still progressing so
quickly, despite these apparent problems? How are we managing to innovate, if
our educational and academic foundation is so unsound?

And for another, what is the solution? What do you propose we do better? If
it's so much obviously better, why don't we do it? Or why doesn't someone,
somewhere do it, and show significantly better results?

I think one reason is that when you are doing work in the top few percent of
human ability, you look around and realize how ordinary it is. Even being at
the top, everyone has their flaws. No one is perfect. Systems designed to
prevent people from cheating also prevent some people from doing amazing work.
But overal, it isn't a few geniuses at the level of "Mozart" that we need;
it's a lot of people, doing work at a high level, but not what some might
consider "genius." If you are immersed in it seems somewhat boring, but when
it all adds up, it winds up opening new possibilities that were simply not
available 10, 20, or 30 years ago.

~~~
HilbertSpace
> "But what would be better? How can we create a better system? And if there's
> such an obvious better answer, why doesn't someone do it?"

Here's a simple answer: The better system exists, but people overlook it.

As a student, first get a Bachelor's degree.

Second, pick a field and be sure have learned it well at at least the
Bachelor's level. Do this learning independently if necessary. A Bachelor's
degree is supposed to teach you how to do at least this much learning
independently.

Third, from that learning about the field you selected, learn some more, to
'the next level'. Likely do this independently.

Fourth, show up at any one of the better research universities and take the
Ph.D. qualifying exams based on what you learned.

Fifth, stick around that university and attend some seminars and courses that
are introductions to research given by experts in research. Here your work is
largely independent.

Sixth, pick a research problem and get some good results, independently or
nearly so. If there is any doubt about the significance of your research, then
publish it.

Seventh, submit your research as your Ph.D. dissertation.

Congratulations: You are now out of school; you went all the way; you are
educated. Done.

~~~
lambda
Hmm. I would love to follow this course of action. Can you point to evidence
that schools will allow you to take their qualifying exams and let you get a
PhD based on your independent research without being formally enrolled?

Also, as far as I can tell, most of the article is complaining about the
bachelor's level, which you assume as a given. For those who haven't done
that, do you have an alternative for that level?

~~~
impendia
As a math professor: why would you not formally enroll?

We have formal admissions procedures and the like, but this is not to screen
out by arbitrary criteria. Grad applications are screened by math professors,
not some stuffed suits somewhere who can't do trigonometry, and if you are
well prepared for grad study then it will show and you will be admitted. And,
typically, funded with a stipend (usually there are 10-20 hours of teaching a
week you have to do, depending on institution).

I think most professors would be happy to let you sit in on an advanced class
without enrolling. But doing a whole Ph.D. that way? _Perhaps_ it is possible,
but I can't imagine why anyone would, and I don't know of anyone that has.

~~~
HilbertSpace
The question was, could a student without a Bachelor's just show up at a grad
school, offering to take the Ph.D. qualifying exams and believing that they
are well prepared, AND, without a Bachelor's degree, be permitted to do so? In
particular, to take the qualifying exams, would they have to be 'enrolled' and
would that be possible without a Bachelor's degree? And, although not said,
maybe the student needs financial aid and hopes to get it based on their good
qualifying exam performance.

So, they are ready to take the exams. But they have no Bachelor's, are not
enrolled, without a Bachelor's would likely not be accepted or enrolled, and
need financial aid. So, can they take the quals? If so, then how? That was the
question. I suggested maybe an Associates degree, some really good GRE scores,
and offering to take the quals BEFORE applying for admission.

------
shanemhansen
I experienced how poorly CS classes prepared students for jobs in Software
Engineering (I do realize they're not the same thing, but that's obviously the
main degree we look for). I interviewed people who have a masters degree with
an emphasis in Java, yet they were unaware of even the simplest details about
how the JVM works (implementation details of String class, JIT compilation).

I felt bad for this person, I wonder if it's too late for them to get refund
on that degree because it sure as heck didn't increase their earning
potential.

~~~
ugyuyguy
I too have a degree in CS, which required me to 'play' with Java. Most
students on the course did not have the basic concepts of Strings and ints let
alone JIT. Most left with the ability to "avoid programming and CLI at all
cost" and are still working in jobs they could have done without a degree
paying £16-21K with zero career progression and / or training.

------
ig1
Dupe: <http://news.ycombinator.com/item?id=737616>

------
RyanMcGreal
> As the Chinese say, I have lived in interesting times.

Nitpick: this is not a Chinese saying.

[http://en.wikipedia.org/wiki/May_you_live_in_interesting_tim...](http://en.wikipedia.org/wiki/May_you_live_in_interesting_times)

------
noderivative
A string vibrates on me when in the online course about Machine Learning a
voice said "you don't need to know what a derivate is ..."

------
lgeek
Final year CS student in a British University here.

I can certainly see where the author's coming from, although I don't find the
situation this dire. I'm not British so I don't know that much about how
Universities worked and were perceived in society in the past and I guess I
might have a slightly different mindset. Anyway, let me explain myself.

We have this modular course structure at my school. Yes, you can pick courses
varying from "Developing web applications with Java" to hard CS stuff like
compiler design and advanced algorithms. As far as I can tell, no course has
been dropped because it was too hard - because students like challenges and
take them. The same goes for final year projects, I've seen a student saying
that she won't be doing any programming for her project (yeah, WTF), but I
also know of more serious engineering projects (like a guy refreshing the
electronics and more importantly, the software of a popular home-build 3D
printer - and using the capabilities to do some stuff I'd really like to have
on my 3d printer), or more experimental projects like mining Twitter for
medical drug information (like perceived effectiveness, side-effects, usage
patterns, etc).

What I'm trying to say is that there might be some easy paths you could take,
but the students which always pick the easiest choice are usually the ones who
end failing or dropping out. Sure, some of them graduate and I have mixed
feelings about having the same degree as some of my fellow students. The
author says that ' By pre-1990 standards about 20% of the students should have
been failed.'. Well, in my school about 20% of the students are failed - each
year.

Another topic is grade scaling. Yes, most lab and coursework grades are scaled
in the first year and some in the second year. Exams are never scaled! But
here's the thing, scaling is always down. It can be argued that labs are too
easy if you need to scale the grades down and it is frustrating to do a
perfect job and end up with a 70 something percent mark. But grades are never
scaled up to 'turn a fail into a II'.

Finally, some people argue that a formal CS education is useless and out of
touch with reality. I definitely don't agree. Knowing algorithms and data
structures can give you an edge even for simple programs, knowing that some
research areas and approaches even exist helps you avoid a lot of easy
mistakes, labs will help you design better and faster because you sort of
develop your own process and you get to know common pitfalls, reports and
presentations train you to communicate better and using the proper domain
language, having a clear image of how computers work from the grounds up is
great when you're debugging.

TL;DR: Formal CS education still useful, just more chances to shoot yourself
in the foot. (Yes, I consider getting a first without learning too much to be
shooting yourself in the foot.) Study the fundamentals and the hard stuff and
you should be better off than a self-taught person.

~~~
arethuza
It's more than 20 years since I graduated with a CS degree and I think my
advice these days to anyone considering the subject would be to avoid CS
_unless_ you want to go on and do research level work in academia or industry.
Of course, that's what I wanted to do before going to University and I did end
up working as a researcher in academia for six years before co-founding a
startup.

People keep thinking that a CS degree is a vocational training program for
developers - they didn't used to be and if that's what they have turned into
then it's no wonder that they are doing a terrible job.

~~~
wantrepreneur
Then what do you suggest wantrepreneurs should study,if anything?I think this
argument whether CS school is useless has been debated far too much.I do see
value in developing the way of thinking and surrounding yourself with top
people,considering you do manage to get in a top college.And this whole debate
about the very high prices of the colleges is ridiculous,because even for a
person coming from a really poor family like me,there are enough possibilities
to get free or very cheap education in most of the world's top 20 CS
schools,if they do believe you're worthy enough candidate.

~~~
arethuza
Personally, if you can get into one of the very top universities that have a
track record for being the places where succesful startups come from then I'd
_strongly_ recommend it.

~~~
wantrepreneur
Would you mind sharing which in Europe are those?Because I've read quite a lot
of founder stories and the only college I'd put on that list is Stanford.UIUC
also seems to have a lot of tech entrepreneur and VC alumni.

~~~
arethuza
I'm not sure if there are any in Europe! [NB I'm in the UK]

Note that I'd love to be corrected on this point - I graduated in '88 and co-
founded a startup after working in academia for 6 years and the help we got
from the University we worked for was laughable (they asked if we wanted to
lease a building!). However, we did meet our first angel investor through the
university - he had done the same CS degree as me about 20 years earlier.

------
iRobot
The conclusions are unfortunately correct, it is the student who is being made
to pay for this policy. Bad course modules and bad degrees de-value the whole
system and do NOT help those who scrape through but who are then unable to
follow the level of skill required through to the workplace.

As an employer who has interviewed and employed countless computer degree
candidates, this has simply devalued the word 'degree' to a point where it may
no longer get you even to the interview ahead of say a non-degree candidate
with a tangible track record of real project work or experience behind them.

This is not just a British phenomenon.

~~~
UK-Al05
What are you looking for in a candidate? Are you looking for computer science?
Or Software Engineering?

I hear people interviewers are expecting cs graduates to know the MVP pattern
or similar things and shocked when they hear they don't know. That's software
engineering and not computer science.

~~~
frou_dh
If a candidate has applied for such a position then I don't think it's
unreasonable to expect that they've tainted their pristine porcelain skin with
-gasp- a few run-of-the-mill software projects.

------
ugyuyguy
The SHOCKING Truth about a pootly organised educational system by a hugely
unsuccessful government. Living standards in the UK are good due partly to the
NHS and the private sector jobs, but the corrupt government attitude of take
take take yet give back little is turning this country in to a 'toilet'.

