
Why I am Not a Professor, or The Decline and Fall of the British University - edw519
http://www.lambdassociates.org/blog/decline.htm?
======
tybris
> By pre-1990 standards about 20% of the students should have been failed.

Interesting, I find that number quite low.

The failure rates for most CS courses at my BSc/MSc university (VU Amsterdam)
usually hovered around 50% in the post-00s era, especially for the hard-line
courses such as computer networks, finite fields and data structures. We were
allowed to do an unlimited number of retrials (6 months in between), but still
over a third of the students would drop out before completing their BSc.

When I started my PhD at a British university I was obliged to do some MSc
courses. Basically every course started with: No one will fail this course. At
first that seemed like a comfortable idea since I didn't want to be distracted
from my research. Soon I learned that actually means the course is going to be
boring as hell.

I don't believe the Dutch system works better than the British one. We've had
our share of choice fanatics. Despite Andy Tanenbaum raking in millions, my
current department is also a whole lot better funded than the VU.

I think it has more to do with attitude than politics. Tanenbaum basically
created the department with his bare hands, writing Minix and his well-known
books in the process. He's no longer actively involved in teaching, but he
created a culture of outstanding education. There's still an enormous amount
of attention on making courses more interesting, challenging and up-to-date.
It's easy to get lazy and lower your standards, but not impossible to say no
and become a better university.

~~~
BrentRitterbeck
This is the result of the university becoming more of a business than anything
else. If you start flunking people out, you lose revenue.

~~~
I_got_fifty
It might be the socialist in me, but schools should always be public. No
business should be involved in educating the next generation.

I think of universities more like a R&D department for the general public.
(which also is why I think all software developed in universities should be
Free)

~~~
miked
Based on what data?

Almost every country in the world is more socialist that the (former) U.S.
None of those countries' university systems come close to that of the U.S. And
what are the best universities in the U.S.? MIT, CalTech, Harvard, Stanford,
Yale, Princeton, Dartmouth, etc. (private) vs. UC Berkeley and UCLA (public).
The latter two are soon to feel the financial collapse of California.

~~~
jacquesm
That may have something to do with the immense budgets for defense the US has,
a portion of that money goes towards research in universities.

------
imack
To be fair to the time, in 1999, there was the dot-com bubble going on and a
lot of students were going into CS for the wrong reasons so there could have
been more at play than government objectives. I suspect there were a lot of CS
students who did not enjoy programming in such programs, thus accounting for
seniors who couldn't do projects. I know CS and programming are not the same,
but I liken it to studying Shakespeare and not enjoying reading or writing.

Our school (UWaterloo) noticed a huge drop in CS enrollment soon after that,
between 2002 and 2004 (roughly). At which point business programs saw a spike
in enrollment. I wonder if this professor had stuck around whether he would
have noticed an improvement once the stories of 24 year old billionaires were
removed from the headlines.

~~~
Locke1689
I am an American, so forgive me, but I was under the impression that UWaterloo
was also regarded as one of, if not the best, CS schools in Canada. I would
argue that many of the more elite universities have managed to beat back this
trend for the most part, as I do not believe that Stanford or MIT are
graduating substandard computer science students.

~~~
0wned
I agree with this statement. Other schools (perhaps not as highly regarded as
the ones you mentioned) such as Texas A&M do hardcore, low-level C++ courses,
compiler stuff, etc. for both CS and CE students as core requirements. They
lightened the requirements several years back to compete with other schools,
but changed back to them when they realized how soft their students had
become. Of course, Bjarne Stroustrup is the chair of the CoE CS program at
TAMU... so that may have a lot to do with it.

------
blasdel
Just yesterday he announced that he's quitting his Lisp dialect, computing in
general, and Britain:
[http://groups.google.com/group/Qilang/browse_thread/thread/5...](http://groups.google.com/group/Qilang/browse_thread/thread/592773c562017d87)

------
ivanyv
That was very interesting.

I live in what you'd call a 3d world country (Mexico), and this stuff makes me
think about why empires fall and others rise...

The rising stars get to watch how an empire falls ;-)

One of my dreams is to open a school that's much better than most of the
schools in Mexico, using the good and bad examples of schools in countries
like the US and the UK. My 2 cents to leave to the world.

~~~
jacquesm
Your dreams are definitely worth pursuing!

~~~
dangoldin
And probably a lot easier in the modern day as more and more people get access
to the web. I'm very interested in web learning and am trying to put something
together for it too.

~~~
jacquesm
I've had a long conversation with someone that is very much into this and
there are some challenges but it definitely is possible.

Already there are tons of lectures online in flash video format. It's amazing
the stuff you can learn about but to actually make lesson material and tests
work in an online environment is far from simple (cheating for instance!).

~~~
dangoldin
I imagined something along an entirely different approach. Putting videos
online is just doing it the same way over a new medium. I am thinking of a way
to take advantage of what the web has to offer.

We'll see how it goes.

~~~
jacquesm
Yes, I got that, sorry for not being clearer. The videos are a simple starting
point, there is no interaction and so on, it's just like watching TV (only
worse quality...).

The subject of the discussion was basically a complete virtual education from
grade school level all the way up to university level. Interactive lectures,
web based self-study units the works.

We realized that this is a very complex undertaking and that it would require
fairly massive funds to be executed properly as well as an enormous amount of
expertise.

~~~
harkain
It might be worth looking at <http://www.khanacademy.org/> for an effort in
web learning. At this point it's mostly short youtube videos (quite a few of
them) concentrating on one concept spanning math, physics, economics and some
others. You can also create an account and get a personal "map" to see how you
are progressing (were you go from basic concepts, and then build from them).
The guy in charge wants to have videos about every possible subject and have
one such knowledge map for them. Eventually, I think he also wants to be able
to have online tutors helping individual students in the future. I realize my
explanation probably doesn't do it any justice so I instead refer you to
<http://www.youtube.com/user/khanacademy> (the video in there) which should
give a better overview of it all. There is also a longer video somewhere that
has even more details.

I just thought I would share, maybe someone here will like it or find it
useful.

------
0wned
We can't generalize. Not all students are like the ones described in this
article. I work with CS and CE graduate students 40 hours a week. I help them
conduct research. The best of these students have an eagerness and desire to
learn. They work on projects at home in their spare time, they learn new
languages (just for the sake of learning them), etc. They _want_ to learn C
and C++. They love it.

Since I work directly with these kids and have seen dozens come and go, it's
easy for me to quickly pick out the good ones. They are the most fun to work
with too as they teach me as much or more as I teach them.

I see the average/bad ones too. The ones who do just barely enough to get by.
The ones who "code by Google". They are far more common than the good ones.
Their work speaks for itself. Poorly written, plagiarized and seldom delivered
on time.

Hopefully employers have methods in place to weed out the bad and average
kids. And incentives to hire and __more importantly __keep the best ones.

Just my 2 cents.

------
javanix
And here we have the reason why just about every single CS department in the
US teaches Java instead of C or Scheme.

~~~
scott_s
I think comments like there are border-line trolls in a community like HN.

The languages used in various parts of a CS curriculum are one component to
the whole. Holding up that component as the reason that the sky is falling is,
at best, disingenuous. I can construct terrible CS curriculums that start with
C, and terrific ones that start with Java.

If those who construct the curriculum want the beginning courses to focus on
algorithmic thinking, I think it's fair to use a language that abstracts away
much of the physical machine. The abstractions can be peeled away in later
courses.

If, instead, they want the beginning courses to focus on the realities and
difficulties of dealing with computer systems, it makes sense to start with
something like C. They can then introduce the abstractions that let people
manage those difficulties in later courses.

I think both approaches are valid, as long as a student gets a view of the
important points of the field. I can even see arguments why one approach might
be better than the other. But claiming that one approach represents the
failure of our CS academic system is zealotry.

~~~
philwelch
"If those who construct the curriculum want the beginning courses to focus on
algorithmic thinking, I think it's fair to use a language that abstracts away
much of the physical machine. The abstractions can be peeled away in later
courses."

I don't know that Java does this all the much better than C. The problem with
C for a beginning student isn't so much that you have to manage memory
manually--it generally takes a few weeks to even get to malloc() in a C-based
introductory course--but that C gets in your way with explicit typing,
#includes, etc. Java does away with some of that but introduces its own OO
scaffolding to get in your way too. Instead of having to write main()s and
#include's, the Java student has to enclose their functions in a class and so
forth. Let's compare Hello World in C, Java, and C#.

C:

    
    
      #include <stdio.h>
    
      int main(void)
      {
        printf("Hello, World!\n");
        return 0;
      }
    

Java:

    
    
      class HelloWorldApp
      {
        public static void main(String[] args)
        {
          System.out.println("Hello World!"); // Display the string
        }
      }
    

C#:

    
    
      class Hello 
      {
         static void Main() 
         {
            System.Console.WriteLine("Hello World!");
         }
      }
    

The Java and C# examples are even more cluttered than the C example when it
comes to superfluous tokens: it has a class declaration, the method signature
is more unnecessarily elaborate, and the print command has like three levels
of object-drilldown in it. When you get to the simple procedural programs that
a beginning student will write, this mysterious crud remains unresolved for
longer. It's not enough to explain typing as you would in C or Pascal, but you
have to talk about object-oriented programming before you get into problems
complex enough to justify that level of abstraction.

If you _really_ want an abstract language to enforce algorithmic thinking,
pick one that doesn't have all that extra mental burden when you first
approach it.

Perl 5.8

    
    
      print "Hello World!\n"
    

Perl 5.10

    
    
      say "Hello World!"
    

Python

    
    
      print "Hello World!"
    

Ruby

    
    
      print "Hello World!"
    

The cool thing is that these languages still _have_ subroutines and classes
and so forth, but they don't _force_ you to declare a class, declare a
subroutine, and call an object method just to code "hello world".

Java has advantages over C. These advantages don't include "letting beginning
programmers focus on algorithmic thinking by using high level abstractions".
Java's higher level than C in that it protects you from naked pointers and
lets you do OOP, but that's not the type of high-level abstraction that helps
a beginning programmer, especially not when it comes at the cost of forcing
them to put everything in classes and methods.

If those who construct the curriculum want the beginning courses to focus on
algorithmic thinking, I think it's fair to use a language that abstracts away
as much as possible. We have no shortage of good interpreted languages to
accomplish this.

~~~
Goladus
Actually I'd argue that with C, you have to start managing memory manually
before you even get to malloc. The abstraction advantage of Java over C (not
that I think Java is necessarily a better intro language) is that you can
generally explain the syntax in abstract concepts and then use it like you'd
expect. With C, it's far more likely to encounter scenarios that don't fit a
simple model of understanding.

For example, unless you're concerned with specific performance issues, you're
not likely to care how a string is implemented in Java. It's difficult to use
strings in C without understanding memory. Without understanding when strings
are mutable and when they aren't, what null-terminated means, how "%s" works,
and such you will quickly run into some unexpected behavior and will likely
just trial and error until you get something that seems to work. When you
understand that C is a syntax for allocating and manipulating memory, it tends
to make a lot more sense.

~~~
Retric
Yea, the advantage of Java over C is the second program.

for (int x = 1; x <= 100; x++){ System.out.println("Hello World! " + x); }

vs

for (int x = 1; x <= 100; x++){ printf("Hello World! %d! \n", x);

PS: How do you include readable code snippets on HN?

~~~
Goladus

        Prefix four spaces to a line to format code.  
    

Also, you're missing a right curly brace in the second example.

------
bkovitz
Certainly, there's loads of stupid "research" in academia—cranking out papers
that interest no one, in order to justify tenure and more grants.

However, "90% of everything is crap". The relevant question, before you throw
out the entire institution, is: Are there real opportunities to do good work?

Good work takes two main forms in academia: research and teaching. Is good
research work being done? Is good teaching being done?

~~~
jerf
"The relevant question, before you throw out the entire institution, is: Are
there real opportunities to do good work?"

I disagree. The relevant question is, "What is the best possible way to do
good work, including practical concerns taking into account the behavior of
real people and not just theoretical people, and is that what we have? If not,
how can we get there?"

The way you ask the question is basically the same thing as the sunk cost
fallacy. Yes, we've got a lot of investment in the current system, but if it
isn't working, it's time to change it to something that will. Whatever that
may be. And, again, the question is whether it is working with real people,
not hypothetical people who are custom-designed to work with the system you
want to be ideal.

~~~
nostrademons
The best is the enemy of the good. I'd shorten the question to "Can we do
better?"

~~~
jongraehl
I'd rather start with with pondering "what's the best way (that I can
imagine)" and then move to "what's the surest route to improvement".

I don't think it's usually the case that someone who asks "what's the best
...?" is really _only_ interested in things that are absolutely the best.

------
fauigerzigerk
He does have a point, but something is missing in most such lamentations of
watered down academic standards. It's that education, in the past, was not
meant to be vocational training and therefore the eoconomic case for
broadening access just wasn't there.

Knowing classic literature was not (functionally) why upper class kids of past
times went on to earn much more than their lower class peers. It was just part
of the symbolic glue that allowed members of that class to recognise each
other.

Throughout history, access to good education was by birth, not by talent.

Most analyses featuring Mozart are flawed. It's probably because nobody is
quite sure whether the purpose of education systems is to find those rare
individuals with innate genius, or rather to teach a large number of
reasonably intelligent people something so they are more capable of solving
particular problems as a result. I don't think there is one approach that is
suitable for doing both.

Mozart is often introduced into the debate by those demanding more stringent
standards of admission. However, if it's about finding the Mozarts of the
world, radical and indiscriminate broadening of access must be the top
priority.

It's the nature of innate genius that it occurs just as likely in illiterate
african street kids as in the offspring of english professors. Taking on a
large number of totally unprepared kids and exposing them to interesting stuff
is probably the most beneficial approach to finding the Mozarts. Demanding
good preparation and high entry standards is a social filter, not a talent
filter.

So I think, universities must commit themselves to actually teaching students
something instead of whining about low entry standards. I fully understand
that the author of the article does more than that, and I agree with his other
points.

But let's face it, universities are there to bring as many people as possible
onto a high level of knowledge and skill. The goals are largely economic ones
and that should not be lamented. They are no longer the kind of upper class
culture club they once were, and they are not primarily an endeavour to find
the Mozarts and Einsteins.

~~~
Goladus
There are lots of problems with the Mozart analogy. For one, Mozart was
trained by his Father. His education in music began at a very young age (which
is common among many if not most of the highly successful classical concert
musicians in the world).

For another, despite the fact that Mozart's music was creative (I've loved
every Mozart piece I've ever heard performed well and most of those that
haven't), from a superficial perspective many of them are also remarkably
similar. If you aren't ready to appreciate subtle differences you aren't ready
to appreciate Mozart for many of the reasons his music remained popular for
200 years after his death.

Which brings me to the next point, which is that he talks about Mozart's music
existing in a sort of historical vacuum, and doesn't bother to compare
Mozart's work to the work of his contemporaries [1]. Most of his
contemporaries have been forgotten by the mainstream, for various reasons, but
while they were alive they still produced a lot of music.

Finally, while there is creativity in science and one can take a scientific
approach to creating (or "discovering") music, it's difficult to argue that
the importance of scientific rigor is not more important in a paper about
Algorithms for Mesh Analysis than in a Piano Concerto in Eb Major.

[1] Wikipedia lists 64 composers in Mozart's era:
[http://en.wikipedia.org/wiki/List_of_Classical_era_composers...](http://en.wikipedia.org/wiki/List_of_Classical_era_composers#Late_Classical_era_composers_.28born_1750-1770.29)

~~~
Goladus
It's probably worth noting that the point of the article doesn't rely much on
the Mozart analogy.

------
sasamat
I can't comment on the US experience, but as a student in the UK in the 80s
and Canada in the 90s I concur with Mark Tarver.

This is not just a CS issue, it's universal and the real damage is being done
in the second tier establishments where grade inflation and dumbed-down
courses are endemic. The tier one establishments (Oxbrige, Ivy League etc) can
still use their own screening methodologies for aptitude and smarts to
minimize the problem. As a result---and to continue Tarver's analogy of the
Cultural Revolution---the 'party' is looking after its own whilst the rest
goes to rot.

~~~
cabalamat
> This is not just a CS issue, it's universal and the real damage is being
> done in the second tier establishments where grade inflation and dumbed-down
> courses are endemic.

It's worse that that, because the dumbing down doesn't just effect
universities. Secondary school science education has also been badly hit. The
links that follow contain actual questions from GCSE science exams (note for
non-British people: a GCSE is an exam typically taken by 16 year olds).

For example, in a biology exam, a question asked whether you see with your
eye, ears, nose or mouth -- [http://cabalamat.wordpress.com/2009/03/30/do-you-
see-with-yo...](http://cabalamat.wordpress.com/2009/03/30/do-you-see-with-you-
eyes-ears-nose-or-mouth/)

Here's a physics exam, that's not quite as absurd --
[http://cabalamat.wordpress.com/2007/08/31/gcses-are-
dumbed-d...](http://cabalamat.wordpress.com/2007/08/31/gcses-are-dumbed-down-
and-getting-worse/)

~~~
IsaacL
"For example, in a biology exam, a question asked whether you see with your
eye, ears, nose or mouth -- "

No it doesn't. The question is "which organ contains light receptors?", which
although simple, does requires some knowledge of scientific jargon. And this
is on the foundation paper, for which the maximum mark is a C - IIRC, the
intermediate and higher papers don't include questions this easy.

This type of question is not for A-Level candidates, it's for distinguishing
between the weaker students. Of course it's worthwhile keeping track of what
kind of questions appear in exams as a benchmark of educational standards, but
only looking at the worst examples you can find does not give a clear
snapshot.

~~~
cabalamat
> although simple, does requires some knowledge of scientific jargon

Knowledge of jargon isn't the same as knowledge of science.

The problem is not so much that the questions are too easy, it is that _they
are not science questions_ , because they don't test knowledge of scientific
concepts.

It would be easy to ask questions that are science questions but that are also
easy questions (for less able or younger examinees).

For example in physics one could ask: A man went to the top of a tall
building, and threw a glass from it. The glass landed on concrete 30 m below.
What happened to the glass when it landed? he then dropped a rubber ball; what
happened to the ball when it landed?

Or a simple biology question: A woman wanted to breed striped cats. She had a
male cat and a female cat. Both cats were coloured black all over. She painted
white stripes on both cats, then got them to have sex. Is this likely to
produce striped kittens?

These are very easy questions, yet to answer them one needs to understand
important concepts about science. Teaching science is about teaching concepts,
not about rote learning of definitions. Unfortunately the bumbling
incompetents who're in charge of education don't seem to understand that.

------
DrJokepu
I had very similar experiences as a Computer Science undergraduate not many
years ago at a prestigious British university. The expectations were so low
that eventually I stopped attending lectures and got a full-time programming
job and I still kept getting quite good grades. The problem is so widespread
that it's not just about undergraduates; as an undergraduate I have met
several PhD students who struggled with even basic programming or mathematical
concepts.

Still, in a class of 300 people, there are still about 10 very talented and
bright students who'd rather be (and not afraid of) writing Prolog
interpreters and doing similarly interesting and cool stuff. There should be a
way these students could learn advanced concepts with while being mentored by
professors.

------
springcoil
Yeah I worry about the fact that this discussion became an emotional
discussion in regards programming languages. There is a lot of evidence that
students learn Java, because of the job market. I've worked in a school as
Physics Teaching Assistant for a year (while at grad school) and the huge
problem in Ireland, the UK and the USA is that people assume that students
know best about choice etc. The fact is students are rubbish at telling what
they'll enjoy, and most students will shirk away from thp the harder sciences
etc. But the responsibility is to keep up the standards of academic rigor.
Some wise young Cambridge scholar said to me recently 'When u audit something
you change the standards, and the aims of it. League tables made schools focus
on results, modularization make students focus on results, the job market
makes people look for certain skills' and that is true. My most useful course
in Philosophy at University, was a course I despised at the time. Perhaps we
need to just let professors set the courses, or we get to the scenario that
scares me the most when I get Physics graduates who don't understand what a
Partial Derivative is. A CS graduate who doesn't know what the Lambda calculus
is, is perhaps the same?

------
jlees
A really interesting read and very much echoes some of the reasons I chose not
to pursue a career in academia. When you realise you're at a fork in the road,
and down one route lies being a supervisor's paper-churning-out slave,
researching something that doesn't even interest you so the department can get
more funding...grnnnh.

Edit: To clarify, this was _my_ problem, as a PhD student with a PhD
supervisor (not sure what the US term is; adviser?) who was forcing me to do
the wrong thing all in the name of academia.

~~~
brent
I wouldn't say that is an entirely accurate characterization. I don't know
anyone in my department doing research that doesn't interest them. Students
generally love the research they are doing and are probably a bit more
autonomous than your comment seems to imply. I've taken both directions
(industry then academia) and in industry I was a code-churning-out slave
working on something that barely interested me. Now I work on my own on a
topic that really interests me!

~~~
amichail
Wouldn't you say that pursuing a startup is better than both industry and
academia?

~~~
tome
Not if you enjoy really theoretical work which is far away from the money.

~~~
amichail
True, but the number of such people is very small.

------
b-man
I think most people are missing the point here. What he is saying is that CS
(and the university model itself) has stopped being a correct vehicle for
preparing students for a technical society.

I happen to agree with the guy. The model is broken, and we should move on to
something much more libertarian, where the individual is in control of its own
learning.

~~~
bkovitz
As a completely self-taught software engineer, I'm inclined to agree. Except
for one big doubt: A large percentage of people do not excel on their own.
They rely on social standards to specify what counts as "good enough" and to
provide them a roadmap through the learning process.

Of course, that's what vocational schools are for, not universities.
Universities got confused when they started pretending to offer job training.

------
IsaacL
Should be "The Decline and Fall of _a_ British University".

The only thing this article tells us is the quality of the students that study
at Leeds. It's a bit of a stretch to extrapolate from experience of one
department in one university to claim that the same is happening everywhere.

~~~
coolestuk
His experience at Leeds is not uncommon.

I originally graduated from one of the better known universities in 1983, and
returned to post-graduate study and teaching in 1989. I experienced an even
worse scenario then he - teaching as a part-time temporary lecturer at several
of the 3rd tier universities (polytechnics).

Even when students turned in work that was totally incoherent (unfinished
sentences and half the size of the minimum word count), I wasn't allowed to
fail them. This was on modular degree courses, were most (if not all) of the
assessment was of such submitted work.

For years I witnessed half the students turn up to class with no
recriminations against the non-attendees, and with only half of the actual
attendees having read the 1 or 2 articles which were required reading for that
class (photocopies of which they had been given previously, so they didn't
have to find the original journals themselves).

The overall level of education and comprehension was appalling. I realised
that pursuing such a career was a path to frustration.

In 1997 I decided I couldn't be part of this sham any more. I have friends who
had gone through the same experiences as I, and they too decided to quit and
find new careers. The story of one of them was even written up in a national
newspaper ten years ago.

I have other friends who teach only at post-graduate level. They are shocked
by the lack of basic maths, basic grammar and basic essay-writing skills of
the graduates they teach. And in their institutions they too provide the
students with all the reading matter, so that these post-graduate students
don't have to find their way round a library or find out anything for
themselves.

Degrees from UK universities were now meaningless, and it was only a matter of
time before the wider world found out. What will probably happen is that
employers and/or students will realise that in the majority of work-related
degrees the degree itself doesn't indicate anything in terms of skills or
competencies.

Several of my neices and nephews have gone to university. Whilst I went there
to study, their principal purpose is to party.

My partner is Thai, and I know several of his family and friends who graduated
from Thai universities. Some have completed post-graduate qualifications. Yet
they would never read a book that was more complex than Harry Potter (even in
Thai). Nor would they ever go to a museum or art gallery.

Degrees have just become something that most people do either as a way to move
out of home safely and/or a stepping stone to a job. For most graduates there
is no sense of education being important in itself.

~~~
IsaacL
Thanks for the reply.

Just looked up at Leeds on Wikipedia - I'd assumed it was a fairly mid-range
institution (and so I thought there'd be other institutions the same or worse,
but many which were better). Just found out it's in the Russell Group. Christ.
I (partially) retract my earlier respones.

I study CS (just finished first year) in the UK at a institution that usually
places in the top 10... now you mention it, a lot of students do match those
described by you and others. However, it's easy to ignore these people, since
they never go to lectures (I have friends on other courses who say they attend
on average 1 lecture a term...) It's trite, but with university, I feel you
get out what you put in.

The opportunity to mix with the people who are bright and motivated (and they
do exist) is great, as is having a library full of free programming books. For
every student who only goes to party, there's one in the CS building late at
night writing a Scheme parser in Haskell. (OK, the ratio's probably more like
3:1, but you get my drift).

On the other hand, I feel like I've learned about as much from pursuing
independent projects than I have from my course, though the resources on
campus make such projects easier.

BTW, a large part of me is thinking about dropping out of university at some
stage and startup a startup - read too much pg - so I've been thinking about
the value of university lately.

CS is a bit of an odd subject in this regard - with a lot of subjects^, I'd
guess a majority of students don't expect their degree to be useful in their
future careers (how many history grads become history teachers?), so it's
understandable that many would just see their degree as a career ticket. I do
actually hope to learn useful stuff on my course, but I'm not sure if I'd
learn more by dropping out and starting up.

*Exceptions I can think of - Law, Medicine, and possibly foreign languages, Engineering, and Economics. Also more 'vocational' subjects like Nursing.

------
alanthonyc
_" the counterfeit academic Mozarts are common and a contributory cause to
global warming and deforestation."_

A gem of a quote, and a worthwhile read.

------
teeja
The gentleman is right on; in the US too it's far from a new story. The
decline started in secondary and then spread up through higher education.

But does the fault really start with the students, as he claims, or is that
blaming the victim? Certainly noone wants to work as hard as people used to
... that's a fact. The old culture (50 to 100 years ago) also supported a
_much lower_ population, and there were _many more_ resources - many trees to
chop, dams to build, necessities to invent ... all that's now paved over.

Many more people wanted to attend college to get 'the good jobs'. Only,there
was only such much demand on the high-end. You wind up with an over-educated
population, deep in debt, with Great Expectations that just won't be
happening. Those who can't be anaesthetized with TV or video games or web-
surfing inevitably turn their frustrated analytical skills upon the powers-
that-be.

So, well, here we are. Clearly colleges are in for a deep re-organization. We
need to recreate eduction - but to what end? what future will we re-tool to
create?. And how will we provide opportunities for all that 'computing power'
waiting to be harnessed!

------
edw519
_Graduating computer-illiterate students who had to do a project in computer
science was more of a headache._

For whom, you or those of us you have to work with them?

~~~
sketerpot
Everyone. Although, really, this goes beyond being just a _large headache_ for
everybody involved; it's more of a _startling clusterfuck_. I mean, honestly,
how does a system get to be so broken that they graduate CS students who can't
program a computer? Or simply stop offering core classes like compilers?

