

Ask HN: Our college is teaching us outdated technology - ponyous

One of my colleagues just sent me this quote saying &quot;I think this is how our college works&quot;: 
&quot;At the end of your four years of torture in university, there is the tendency to think that &quot;I’ve learned all there is to learn&quot;… only to come out into the real world and realize, &quot;I’ve been learning outdated technology!&quot;&quot;<p>We are basically learning stuff like XML, XSD, XSLT, JSP... I&#x27;m not saying anything of this is useless but I think priorities should be somewhat different, like learning JSON instead of XML, PHP instead of XML+XSLT...<p>I wonder what is correct way to tell professors that we are learning something that won&#x27;t get us job as &quot;easily&quot; as PHP&#x2F;Javascript would. Are there any other proposals that would improve our college program which I as a student can propose?
======
smoyer
You're going to have to live with this - the point of college is to teach you
how to learn.

Longer version:

For the last two years I've worked for a major university - one of the top
fifty. Prior to working for the university, I spent thirty years in industry.
During my time in the commercial market, I thought of universities as ivory
towers were everything is the latest and greatest ... because reading research
papers only shows you the latest and greatest.

When I got here, I was shocked to find that many of our systems were 15, 20 or
even 25 years behind those we were using in industry. Not that they weren't
stable, but since our primary job is to "educate the youngin's", working
systems were valued but there is no competitive advantage to replacing them.

My son is a super-senior in college, and I've noticed that the information
he's learning is (on average) five to ten years old. And this actually makes
sense. Research is bleeding edge, but much of it isn't commercially valuable.
Since undergrad degrees are supposed to produce graduates who can obtain jobs
in industry, industry generally has to adopt a technology before it's likely
to be used in education.

Once industry adopts a technology, a professor (or department) has to
recognize that adoption, make sure it's not simply a technology fad, verify
that it aligns with the theoretical approaches that are valid and develop a
curriculum.

Your university is actually providing you a valuable education by focusing on
the technologies that are valued in industry, albeit a little behind the
curve. Why? Because JSP is still used by a huge number of corporations and as
a mark-up representation it's a pretty good example of many comparable
technologies. Furthermore, JSF has been strengthened in the latest JavaEE
releases and many large corporations have adopted these changes.

What if you don't want to work for a big stodgy corporation? You probably
didn't need to go to a big stodgy corporation, but I question whether you're
really following the technology trends very well. "PHP instead of XML+XSLT"
sounds like a horrible way to do ETL, even for a start-up. Why wouldn't you
pick NodeJS in that instance?

There are thousands of technology stacks and it's not practical for your
university to teach all of them to you. how invested are you in learning the
ones you're curious about outside of your classwork? You do realize you're
responsible for your education and career right? If you don't learn new
technologies on your own in the corporate world, you'll be quickly outdated.

So finally, my suggestion: Learn one of the technologies you're interested in,
then go have a discussion with the professor about how that technology
compares with what he's teaching you. I'm hoping he'll be intellectually
curious and engage deeply in a conversation like this - recognizing that he
can also learn something in the process. If not, your professor has indeed
failed.

~~~
ponyous
Thanks for your reply it makes sense.

This fellow student also said something along this lines:"Next year I'm
dropping out, I can learn 10 times more on my own than here...". What Do you
think of this? Is this good decision?

Few of you mentioned "The point of college is to teach you how to learn",
which seems interesting but it doesn't seem useful for me anymore since I
already know a lot of stuff/tech, should I also drop out? What is your
suggestion for a student who is amongst the best in college? (I already had a
few job offers from start-ups in my country, I have good references, I'm
researching on my own, won a few Computer science competitions...). Currently
I'm working for a startup and completing first year of college, but replies
here got me thinking - I might drop out next year.

I wrote PHP just as an example I actually had Node written there but then I
changed my mind and wrote PHP instead of Node, because its more popular. You
can replace PHP with anything more popular/new/trendy/useful there.

~~~
meric
I've been programming before I got into university, and for me, there would
have been a lot of stuff I wouldn't have gone to learn on my own if I hadn't
heard about it in university. Going to university exposed me to new ideas I
could further explore deeply in my own time. It also help formalised some of
my knowledge, for example the language theory behind regular expressions, as
well as what OOP means exactly. More ways to describe concepts in my head, I
think, has helped me compose bigger ideas.

Today I'm working as a web developer using Django in Python. I'd imagine if it
wasn't for having watched my professor demonstrate a cgi C program web server,
I wouldn't have gone on to dig up rabbit holes in php, rails, app-engine, and
finally django in my own time, and I suppose rather than working as a web
developer (and doing "real engineering" using version control, unit testing,
integration testing, etc... all of which I first heard about in university and
then further explored in my own time) I'd be slaving away in a company which
works without version control and wouldn't know any better. And yes, don't
laugh, I have indeed worked in a company part time without version control,
before I learnt about it in university; when I did learn about it I introduced
it to the company, and then _quit_. Let me tell you they were very grateful I
told them about it. Nowadays they no longer merge code by hand.

    
    
        "Next year I'm dropping out, I can learn 10 times more on my own than here..."
    

I have found that to be generally true, i.e. I learn 10 times faster, on my
own time, than attending lectures and doing tutorials and doing assignments,
all of which were dumbed down for the average student, but in spite of that,
university has still been worth it for me.

~~~
ponyous
This seems to be good perspective on college, but I still don't think college
have more upsides than downsides for me or anyone else who follow sources
where latest tech is exposed (HN, reddit, ...). All the things you have
mentioned (Version control, unit/integration testing, ...) I have already
heard of or I already know - I'm not saying there isn't something else but I
think I will get to know more and more things as I learn on work or by
developing side-projects or just reading somewhere about it.

Damn I'm desperate, wasting my time with folks who are totally uninterested in
learning something on their own, learning something that probably wont help me
in future...

------
danpalmer
I was taught about XML, XSD and XSLT. When I got to industry, this made me
wonder why we stuck with unstable JSON APIs and hand-coded validation, instead
of using schemas for validation.

I was taught about SQL. When I got into industry and started using MongoDB, I
wondered why data would go missing, and then realised it's because
transactions are often a very good thing to have.

I learned some Modula 2, in 2011. Not because we were writing Modula 2, but
because we were writing a compiler for it.

I've found quite a few times that while the technology we learn is old, there
are many beneficial lessons to take away from it. Not just general theories
and processes, not just learning how to learn, but there are actually some
seriously good ideas in software development that new technologies haven't
got, and often we in the startup/SV scene either haven't learnt them, or don't
think they are important because they're 'old'.

I have several times found myself speaking to engineers who proudly 'didn't go
to university' or taught themselves, to be hugely behind in terms of
knowledge, and they just end up re-inventing things which have been around for
decades, usually in Javascript. Re-invention isn't always a bad thing, it
often injects new ideas, and the new versions can be really great to use,
unlike some older technologies, but I think a significant number of people who
think colleges are teaching outdated technology probably need to go to
college.

Disclaimer though, this is based on my experience at one university, in the
UK. I understand it might be different in the US and at other universities. I
just find it frustrating when I know that my degree has taught me a huge
amount of very useful stuff that I know some people in industry are lacking.

As far as getting a job goes, if you actually look at what most places want,
it's Java, XML, C#, etc. Maybe not in the startup world, but the enterprise
market is considerably larger, and ultimately where many people end up
working.

~~~
draker
>to be hugely behind in terms of knowledge, and they just end up re-inventing
things which have been around for decades, usually in Javascript.

I have taken a few university courses, but have primarily taught myself and
would greatly appreciate if you could expand on this.

~~~
mickeyp
A good University will teach you things you would almost never come into
contact with on your own.

I will give you some random examples from my own degree. Some concepts are
quite practical and others are theoretical, but all of the examples below
taught me -- as someone who had started programming when he was 12, back in
the DOS era -- something new.

\- Data structures. I had played around with them myself before I went to
university, but being formally introduced to the concept of data structures:
lists (Queues, Linked Lists, Stacks); trees (Binary, B-Trees, Heaps, etc.) and
the performance characteristics of each and how you would apply them. In my
university it was first taught with mathematical notation followed by
practical implementations in Pascal.

\- Learning Prolog in my first year. An entirely procedural language that uses
back tracking to 'solve' relations based on rules and facts. I loved this
class. Prolog's not terribly practical but it taught me a lot about abstract
data types and 'recursion'. The idea that a list is defined as a head (the
item in a list) and the tail (the rest of the list) which can be iterated over
recursively was a major eye-opener to me, an until then hobbyist coder.

\- Haskell. A stark contrast to Prolog and Pascal, I learnt the power of
composability and the ability to capture complex ideas with very powerful
constructs and a very strong type system. The language is lazy so you can
operate on infinite 'streams' of items which is another interesting way of
thinking about the theory of computation.

\- We were taught how to specify a PDP-11 (RAM banks, CPU, I/O) using an
academic language that uses "rewriting logic". It was a very theoretical yet
practical way of specifying a computer using only mathematical principles. The
idea that the number 0 and the successor function is all you need to define
the natural numbers and, combined with recursion, basic arithmetic was again
the sort of thing you never ever do outside University.

\- Compilers. I wrote compilers using lex and yacc. I have used that theory --
and the theory of regular languages -- to great effect since then to formally
construct grammars and parsers for tasks I needed later on in life. And I
could do it effectively and efficiently without reinventing the wheel or
coding myself into a corner -- which is easy to do with compilers.

\- Computer graphics. Basic convolution filters; image processing and the
general theory of 3d graphics.

\- Graph theory. When I first took it it seemed utterly "useless" but it has
since then been one of the most useful things I know. I grasped the concept of
Git right away thanks to that knowledge.

\- Operating systems. Networking. Basic security. Complexity theory (the study
of algorithm complexity) -- and the list goes on.

I remember maybe 30% of what I learnt but I can quickly jog my memory or pick
up where I left off. I consider my degree invaluable from an academic sense...

... but it did nothing to make me a better _developer_. That, unfortunately,
is something you have to work hard at. I got lucky: I had already been
programming for 10 years by the time I graduated.

------
Codhisattva
First of all, college isn't vocational training. So get over the whole "get us
a job easily" thing.

Your employment is dependent on the effort you make towards your chosen
career.

The true purposes of college is to teach you how to learn, to expose you to
many new ideas, and to give you the opportunity to focus your life on thinking
with the least amount of distraction.

If the curriculum is different than what you desire, then write your own
curriculum, enroll in independent study classes and discuss with your advisor
how you can achieve your goals.

If you approach it from your personal perspective you'll have much more
success than if you approach it in the way you state "tell professors..." or
"improve our college program". Those are bureaucratic battles and who wants to
learn about the gnarly dark under belly of academia? You can go that route,
but to win you'll end up spending your nights and days formulating
presentations to administrative decision makers. YAWN. Institutional change is
hard in all the wrong ways and dull in every way. And you know what the most
likely outcome will be? A dean will agree with you and promise to look into
adding a JS class in the 2015-16 academic year. You'll feel victorious but the
dean will forget about in 10 minutes.

By taking personal responsibility for your own curriculum you can achieve
something greater than a college degree. You will learn and exhibit two of the
most important attributes in the software industry: initiative and independent
self education. If you master those skills you're well on your way to a great
career in engineering.

------
adrianhoward
There's also the fact that older doesn't necessarily mean outdated or unused.
There are still _many_ jobs out there that require XML, XSD, XSLT, JSP, etc.

Organisations have software that's been around for many years, in many
instances decades. Curiously they don't rewrite everything just because COBOL
or Java or C++ or whatever has become unfashionable among some portions of the
dev world ;-)

~~~
ponyous
As I wrote in original post: "I'm not saying anything of this is useless but I
think priorities should be somewhat different"

~~~
adrianhoward
Why?

Genuine question ;-)

What utility value will you get out of PHP rather than Java? Why should
priorities favour the PHP over Java, or C, or Lisp, or whatever?

I still have this vain hope that folk will get back to teaching programming -
rather than specific programming languages. During my degree back in 1988-91
we built non-trivial programs in all of: Pop-11 (yes - nobody has heard of
this ;-), Prolog, Lisp, ML, Modula-2, C, plus some stack-based assembler whose
name escapes me at the moment. Not to mention trivial playing Smalltalk,
Occam, shell scripting and probably others that I've forgotten. And this
wasn't even a straight CS degree!

Sometime between now and then the universities started chucking out people who
were just taught Java, or Python, or some other single-language.

Sigh.

I seem to have ranted a little off-topic... I'll shush now ;-)

Sometime

------
roberte3
I'd worry more about the general topics that your learning, rather than the
technology that your using to learn it.

Are you in a Computer Science class or a bogus how to use technology X class.

Are you learning Discrete Math, Big O, how to write a compiler, assembly
language (doesn't really matter what platform).

If your learning CS, then you will have the tools you need to tackle any job
using whatever tools you have at hand.

The "huge" differences between JSON and XML, don't really matter, if you can
write a parser...

------
0verc00ked
I sympathize with how you feel, and I think some of these responses miss the
point and are kind of inappropriately harsh for someone asking a perfectly
legitimate question. I know plenty of CS students/grads feel the same way. I
certainly do.

I definitely agree that college is _partly_ about exposure and teaching you
how to learn, but it's ridiculous to say that it's not their responsibility to
teach you technologies that you'll be using professionally. Yes you should
take it upon yourself to learn what you want or need to know. But with the
cost of tuition - they should definitely be doing more than they are.

I know at my college (graduated in 2011), there seemed to be a
disproportionate offering of courses that one would use if they wanted to
become a video-game developer or an enterprise software engineer. They taught
us very little about how to actually program for the web.

Getting back to your question.. heres' a couple things to keep in mind:

\- It's not the same anywhere. There's a range in curriculums depending on
what school you're at (and even within the same school). Some specialize in
different things. Some have different philosophies. And some are better than
others.

\- The truth is, there's so many languages, libraries, frameworks, and
technologies - and they're expanding in all directions faster than anyone can
keep up. The only thing you can do is try and pick the ones that matter to you
(based on what your aspirations are) and specialize. Anyways, given all this -
imagine how hard it is for the universities to keep up themselves while
designing a curriculum that can fit all of their students.

As far as what you can do - I'm not sure there's a whole lot you can do within
the confines of your university, unless you were super adamant to the point of
organizing events/rallies or pestering the hell out of your professors and
school until they make some changes. And even then, who knows if it would work
and in all the time you'd spend - you could probably have taught yourself a
few of the things you're pushing to have them teach you.

Here's a couple other solutions:

\- Find a professor/TA you like and knows some things you want to learn, and
try to get them to tutor/mentor you. (In my experience, this one is hard
because everyone's busy)

\- Transfer schools. But do your research first to make sure you don't wind up
in the same situation.

\- Adapt by teaching yourself the things you want on the side. Tackle small
projects, each with one or two new things you want to learn. (this is what the
other responses were advocating, and I think you need to get in the habit of
doing this regardless)

Hope this was somewhat helpful.

------
eshvk
> We are basically learning stuff like XML

I find this hilarious because before I can go further into hacking into fancy
cutting edge deep learning models, I have to process years and years of XML
data from an external party.

As someone else said, the only purpose of University is to teach how to learn.
Everything else is up to you. Again, there are places that teach you specific
skills that are job relevant now (Codeacademy/Bootcamps). You could very well
do them and get a high paying job too.

However.

Figure out what you want to eventually be in life. Do you want to be a high
paid janitor who can be easily replaced? Or do you want be the guy who can see
through all the bullshit frameworks and design solutions that are robust. The
guy who remains relevant at 40.

------
Aldo_MX
If you feel uneasy with college, and you already have the skills you need,
skip a quarter, semester or whichever time measure your college uses.

Your college should have a head professor, talk with him, explain him your
feelings, and scrutiny the curriculum with him. He should give you more
insight about where you might not be as good as you believe, and you will get
valuable hints about where to start learning on your own.

Take the time you skipped to experiment what is to be a self-educated person,
try to research on your own, read books, attend online courses, build
projects, and everything else you find comfortable with to learn at your own
pace.

If you are completely sure that you don't need college anymore, and you feel
comfortable being a self-educated person, drop.

EDIT: In the country where I live (Mexico), there is an institution named
CENEVAL[1], where you can graduate by knowledge acquired through job
experience and self-education. Maybe in the country where you live there is a
similar institution.

[1] [http://www.ceneval.edu.mx/ceneval-
web/content.do?page=1927](http://www.ceneval.edu.mx/ceneval-
web/content.do?page=1927)

------
27182818284
The stuff you're learning isn't outdated. It just might be more career-focused
rather than front-page-hacker-news-or-reddit-worthy.

The example I like to bring up over and over again on HN is Union Pacific,
because it is a great example. You will probably never see an article on the
top of HN about railroad .NET or Java code, but it helps drive the US economy.
(and they pay well. I have younger friends out of college that made $55
immediately and more experienced devs making $100K, which is insanely great
compared to California given the cost of living differences. You're talking
about owning the equivalent of a million-dollar CA home at 25 with cash to
spare. )

------
shrikrishna
tl;dr Use your college life to build cool stuff, just because you can. You
won't have that liberty once you come out

> I might drop out next year.

My guess is that you are serious about this. As a student myself, and having
been thinking on similar lines over three of four years of my college, my
opinion is - finish what you started. It is absolutely true that what you
learn in college is outdated. But, what you get in college and don't get once
you come out is the sandboxed environment. In college, you are free to explore
your will, do whatever you want (even find co founders, if you are
enterpreneural); there is no pressure. The pressure palpably increases once
you come out. You can do that (do cool stuff for fun) even after college, but
it's infinitely harder.

> since I already know a lot of stuff/tech

You might be the best in your college, but never let it get into your head
(I'm not saying you are)

PS: It became a long reply in the end. My apologies

~~~
ponyous
I think I have this luck, that in my country I can repeat one year of college
(free of charge), so I will probably drop-out (fail one year intentionally)
next year and will experiment for a year - hopefully I will be successful as I
plan.

"even find co founders, if you are enterpreneural" \- I'm trying this the
whole time not just in college but with everyone with IT interests. People in
my country are just not that ambitious - they just want a safe job and an
average salary. This one student to whom I've been talking too seems to be
someone who I should hang out with.

~~~
notduncansmith
If you don't mind me asking, what country are you in?

~~~
ponyous
Slovenia (Southern part of Central Europe)

------
sergiotapia
Everybody has gone through this - it's a staple of college. You're there to
learn the basics and to learn how to learn.

If you aren't working on your thing and researching your own technology stack
during college you're doing it wrong. Of course you're going to come out of
there knowing nothing but old tech.

------
sn
Personally, I would be way more interested in someone who came to me and said
"Here is a class project originally written in X that I rewrote in Y because
your job description included Y and not X" then someone who had originally
learned Y in school.

------
exabrial
The most valuable lesson you will learn in college is "how to learn." Put the
attitude away and start learning.

------
Codhisattva
Also, fix your damn title. "Our college is learning us outdated technology".
Really?

~~~
smoyer
I almost commented on the title, but I was guessing the poster might not be a
native English speaker. Since you've brought it up, one correct way of
phrasing the title would be "Our college is teaching us outdated technology".

~~~
Codhisattva
I refrained for commenting on it for as long as I could. I weighed the non-
native speaker too but finally old George Bush quotes pushed me over the edge.

;)

~~~
smoyer
George Bush was the decider!

