

How does one learn about the latest advances in Computer Science (not fads) to apply and improve your work?  - juwo

Note: core stuff like algorithms, new patterns etc. <i>not</i> fads like RoR, Ajax...<p>Stuff that usually sits in abtruse papers, out of mind for the average developer. 
Accessible to the average developer, and in a form more easily digestible 
(studying research papers is not practical for all). <p>When I picked up the Cormen book recently, I saw lots of new stuff I didnt learn in College.
======
nostrademons
I'm going to say something heretical here: I don't think entrepreneurs should
base their companies off the latest advances in computer science. Problem is,
many of them are really "bleeding edge", and even the researchers themselves
don't know all the implications for them. _Nobody_ has a clue where they might
lead, or if anyone will ever find them useful.

Instead, you should look for the stuff that came out of academia 20 years ago
but was rejected as unfeasible, useless, or just plain idiotic. Then keep an
eye on economic trends that change the assumptions that made those discoveries
useless. If you keep in mind a large enough set of rejected technologies and a
large enough set of economic changes, eventually you'll find a match between
them.

Some examples:

The architecture, performance, and programming techniques for early
microcomputers mimicked 1950s mainframe technology. Many of the features of PC
OSes were considered incredibly backwards at the time - no multitasking,
segmented memory, assembly language coding. Yet this primitive hardware cost
perhaps 1/10,000th of a 1950s mainframe and fit on a desk. This opened up a
whole new market, one that was willing to put up with buggy software, single-
tasking, and limited functionality.

Java consists mostly of poor implementations of ideas from the 1960s and
1970s. Garbarge collection was invented in 1960; object orientation in 1968;
monitors in 1978; virtual machines in the early 1970s. Yet Java targetted the
PC, Internet, and embedded device market that had previously been limping
along with C/C++ and assembly. To them, these innovations were new, and
performance of devices was just barely improving to the point where they were
becoming feasible.

HyperText was invented in 1960- actually, you could argue that Vannevar Bush
came out with the concept in 1945. But there was no easy physical way to put
together large amounts of information, so Ted Nelson's Xanadu project went
nowhere. Fast forward to 1991: the Internet had linked together most research
institutions, and PCs were becoming powerful enough to support graphical
browsing. When Tim Berners-Lee put the WWW up, there was a ready
infrastructure just waiting to be expanded. And the rest is history.

PC video has been around since the mid-1990s: I remember recording onto a Mac
Centris 660AV in 1993. Flash has also been around since then, as has the
Internet. Previous attempts to combine them failed miserably. Yet YouTube
succeeded in 2005, because a bunch of factors in the environments had changed.
People were now comfortable sharing things online, and many people now have
broadband access. Cell-phone video makes it really easy to record, without
expensive equipment. And the rise of MySpace and blogs made it really easy for
people to share videos they'd created with their friends.

~~~
brlewis
Was there image support in HTML in 1991? I thought NCSA Mosaic introduced the
IMG tag later, apart from any standards process. I think the only thing
graphical about browsing in 1991 was that the NeXT browser could do header,
bold and italic.

~~~
nostrademons
No, there was no image support in the original HTML. TBL wanted image tags
that opened the image as a new document; Marc Andreesen and Mosaic pushed it
through as an inline element.

The reason that GUIs helped so much is that you could _click_ on a hyperlink
and it'd take you directly to the page. There are often lots of hyperlinks on
a page: it's very cumbersome to navigate between them with the keyboard.
(Trust me, I spent a year using Lynx over a 2400 baud modem before we got a
real Internet connection. ;-)). And any interface like that locks out the vast
majority of potential users, who don't want to remember that 'g' lets you go
to a url or arrows select or that you hit space (or was it enter?) to follow a
URL.

~~~
brlewis
Except Gopher and TechInfo already had graphical interfaces in 1991, on more
platforms than the WWW, where you could click to get the information you
wanted. I think it was the IMG tag that initiated a lot of the excitement
about TBL's project. You could already browse words and images separately
through existing information systems. It was putting them together that gave
the WWW lots of momentum.

------
jsjenkins168
If you are interested in learning more about core technologies in an area
valuable (in terms of future technology trends), I would advise learning more
about networking.

Having strong networking knowledge is very useful and can be applied to cool
areas of growth such as the Internet and mobile devices. I took a graduate
level course in networking as an undergrad and it was one of the best
decisions I've made.

If you want to teach yourself, I recommend the following books:

Computer Networks by Andrew Tanenbaum - This book is from 2002, but is still
considered the Bible of Networking. It covers all of the advanced topics and
is great as a reference. You can pick it up used for cheap.

TCP/IP Socket in ____ by Michael J Donahoo - This is a series of books that
cover network programming in different languages. There is a C/C++, Java, and
C# version of the book that I know of, but there might be more now too. These
books are very concise and to the point and teach you everything you need to
know to write advanced network programs.

------
Mouse2k
I'm currently a CS double-major in an American university. We learnt
algorithms with Sedgewick's books (available for C and Java) and patterns with
the Gang of Four book. I find that I rarely use the algorithms for web
programming, but do use some patterns (MVC, Observer).

As for latest advances, there are many languages on the cutting edge, such as
Haskell, OCaml and Erlang that would be worthy to study just to expand your
horizons. These languages ARE many of the latest advances in CS. I'm currently
diving into Lisp, and have found that some concepts are timeless (code as
data, macros).

------
pg
Go to CS talks at nearby universities. Most have talks that are open to the
public. These usually include "job talks" in which people applying for
teaching jobs present their recent work.

~~~
amichail
Also see:

<http://video.google.com/videosearch?q=google+techtalks>

~~~
fireandfury
Yeah the tech talks are awesome. The Museum of Computer History has some great
lectures on tech talks. One that I particularly like is titled "Great
Principles of Computing". It's a nice summary of the past few decades of
computing principles. Here's the link:
<http://video.google.ca/videoplay?docid=5494452304620274339>

------
bayareaguy
This can be a full time job in itself, but here are some quickies I like:

If you want to be practical, Hack The Planet is worth a glance every now and
then: <http://wmf.editthispage.com>

Lambda the Ultimate is good too, especially if you think languages are where
the real CS action is: <http://lambda-the-ultimate.org/>

------
maxklein
There are no significant advances in algorithms worth looking at, in my
opinion. Current hardware only supports bruteforcing things by optimizing
cycles or throwing more cycles at things. You can do that by improving
hardware, and that will come in any case.

Real advances only come when new _ways_ of doing things appear. And for that,
just reading tech news is enough. For example, the Table Top PC.

The only really two significant areas where algorithms can still make a
difference are in video manipulation / object recognition, audio manipulation
and artificial intelligence.

But you'll find that in those areas, advances are usually very complex and
difficult to monetize.

It's much more effective to just look at advances in hardware, and figure out
what you can do on a software level to take advantage of this hardware.

But even better, look at the internet, and watch as data opens up. Use that
data to create new things.

------
herdrick
Go to an academic conference. I went to this one:
<http://www.icfpconference.org/> last year and it was well worth it.

Hunt around for a great class at the local CS department and either figure out
a way to enroll or just 'drop in' on the class. I did this too and loved it.

------
felipe
IEEE Magazines:

<http://www.ieee.org/web/publications/journmag/index.html>

"Software" and "Computer" are more in-depth and oriented towards latest
advancements and practices. "Internet Computing" and "IT Professional" are
more practical.

------
brlewis
Crash a social event at a school with a strong CS department and talk about
your work. I bet people will be more than happy to give you pointers. The
academic world is overflowing with ideas that ought to be more widely used but
aren't. Academic people like it when you take their ideas and run with them.

------
far33d
Computer Science, as a field is very mature, and as a result, very specialized
in its research (though I disagree with the idea that core algorithm research
is more relevant for the average developer than what you call fads).

The best thing to do is to first get more specific: what kind of algorithm? Is
it database implementations? Virtualization techniques? Filesystem
optimization? Graphics hardware? Programming languages?

Then, find the appropriate journals. Get an ACM membership so you can search
and get full text of the library.

~~~
jamesbritt
> Computer Science, as a field is very mature,

CS is less than 100 years old. How do you see it as "mature"?

~~~
far33d
Mature in that there are many sub-disciplines and an extremely large academic
community that studies and researches very specific aspects.

------
amichail
For routine programming, the GoF Design Patterns book will be more helpful to
you than algorithms books.

But if you really want to learn more about algorithms, check out this book:

[http://www.amazon.com/Algorithm-Design-Jon-
Kleinberg/dp/0321...](http://www.amazon.com/Algorithm-Design-Jon-
Kleinberg/dp/0321295358)

As for programming languages, the java + eclipse combination is excellent.

~~~
amichail
Also, check out this refactoring book:

[http://www.amazon.com/Refactoring-Improving-Design-
Existing-...](http://www.amazon.com/Refactoring-Improving-Design-Existing-
Code/dp/0201485672)

~~~
juwo
thanks. I notice though it may be a bit dated - 1999.

------
juwo
Clarification: I was roughly talking of techniques or improvements to apply to
our software design and code. Not about adopting new technology as a business
strategy.

------
mhidalgo
ocw.mit.edu is where MIT keeps alot of their material for their CS classes,
also Berkeley's webcast has video of their classes for each semester.

------
donna
How can you tell the differnce between a fad and an advancement?

~~~
brent
Fads usually come with a fair bit of unsubstantiated hype (like RoR and Ajax)
whereas algorithms (and such) are not usually hyped at all in the media
(sometimes due to copyrights on their publications [journals] and sometimes
due to academic obscurity, abstraction, or even complexity).

~~~
donna
That's an interesting take. I'd argue that Ajax for example _is_ an
advancement, regardless of the hype -- because it increases usability of many
real-world web applications, today.

On the other hand, most algorithms are not widely applicable, so might only be
counted as an advancement when they're used 30 years from now in a single
specialized case.

-Matt (of donna & Matt)

~~~
brent
I'd count them as an advancement as soon as they are added to the body of
knowledge. XMLHttpRequest has been around for nearly a decade. A decade ago
the "advancement" was there. It took almost 10 years for the fad to build up.
If someone wants to know the fads (AJAX) just read prog.reddit.com and you're
all set. If you want to know REAL C.S. advancements you'll have to dig a bit
deeper into the literature.

~~~
paul
Silicon has been around for billions of years. Forming it into microprocessors
is just a fad.

~~~
juwo
If microprocessors lasted only 5-6 years before they were replaced by a newer
type of technology - then, yes, they would be just a fad. :)

~~~
paul
So mosaic, for example, was just a fad?

~~~
juwo
The _concept_ was not a fad. The _concept_ was an advance which has lived on
in other browsers.

With imagination perhaps, one could even apply the paradigm of a browser to
mechanical and biological domains ("non-computing" in a software sense).

concepts vs. tools for strategic business and market advantage

~~~
juwo
IMHO

1) interpreting of tags

2) sending the tagged content at a site, to an enquiring browser so that in a
virtual sense, one is visiting the location.

