

Ask HN: What do you think is the next Technology worth mastering ? - Murkin

I have friends who got very deep into iPhone dev a few years back.
Now, they are considered masters of their craft, have managed to build a good name and connections for themselves. (And monetize on it).<p>Now that I have lots of spare time, I decided to delve into something (very)new.
In the hope that two/thee years down the road, I too, can position myself in the same way in a new field.<p>So what do you think is most worth learning ?
======
adriand
This is an interesting question and one I've been pondering lately as well,
not from an individual perspective, but from a management perspective for a
small team of developers at a web app/design shop.

I obviously have to be more conservative than you because I have to consider
short-term profit as well as risk. So in order to get responses that are along
the lines of your question, I'd like to ask HN readers: what do you think is
worth concentrating on in 2010 that would lead to good short- and long-term
results for an agency specializing in website and web application development,
often for relatively small projects?

To answer your question, I think you need to do some thinking about what it is
that you want to do. You have to start with something that interests you. Your
friends chose something in the realm of mobile development: does that appeal
to you? Perhaps they also focused on games: does game programming appeal to
you? Someone below suggested "data mining" as a field to focus on. I can say
for myself that although I am somewhat interested in this, I am not anywhere
close to as interested as I'd need to be to devote several years to it.

It also has to be something with a good chance of paying off, if you are
interested in the money side of it. Your friends chose something that was
backed by a huge corporation with a proven track record of creating successful
devices, so although they ran some risk - the iPhone may not have been a
massive hit like it was - they mitigated that risk by choosing something with
very good chances. However, any choice that relies on predicting what will be
successful in the technological realm in two to three years is bound to be
risky, especially if it is "very new" (and thus unproven).

Here's a shot at it: focus on mobile web application development utilizing
HTML5 features. Google believes "the web has won", and I agree. If you get
really good at building browser-based applications for mobile phones, I think
you'll do well.

~~~
Murkin
Excellent answer, thank you.

This is one of the main fields I have been looking into. The idea of
iPhone/Android/Blackberry/etc app, sounds absurd to me with the Internet-
Everywhere movement.

Uniform web hosted apps are surely to be here soon.

~~~
aaronblohowiak
Pro-tip: jQuery is light and fast enough for mobile and provides clean
abstractions.

------
ratsbane
+1 for Data mining, concurrency, FP.

Another one: HTML 5 and related.

This probably won't make as much of a splash as AJAX did a few years ago but
some of the new things you will be able to do in a web browser present an
opportunity to improve on older web apps: e.g. location and gravitation APIs,
web sockets, multi-file uploads and drag-and-drop.

Biggest problem with HTML 5: it's going to be years before complete market
penetration - but FF, Chome, and Safari all have much tighter upgrade cycles
than IE. For intranet stuff or anywhere you have a reasonably captive user
base, why not offer better features to your users if they upgrade? Side
benefit: it's much more fun to be coding for the future than for the past.

~~~
nzmsv
AJAX originated as an obscure hack put in IE by a Microsoft employee. The only
reason it made such a splash later is that other people tried it, found they
could use it to do cool things, and then spread the word.

HTML5 could make just as big a splash. Someone making local storage work
seamlessly in a large webapp would be noteworthy, and take away the biggest
problem with Chrome OS: can't work offline.

~~~
aaronblohowiak
s/someone/google

------
gtani
concurrency models: erlang, scala, clojure, Haskell

Parallel execution, map/reduce, hadoop, noSQL datastores;

~~~
maxklein
This will be a niche area for few very good programmers. A general purpose
developer should not risk this. If it does not pan out, then you've wasted a
lot of time.

~~~
keefe
it's not going to pan out because we're going to roll back to single cores?

~~~
aaronblohowiak
Frankly, the _vast_ majority of applications don't have these classes of
concurrency, capacity and computational concerns and can be adequately scaled
in the traditional fashion.

~~~
keefe
Exactly what type of application are you referring to? Most server software
does a lot of work in parallel. Many computationally complex algorithms are
parallelizable to one degree or another. Video games usually have a lot of
parallelizable number crunching, analytics very often run quite slowly, UI
code requires at least a couple threads of execution, on and on goes the
list... it's time to adapt the idea of traditional to include more than
sequential programs

~~~
aaronblohowiak
I presume that most applications that most people use most of the time do not
need to scale. Most websites don't need to scale, most business apps don't
need to scale, and most userland applications don't need to scale. Single-core
sequential programs do just fine on comparatively resource constrained
hardware for "everyday" applications. with the exception of maintaining a
separate thread for UI, the examples you gave are edge cases.

I am into game dev, music dsp and highly concurrent web application
development... but I recognize that these are all fringe activities.

~~~
keefe
parallelization is not only about scalability, but also performance. If a task
takes 20min and can be parallelized 4X, it now takes 5 minutes. There's only a
certain portion of each program that is inherently sequential. We're at 2-4
cores now, soon it will be 8-16. I'm not really in a position to comment on
average joe computing... but my social circle runs PCs down to a halt
regularly. If I had 8 cores and all of my apps were setup to exploit it, I'd
be a very very happy man and I don't think I'm in a minority. There's
analytics on large databases - every enterprise does this. There's gaming -
imho not a fringe activity at all. Exactly what are these day to day
applications? CRUD web apps with no analytics? word processing which certainly
do parallel processing of spellchecking etc? firefox? photo editing? I
honestly can't think of a program worth writing that couldn't stand to use
multiple threads of execution - even gedit polls files for modification.

~~~
aaronblohowiak
I agree that increased parallelization of common computing tasks could
increase performance, but I believe that performance is acceptable (by the
consumers and management) for most applications.

Programming games is fringe programming. I believe that most PCs are used for:
CRUD web apps, VB/C# apps, productivity apps, document editors, ecommerce
clients, collaboration, and communication clients.

Sure, faster programs would be great! I want it as much as the next person.
Sadly, I don't see it being the Next Big Thing. I hope I'm wrong! I do think
that the present high-performance community is going to get nicer tooling for
parallel programming, and some of that may find its way into your everyday
consumer apps but that will be in forms like Grand Central Dispatch or other
library adjuncts to existing platforms, and therefore not a technological leap
but rather a step in the right direction.

Polling files for modification is an example of not taking advantage of
platform capability, where existing performance is "good enough." Most
platforms now have filesystem hooks for modification, and if you want to
improve the speed of gedit you can register an event handler.

------
spudlyo
Configuration management ala Chef, Puppet. The old way of Sysadmins hand
maintaining configuration files on individual UNIX machines simply won't cut
it in the era of easily provisioned ephemeral cloud resources.

~~~
aaronblohowiak
This is already the reality on the ground, not 2 years out

------
simonw
HTML 5. In two or three years time I'm willing to bet a large portion (if not
a majority) of desktop applications on all platforms will be written using
HTML 5 technologies - in particular offline storage and web workers. If you're
an expert with CSS, JavaScript and the various HTML 5 APIs you'll be in a very
good position.

~~~
aaronblohowiak
The barrier to learning HTML 5 for the entrenched HTML/CSS/JS talent is too
low and the migration path is too clear that being an "HTML5" expert will be
notable enough to provide exclusive niche marketability.

------
yummyfajitas
AMQP/RabbitMQ. I explained in more detail why in this post:

<http://news.ycombinator.com/item?id=1006208>

------
coderdude
A couple things off the top of my head: -Web apps geared towards the emerging
thin-OS netbook market

-I think Android apps will kick off in a much bigger way this year with more people getting their hands on phones supporting the OS. It's still early enough to get into IMO.

-On this page Veera suggested the Semantic Web, and while I have big hopes for the Semantic Web (I should, I blog[ged] about it), it's just not going to mature into what I think you're looking for in the amount of time you're looking for.

-Tichy suggested Data Mining, which I'm getting very into as of late, so maybe I'm not in the best position to give an impartial opinion but I think that will be taking off more. :)

Edit: fixed linebreaks

~~~
raju
So what would you recommend be the first step in learning about Data mining?
Any resources/books you can suggest?

[I figured since you just got into it ... ]

~~~
nerme
Learn some linear algebra. Literally every how'd-they-do-that-technique from
OCR to PageRank is based on putting data in to a matrix and extracting some
eigenvectors.

I took LA in college, but my professor was no where near as good and on topic
in relation to computer algorithms as this guy:
[http://ocw.mit.edu/OcwWeb/Mathematics/18-06Spring-2005/Cours...](http://ocw.mit.edu/OcwWeb/Mathematics/18-06Spring-2005/CourseHome/index.htm)

Oh, check out R. There are a lot of free papers and example code showing say,
how principal component analysis on a dataset of migratory geese can be used
to explain so and so, along with generating some pretty pictures.

It's a really nice bit of OSS!

------
Tichy
Data Mining

~~~
sga
Any thoughts on the best resources for learning data mining ie. can anyone
suggest must read sites, blogs, textbooks on the subject? Thanks.

~~~
hack_edu
Ben Fry's dissertation 'Computational Information Design'
[<http://benfry.com/phd/>] is a great start. He breaks down many of the skills
needed to succeed at information design and processing.

He's recently released _Mastering Data_ with O'Reilly, which is essentially an
expanded second addition to his dissertation.

~~~
evgen
I think you mean _Visualizing Data_ and not Mastering Data.

------
skip
Coding for GPGPU or stream processors

~~~
andrewcooke
i'm doing this now, and it's a heap of fun (if you like fiddly coding
challenges :o). but it's hard to see how this or the next generation will be
the next big thing. it's still necessary to have a "suitable" problem, and the
next generation (nvidia's fermi), while improving things enormously, is still
very much _not_ a general purpose chip (i was just looking and the cache for a
multiprocessor is going to be 64kB - compare that to the 6MB on my core 2
quad...)

~~~
ramchip
The cache may be small, but unless you're talking about something different -
graphics memory is fast, and GPU clock speeds are moderate, so cache isn't as
critical on a GPU as on a CPU.

~~~
andrewcooke
The trouble is that you have hundreds more processors. So even if the memory
is twice as fast, and each processor twice as slow, memory access is still the
dominating factor in efficiency.

You can work round that by being vary careful, arranging things so that
processors access memory in sequence, which lets reads be coallesced (you're
streaming data from continguous addresses, avoiding the "seek time" of random
access). But that only works if all the processors are focussed on the same
job.

Now you can say I'm just describing the standard problems with GPU, and I'd
agree, but my point is that even in Fermi (which is a huge step forwards in
many ways) these will still dominate. And it's hard to see how most software
fits into such an approach. Hence my warning that they are not becoming
general purpose.

------
timwiseman
The new technology that interests you.

Seriously, several of the technologies emerging now are likely to be big (or
at least grow substantially) over the next few years. But if you focus on the
one you find most interesting it will help keep you focused and motivated
which can be an enormous help. It will also help you enjoy what you are doing
which simply makes the process more pleasant.

------
maxklein
I think that the other commenters here are on to something with Data Mining,
but they are not seeing it right. What is really needed is more than Data
Mining but more of Data Abstraction. I.e, there is a LOT of different data out
there. People have very different needs for it. We cannot know all the
possible usecases for this data. This information has to be abstracted and
simplified so that data mining apps become trivially easy for business types
to write. That is a good area to be in.

If you know enough about dealing with data such that you can be one of the
ones building layers on top of the data, then there is a lot of opportunity
there.

5 years is a reasonable estimation for this. Before then may be a bit tough.

Another important thing to learn are Location Based Services. The problem is
that the applications at the moment have not all been invented yet, and it's
not clear when it will reall trickle down to the Ex-VB6 guys.

~~~
aaronblohowiak
The term "Data Abstraction" already has a different meaning. The
"simplification" of data implies an understanding of the data, and that
modification instills biases in the representation that may limit the
fruitfulness of the data itself.

Letting "business types" write data mining apps is already on the market and
has been for a while. Look into "Business Intelligence" applications. The
verdict is that a) making it easy to do BI actually makes it very complex and
b) making analysis easy does not automatically expose or explain what analysis
is salient.

------
mcantelon
Hardware hacking and DIY manufacturing/fabrication. Mainstream manufacturing
has been offshored, but there are a lot of people working to build a DIY
ecosystem that will eventually allow people to turn around low volume, niche
products faster than offshore manufacturers.

------
nir
IMHO it's more about concepts than technologies - eg, MVC based Web frameworks
were a good concept to pick up a few years ago, whether you chose Rails or
Django or Zend Framework etc, OOP were a good idea to pick up somewhere in the
90s whether you ended up writing games in C++ or banking apps in Java.

I think one of the next concepts to take off is asynchronous (or event based)
programming. It's been around for a while, but only recently (with growing
interest in concurrency) becoming mainstream for non-UI, dynamic language
apps. Good intro here: <http://simonwillison.net/2009/Nov/23/node/>

------
Murkin
I am surprised no one raised the issue of smart-devices. I mean the hundreds
of planned smart, wireless, SOC devices that are prophesied to become deeply
ingrained into our lives in the coming years.

ZigBee/WBAN/RFID and the technology around them.

This fields has myriads of required applications: \- Firmware upgrades \-
Control & Monitoring \- Inter communication tools \- Security and more and
more and more

------
mark_l_watson
I just blogged on my predictions for the hot tech for 2010: wireless,
analytics, modeling, data/text mining, micro business development with small
very focused charge for use web apps, Linked Data, etc.

The thing is, choose something that is fascinating for you, otherwise you
probably don't have as much chance for success. I would suggest going with
what most interests you.

------
JangoSteve
How about the brain?

How the Brain Encodes Memories at a Cellular Level
[http://www.sciencedaily.com/releases/2009/12/091223125125.ht...](http://www.sciencedaily.com/releases/2009/12/091223125125.htm)

If you've ever read Mindkiller or Time Pressure by Spider Robinson, then you
know where I'm going with this ;-)

------
staunch
How about iPhone development? Being early in a gold rush has its advantages,
but there's also advantages to coming in later.

Or Flash? I know it's not loved by very many, but expert Flash developers can
demand a very good salary. They're also very well positioned to develop cool
stuff on new web platforms.

~~~
iron_ball
And, speaking as an Actionscript hacker, the language is very good to work
with. It takes all the best aspects of Javascript, gives them some (largely
optional) Java-like safety/structure features, and puts them in a much more
sane API environment.

~~~
aaronblohowiak
The language is pretty cool, too bad the VM and compiler are terrible.

~~~
blasdel
Ha, I think the language is pretty mediocre (making JS superficially
enterprisey is not a good idea), but the VM itself is pretty decent,
especially on Windows.

The real problem with the Flash implementation is in the native runtime, not
the VM. It just struck me that Flash is a ripe candidate for a Smalltalk-style
turtles-all-the-way-down runtime implementation!

------
spiralhead
Scala! ... specifically using Actors for concurrency problems has been a bit
of a revelation for me ... and the functional parts if you're not already
familiar with FP

... and for practical reasons ... Scala is one of the only academic-ish
languages you can actually use in the real world

------
sriramk
Data. R/Hadoop/Hive/Pig/Dryad, SQL Server streaming, Excel PowerPivot and the
works.

------
HeyLaughingBoy
Sales.

------
tyohn
Mobile Devices - not just the iPhone but think "What would software/hardware
be like if you could carry your desktop machine everywhere - inside your
pocket"

~~~
maxklein
Mobile Devices is a good and safe bet. Particularly cross-platform stuff.
Fragmentation is not going to occur, everyone knows what is at stake and
nobody wants it to happen.

------
antirez
a) Every kind of concurrency stuff. CUDA, threading, Go/Erlang/... light
threads, and so forth.

b) Alternative databases.

c) Real-world scalability.

------
aaronblohowiak
Omap3 and Omap4

------
Veera
Semantic web

~~~
maxklein
2002 called, they want your comment back. No seriously, semantic web has been
talked about for a long time. People don't understand it, don't care and don't
see how it will make them money.

~~~
nzmsv
Google is starting to look at microformat data. All that needs to happen in
order for more semantic tech to take off is the search engines paying
attention.

As soon as a technology translates into an edge on the web, it'll get
implemented. This fixes the monetary motivation problem, and forces people to
solve the other two :)

------
zen53
statistics/data mining/web analytics. Maybe iTablet development

