Robots are still rising. Drones too. Voice recognition for everyone with Siri and the equivalent on Android. Natural language queries in Wolfram Alpha.
Cheap, reliable 3d motion detection via kinect.
3d printing. Tissue printing and a $99 genetic scan https://www.23andme.com/
Self driving cars.
Raspberry Pi boards.
iPads (I had to check that one - the first iPad launch was just recently in April 2010!) and competing Android and windows tablets.
* Or these lists:
* Things that were new and not so well known in 2007, but are big now:
The rise of online education via Khan academy, Coursera and Udacity.
Workable electric cars.
However the io9 link's contents are the f-ing future and they didn't even scratch the surface. Our understanding of biology is getting better and better, automation is finally starting to get wide acceptance both in industry and academia, and there is so much work being done by universities today that was pure science fiction only a decade ago.
The future is sneaking up on us quietly, and once it's here, there's no going back :)
For example, recently one my friend argued that there haven't been too many inventions in the latest quarter of the 20th century and beyond (compared to first three quarters of the 1900s). It took me some time to explain it that having the internet, self-driving cars and computerized prosthetics is fucking amazing. Also, the clip of Louis CK ("Everything is amazing and nobody's happy" ) helped :)
ADD: found the term! Rosy retrospection!
things which are "the f-ing future" aren't on the mass market yet. When they finally are, you could say with some justification that "this is just an incremental improvement of the real breakthrough of 2012" ;)
It is apparent to me that if you had the shekels to throw in a certain direction 3D printing would be one direction. It seems to be hitting that inflection point.
Self-directed tuition is the other: Khan Academy, Coursera et al., cheap scalable global education. Total game changer. When has this happened before? Never.
I think the year that a new paradigm-shifting technology is first conceived and then developed, and then implemented on any scale within a single year will be a time to celebrate the end of humanity and the peak of the singularity!... (if it ever happens)
Maybe TechCrunch should step away from the desk/SF and put more reporters in other cities on the ground, instead of relying on so much inbound pitching? Technology, used loosely, is becoming an everyday aspect to many businesses so maybe TC also needs to reconsider their editorial position, too. 
There are tons of other companies in cities around the US (and world) less interested in getting caught up in the SF noise. Spending PR budget to target TC just for some ego exposure amongst select group of peers at the cost of a less targeted audience isn't a wise decision for most.
Austin has quiet a few smaller but sustaining tech companies doing pretty interesting things on a regular basis; $AWAY, $BV, $SLAB, Chaotic Moon, Mass Relevance, and numerous other fresh startups like Outbox  and DailyDot.com. I'm sure same is true for other cities.
Don't I have the USA Today and Mashable for this? Did AOL make this decision?
 Saving the USPS. http://www.nashuatelegraph.com/business/988285-464/austin-st...
Thanks for finally waking up Michael.
The companies that are doing actually interesting things are using technology to make non-tech stuff easier/better/faster/brighter: Uber, AirBnB, Zipcar, that female-led startup connecting farmers with buyers, etc.
Technology for technology's sake is done and boring and short-sighted. We've got news aggregators that aggregate other news aggregators. Mobile ad platforms that resell ads from other mobile ad platforms. Photo-sharing apps whose entire purpose is to create filters for other photo-sharing sites. And it's all ad-supported.
Arrington's right: it's all just the same thing over and over, and it's boring. And I never thought I'd agree with Arrington about anything.
so enjoy getting bored, because after singularity you will only be perplexed and petrified. (unless you are upgraded or modified)
> I love how Arrington, Thiel, Graham, and others spent years talking about how you don't need any college, promoted quick-buck social media startups with no plan, and kids fresh out of high school, and now they are sad there is no flying cars. [...]
And Mr. Arrington is bored? Maybe he's just not paying attention.
Every decade has had a game changer who came in from no where. Who is going to take away the 201x?
Personally I think we're on the edge of a revolution, with the rapid shrinking (and increase in power) of computers, and Google boldly pushing into some pretty deep AI.
Intelligent personal assistants. Self driving cars. Wearable augmented reality.
There's going to be a lot of exciting stuff to write about in 2013.
Don't want to sound too cynical here, but give me a break. The health care industry has resisted technology for 2 decades persistently, and any radical revolution in medicine will have so many public hurdles beyond what has already been dealt I don't see anything changing for Average Joes for decades just because of inertia, the powers that be, and special interests.
I mean I still can't get medical care at all without cleaning out my bank account because I'm uninsured. It is personal bias, but I'd rather see that fixed first. $500 - $1000 a month in healthcare is obscene.
There's no exit strategy, though, so VC aren't interested.
I think hardware makers are doing a great job of pushing the envelop, but software not so much.
The question is, what's next for software? At the time Twitter came out, it was brand new...amazing...innovative. I can't recall feeling the same about anything since.
My own personal belief is that embedded software is going to be the next thing, but the problem there is the barrier to entry is a bit higher, both in terms of knowledge base and financial cost. That said, tech like Bluetooth LE is making it easier than ever to try.
I've started hearing that embedded software control via cheap Android tablets ($50-$100)/hardware interfaces is beginning to take off and reducing the barrier to entry. Pretty much, you can use a $75 Android tablet to control an embedded device and drive the user interface. iPads are, frankly, a bit too expensive for most hardware applications.
If I had the energy, I would champion an open-source project for niche hobbyists who want to program a particular type of hardware (assuming their particular hardware has some kind of serial/network relatively-open, documented interface - something like ZigBee but less complicated maybe) but aren't necessarily programmers. Build the project such that the UI and hardware control layer run on Android - and can be abstracted out from the actual hardware later. The UI is not going to be snazzy looking (sliding menu) but reliable looking - like an industrial control but maybe a bit better.
In the end, it seemed the brighter students destined for real programming were relieved to get into a text editor. The "success stories" focused on people who could not otherwise program, but the cost was imposing too much abstraction on the brighter users.
So I think it's a hard problem, and thus a very worthy one to pursue.
Celebrating funding like winning the Startup Lottery is only going to feed the monster of hype vs real innovation.
Is it just me?
Now, this may seem like a very, very minor thing (and it's not unique to Turkey). On it's own it says nothing, but the longer I've been away from the US, the more and more I've noticed just these sorts of little things. It's the sum total of these "little things" that indicate to me that the US has, on a very fundamental level, stopped innovating.
Oh, sure, the US will continue to produce new things. It'll probably even produce one or two big new things in the next couple years, but it's not the big things that drive the innovative spirit. Gather enough smart people in one place, give them enough time and enough money, and you're pretty much guaranteed to get a major innovation (or, at least a driverless car).
You'll also waste an enormous amount of resources. A society that is innovative at its core has only to foster that innovation, in any small way, to get a far greater return on investment. In short: America has a tremendous head start, but America has become complacent. The "next big thing" just might come from somewhere where the people are still hungry.
Geographically-targeted versions of items are custom tailored for desires of local markets. Even if we assumed none of the TP brands in Turkey are multinational corporations, do you really think that the carrying strap idea wouldn't have been nicked by companies that do sell to other markets?
Living in NYC and buying a big pack of toilet paper is an outlier as far as the US is concerned. Anywhere else in the states you would put the pack it in your shopping cart and roll it directly to your car door.
For all we know, those straps could have been designed in the States, and the bean-counters decided adding the straps to the case manufacturing process is only cost-justified in plants serving an average population density of X and above.
I only bring this up because I have, in the past, been working in the States and built better versions of products for sale exclusively in Europe.
Again, it's not that any one particular example stands out. For almost every one of these examples I've noticed, I could come up with a way to explain away how it isn't a difference in innovation. I, too, have thought about the how shopping-carts-to-cars does not necessitate the straps. I have thought that the parking lot lights might not have a significant or measurable enough ROI to justify their installation to US mall owners... but isn't that the point?
It's not that any of these things are needed or demanded or even entirely justifiable from a purely economics-driven view point. But since when has an economics-driven view point led to great innovation? Isn't a prerequisite for innovation the willingness to look beyond simple, straight-forward economic arguments? To be creative for creativities sake?
It's in this way that I mean the US has stopped innovating.
...but, again, it's not any one particular "invention" but rather the whole attitude that's different. The US has gotten complacent and seems to be spending more effort on justifying why it's the greatest than on actually, you know, being the greatest.
But, I understand your point!
As anyone here can attest, innovation is hard -- really hard. I'm a nobody and I see this firsthand in my daily life. I've been working on a side-project for years that I hope will change the world and planned on building a proof of concept prototype last weekend. I expected to be done with the prototype in 2 days. It took over a week. A quarter of the way through construction, the code for the prototype just kept getting more and more complicated and kludegy, even though I thought I had designed it well enough to be straightforward. Although, I eventually had to redesign the whole thing, the final design ended up being much better and more reusable than the original POC I had planned. I was fortunate enough to be in a position where I could spend time to rethink the design. There were plenty of stories in 2012 to blog about where products were released half-baked because they were released too early.
I suspect we will see many interesting things in 2013, much of which has been incubated and polished during the time that Michael thought nothing was happening.
Looking back, I was more surprised by the rise of Google (and Android/Linux) in a world dominated by Microsoft. But at 28, I'm probably just a little older than you.
I'd also argue that the reason the last milestone was in 2007 isn't because of the iPhone, but because that's just before the global financial superbubble popped (foreshadowed by the bursting of the Nasdaq bubble in 02). IMO, that was the earthquake which stymied true attempts innovation. Many people stopped trying to innovate and starting worrying about finding a safe place for their retirement fund (still are), or worse, finding a job.
Shoot me an email u feel like sharing your POC project with a fellow dreamer.
Microsoft dominated the mobile phone space? Ubuntu is doing ok, but I don't think it is taking the productivity market by storm. I am more impressed Apple kept their propaganda campaign for OSX going so strong that it became its own self perpetuating must-have device trend for no other reason than shiny and street cred for 99% of buyers. I'm 21. When I was still in middle school RIM was in its prime.
> IMO, that was the earthquake which stymied true attempts innovation.
Tech investment has barely slowed down since then. A lot of us are still out of work because society hasn't adapted to the still continuing trend of "productivity is high enough not everyone needs to work 40 hours a week to prosper" but the tech sector is still there. The real reason for the innovation slowdown is extreme patent trolling and abuse in the US. The IP laws here, especially software patents, are ruining potentially revolutionary tech with this systemic abuse.
Thiel said that most of our vertical progress comes from Silicon Valley, which I agree with. But I think SV also has a lot of horizontal progress, which is what Arrington is complaining about. Everyone and their mother is working on a social app, and every software company is going apeshit about mobile. Granted, tech news coverage is probably skewed in favor of horizontal progress, since vertical ideas by definition are so brand new and "out there" that most people would dismiss them as crazy. Still though, it would be great if someone came up with the next technology as innovative and groundbreaking as the iPhone or, to a slightly lesser extent, the iPad.
You'll never be bored again.
Be advised, however, because even if you manage to identify one such sharp pain point, and even if you have a solution, only rarely will you successfully be able to repackage it and market it to the masses. But to put in perspective, if you're working towards an affordable solution 99% of the world is compelled by extreme pain to use, you might very well be working on a billion dollar business.
The question is, when are more startups going to realize that the biggest pain in the room is as clear as day? It's right in front of every single one of us. It's only a matter of time before people in massive numbers realize that 24/7 surveillance of all telecommuncations isn't fucking cool. People are already flocking to VPNs to get around downloading restrictions, so what happens when people realize they need a VPN to send a private email? Here the truth is contagious, and government elements can only repeat lies successfully for so long. The problem of the surveillance state is the very definition of sharp pain:
Compared to the markets for cosmetic surgery, real estate, knee pain, back pain, ANY market you can think of - the prospect of having all of your telecommunications stored indefinitely if not monitored in real time by regimes teetering on the edge is an order of magnitude more concerning. When you consider that a 24/7 surveillance state by definition constitutes neverending pain, it's really no contest.
Unwanted, unconstitutional surveillance measures are creating a seering hot, absolutely intolerable pain for the entire human population. This is a pain that demands a solution, ASAP.
As a startup and as a developer, there are very few pursuits more worthy of your time than furthering human rights and averting absolute tyranny.
Curiously, thanks in large part to Bitcoin, the startup community now has the power to fund itself anonymously and innovate solutions that actually matter, without outside interference.
The clock is ticking.
But I never hear about what the harm is except for the exceptional cases. Anecdotally, none of my aquaintances has had a single adverse life event that can be attributed to these trends.
On the other hand, adverse events related to money/jobs, health issues, issues with intimacy and social isolation, being too busy: these are daily occurrences.
I'm bringing this up here just because your post reminded me I wanted to run this question by the HN community.
The funny thing is that techcrunch is one of the most guilty sites when it comes to writing about these things.
We didn't get flying cars at $250,000/car. We got 140 characters for free because that's what people wanted. We didn't get hoverboards, not for lack of technology, but it just turned out that we wanted to search the world's knowledge with Google.
We're frogs in boiling water, not knowing just how innovative we're becoming as a people. And sci-fi writers are just bad (or as bad as typical entrepreneurs) at guessing what people want.
Predictions are uttered by prophets (free of charge), by clairvoyants (who usually
charge a fee, and are therefore more honored in their day than prophets), and by
futurologists (salaried). Prediction is the business of prophets, clairvoyants, and
futurologists. It is not the business of novelists. A novelist’s business is lying.
Seems like a case of self-imposed blindness. Narrow down your field of vision until you don't see anything exception startups oriented around mobile, iOS, social media and then complain that you're not seeing anything revolutionary. Tablets are arguably the biggest change in personal computing since the internet reached consumers. To complain things are boring is bizarre.
"Whenever you find yourself on the side of the majority, it is time to pause and reflect." - Mark Twain
So what's next? Disregard the mean and look for the outliers on both sides. Those are the true survivors, the extremophiles. When the mean shifts, they will have already adapted to the new environment, have succeeded, failed, persevered and flourished. So go there, fight and thrive because the future favors its creators, the extremophiles.
Easy to point the finger outside, hard to point it inside.
Overall sounds amazing, yet for me the thought of no longer needing to buy objects or clothes is a bit disconcerting. Our economies are struggling already. As tech continues to evolve alongside population increases, I have a minor worry about how the majority will sustain themselves?
1. You would start buying designs instead of products. You can go shopping for them, just on a meta-level kind of way.
2. You are able to change designs, get designs changed by others (you can get at a shop, maybe), you can share designs with others (social network of things?), you can communicate about these designs, and so on. Stuff can be personalized on a micro and macro scale, whereas now it can only be personalized on a macro scale through combining existing items. With 3D-printing you can personalize the items, too.
3. There will be a growing market for hand made stuff, albeit more exclusive. I think the first couple of decades the distinction between printed clothing and clothing made from fabrics will be clear. That means that traditional clothing will keep its value while printed clothing will be valued less.
If anything, shopping will become a more immersive activity, I think. It'll change from a materialistic activity to a more service-oriented activity.
Pg, time to tweak the algorithms again.
Build something yourself.
The people who are not bored.
If you're running a toothpaste factory, you don't need vision. You just need competent execution. You need people to show up and follow the plan without complaint. This kind of work can actually be managed. If you can cut costs without compromising on quality, you do it. It's not about vision. That's already solved. Vision was relevant long ago when someone figured out how to make toothpaste. Your job is just to keep abreast of competitors and seek rent.
VC-istan is a postmodern startup factory. It's technically not a company, but as a tight social network, it functions as one. VCs, rather than properly competing, talk to each other and agree on who they like and who they don't. VC-istan's "serial entrepreneurs" are just glorified PMs whose egos make them unemployable. The real bosses are the VCs. They want quick exits, as they should, because that's how the incentive systems that govern them are set up. The "tech press" are a sort of HR organization. Entrepreneurs are PMs, often mediocre ones, and engineers are chumps paid in lottery tickets. This is just a big company that has managed to dress itself up as a thousand small ones that happen to be all controlled by the same people.
Now you have an ecosystem of commodity entrepreneurs hiring commodity engineers to implement commodity ideas. Ok, nothing to see here. Mobiles skwrking, mobiles chirping, take the money and run, take the money and run, take the money...
Is it any wonder that this isn't producing innovation? It shouldn't be. Yet VC-istan is doing a lot better than most large companies do. It seems inevitable in large organizations (including economies) that the resources gravitate toward players who don't have much in the way of vision. Innovation is the exception. It adds variance. Given that we're social animals who judge one another based on reliability (low variance/minimal performance) rather than capability (expected return) it is socially dangerous.
How many people have the talent and the resources? I'm sitting on +4 sigma talent but have no money. The people with boatloads of money seem (with a few exceptions) to be lacking in vision (which means I can't tell if they have talent, but I have doubts). I can't say that I blame them. Why take risks if you have no need to do so?
What we have now is a generation that's used to technical progress and wants to take part. We have people who have been working their asses off since age 4, are now in their 20s and 30s, and want to take part in technical innovation. Most of them can't, because there's so little of it actually going on, and because most work activity is bullshit oriented toward keeping one warlord boss's status high at the expense of another's, rather than being invested in true progress. That's depressing. It creates a malaise. A deep sense of ennui. Yawn, another fart app. We now have an unprecedented number of ridiculously talented, over-educated people saying, "Dude, where's my machine learning job?"
I think that 2013 will see the beginning of a Flight to Substance, and if I'm right, that will put that talent to better use than fart apps and toilet check-in services. I don't know how it will play out. I have no clue who will fund it. One sign of this is the increasing clamor for Valve-style open allocation. By the mid-2010s, you won't be considered a real tech company if you're running closed allocation (take heed, Google). If I'm right, that will help. That will help our generation work its way toward excellence. At least, some of us will go in that direction. Others will go off into the weeds of fart apps. May the market reward both crowds justly.
Here's how we fix tech.
* Open allocation. As long as the work is relevant to the company's needs, let people work on whatever they want. This enables native growth of technical talent. You don't have to poach qualified people with ridiculous signing bonuses. They quickly find a project that fits their skills and interests, and they actually improve while they work for you. Imagine that.
* Stop fetishizing either extreme of company size. Not all large companies are bad, not all startups are good. Nor vice versa. If your 50-person startup is running closed allocation with typical HR policies, then it's just a big company that failed to get big and it should be considered a massive joke. I've heard of people getting turned down for transfers in 20-person companies because of "headcount" limitations. If you want to work at a startup, then drop that shit and work for a real fucking startup.
* Demand work on hard problems. Don't build someone's fart app for 5% equity. If you're in the press, don't reward stupidity either. Instead of cheering on idiots who get acquired for outrageous sums, ridicule them.
Intellectually, I know there's a huge difference between a programmer who has 99th percentile programming skills and one who has 90th percentile programming skills. Being a non-programmer, I don't have the tools to easily tell the difference.
This means my base instincts will tell me to work with the person who is a good programmer and speaks confidently and clearly, vs. the exceptional programmer who's shy and mumbles. That's because as far as my brain can tell, they both have the same degree of programming ability.
I work very hard to solve this bias. I make sure to focus on people's strengths and ignore weaknesses that aren't obvious dealbreakers. I try to maintain wide social circles and ask my most successful friends what standards I can use to judge skill in different areas. I pay a lot of attention to how other people in their industry judge them, vs. people in general.
What I've found is that there are two kinds of people who are really diamonds in the rough. One is people who are generally smart and have shown the ability to excel in a lot of fields, but haven't had a lot of experience in the field you're talking about. Think about a recent graduate who has an exceptional understanding of History and Psychology and is an excellent tennis player. This kind of person will usually learn very quickly and excel at jobs that demand a high ratio of problem-solving ability to knowledge.
Another kind of diamond-in-the-rough is someone who is admired by people in their field but somewhat disparaged by others. A good example would be a Salesman who always exceeds his quota but struggles with basic math. People who find math easy will typically think of this fellow as stupid, even if he's a genius at what he does.
Pay attention to people like these. It's all too easy to make a mistake like "If someone can't do math/can't speak coherently/failed high school/doesn't understand technology/uses Internet Explorer, then they must be stupid." In my experience this is almost never true-having a weakness doesn't mean that someone doesn't have a certain strength. Focus on the strengths and you'll end up working with (and hanging around) much more interesting, much more diversified people.
Is this a major business problem for you? Are you trying to solve it using actual money and process or just winging it really hard every time? There are known programmers who are really good (USA Computing Olympiad) and some of them will be articulate and you could find someone who's a good fit and hire them as a consultant to help tell the difference... though that's only off the top of my head.
Every time I hear somebody describe a "big problem" they've been "trying very hard to solve" I wonder whether they've focused on it enough to (a) step back and think about possible tools to make the job easier (b) resort to professional specialization (c) make it the job responsibility of a particular person or (d) spend actual money.
So, maybe this will come as a shock, but "computing olympiad" success doesn't correlate with success as a professional programmer. There's so much more than raw intellectual horsepower to being a good team engineer that it can't be captured with any one test.
Which is to say, the parent is right. Identifying good programmers is a hard problem. Harder than identifying people who do well at coding contests.
IOI and similar programming competitions do not select for good programmers. They select for people who can solve small algorithmic problems efficiently in a short time period with throwaway code. There is some overlap, but also a lot of perverse incentives. In professional programming, the tortoise usually beats the hare.
(I have competed at a national level and won a national college-level programming competition myself, and don't really take it all that seriously.)
I'm not just trying to find talented programmers. I'm trying to address the more general problem of figuring out whether somebody is good at X. You're right that the easiest way to judge on a case-by-case basis is to ask known experts. But since this is/was a cognitive bias of mine, I'm trying to overcome it myself.
I have spent a lot of time, done plenty of reading, and spent my own money on this problem. Like you said, I actually have paid experts to give me advice on how to identify talented people their field. More frequently though, I'll network and ask in an informal setting. And as a result, I have managed to severely correct several biases I had, such as thinking that anyone who couldn't do basic Math must be not be very bright.
One thing that's relatively easy (i.e., only took me a few months of effort) is learning how to identify generally smart people who learn fast. (I would actually divide this into learning social systems quickly vs. learning technical systems quickly.) If two people are entering a field or starting a job at the same time, I can generally predict which one of them will learn faster after a 30 minute conversation. The biggest surprise was that very smart people can be catastrophically bad at things "normal people" would consider basic-like eating spaghetti or not offending their interviewer (or even realizing they'd offended someone.) Before I tended to assume that fast learners would be good at most things and exceptional at a few things.
A more challenging problem: say I meet a lawyer, or doctor, or anyone with a lot of experience in an area I don't know much about. How can I tell if they're genuinely good at what they do? So far, general intelligence + percentile-based accomplishments ("I boosted revenues by X which is better than 95% of marketers...") + asking them to walk through a real life situation has been my best predictor. I'll validate this by asking known experts afterwards or looking up the person's career track.
Even so, it's not a particularly good predictor, and asking a real expert works much better. But by looking at actual data, even if it's relatively weak and anecdotal, my ability to accurately judge competence has become much stronger over the last few years. I think this is something that most businesses could benefit from, but the more common response seems to be throwing up their hands and complaining about the lack of qualified people.
Edit: when I said "how we judge talent is actually one of the biggest problems", I should have specified "we as a society." It clearly came across as "we as a company", sorry about that.
People want a programmer who can only juggle 3 items, but does it so well they are juggling chainsaws. That's easier to identify - it's their production (portfolio = chainsaws) and the quality of their production (referrals = not missing any arms). If you can verify that people coded what they claim they coded, it should be fairly easy to tell who the good programmers are.
If you're having trouble telling who the good ones are, you haven't found any yet.
To extend the analogy, there are plenty of jugglers who are capable of juggling 3 chainsaws, but find that they can only get paid to juggle 2 plastic balls over and over again. Variation in their craft is punished, not rewarded. If you only look at whether they've juggled chainsaws before, you'll miss out on a lot of great talent.
I've seen many, many people who are much better than their portfolio or past accomplishments indicated-especially people in their 20s who've just happened to work at crappy companies and end up on bad projects.
Someone who's never juggled before may not be capable of juggling three chainsaws at first, but you can look at their athletic record in other areas, examine their flexibility, coordination, and willingness to practice, and use these to predict whether that person will be able to learn how to juggle chainsaws. How do you figure out which of these areas matters? By asking existing expert jugglers.
This hits too close to home. I am trying to figure out what to do about it.
In my case it came down to the realization that the standards people use to judge your skills are very arbitrary. Once I went out and got a few certificates I started getting job offers. Nobody cared about the fact that I was smart enough to learn SAS in two weeks or make my previous employers large sums of money, they only cared about the fact that I had the certificate. C'est la vie.
In every industry there's some sort of bar past which people start treating you like a real person. Getting past this bar often has little to do with merit-think having a degree from a "prestigious" college or having exceedingly specific previous experience. Try to figure out what that bar is in the field you want to work in and how you can show that you've cleared it. Programming is a diverse field but a few bars I've seen people use (not saying I agree with these) are "has experience in Ruby/Python/Functional Programming/my favorite language", "has contributed significantly to open source", "recommended by someone I know who is competent", and so on.
The last one is IMO the most valuable bar to clear. The more you can get out and know people, and show them that you're competent, the easier it is to get better jobs and better projects in the long run.
Hired to be a programmer. Showed up and they decided I was going to be a business analyst. I've got significant experience in .Net, Scala and NLP but I can't even get companies to reply to my applications. I despise my job but I can't quit yet and I can't find another job. I'm working on side projects to build up my portfolio but I right now I just really hate waking up and going to work each day.
My advice: keep learning cool stuff, go to networking events (e.g. meetups, especially related to NLP, machine learning, and Scala) get some job leads there. Cold-emailing your resume rarely gets you anything, especially when you're in a career sand trap. But you have a skillset that, if you're strong in what you've cited, is desirable.
You will eventually get a feel for how much investment you need to put into your day job to keep it, and you can use the remainder of your time (which may be 20-35 hours of your work week) to keep current with the skills you want. If you're writing code on company time, be careful and make sure not to use it for any closed-source purpose, because you don't want your employer asserting ownership.
You can get out of the sandtrap but you'll have to break the rules to do it. Stealing an education from a boss feels dirty when you're young and naive, but it's a necessary survival skill and, in a world where bait-and-switch hiring is common as dirt, not at all unethical.
I do extremely little for my job, so little I honestly started to wonder if I was missing something. Then I slowly realized my coworkers are morons. A few weeks ago, I was asked to create a directory structure for a bunch of incoming data files, a few hundred directories all told. Another person was tasked to do the same on a different server.
I spent about ten minutes on it because I wrote a shell script. He did it by hand and spent all day. Which, of course, is what I told my boss it took me as well. I just happened to use the rest of my time to read Akka in Action.
I can't promise equity in a company, because this idea was looking more towards being a lifestyle business. Revenue/profits of operation would thus be the thing at stake.
But I agree with your basic point. If you can get good at spotting talent that has "flaws" that would disqualify them from the more typical elitist hiring mills, then you have a huge advantage.
What it comes down to is that the engineers are better suited to evaluate engineers. Obviously, you've got to have final say as leader of the company, but the engineers are your best asset here. This is why good VC's always have a luminarie in their back pocket to send out to evaluate tech before an investment. Even technically competent VC's would rather send an experienced database developer to look over a new DB company rather than rely on their now 15 year-old experience.
Depends what you're building. If you need cutting-edge programming expertise or knowledge of computing's deep magic, then yeah.
If you're building a CRUD database or a local-mobile-social app, you don't need the expertise that much.
This I disagree with. Causes of success and failure are hard to tease out, but people know when software sucks. Also, if people can't use it, then all the cleverness in the world (in optimizations, for example, or in feature set) doesn't matter. It still sucks.
But the broad middle is a muddle. People with talent that has been squandered by circumstance, people without talent who have succeeded due to accident and luck, etc.
That's very true.
This is part of why I think Valve's self-organizing open allocation is superior. Management rarely knows who the best programmers are. The group of programmers, if they're good, usually can figure out an appropriate leader on a per-project basis. This doesn't generate the permanent, entitled leadership that management wants to see, but it gets the job done.
You also need to create room to fail. If the software project can't be saved, let it die so people can allocate their talents and energy to something that has better odds. Give the architects respect for trying and ask them to write up the challenges they encountered, so as to keep the knowledge in house. If you fire people for failing projects, then you lose that knowledge and will probably fail in the same way again.
You talk about software that sucks. However, one of the quirks of software is that sometimes even when it sucks it can succeed and even when it doesn't suck it can fail. For example, technologically google wave didn't suck, but in terms of actually providing useful features for people that justified its use, it didn't have a leg to stand on. Then look at wordpress, which started out sucking but because it was open and because it had developed a strong community around it ended up getting better and better to where it was finally sort of decent. Or look at PHP. As a language it definitely sucked at the start, and there's a strong argument to be made that it still sucks. But it is perhaps the most popular language for web development in history.
Most software is even more difficult to determine success or failure with because even though a software project might not be a success immediately it could be a success down the road. Another case in point would be the Mozilla Project. At the outset it sucked, but eventually it became pretty awesome. How much of the awesome of today's firefox is rooted in the code from the early days of mozilla and how much is due to subsequent dev. work? How do you tell the difference between software that sucks because it is rotten through to the core and software that sucks because it has a layer of crap on top of awesome internals?
And then how do you track everyone's contribution to software? Sure you can keep track of commits, but that doesn't track inspiration and ideas. Sometimes the fundamental design or mechanism for a given piece of software will mostly be due to a different dev. than the one who implemented it in code, and often there is no paper trail whatsoever that that's the case.
Ultimately there's no objective way to measure either talent or success except in the extremes. Some people's subjective estimates can still be reasonably accurate though, but usually it takes a talented and experienced dev. to be able to judge another dev.
So, if people have an unfailing sense of good and bad software, which they might not, it would be eventually consistent at best.
You're also right that individual talent is, for the most part, impossible to measure.
Valve's ideology is "We hired you, we trust you." That doesn't mean that they give new hires all the keys, but the going assumption is that anyone who gets in is a competent adult who doesn't need to be restrained with typical, military-style subordination. If you're going to hire someone, then trust that person with his or her own time. If you can't, then don't hire.
Most companies grow too fast and end up hiring before they trust. This causes a loss of focus, because they need to generate busywork projects for the new hires, but it also creates a dynamic where there are Real Fooblers (for a company named Foobar) and everyone else, and the company has no problem generating a bunch of shit work for the "everyone else" category so the Real Fooblers can work on the fun stuff.
Talents of leadership and architecture can be assessed later on, but everyone worth hiring should start out with the basic right to direct her career and, when the time comes, prove herself.
When I say that it's not legitimated, I'm not saying that it's illegitimate philosophically (although I think it is) but that such managers do a poor job of convincing other people that their power is legitimate.
In the military, most people buy in to the rank system. A major component of the abuse inflicted in basic training is to tear someone's ego down and build the person back up again as someone who can take orders. The result is that a lot of them come out of the process genuinely believing that the commanding officer's power is legitimate. In most companies, nothing happens to convince the grunts that the managerial power is legitimate, since most managers are puppet leaders rather than the leaders that the group would pick.
Don't they convince their shareholders? These companies keep making it big even with tremendous overhead of useless management and they seem to do it by pitching investors and locking down markets.
Then you shouldn't. Period.
EDIT: It is unfortunate that the types of people (quick learner, genius at a subject) are not considered for engineering jobs usually because the interviewers don't know how to interview. They ask stupid technical questions instead of having a technical conversation and what code on a whiteboard.
Why not do calculus with an abacus?
Not to mention the problem of people who make hiring decisions having no idea of what the problem they need to solve is and hire unqualified people under false pretenses. The old bait and switch.
I am planning on putting up a 'Proposition HN' in a week, with this exact thing in mind. I want to pay talented HNers to work on their dream/vision/side-project in exchange for participation/equity.
Watch this space.
trumanshow.. I just created a throw-away gmail account:
drop a line there and I'll email a link to the HN post once its fleshed out and posted.
+4 sigma for certain things, not overall. It's not even possible to measure "+4 sigma overall intelligence" and if it were, I doubt I'd be the one who has it. But there are certain subdisciplines where I'm in that range.
I say "+4 sigma" because for the vast majority of companies I've observed, I could run them better, and they seem to at least think they have 2-3 sigma talent.
If we accept that the people who actually are in charge are +2 to +3 sigma minds, then I'm easily +5 based on the difference between me and them. If we assume they're idiots, then it'd be generous to give me +3. The reality is probably between the two.
~ = snark, allow me one snark ! :)
This would have been great when I had time off to work on my compiler back in the summer. Oh well, the algorithm still had/has bugs in then anyway.
I'm actually reasonably happy with where I am, because I have a career strategy that I think will work. Startups aren't the only path and, if you have no connections and are going to be "just a programmer", they're not always even a good path.
Most successful entrepreneurs started in finance because, for better or worse, that's a way to build credibility. Remember that VCs are also financiers and will be biased in favor of that experience, even if it's not relevant to what most startups need.
Attempting to see if I can create that niche (data science) in a large company, first for myself. If so, then things will go really well. If not, then I'll probably be getting back into "regular tech" mid-2013.
The resulting problem is that most of these companies do not really care about pushing technology forward or contributing relevance to the technological sphere - just figuring out the right arrangement of features to make the things not completely fall apart. Just as an anecdote during my career Ive worked with 4 Phds (Stanford, MITx2, Brown)... in each one of those cases they were essentially very educated software implementers - management would dole out tasks to myself and them that were just typical... we need an ad unit here, an app for this, analytics system for this, landing page for this. I don't criticize the Phds I worked with in that Im sure they were trying to pay their bills like everyone else but it was disheartening that our current system/environment had allowed such great minds to be puppets of management who were the least qualified in the room to be working with technology.
Notable quote that I agree with somewhat.
If we start with a Beta(2, 2) prior on the percentage of CEOs who are insufferable, egotistical fucks, we get a Beta(4, 2) posterior. So the 95% confidence interval on the percentage who are insufferable, egotistical fucks is [28.4%, 94.7%].
We started a foundation that builds web and mobile services, on open source software, which are used to fix poverty. This area was essentially not considered a market when we started, so the threshold to get started and recognised for doing something useful was relatively low. As a non-profit foundation we could also take investments from those that wouldn't invest in your ordinary tech startup.
We aren't doing this work to get rich, but there are a lot of other benefits. We have very patient investors and we have a lot of freedom in implementing our vision as we see fit. It is also pretty amazing to be working on software and services which actually make a real difference for those that have it the worst.
One of our core services, a mobile phone field survey application, have been used recently to do baseline surveys for all public water points (wells etc) in both Sierra Leone and Liberia. Providing data that just didn't exist before. 30 people go out on motorbikes for 3 months to collect data and come back with tens of thousands of surveys. This data is then used to drive national policy on rural infrastructure improvement. This stuff makes a real difference. And we have agreements in the works which will make this type of data collection possible all across sub-Saharan Africa, with our tools that are used by governments, NGOs and multi-lateral organisations such as UNICEF.
It feels like working on stuff that matters. It is both technically interesting and there is a lot of work to be done. I am sure that in education, healthcare, government, public infrastructure and other areas there is a lot that you could think of that could be improved, which you could get unconventional funding for to do.
Basically the point of the article is, let developers choose their projects and don't force them on them, they will choose it better and make it much more likely to succeed.
One thing hthe OP surely got is the title of the worst article of 2012. The year is almost over, and he aside the bar so high that now is almost impossible to take this for him.
Now back to the point. Innovation is not some new shiny thing that no one has seen before. Instead, real innovation is something you work on until itself is the definition of perfection according to someone's vision. Thus iteration is the mother of innovation.