Hacker News new | comments | show | ask | jobs | submit login
I’m Bored. What’s Next? (techcrunch.com)
151 points by nhangen 1784 days ago | hide | past | web | 136 comments | favorite



* What's here that wasn't here in 2007?

Robots are still rising. Drones too. Voice recognition for everyone with Siri and the equivalent on Android. Natural language queries in Wolfram Alpha.

Cheap, reliable 3d motion detection via kinect.

3d printing. Tissue printing and a $99 genetic scan https://www.23andme.com/

Cheap SSDs.

Self driving cars.

Raspberry Pi boards.

iPads (I had to check that one - the first iPad launch was just recently in April 2010!) and competing Android and windows tablets.

github.

* Or these lists:

http://www.buzzfeed.com/donnad/27-science-fictions-that-beca...

http://io9.com/5971328/the-most-futuristic-predictions-that-...

* Things that were new and not so well known in 2007, but are big now:

The rise of online education via Khan academy, Coursera and Udacity.

Workable electric cars.

Twitter.

Arduino boards.

git


Biggest innovation in my mind is the concept behind KickStarter. Literally revolutionizing marketing, fundraising, and the product development process.


I think that everything you directly mentioned is incremental improvements over the last decades (electric cards, SSDs, 3d printing, tissue printing, DNA/RNA microarrays, 3d motion detection, RaspberryPI and Arduino [which are more mass production and extra firmware plays than anything], etc).

However the io9 link's contents are the f-ing future and they didn't even scratch the surface. Our understanding of biology is getting better and better, automation is finally starting to get wide acceptance both in industry and academia, and there is so much work being done by universities today that was pure science fiction only a decade ago.

The future is sneaking up on us quietly, and once it's here, there's no going back :)


I think that understating current era achievements is some sort of cognitive bias. Quite probably it even has a fancy name that I don't know of.

For example, recently one my friend argued that there haven't been too many inventions in the latest quarter of the 20th century and beyond (compared to first three quarters of the 1900s). It took me some time to explain it that having the internet, self-driving cars and computerized prosthetics is fucking amazing. Also, the clip of Louis CK ("Everything is amazing and nobody's happy" [1]) helped :)

[1] https://www.youtube.com/watch?v=mRnzZZw84v0

ADD: found the term! Rosy retrospection!

http://en.wikipedia.org/wiki/Rosy_retrospection


The future doesn't happen overnight; everything is an incremental improvement ;) SSDs are designed as a drop-in-replacement for spinning hard disks. i.e. they work in systems designed for spinning hard disks. Systems adapted to how they work takes time.

things which are "the f-ing future" aren't on the mass market yet. When they finally are, you could say with some justification that "this is just an incremental improvement of the real breakthrough of 2012" ;)


Out of all those things you listed, self-driving cars and maybe - just maybe - 3D printing are the only truly exciting things, at least in terms of their potential to be disruptive.


I think consumer robotics is a massive market that needs a breakout in. I can easily see people throwing away life savings, a double mortgage, and their cars to get a personal butler droid that cooked, cleaned, did dishes, mowed lans, etc. Maybe even more importantly, the wealthy elite class would throw millions away to replace servant crews. It doesn't need to be one machine, it doesn't need to look human, you could probably control it with your phone rather than finicky voice recognition. I think the technology is already there, nobody just took the initiative.


Sorry for the me-too comment.

It is apparent to me that if you had the shekels to throw in a certain direction 3D printing would be one direction. It seems to be hitting that inflection point.

Self-directed tuition is the other: Khan Academy, Coursera et al., cheap scalable global education. Total game changer. When has this happened before? Never.


Geesh, only two major paradigm-shifts emerging in one year. How tedious!


Neither of those are from 2012.


I think the argument is that they've been getting into the mainstream this year...

I think the year that a new paradigm-shifting technology is first conceived and then developed, and then implemented on any scale within a single year will be a time to celebrate the end of humanity and the peak of the singularity!... (if it ever happens)


I like what has happened with internet radio. But opposite of boring is exciting. I still couldn't get myself excited with the present though.


One thing that's fun to do when the present loses appeal is to envision the future. Write out in a journal page what a day in your envisioned future is like.



What a meta post. Arrington complaining about a problem his site perpetuated by popularizing consumer tech. [1]

Maybe TechCrunch should step away from the desk/SF and put more reporters in other cities on the ground, instead of relying on so much inbound pitching? Technology, used loosely, is becoming an everyday aspect to many businesses so maybe TC also needs to reconsider their editorial position, too. [1]

There are tons of other companies in cities around the US (and world) less interested in getting caught up in the SF noise. Spending PR budget to target TC just for some ego exposure amongst select group of peers at the cost of a less targeted audience isn't a wise decision for most.

Austin has quiet a few smaller but sustaining tech companies doing pretty interesting things on a regular basis; $AWAY, $BV, $SLAB, Chaotic Moon, Mass Relevance, and numerous other fresh startups like Outbox [2] and DailyDot.com. I'm sure same is true for other cities.

[1] http://techcrunch.com/2012/12/31/seven-apps-that-will-keep-y...

Don't I have the USA Today and Mashable for this? Did AOL make this decision?

[2] Saving the USPS. http://www.nashuatelegraph.com/business/988285-464/austin-st...


I suppose you're right. It's a self-fulfilling prophecy. For years, they wouldn't fund or write about anything that wasn't a me-too product, and now he complains that there's nothing but me-too products.

Thanks for finally waking up Michael.


Everyone in the tech world is going bananas over the same stuff. Technology helping technology: things that make Facebook easier/harder to use, things that make software development easier/harder to do, mobile this-that-and-advertising.

BOOOOOR-IIIINNNGGG.

The companies that are doing actually interesting things are using technology to make non-tech stuff easier/better/faster/brighter: Uber, AirBnB, Zipcar, that female-led startup connecting farmers with buyers, etc.

Technology for technology's sake is done and boring and short-sighted. We've got news aggregators that aggregate other news aggregators. Mobile ad platforms that resell ads from other mobile ad platforms. Photo-sharing apps whose entire purpose is to create filters for other photo-sharing sites. And it's all ad-supported.

Arrington's right: it's all just the same thing over and over, and it's boring. And I never thought I'd agree with Arrington about anything.


That startup is Vegetality. And why would it matter if it was started by a woman in this context? (I can't wait until ~50% of startups are by women and it's not considered exceptional.)


Simple context would suggest that he was offering any details he remembered of the startup in the hope that someone would recognize it. Let's not overreact.


Well it would be considered exceptional as women rarely ever lead in number of start-ups in any field.


when singularity suddenly comes, all articles will be obsolete before being written; as soon as arrington starts to type their first letter. so there won't be any time to be bored.

so enjoy getting bored, because after singularity you will only be perplexed and petrified. (unless you are upgraded or modified)


Nice reply by Ray Cromwell:

> I love how Arrington, Thiel, Graham, and others spent years talking about how you don't need any college, promoted quick-buck social media startups with no plan, and kids fresh out of high school, and now they are sad there is no flying cars. [...]

https://plus.google.com/110412141990454266397/posts/YLXqGgpc...


Humans have been talking about flying cars since the 60s. If we would have had them by now, their creators would have been educated well before Thiel Graham and Arrington.


Sure, we only got self-driving cars, almost flawless speech recognition, concordant natural language queries, "Minority Report"-like UIs, Google Glass, space flight at 1/10th of the previous cost, 1GB fiber to the home, dramatically cheaper solar power, garage gen tech, 22nm chips, we did a spectacular landing on Mars, working Exoskeletons, the technology to find planets that are only slightly bigger than earth,...

And Mr. Arrington is bored? Maybe he's just not paying attention.


Most of what you mentioned came from big companies with a $billion or more. Mike is an Investor. When he says that innovation is not happening, its probably because he is not seeing good ones starting up, i.e a new comer with an amazing idea. We'll see the effect a few years from now, when most great products come from Google/Amazon/Microsoft/Apple. It is probably that way already.

Every decade has had a game changer who came in from no where. Who is going to take away the 201x?


Who was the game changer of the 2000s?


Google and Apple were probably the two biggest game changers of the 2000s. Apple's being a resurgence and Google being a company founded at the very end of the 90s.


Besides speech recognition and natural language queries, all the tech you cited is not yet distributed. And as someone wiser said "the future is already here, it's just not evenly distributed". So I'd discount the impact of tech demos like Google Glass, at least in the short term. Long term OTOH, I'd like to be as optimistic as possible.


Mike Arrington is bored with blogging. Does that mean he'll stop?

Personally I think we're on the edge of a revolution, with the rapid shrinking (and increase in power) of computers, and Google boldly pushing into some pretty deep AI.

Intelligent personal assistants. Self driving cars. Wearable augmented reality.

There's going to be a lot of exciting stuff to write about in 2013.


The Spirit autonomous tractor[1] becomes available to farmers this spring. I guess driverless vehicles are not exactly radical by now, but the fact that you can actually put your money down on one today is still pretty amazing to my eye. Though I guess it might not be all that interesting through the SV-centric lens.

[1] http://www.autonomoustractor.com/


2013 is also going to be a big year for Healthcare IT, the HHS is moving to open up clinical patient data via HHS. I doubt Google will venture again into the healthcare space, given the recent failure of Google Health, but it's going to be a turning point, they should have waited until the end of 2013 to close it down!


> the HHS is moving to open up clinical patient data

Don't want to sound too cynical here, but give me a break. The health care industry has resisted technology for 2 decades persistently, and any radical revolution in medicine will have so many public hurdles beyond what has already been dealt I don't see anything changing for Average Joes for decades just because of inertia, the powers that be, and special interests.

I mean I still can't get medical care at all without cleaning out my bank account because I'm uninsured. It is personal bias, but I'd rather see that fixed first. $500 - $1000 a month in healthcare is obscene.


Kenyans now have $50 smart phones that exceed all but the latest phones in capability. We have $50 android tablets. We have reached a sort of technological plateau where stuff we consider awesome advanced technology is more available than clean drinking water. The great stagnation cometh, the center holds, things stick together more strongly than ever.


All highly commendable projects.

There's no exit strategy, though, so VC aren't interested.


I seem to recall reading a few weeks ago that VC money was starting to move towards B2B companies. If that's true, you're not going to see a lot of exciting tech blog worthy start-ups. You will see a lot of progress in software, and real people will be receiving real value. In fact, i view that as a really positive thing. We don't need another social network enabled live drawing text-pad that opens a door. However my fiance definitely needs some software to reduce some of the redundant unsexy work she has no time to do


Try creating instead of commenting on what other create. It's more fun.


Michael Arrington did try and fail to create once.

http://en.wikipedia.org/wiki/JooJoo


I had the impression that he did not really do creative work on that project, but high level management. I may be wrong.


As a software dev currently thinking through my 'next thing,' I find this opinion refreshing. There's a lot of redundant and me-too software and it's far too easy to fall into the trap of building another mobile/social/geo/photo app.

I think hardware makers are doing a great job of pushing the envelop, but software not so much.

The question is, what's next for software? At the time Twitter came out, it was brand new...amazing...innovative. I can't recall feeling the same about anything since.

My own personal belief is that embedded software is going to be the next thing, but the problem there is the barrier to entry is a bit higher, both in terms of knowledge base and financial cost. That said, tech like Bluetooth LE is making it easier than ever to try.


> My own personal belief is that embedded software is going to be the next thing

I've started hearing that embedded software control via cheap Android tablets ($50-$100)/hardware interfaces is beginning to take off and reducing the barrier to entry. Pretty much, you can use a $75 Android tablet to control an embedded device and drive the user interface. iPads are, frankly, a bit too expensive for most hardware applications.

If I had the energy, I would champion an open-source project for niche hobbyists who want to program a particular type of hardware (assuming their particular hardware has some kind of serial/network relatively-open, documented interface - something like ZigBee but less complicated maybe) but aren't necessarily programmers. Build the project such that the UI and hardware control layer run on Android - and can be abstracted out from the actual hardware later. The UI is not going to be snazzy looking (sliding menu) but reliable looking - like an industrial control but maybe a bit better.


Interesting. In the local hackerspace, for several years, I watched many students struggle with a (very fine) graphical drag-n-drop programming tool known as StarLogo TNG[0] (and Logo variants).

In the end, it seemed the brighter students destined for real programming were relieved to get into a text editor. The "success stories" focused on people who could not otherwise program, but the cost was imposing too much abstraction on the brighter users.

So I think it's a hard problem, and thus a very worthy one to pursue.

[0] http://education.mit.edu/projects/starlogo-tng


Most of these cheap tablets have unlocked bootloaders or easily broken into ones. No reason to keep all the unnecessary Android clutter on top of the hardware if you are trying to use it as an imbedded microcontroller. Maybe keep surfaceflinger as the display server, or maybe just run X or Wayland.


This is weird. Techcrunch in my eyes contributed to the startup hype machine that it's now complaining about?

Celebrating funding like winning the Startup Lottery is only going to feed the monster of hype vs real innovation.

Is it just me?


If you allow me a small anecdote... I used to live in Manhattan. Occasionally, I would go to the store to get paper towels or toiletpaper and buy the bulk-size. Invariably, the package would be too big to fit in the small plastic bags that were available and carrying it along with everything else was rather awkward. Last year I moved to Turkey, and one of the first things that I noticed is that all of the bulk-size packages of paper towels and toiletpaper have built in carrying straps.

Now, this may seem like a very, very minor thing (and it's not unique to Turkey). On it's own it says nothing, but the longer I've been away from the US, the more and more I've noticed just these sorts of little things. It's the sum total of these "little things" that indicate to me that the US has, on a very fundamental level, stopped innovating.

Oh, sure, the US will continue to produce new things. It'll probably even produce one or two big new things in the next couple years, but it's not the big things that drive the innovative spirit. Gather enough smart people in one place, give them enough time and enough money, and you're pretty much guaranteed to get a major innovation (or, at least a driverless car).

You'll also waste an enormous amount of resources. A society that is innovative at its core has only to foster that innovation, in any small way, to get a far greater return on investment. In short: America has a tremendous head start, but America has become complacent. The "next big thing" just might come from somewhere where the people are still hungry.


I bought bulk-size toilet paper at a Target in St. Louis and the cashier taped a plastic handle to it as she rang it up. Fear not - the US is still advancing the state of the art in toilet paper delivery.


I think you're drawing too much from that observation. It says more about the market in the Turkey than the innovation in Turkey.

Geographically-targeted versions of items are custom tailored for desires of local markets. Even if we assumed none of the TP brands in Turkey are multinational corporations, do you really think that the carrying strap idea wouldn't have been nicked by companies that do sell to other markets?

Living in NYC and buying a big pack of toilet paper is an outlier as far as the US is concerned. Anywhere else in the states you would put the pack it in your shopping cart and roll it directly to your car door.

For all we know, those straps could have been designed in the States, and the bean-counters decided adding the straps to the case manufacturing process is only cost-justified in plants serving an average population density of X and above.

I only bring this up because I have, in the past, been working in the States and built better versions of products for sale exclusively in Europe.


Perhaps the straps were a bad example. Another example (that I already noticed back in 2004) is that the mall parking lots all have lights hooked up to sensors so that you can see when a spot is available without having to drive up-and-down every aisle in the lot.

Again, it's not that any one particular example stands out. For almost every one of these examples I've noticed, I could come up with a way to explain away how it isn't a difference in innovation. I, too, have thought about the how shopping-carts-to-cars does not necessitate the straps. I have thought that the parking lot lights might not have a significant or measurable enough ROI to justify their installation to US mall owners... but isn't that the point?

It's not that any of these things are needed or demanded or even entirely justifiable from a purely economics-driven view point. But since when has an economics-driven view point led to great innovation? Isn't a prerequisite for innovation the willingness to look beyond simple, straight-forward economic arguments? To be creative for creativities sake?

It's in this way that I mean the US has stopped innovating.


We have the parking light indicator thing here too. San Jose, Santana Row...


Heh...oddly enough, I've been to Santana Row. It's the only mall in the entire US that I've been to that has anything even close, and while Santana Row does tell you what levels have free spots, the last time I was there they still didn't have indicator lights above each individual spot.

...but, again, it's not any one particular "invention" but rather the whole attitude that's different. The US has gotten complacent and seems to be spending more effort on justifying why it's the greatest than on actually, you know, being the greatest.


They have the indicator lights now also ;-)

But, I understand your point!


Spoken like a brat spoiled by an embarrassment of riches. Even if one grants that not much innovative has happened since the iphone was released in 2007, and other comments have already falsified this claim, 2007 really isn't that long ago. We're talking about an event that completely changed the way people interact with mobile devices and consume content. A seismic shift that resulted in the rise (Apple) and fall (Nokia/RIM/Sony) of empires. How often do we expect that to happen? When did it last happen before 2007?

As anyone here can attest, innovation is hard -- really hard. I'm a nobody and I see this firsthand in my daily life. I've been working on a side-project for years that I hope will change the world and planned on building a proof of concept prototype last weekend. I expected to be done with the prototype in 2 days. It took over a week. A quarter of the way through construction, the code for the prototype just kept getting more and more complicated and kludegy, even though I thought I had designed it well enough to be straightforward. Although, I eventually had to redesign the whole thing, the final design ended up being much better and more reusable than the original POC I had planned. I was fortunate enough to be in a position where I could spend time to rethink the design. There were plenty of stories in 2012 to blog about where products were released half-baked because they were released too early.

I suspect we will see many interesting things in 2013, much of which has been incubated and polished during the time that Michael thought nothing was happening.


> A seismic shift that resulted in the rise (Apple) and fall (Nokia/RIM/Sony) of empires. How often do we expect that to happen? When did it last happen before 2007?

Looking back, I was more surprised by the rise of Google (and Android/Linux) in a world dominated by Microsoft. But at 28, I'm probably just a little older than you.

I'd also argue that the reason the last milestone was in 2007 isn't because of the iPhone, but because that's just before the global financial superbubble popped (foreshadowed by the bursting of the Nasdaq bubble in 02). IMO, that was the earthquake which stymied true attempts innovation. Many people stopped trying to innovate and starting worrying about finding a safe place for their retirement fund (still are), or worse, finding a job.

Shoot me an email u feel like sharing your POC project with a fellow dreamer.


> I was more surprised by the rise of Google (and Android/Linux) in a world dominated by Microsoft.

Microsoft dominated the mobile phone space? Ubuntu is doing ok, but I don't think it is taking the productivity market by storm. I am more impressed Apple kept their propaganda campaign for OSX going so strong that it became its own self perpetuating must-have device trend for no other reason than shiny and street cred for 99% of buyers. I'm 21. When I was still in middle school RIM was in its prime.

> IMO, that was the earthquake which stymied true attempts innovation.

Tech investment has barely slowed down since then. A lot of us are still out of work because society hasn't adapted to the still continuing trend of "productivity is high enough not everyone needs to work 40 hours a week to prosper" but the tech sector is still there. The real reason for the innovation slowdown is extreme patent trolling and abuse in the US. The IP laws here, especially software patents, are ruining potentially revolutionary tech with this systemic abuse.


Peter Thiel actually talked about this in one of his CS 183 lectures at Stanford. He categorized progress as vertical vs. horizontal. Vertical progress can be defined as a problem of going from 0 to 1: you manage to do something totally new. Whereas horizontal progress is a problem of going from 1 to n, as in replicating an idea that has already been proven to be successful.

Thiel said that most of our vertical progress comes from Silicon Valley, which I agree with. But I think SV also has a lot of horizontal progress, which is what Arrington is complaining about. Everyone and their mother is working on a social app, and every software company is going apeshit about mobile. Granted, tech news coverage is probably skewed in favor of horizontal progress, since vertical ideas by definition are so brand new and "out there" that most people would dismiss them as crazy. Still though, it would be great if someone came up with the next technology as innovative and groundbreaking as the iPhone or, to a slightly lesser extent, the iPad.


Find some customers whose hair is on fire.

You'll never be bored again.


There is so much amazing creation happening right now. Unfortunately for TechCrunch, it doesn't have, nor aspire to, $5 million seed funding, so TC won't hear about it.


Oh, tech bloggers. Never happy. I think we should all try spending more time away from our devices. There is so much more to life than those glowing rectangles.


As a founder, it's often better to start where your own pain points are, because sometimes you'll stumble upon an incredibly massive pain felt by the entire world. Now you're ready to do business.

Be advised, however, because even if you manage to identify one such sharp pain point, and even if you have a solution, only rarely will you successfully be able to repackage it and market it to the masses. But to put in perspective, if you're working towards an affordable solution 99% of the world is compelled by extreme pain to use, you might very well be working on a billion dollar business.

The question is, when are more startups going to realize that the biggest pain in the room is as clear as day? It's right in front of every single one of us. It's only a matter of time before people in massive numbers realize that 24/7 surveillance of all telecommuncations isn't fucking cool. People are already flocking to VPNs to get around downloading restrictions, so what happens when people realize they need a VPN to send a private email? Here the truth is contagious, and government elements can only repeat lies successfully for so long. The problem of the surveillance state is the very definition of sharp pain:

Compared to the markets for cosmetic surgery, real estate, knee pain, back pain, ANY market you can think of - the prospect of having all of your telecommunications stored indefinitely if not monitored in real time by regimes teetering on the edge is an order of magnitude more concerning. When you consider that a 24/7 surveillance state by definition constitutes neverending pain, it's really no contest.

Unwanted, unconstitutional surveillance measures are creating a seering hot, absolutely intolerable pain for the entire human population. This is a pain that demands a solution, ASAP.

As a startup and as a developer, there are very few pursuits more worthy of your time than furthering human rights and averting absolute tyranny.

Curiously, thanks in large part to Bitcoin, the startup community now has the power to fund itself anonymously and innovate solutions that actually matter, without outside interference.

The clock is ticking.


What are these pain points related to the rise in surveillance you mention. This seems to be an article of faith on HN: that lack of privacy protections, increasing amounts of personal data being aggregated, and intensifying surveillance is creating a perfect storm of bad consequence for civilians at large.

But I never hear about what the harm is except for the exceptional cases. Anecdotally, none of my aquaintances has had a single adverse life event that can be attributed to these trends.

On the other hand, adverse events related to money/jobs, health issues, issues with intimacy and social isolation, being too busy: these are daily occurrences.

I'm bringing this up here just because your post reminded me I wanted to run this question by the HN community.


> I don’t want to read any more stories about how Facebook cloned something they couldn’t buy. Or that Twitter banned something that they tried to buy but Facebook got there first. Or the press regurgitating how Google+ is somehow not flailing. Or about the number of Android v. iPhone devices. Or Samsung’s patent mishaps. Or how Yahoo is winding down things in Asia.

The funny thing is that techcrunch is one of the most guilty sites when it comes to writing about these things.


What gets made is what people want.

We didn't get flying cars at $250,000/car. We got 140 characters for free because that's what people wanted. We didn't get hoverboards, not for lack of technology, but it just turned out that we wanted to search the world's knowledge with Google.

We're frogs in boiling water, not knowing just how innovative we're becoming as a people. And sci-fi writers are just bad (or as bad as typical entrepreneurs) at guessing what people want.


Well, good sci-fi writers don't try. As Ursula Le Guin puts it,

  Predictions are uttered by prophets (free of charge), by clairvoyants (who usually
  charge a fee, and are therefore more honored in their day than prophets), and by
  futurologists (salaried). Prediction is the business of prophets, clairvoyants, and
  futurologists. It is not the business of novelists. A novelist’s business is lying.
https://raw.github.com/gist/4424894


Overlooks tablets, siri, google now, google glass ...

Seems like a case of self-imposed blindness. Narrow down your field of vision until you don't see anything exception startups oriented around mobile, iOS, social media and then complain that you're not seeing anything revolutionary. Tablets are arguably the biggest change in personal computing since the internet reached consumers. To complain things are boring is bizarre.


I don't think most people realize how big Google Glass is going to be, but that's in 2014. I predict it will be a bigger leap than the iPhone in 2007.


The bursting of a bubble is the BEST time for a blogger and entrepreneur. The red ocean calms, old truisms die, resources reform and a new paradigm is born. We've all perfected execution, optimized our UX, achieved our viral coefficient, developed on every platform, increased DAU and sold to Facebook. 100 Million users is the new 10 Million but in the next shift does that even matter? To be on this cusp is to define it as much as Microsoft/Netscape did in the 90s, Google/Apple did in the early 00s, Facebook/Twitter now.

"Whenever you find yourself on the side of the majority, it is time to pause and reflect." - Mark Twain

So what's next? Disregard the mean and look for the outliers on both sides. Those are the true survivors, the extremophiles. When the mean shifts, they will have already adapted to the new environment, have succeeded, failed, persevered and flourished. So go there, fight and thrive because the future favors its creators, the extremophiles.


You aren't bored, you're boring.


This article is 100% trolling. Don't feed.


How about the author be grateful for advances and if still bored, be proactive and join the rest of us, staying up late nights trying to add to the knowledge base of our species.

Easy to point the finger outside, hard to point it inside.


Why do people undervalue the value brought by by these big companies, it would nice to have flying cars that most of the population would not be able to afford, or more trips to space that the wealthy can enjoy habitually, but the reality is these tech companies have reach their audience globally, call me crazy but I think that itself is amazing, we can always do better though


3D Printing, which I'm guessing will lead to printing up your own clothes.

Overall sounds amazing, yet for me the thought of no longer needing to buy objects or clothes is a bit disconcerting. Our economies are struggling already. As tech continues to evolve alongside population increases, I have a minor worry about how the majority will sustain themselves?


> Overall sounds amazing, yet for me the thought of no longer needing to buy objects or clothes is a bit disconcerting.

Some observations:

1. You would start buying designs instead of products. You can go shopping for them, just on a meta-level kind of way.

2. You are able to change designs, get designs changed by others (you can get at a shop, maybe), you can share designs with others (social network of things?), you can communicate about these designs, and so on. Stuff can be personalized on a micro and macro scale, whereas now it can only be personalized on a macro scale through combining existing items. With 3D-printing you can personalize the items, too.

3. There will be a growing market for hand made stuff, albeit more exclusive. I think the first couple of decades the distinction between printed clothing and clothing made from fabrics will be clear. That means that traditional clothing will keep its value while printed clothing will be valued less.

If anything, shopping will become a more immersive activity, I think. It'll change from a materialistic activity to a more service-oriented activity.


The industrial revolution came and went just fine. I think we masses will be okay this time around too.


Just a year ago, TechCrunch was also being short-sightedly wrong about established major companies being "boring": http://www.peopleprocesstech.com/why-techcrunch-is-boring-sa...


It seems he is about to realize that things move in waves. Currently people are learning how to take advantage of all this new mobile technology and once they have done that we will get our new wave of awesome shit.


Look around you


Ok i smell spam upvoting. There is no way 50 people thought this navel gazing article deserved an upvote. He does not even say anything.

Pg, time to tweak the algorithms again.


Of course you'll get bored if your primary source of tech news delivers you only pinterests and instagrams.


Actually, the last great human invention will probably be self-reflective Artificial Intelligence.


Ctrl+F Tesla, SpaceX, zero results. Really? I'm excited about these companies.


Ha! Easier said than done! Of course, everybody wants something new.


Mike,

Build something yourself.

Sincerely,

The people who are not bored.


TC;DR.


The people in charge have no vision. Everyone makes this complaint, because it's almost always true. The difference is that it's starting to matter. You can't "do technology" without vision. Not anymore.

If you're running a toothpaste factory, you don't need vision. You just need competent execution. You need people to show up and follow the plan without complaint. This kind of work can actually be managed. If you can cut costs without compromising on quality, you do it. It's not about vision. That's already solved. Vision was relevant long ago when someone figured out how to make toothpaste. Your job is just to keep abreast of competitors and seek rent.

VC-istan is a postmodern startup factory. It's technically not a company, but as a tight social network, it functions as one. VCs, rather than properly competing, talk to each other and agree on who they like and who they don't. VC-istan's "serial entrepreneurs" are just glorified PMs whose egos make them unemployable. The real bosses are the VCs. They want quick exits, as they should, because that's how the incentive systems that govern them are set up. The "tech press" are a sort of HR organization. Entrepreneurs are PMs, often mediocre ones, and engineers are chumps paid in lottery tickets. This is just a big company that has managed to dress itself up as a thousand small ones that happen to be all controlled by the same people.

Now you have an ecosystem of commodity entrepreneurs hiring commodity engineers to implement commodity ideas. Ok, nothing to see here. Mobiles skwrking, mobiles chirping, take the money and run, take the money and run, take the money...

Is it any wonder that this isn't producing innovation? It shouldn't be. Yet VC-istan is doing a lot better than most large companies do. It seems inevitable in large organizations (including economies) that the resources gravitate toward players who don't have much in the way of vision. Innovation is the exception. It adds variance. Given that we're social animals who judge one another based on reliability (low variance/minimal performance) rather than capability (expected return) it is socially dangerous.

How many people have the talent and the resources? I'm sitting on +4 sigma talent but have no money. The people with boatloads of money seem (with a few exceptions) to be lacking in vision (which means I can't tell if they have talent, but I have doubts). I can't say that I blame them. Why take risks if you have no need to do so?

What we have now is a generation that's used to technical progress and wants to take part. We have people who have been working their asses off since age 4, are now in their 20s and 30s, and want to take part in technical innovation. Most of them can't, because there's so little of it actually going on, and because most work activity is bullshit oriented toward keeping one warlord boss's status high at the expense of another's, rather than being invested in true progress. That's depressing. It creates a malaise. A deep sense of ennui. Yawn, another fart app. We now have an unprecedented number of ridiculously talented, over-educated people saying, "Dude, where's my machine learning job?"

I think that 2013 will see the beginning of a Flight to Substance, and if I'm right, that will put that talent to better use than fart apps and toilet check-in services. I don't know how it will play out. I have no clue who will fund it. One sign of this is the increasing clamor for Valve-style open allocation. By the mid-2010s, you won't be considered a real tech company if you're running closed allocation (take heed, Google). If I'm right, that will help. That will help our generation work its way toward excellence. At least, some of us will go in that direction. Others will go off into the weeds of fart apps. May the market reward both crowds justly.

Here's how we fix tech.

* Open allocation. As long as the work is relevant to the company's needs, let people work on whatever they want. This enables native growth of technical talent. You don't have to poach qualified people with ridiculous signing bonuses. They quickly find a project that fits their skills and interests, and they actually improve while they work for you. Imagine that.

* Stop fetishizing either extreme of company size. Not all large companies are bad, not all startups are good. Nor vice versa. If your 50-person startup is running closed allocation with typical HR policies, then it's just a big company that failed to get big and it should be considered a massive joke. I've heard of people getting turned down for transfers in 20-person companies because of "headcount" limitations. If you want to work at a startup, then drop that shit and work for a real fucking startup.

* Demand work on hard problems. Don't build someone's fart app for 5% equity. If you're in the press, don't reward stupidity either. Instead of cheering on idiots who get acquired for outrageous sums, ridicule them.


How we judge talent is actually one of the biggest problems, and a bias that I've been trying very hard to solve.

Intellectually, I know there's a huge difference between a programmer who has 99th percentile programming skills and one who has 90th percentile programming skills. Being a non-programmer, I don't have the tools to easily tell the difference.

This means my base instincts will tell me to work with the person who is a good programmer and speaks confidently and clearly, vs. the exceptional programmer who's shy and mumbles. That's because as far as my brain can tell, they both have the same degree of programming ability.

I work very hard to solve this bias. I make sure to focus on people's strengths and ignore weaknesses that aren't obvious dealbreakers. I try to maintain wide social circles and ask my most successful friends what standards I can use to judge skill in different areas. I pay a lot of attention to how other people in their industry judge them, vs. people in general.

What I've found is that there are two kinds of people who are really diamonds in the rough. One is people who are generally smart and have shown the ability to excel in a lot of fields, but haven't had a lot of experience in the field you're talking about. Think about a recent graduate who has an exceptional understanding of History and Psychology and is an excellent tennis player. This kind of person will usually learn very quickly and excel at jobs that demand a high ratio of problem-solving ability to knowledge.

Another kind of diamond-in-the-rough is someone who is admired by people in their field but somewhat disparaged by others. A good example would be a Salesman who always exceeds his quota but struggles with basic math. People who find math easy will typically think of this fellow as stupid, even if he's a genius at what he does.

Pay attention to people like these. It's all too easy to make a mistake like "If someone can't do math/can't speak coherently/failed high school/doesn't understand technology/uses Internet Explorer, then they must be stupid." In my experience this is almost never true-having a weakness doesn't mean that someone doesn't have a certain strength. Focus on the strengths and you'll end up working with (and hanging around) much more interesting, much more diversified people.


> How we judge talent is actually one of the biggest problems, and a bias that I've been trying very hard to solve.

Is this a major business problem for you? Are you trying to solve it using actual money and process or just winging it really hard every time? There are known programmers who are really good (USA Computing Olympiad) and some of them will be articulate and you could find someone who's a good fit and hire them as a consultant to help tell the difference... though that's only off the top of my head.

Every time I hear somebody describe a "big problem" they've been "trying very hard to solve" I wonder whether they've focused on it enough to (a) step back and think about possible tools to make the job easier (b) resort to professional specialization (c) make it the job responsibility of a particular person or (d) spend actual money.


"There are known programmers who are really good (USA Computing Olympiad)"

So, maybe this will come as a shock, but "computing olympiad" success doesn't correlate with success as a professional programmer. There's so much more than raw intellectual horsepower to being a good team engineer that it can't be captured with any one test.

Which is to say, the parent is right. Identifying good programmers is a hard problem. Harder than identifying people who do well at coding contests.


Yes, but I'd expect them to at least be able to tell good programmers from bad programmers better than someone who couldn't code.


There are known programmers who are really good (USA Computing Olympiad)

IOI and similar programming competitions do not select for good programmers. They select for people who can solve small algorithmic problems efficiently in a short time period with throwaway code. There is some overlap, but also a lot of perverse incentives. In professional programming, the tortoise usually beats the hare.

(I have competed at a national level and won a national college-level programming competition myself, and don't really take it all that seriously.)


This is not a business problem for me, it's a personal interest. This problem interested me because I noticed that there was very little correlation between how smart my friends were and the types of offers they received. And more broadly, I noticed that most interviewers will tend to over-weight negatives. Then I realized that I did the same thing, which got me thinking.

I'm not just trying to find talented programmers. I'm trying to address the more general problem of figuring out whether somebody is good at X. You're right that the easiest way to judge on a case-by-case basis is to ask known experts. But since this is/was a cognitive bias of mine, I'm trying to overcome it myself.

I have spent a lot of time, done plenty of reading, and spent my own money on this problem. Like you said, I actually have paid experts to give me advice on how to identify talented people their field. More frequently though, I'll network and ask in an informal setting. And as a result, I have managed to severely correct several biases I had, such as thinking that anyone who couldn't do basic Math must be not be very bright.

One thing that's relatively easy (i.e., only took me a few months of effort) is learning how to identify generally smart people who learn fast. (I would actually divide this into learning social systems quickly vs. learning technical systems quickly.) If two people are entering a field or starting a job at the same time, I can generally predict which one of them will learn faster after a 30 minute conversation. The biggest surprise was that very smart people can be catastrophically bad at things "normal people" would consider basic-like eating spaghetti or not offending their interviewer (or even realizing they'd offended someone.) Before I tended to assume that fast learners would be good at most things and exceptional at a few things.

A more challenging problem: say I meet a lawyer, or doctor, or anyone with a lot of experience in an area I don't know much about. How can I tell if they're genuinely good at what they do? So far, general intelligence + percentile-based accomplishments ("I boosted revenues by X which is better than 95% of marketers...") + asking them to walk through a real life situation has been my best predictor. I'll validate this by asking known experts afterwards or looking up the person's career track.

Even so, it's not a particularly good predictor, and asking a real expert works much better. But by looking at actual data, even if it's relatively weak and anecdotal, my ability to accurately judge competence has become much stronger over the last few years. I think this is something that most businesses could benefit from, but the more common response seems to be throwing up their hands and complaining about the lack of qualified people.

Edit: when I said "how we judge talent is actually one of the biggest problems", I should have specified "we as a society." It clearly came across as "we as a company", sorry about that.


Think of programming like juggling. Some genius programmers can juggle 6 balls at once, which is a rare and impressive feat and has a few purposes. This is the Linus-type programmer who can punch out an OS in a week. That's probably not the programmer most companies need.

People want a programmer who can only juggle 3 items, but does it so well they are juggling chainsaws. That's easier to identify - it's their production (portfolio = chainsaws) and the quality of their production (referrals = not missing any arms). If you can verify that people coded what they claim they coded, it should be fairly easy to tell who the good programmers are.

If you're having trouble telling who the good ones are, you haven't found any yet.


If someone has a good portfolio and good referrals, you can easily tell if they're competent. But there are plenty of talented people who don't.

To extend the analogy, there are plenty of jugglers who are capable of juggling 3 chainsaws, but find that they can only get paid to juggle 2 plastic balls over and over again. Variation in their craft is punished, not rewarded. If you only look at whether they've juggled chainsaws before, you'll miss out on a lot of great talent.

I've seen many, many people who are much better than their portfolio or past accomplishments indicated-especially people in their 20s who've just happened to work at crappy companies and end up on bad projects.

Someone who's never juggled before may not be capable of juggling three chainsaws at first, but you can look at their athletic record in other areas, examine their flexibility, coordination, and willingness to practice, and use these to predict whether that person will be able to learn how to juggle chainsaws. How do you figure out which of these areas matters? By asking existing expert jugglers.


> I've seen many, many people who are much better than their portfolio or past accomplishments indicated-especially people in their 20s who've just happened to work at crappy companies and end up on bad projects.

This hits too close to home. I am trying to figure out what to do about it.


This happened to me. I had a streak of bad luck that left me with an unimpressive resume and made it incredibly difficult for me to get hired, even though objectively I was/am very good at making companies money.

In my case it came down to the realization that the standards people use to judge your skills are very arbitrary. Once I went out and got a few certificates I started getting job offers. Nobody cared about the fact that I was smart enough to learn SAS in two weeks or make my previous employers large sums of money, they only cared about the fact that I had the certificate. C'est la vie.

In every industry there's some sort of bar past which people start treating you like a real person. Getting past this bar often has little to do with merit-think having a degree from a "prestigious" college or having exceedingly specific previous experience. Try to figure out what that bar is in the field you want to work in and how you can show that you've cleared it. Programming is a diverse field but a few bars I've seen people use (not saying I agree with these) are "has experience in Ruby/Python/Functional Programming/my favorite language", "has contributed significantly to open source", "recommended by someone I know who is competent", and so on.

The last one is IMO the most valuable bar to clear. The more you can get out and know people, and show them that you're competent, the easier it is to get better jobs and better projects in the long run.


I'm in the same boat. Just out of college and I wound up in a position that's not even close to what was advertised.

Hired to be a programmer. Showed up and they decided I was going to be a business analyst. I've got significant experience in .Net, Scala and NLP but I can't even get companies to reply to my applications. I despise my job but I can't quit yet and I can't find another job. I'm working on side projects to build up my portfolio but I right now I just really hate waking up and going to work each day.


That sucks. What part of the country do you live in?

My advice: keep learning cool stuff, go to networking events (e.g. meetups, especially related to NLP, machine learning, and Scala) get some job leads there. Cold-emailing your resume rarely gets you anything, especially when you're in a career sand trap. But you have a skillset that, if you're strong in what you've cited, is desirable.

You will eventually get a feel for how much investment you need to put into your day job to keep it, and you can use the remainder of your time (which may be 20-35 hours of your work week) to keep current with the skills you want. If you're writing code on company time, be careful and make sure not to use it for any closed-source purpose, because you don't want your employer asserting ownership.

You can get out of the sandtrap but you'll have to break the rules to do it. Stealing an education from a boss feels dirty when you're young and naive, but it's a necessary survival skill and, in a world where bait-and-switch hiring is common as dirt, not at all unethical.


I'm in Houston which is not a great city for that skillset.

I do extremely little for my job, so little I honestly started to wonder if I was missing something. Then I slowly realized my coworkers are morons. A few weeks ago, I was asked to create a directory structure for a bunch of incoming data files, a few hundred directories all told. Another person was tasked to do the same on a different server.

I spent about ten minutes on it because I wrote a shell script. He did it by hand and spent all day. Which, of course, is what I told my boss it took me as well. I just happened to use the rest of my time to read Akka in Action.


I've got something simmering for solving this problem. Can you do web design sort of stuff? I've never been so good with the HTML bits but need a nice front-end. Shoot me an email via my profile here!

I can't promise equity in a company, because this idea was looking more towards being a lifestyle business. Revenue/profits of operation would thus be the thing at stake.


I'd argue that for the vast, vast majority of programming tasks that we encounter, the 90th percentile programmer who can communicate clearly actually IS the better choice. There are tasks for which the 99th percentile mumbler is indispensable, and if you have a task like that, then you'd better cultivate the ability to find those people (or hire someone who already has it...). But most programming involves understanding human problems, or at least working effectively together on a team.

But I agree with your basic point. If you can get good at spotting talent that has "flaws" that would disqualify them from the more typical elitist hiring mills, then you have a huge advantage.


I definitely agree that good communication skills are very important. That's why I used shy vs. confident in my example. The ability to articulate ideas in a clear way is important, the ability to make eye contact and have a strong handshake less so (in programming.) These skills make someone appear more confident and essentially trick most people's brains into thinking that person is more capable.


I think hiring programmers should really involve the developers they'll be working with. No trick questions or silly problems. Just have them sit with people from the team and talk about the stuff they'll be working on. If they get fired up and start spouting solutions to the types of problems offered, you very quickly see what kind of developer they are. If they're suggesting something you've already tried, that's a good sign. If they already know the pit falls you've encountered before you mention them, that's also a good sign.

What it comes down to is that the engineers are better suited to evaluate engineers. Obviously, you've got to have final say as leader of the company, but the engineers are your best asset here. This is why good VC's always have a luminarie in their back pocket to send out to evaluate tech before an investment. Even technically competent VC's would rather send an experienced database developer to look over a new DB company rather than rely on their now 15 year-old experience.


Intellectually, I know there's a huge difference between a programmer who has 99th percentile programming skills and one who has 90th percentile programming skills.

Depends what you're building. If you need cutting-edge programming expertise or knowledge of computing's deep magic, then yeah.

If you're building a CRUD database or a local-mobile-social app, you don't need the expertise that much.


Programmer productivity is, I think, an irreducible paradox of software engineering. It's difficult bordering on impossible to judge even the success or failure of a given software project. It's also difficult to judge the importance of different contributions to a high level of granularity. There are, of course, exceptions and outliers. Someone who obviously succeeds with a 1-man project, for example. But the broad middle is a muddle. People with talent that has been squandered by circumstance, people without talent who have succeeded due to accident and luck, etc.


It's difficult bordering on impossible to judge even the success or failure of a given software project.

This I disagree with. Causes of success and failure are hard to tease out, but people know when software sucks. Also, if people can't use it, then all the cleverness in the world (in optimizations, for example, or in feature set) doesn't matter. It still sucks.

But the broad middle is a muddle. People with talent that has been squandered by circumstance, people without talent who have succeeded due to accident and luck, etc.

That's very true.

This is part of why I think Valve's self-organizing open allocation is superior. Management rarely knows who the best programmers are. The group of programmers, if they're good, usually can figure out an appropriate leader on a per-project basis. This doesn't generate the permanent, entitled leadership that management wants to see, but it gets the job done.

You also need to create room to fail. If the software project can't be saved, let it die so people can allocate their talents and energy to something that has better odds. Give the architects respect for trying and ask them to write up the challenges they encountered, so as to keep the knowledge in house. If you fire people for failing projects, then you lose that knowledge and will probably fail in the same way again.


This deserves a longer response but a short one will have to do for now.

You talk about software that sucks. However, one of the quirks of software is that sometimes even when it sucks it can succeed and even when it doesn't suck it can fail. For example, technologically google wave didn't suck, but in terms of actually providing useful features for people that justified its use, it didn't have a leg to stand on. Then look at wordpress, which started out sucking but because it was open and because it had developed a strong community around it ended up getting better and better to where it was finally sort of decent. Or look at PHP. As a language it definitely sucked at the start, and there's a strong argument to be made that it still sucks. But it is perhaps the most popular language for web development in history.

Most software is even more difficult to determine success or failure with because even though a software project might not be a success immediately it could be a success down the road. Another case in point would be the Mozilla Project. At the outset it sucked, but eventually it became pretty awesome. How much of the awesome of today's firefox is rooted in the code from the early days of mozilla and how much is due to subsequent dev. work? How do you tell the difference between software that sucks because it is rotten through to the core and software that sucks because it has a layer of crap on top of awesome internals?

And then how do you track everyone's contribution to software? Sure you can keep track of commits, but that doesn't track inspiration and ideas. Sometimes the fundamental design or mechanism for a given piece of software will mostly be due to a different dev. than the one who implemented it in code, and often there is no paper trail whatsoever that that's the case.

Ultimately there's no objective way to measure either talent or success except in the extremes. Some people's subjective estimates can still be reasonably accurate though, but usually it takes a talented and experienced dev. to be able to judge another dev.


This is a good point. There's a time behavior to software that is hard for most people, and especially non-engineers, to understand. Software that's superficially weak or seems to provide functionality no one wants might be a hidden gem.

So, if people have an unfailing sense of good and bad software, which they might not, it would be eventually consistent at best.

You're also right that individual talent is, for the most part, impossible to measure.


A related and more pernicious issue, in my view, is that companies have a tendency to hire before they trust.

Valve's ideology is "We hired you, we trust you." That doesn't mean that they give new hires all the keys, but the going assumption is that anyone who gets in is a competent adult who doesn't need to be restrained with typical, military-style subordination. If you're going to hire someone, then trust that person with his or her own time. If you can't, then don't hire.

Most companies grow too fast and end up hiring before they trust. This causes a loss of focus, because they need to generate busywork projects for the new hires, but it also creates a dynamic where there are Real Fooblers (for a company named Foobar) and everyone else, and the company has no problem generating a bunch of shit work for the "everyone else" category so the Real Fooblers can work on the fun stuff.

Talents of leadership and architecture can be assessed later on, but everyone worth hiring should start out with the basic right to direct her career and, when the time comes, prove herself.


Oh, if only corporate subordination were military style. A fresh Marine recruit knows the names of the five links in the chain of commmand between him and the President; meanwhile, Peter Gibbons has to answer to eight bosses, some of whom he hadn't heard of until they drop by to ask if he got the TPS report memo.


Good point. The corporate hierarchy is inspired by the military structure, but incompetently implemented, badly defined, and never legitimated.


I think it was legitimized by having it happen in pretty much every big company ever. It is the "I made this company (or weaseled by way onto the board / highest layer of management), I'm boss, I don't want to hear from anyone else how I do things because its mine, I have the power, obey me. It comes off as very 3 year old immaturity.


I call that TWiGs management. Toddlers WIth GunS.

When I say that it's not legitimated, I'm not saying that it's illegitimate philosophically (although I think it is) but that such managers do a poor job of convincing other people that their power is legitimate.

In the military, most people buy in to the rank system. A major component of the abuse inflicted in basic training is to tear someone's ego down and build the person back up again as someone who can take orders. The result is that a lot of them come out of the process genuinely believing that the commanding officer's power is legitimate. In most companies, nothing happens to convince the grunts that the managerial power is legitimate, since most managers are puppet leaders rather than the leaders that the group would pick.


> but that such managers do a poor job of convincing other people that their power is legitimate.

Don't they convince their shareholders? These companies keep making it big even with tremendous overhead of useless management and they seem to do it by pitching investors and locking down markets.


"Being a non-programmer, I don't have the tools to easily tell the difference."

Then you shouldn't. Period.

EDIT: It is unfortunate that the types of people (quick learner, genius at a subject) are not considered for engineering jobs usually because the interviewers don't know how to interview. They ask stupid technical questions instead of having a technical conversation and what code on a whiteboard.

Why not do calculus with an abacus?

Not to mention the problem of people who make hiring decisions having no idea of what the problem they need to solve is and hire unqualified people under false pretenses. The old bait and switch.


If you're really sitting on +4 sigma talent and are lacking the money to move your plans forward, drop an email address here so I can contact you.

I am planning on putting up a 'Proposition HN' in a week, with this exact thing in mind. I want to pay talented HNers to work on their dream/vision/side-project in exchange for participation/equity.

Watch this space.


Put up a landing page now so you can collect emails from interested folks.


Fair enough Michael. Sounded like you had some plans you were unable to work on, based on that one sentence in your post.

trumanshow.. I just created a throw-away gmail account:

hnproject2013@gmail.com

drop a line there and I'll email a link to the HN post once its fleshed out and posted.


What is "sigma talent"?


Sigma is usually used to represent the standard deviation. +4 sigma would mean four standard deviations from the mean. So in this case he is saying his talent is higher than 99.9966% of people.

http://en.wikipedia.org/wiki/Standard_deviation


Hmm... a brag like that is pretty difficult to measure.


So in this case he is saying his talent is higher than 99.9966% of people.

+4 sigma for certain things, not overall. It's not even possible to measure "+4 sigma overall intelligence" and if it were, I doubt I'd be the one who has it. But there are certain subdisciplines where I'm in that range.

I say "+4 sigma" because for the vast majority of companies I've observed, I could run them better, and they seem to at least think they have 2-3 sigma talent.

If we accept that the people who actually are in charge are +2 to +3 sigma minds, then I'm easily +5 based on the difference between me and them. If we assume they're idiots, then it'd be generous to give me +3. The reality is probably between the two.


Assuming it is iQ for example,i would guess that +4 sigma would mean 4*15=60 above the mean,which if you take as 100 gives 160.I would welcome a better clarification of this.


It means he knows better than you even when he doesn't. ~

~ = snark, allow me one snark ! :)


What do you mean, participation or equity?

This would have been great when I had time off to work on my compiler back in the summer. Oh well, the algorithm still had/has bugs in then anyway.


How would that work?

I'm actually reasonably happy with where I am, because I have a career strategy that I think will work. Startups aren't the only path and, if you have no connections and are going to be "just a programmer", they're not always even a good path.

Most successful entrepreneurs started in finance because, for better or worse, that's a way to build credibility. Remember that VCs are also financiers and will be biased in favor of that experience, even if it's not relevant to what most startups need.


What is that strategy, by the way, if you can share?


At a finance company with no defined "data science" role.

Attempting to see if I can create that niche (data science) in a large company, first for myself. If so, then things will go really well. If not, then I'll probably be getting back into "regular tech" mid-2013.


Ah, that sounds like a good project, thanks for sharing.


I overwhelmingly agree and think an additional problem is that a large portion of "technology companies" are not really technology companies. Most are actually consuming and applying technology - not developing new technology in any major sense. For example, is your run of the mill VC backed company churning out an app with a sprinkle of social integration and location awareness a technology company? - I would argue that calling what they do "technology development" is quite generous.

The resulting problem is that most of these companies do not really care about pushing technology forward or contributing relevance to the technological sphere - just figuring out the right arrangement of features to make the things not completely fall apart. Just as an anecdote during my career Ive worked with 4 Phds (Stanford, MITx2, Brown)... in each one of those cases they were essentially very educated software implementers - management would dole out tasks to myself and them that were just typical... we need an ad unit here, an app for this, analytics system for this, landing page for this. I don't criticize the Phds I worked with in that Im sure they were trying to pay their bills like everyone else but it was disheartening that our current system/environment had allowed such great minds to be puppets of management who were the least qualified in the room to be working with technology.


"VC-istan's "serial entrepreneurs" are just glorified PMs whose egos make them unemployable".

Notable quote that I agree with somewhat.


Small sample size, but of the 2 startup CEOs I know well, it's 2 for 2. There are others who seem fine, but I don't know them well enough to judge their true character.

If we start with a Beta(2, 2) prior on the percentage of CEOs who are insufferable, egotistical fucks, we get a Beta(4, 2) posterior. So the 95% confidence interval on the percentage who are insufferable, egotistical fucks is [28.4%, 94.7%].


"Flight to Substance" is something I think a lot of us are interested in, but it isn't always obvious how to do it. My take on it is that you need to find niches which are underserved and where you can do something really meaningful. I think there are quite a lot of these.

We started a foundation that builds web and mobile services, on open source software, which are used to fix poverty. This area was essentially not considered a market when we started, so the threshold to get started and recognised for doing something useful was relatively low. As a non-profit foundation we could also take investments from those that wouldn't invest in your ordinary tech startup.

We aren't doing this work to get rich, but there are a lot of other benefits. We have very patient investors and we have a lot of freedom in implementing our vision as we see fit. It is also pretty amazing to be working on software and services which actually make a real difference for those that have it the worst.

One of our core services, a mobile phone field survey application, have been used recently to do baseline surveys for all public water points (wells etc) in both Sierra Leone and Liberia. Providing data that just didn't exist before. 30 people go out on motorbikes for 3 months to collect data and come back with tens of thousands of surveys. This data is then used to drive national policy on rural infrastructure improvement. This stuff makes a real difference. And we have agreements in the works which will make this type of data collection possible all across sub-Saharan Africa, with our tools that are used by governments, NGOs and multi-lateral organisations such as UNICEF.

It feels like working on stuff that matters. It is both technically interesting and there is a lot of work to be done. I am sure that in education, healthcare, government, public infrastructure and other areas there is a lot that you could think of that could be improved, which you could get unconventional funding for to do.


the relevant article about open allocation if anyone wondered: http://michaelochurch.wordpress.com/2012/09/03/tech-companie...

Basically the point of the article is, let developers choose their projects and don't force them on them, they will choose it better and make it much more likely to succeed.


The first half of your post is one of the most insightful observations I have come across in a long time.

Thanks


Hacker News post of the year.


well said


This guy is only making noise asking us to let go anything cool we've been working for years to improve his 2013's headlines. Don't fall for it.

One thing hthe OP surely got is the title of the worst article of 2012. The year is almost over, and he aside the bar so high that now is almost impossible to take this for him.

Now back to the point. Innovation is not some new shiny thing that no one has seen before. Instead, real innovation is something you work on until itself is the definition of perfection according to someone's vision. Thus iteration is the mother of innovation.


Lots of people are bored. This is the kind of garbage that should stay off HN. A contentless POST. I believe that's a HEAD.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: