while listening to the talk, I took notes. (Not exact notes, I probably reworded several things, hopefully not twisting the meaning too much.) Here they are, if you don't have time to watch the talk.
Google, "where I work right now", they are doing great work to attempt to change the work. At least more than other companies.
I work on compilers. I like to work on big data, learn about data mining. Because even for compilers work I need to do large data analysis, and face
tremendous scaling problems.
Hollywood blockbusters summer 2011: why is this slide here? These summer movies are all crap, because corporates are greedy, they are incremental, not trying to shoot for real quality, real game changers. They chase money.
Except for "auteurs": people making money while keeping principle. E.g, Pixar. They show their passion, make every one look bad, but make money as well. Apple is also a great example.
Social networks; this is what I work on at Google :-( (lolcatz pictures on the slide). Is this principled? This is fun, and making money. But not principled? Is there anyone in this crowd not working on social networks? This is a hype. Why is everyone working on this? This is money chasing.
You are interested in social networks but when you are 60 you will be interested in your health. But then it is too late. You will wish we had solved these fundamental health problems when you are 60. These are hard problems that require math statistics and big data.
Human genome project: This will be an inflection point in human history. It is also a data-mining project. Reverse engineer the source code (genome) with respect to how treatments work/are-effective. The people who can solve it, data mining people, are working on crap problems, lolcatz social networks :-(
Let's affect a culture change.
short-term: infrastructure and scaling
medium-term: math, data mining, bioinformatics
long-term: important problems
I had a midlife crisis instantly after rehearshing this speech once. I am not following my own advice. I had started work on math every evening. And I am officially quitting that social network job at Google. (Is he also quitting Google?)
This way I will be ready when we are in a position to face those important problems in five years.
The problem with calling out specific companies for quality is things can change -- eg, Pixar, who would have clearly belonged on his list a couple of years ago, but post-Disney, post-Cars 2, post Newt being cancelled for Monsters Inc 2, post the Canadian branch being formed to shit out TV specials... well... things can change (including for the worse) quickly.
This is a very important lesson, and one that I would argue that books such as "Built to Last" completely failed to foresee that many of the companies they thought were made of gold, turned out to be duds merely a couple years later when the late 90s stock market bubble burst.
It feels hard to get into Bioinformatics without a PHD. I would love to tackle problems in genetics, medicine, and etc. But I'm not going to commit 7 years of my life to breaking into a highly regulated sector before I get to "Hello World!".
My father is in the hospital right now and I feel the same frustration. A family friend who is a professor of radiology mentioned that a major driver of research in medical imaging comes from video game graphics work. He even submitted a paper to siggraph around interpolating 3d space from 2d CT scans. There are direct and indirect applications from the techie scene, it just doesn't get the kind of press like consumer tech and the hacker scene tends to be at universities and learning hospitals.
it is hard to be hired there. That creates relatively closed system where software is in the state that can significantly benefit even from short professional generic programmer involvement - at least it was on few occasions i or my friends happened to touch it.
With respect to open source work - the field is wide open. For example, as "data mining/big data" was mentioned, the interesting recent development synthesizing advances in general availability of cheap server farms (hardware and software -wise, in particular cloud/hadoop) and a new approach - "meta/shotgun sequencing" - enabled by such advances can be found here http://bowtie-bio.sourceforge.net/crossbow/index.shtml
When it started I thought Yegge sounded nervous and jittery and seemed a little intense, like he had a chip on his shoulder and I thought "Oh boy, I hope he doesn't do a melt-down".
Then as the talk progressed, I realized he was just excited/nervous and that's how he talks.
He gradually hit his stride and the message that his talk was meant to convey slowly started to take shape for me... and it's a hell of a positive message.
It is a call-to-arms to give a damn and use our powers for the advancement of everyone. To stop spending out free time working on icanhascheezeburger SMS alert apps and pickup a book on mathematics, bioinfomatics, data mining and other hard topics and start learning.
It is a call to arms to send yourself back to school (in a sense) that don't be afraid to start learning about other topics that have always seemed interesting to you but maybe you figured were outside of your area of effect, e.g. "I'm a server guy, I'll never do anything interesting in 3D visualization!"
It is also a call to arms to make money and effect change with principle; like a Google or an Amazon.
You don't need to scrape every last piece of skin off of your customers hide in order to post big quarterly profits to be successful. You can develop positive relationships with your customers, employees and the world around you and STILL make the money necessary to continue growing an innovating.
The "quitting" part of the talk is unimportant, it was just his way of illuminating his point. The value is in his message.
So I'm not really an ask me anything type guy, but I work on a bioinformatics team at an academic cancer research center. We have genomic sequencers and a software stack running 24/7/365 plus your normal collection of IT and small dev projects here.
If you have questions about what it's like to be a hacker in this type of environment, post them here and I'll share what my experience is like.
BTW, I completely wish I knew more stats and bioinformatics, so I probably should purchase the Yegge book collection myself . . .
I guess my biggest question is: in what way can programmers outside that specific discipline pitch in and help out? Are there relevant open source projects that could benefit from someone with a lot of programming experience and a slightly-better-than-layman science background?
Bioinformatician here: First of all, I think it's got to be clear that a lone bioinformatician, or even a group of them isn't going to go about changing the world. Essentially, when you sign up for this, you're still a cog in a machine, albeit a slightly more altruistic machine.
Here's my pet peeve in bioinformatics - If there's one thing that's poorly suited to science, it's the building of computational infrastructure. We're talking basic stuff like databases, tools etc. Sure, anyone can knock out a bit of code for a basic database, but the big problem is that there's no incentive to have decent code, or maintain it so that it lasts any longer than the person is in the lab, or has funding. So, what will be great is if existing resources are cleaned up - data is normalised and pulled out so that it is actually accessible for doing some kind of analysis on it.
If you want to do bigger work, do something actually novel, or that has any biological relevance there's no getting around collecting your own data (e.g. sequencing the crap out of a bunch of things). I'm in the process of trying to get funding now for a project of mine to make that very leap now.
I'm sure someone working on next gen sequencing (the new hotness) can pipe up with the big problems to be solved there.
I know a professor who has a big grant to sequence a whole bunch of animals, and will also do a high res CT scan of each specimen. He plans to make a comprehensive site where scientists and school children can access the data they are interested and learn more.
He even has money for a dev position for 4 years, I'm just worried that he gets someone who slaps together a proprietary and incompatible site when this would be a perfect chance to experiment with implementing some standard data access APIs.
I've heard bioinformatics people complain about the lack of standards and fragmented nature that comes from various small groups of scientists doing it on their own.
If anyone is interested pm me and I'll put you in touch, he is in South Carolina so he can't offer the salary and other perks of the Bay, but it's a real chance to put good development energy into science.
This is a problem that is pretty endemic to acadamia in general. The revolving door of most academic labs means a lot of knowledge and tools are lost or not maintained. It is frustrating to see this happen in every single lab, but short of fixing the labor-mill mentality of acadamia, this will never go away.
I fit the mold you are talking about. I started working for a neuroimaging lab about 1 year and a half ago. Basically I'm not a scientist at all, did Math as an undergrad but almost no stats or any of that type of vodoo stuff (topology, abstract algebra etc real math stuff is what I did ;p).
So basically I had no clue what Neuroimaging meant or did other than people get shoved in scanner, huge magnets turn and they see inside you ;p
But what I found is there are plenty of computer science problems in a field of Neuroimaging (and neuroscience) that a programmer can help with. (processing, image analysis, data mining, storage, Visualization whatever) Most labs don't really have people who's primary job is programming. Thus there a lots of tools that are just hack jobs long forgotten that no one is maintaining but everyone depends on. Those things can be helped out by good programming practice and with real programmers behind them. What sucks is getting funding for these people but Open Source can help here by pooling multiple people from many labs into common projects.
Also, if you do work with scientist, most of them will talk to you for hours about the science of what they do. You can usually ask the most stupid question and they will be happy to answer it. I've found that most people I work with are open and even more if the work you do helps them achieve their scientific goals. So in the end if you need to learn some science stuff, they will usually be helpful.
(BTW my project is in my profile, will be open source soon, waiting for some political approval process).
My gut feeling is that there is a large need for good query/visualization tools of the datasets the sequencers produce. At least, I think if researchers could "play" interactively with data they would be pretty excited. But I am most definitely non-expert in this area, so take that with a grain of salt. I sometimes think about whether or not tools developed with column stores (e.g., the programming language J or something like KDB+) would actually be cool for data exploration.
You can probably find research groups via TCGA that might appreciate some one-off development or support, but it might not be exciting from a tech viewpoint.
There is a ton of EMR (electronic medical record) data out there in free text. If you have skills or interest in things like Lucene/Solr, I would bet that almost any research hospital might appreciate your time and skills. And, if you talk to the right group, want to hire you . . .
edit: also think about contributing to the CATMAID project, which is the software serving the browsable data set above, and perhaps will someday enable crowd-sourced markup: http://fly.mpi-cbg.de/~saalfeld/catmaid/
How would you describe the attention to craftsmanship of software in scientific computing? The most illuminating thing about Climategate to me was the state of the software: is this kind of code widespread? Is there interest in improvement? If I went to work for a research team and suggested pairing, code reviews, version control, continuous integration, or other accepted good practices from my experience in software development how would that be received?
Also, the dilemma in my mind is whether I can stand going back to grad school in my late 20s for a career I don't really know much about. I'm not sure if you can speak to that experience, but would you say it's been rewarding?
Software Craftsmanship is often disastrous in Computer Science departments, never mind other fields.
I remember getting a C program from one of the biggest names in both undergrad CS education and machine learning and finding it wouldn't run on 32 bit machines because it had a static array that consumed 4 GB.
Even in computer science, the product is papers, not working software, and the situation is worse in other fields.
As someone who had an academic background, I think going from "pictures of cats" to "math and science" is like going from the frying pan to the fire. Entry level positions in the math and science Juggernaut pay from 2-5x less what a junior or senior person in the social media Juggernaut gets. You can sink anywhere from 5-10 years into getting a PhD, and then you'll find that there are just enough new jobs for the children of yesterday's professors who weren't totally destroyed by their upbringing and that they've got an insurmountable advantage in the game of musical chairs.
Science and Math is a system that uses up young people, especially men, the same way that the racing industry uses thoroughbred horses. There's no realistic career path for 95% of the people who get involved... other than working on "pictures of cats" or whatever it is that pays.
I haven't had much luck getting people on my team interested in code reviews. The day I started we put a ton of stuff in version control though.
In general, you should interview a team when you're trying to get a job and trust your gut on how much those practices you mention will be accepted. It's a little harder here, because for years the model was assign a dev to a project and that dev will own the entire project from start to finish. We still don't do a great job working as a team.
Craftmanship can take a back-seat to schedule. Lots of operational stuff (for grants and administrative purposes) gets pushed off to the last minute. And for research, the focus here is on the results much less than the process. The researchers don't care if you do it in a bash shell script or a clojure jewel as long as it's done - kind of like a startup. So if you can do it fast in a maintainable and cool way, all the better.
We are suffering the effects right now of some bad QC code. A non-insignificant amount of data had to be re-analyzed because of some code bugs. I would say interest is currently much higher in improvement. :-)
I did grad school (MS in CS), went to industry, and then came back. Returning was mainly lucky timing and personal network effects. But I was late 20's when I did grad school, so we're not too different. It was completely worth it for me.
Some PIs are like typical academics - they have MD/PhD (usually both) and can be somewhat dictatorial. But some are great and want your thoughts and expertise.
You could consider simply getting a job doing typical IT work in an organization like this. You're probably right that eventually you'll want some grad school (we are walking around in credential heaven here), but you could always try it out first.
This is my first bioinformatics job and my first return to an academic environment since finishing a MS in CS and doing the standard industry thing. This group is mostly a core services group rather than a pure research support group.
I came in to do traditional data warehouse work, loading data from separate databases into one spot to make reports across those different datasets easier. That's most of my days now - simple data integration and reporting. Oddly enough, we are an Oracle shop because our university has a site license. But I do a little Ruby/Rails/Sinatra for a project that is reporting + some app-like enhancements. Tons of SQL for manipulating data, but it is usually simple SQL, especially compared to stuff I wrote previously in a financial corp.
A significant amount of my day is ferreting requirements out of users. I wish I was better at this.
There is a much larger amount of grunt work happening in the data management space than I expected. We deal with a huge amount of protected health information (PHI), which can be frustrating since you're usually working around data policies that require somewhat careful interpretation. Obviously, the genomic datasets are huge, so dealing with storage and clusters for data processing is often a discussion point around here.
It is an academic/research place, so it is a little more relaxed than industry. Expectations (for software) are a little low - we're talking about users who have, for the most part, kept data in spreadsheets or little Access style databases for years. And they often haven't had the budget to hire dedicated software guys, so sometimes tools are suffering from bit-rot.
Good hackers seem to be appreciated. It is below your standard industry salary, but not terribly so. I try to avoid being sucked into low-value work since "the reward for work well-done is more work."
Just like grad school, if you want to work on a specific topic or research area, make sure you get a job in a group doing that work. Once you're established, it seems easy to get some space to do some exploratory hacking.
This sounds quite accurate. Just a minor thing to add: said DB vendor is heavy in politics, and getting off the Obstacle goggles is part of the cure.
And the environment will try to shoehorn "bioinformatics/medical informatics/etc" into "mostly a core services group" of technicians, like surface technicians, kitchen etc. Though in discussions, it will be admitted, that informatics/maths is one of the most promising keys to advance the medical research field.
J. Quackenbush called these groups of people intellectual peasants. Unless these groups are accepted as peers, I can hardly see any true systems approach succeed.
Lets say I want to get in this field. I have a BBA in Econ -- but no upper level math -- would I be better off getting a BA in Math and then going off to grad school (in CS or Bio, or both..)? Or would it be a better use of time to take certain maths and then just head into a masters program?
My co-workers have very diverse backgrounds - PhDs in bioinformatics all the way to folks with a BA in history.
They all seem to be hackers. So I would say, if you're already a hacker, you can probably find a way in somewhere given enough time regardless of your current degree level.
If you know you are going to head off to grad school, don't do another undergrad before. Many CS programs take on students who need a little remediation in math or CS background. I would guess that would be much more difficult for a MS program in biology.
You might even find that a MS in bioinformatics is the appropriate choice. There seem to be several of these programs floating around, they are multi-disciplinary to begin with, and if you can crank out a good GRE score, you can probably get in one without too much effort.
You will probably find it easier to break into the academic research world with at least a MS. But, I hear lots of buzz about commercial companies wanting to get into the sequencing business. Maybe you could just give yourself a crash course in python or ruby and a little biostatistics and sneak into one of those? Especially if you can find someone in your network already working in the field.
I recruited a friend to work with me who had a BS in history and a MBA (I teased him). But he was working in the software field already and I knew he would pick up the simple tech we were using to get work done.
Stevey gives up the being part of the chase for the superfluous (in money), and calls for us to do something about the necessary
No one needs a million dollars. No one. Why are we chasing it and dying of heart disease -- heart disease we can cure if we start chasing that instead?
Our priorities are absolutely messed up and it's time we start realigning them. This isn't a speech; this isn't a funny resignation. This is a clarion call to join in. We can do so much better. We can achieve something valuable, if we start to realize where true value lies.
food, roof (with internet connection reaching under it), freedom. In Bay Area, if you have a mortgage, it is half a mil bare minimum and even a million may not take you far enough. There have been experiments to build societies with decreased degree of connection between food, roof, freedom and money - somehow it always went like this one http://www.globalsecurity.org/military/world/dprk/dprk-dark....
And you can't build SpaceX without a bucket of millions (Copenhagen Orbitals are beyond wonderful, yet they are in Virgin Galactic league at best, even in their furthest plans. They show the possibilities for the future and at the same time how far yet that future is)
I think it should be noted that Steve is probably fairly wealthy by most any standard from GOOG & AMZN stock.
And with that wealth comes more freedom...to work on 'big' projects.
I think it's a bit disingenuous to ask people to work on non-lolcat projects when you are wealthy and set for the rest of your life. It's admirable to devote your life to working on big ideas, but I would argue most people who are doing so (even those mentioned in his talk - Gates, O'Reilly, etc) aren't really worried about money.
Most people are simply worrying about how to pay next month's rent or handle their family expenses. Sorry, but these people are not 'in' when it comes to working on big ideas.
In my case, if it came down to working for Facebook and making $10+ Million on the IPO or working to cure cancer - Facebook wins. Maybe after that I'll work to cure cancer.
Your argument is specious to the point of being crazy:
- Millionaires can work on important things. (Yes they can, but they usually just end up working on their tan.)
- Everyone else is just making ends meet. (Huh? Everyone?)
- If you have the chance, join FB early so you can be a millionaire and work on important things.
Look at the startup Color. It has brilliant people and they are all working on ways to make sharing pictures with people you don't know.
We're saying, "Stop It!" Maybe you'll get a hit and get rich but that's a stupid goal.
No one needs better ways to share pictures of your cats. There are important things to do instead, and our brightest minds, instead of engaging in ways to humanity forward, are writing PHP to get themselves forward.
Let's stop focusing on the fact that he quit. And focus on WHY he quit. I'm sure we all can relate with what he's saying at some level. He's talking about tackling hard problems and not just low hanging fruit that might (emphasis on might) make a buck. And most of all he's leading by example. I haven't seen this much bravado from anyone in our industry in a long time.
downvote freely (but then don't laugh when watching Office Space).
If you have enough life experience, or read enough of Matt Groening, worst case watch enough Office movies/series, then you could suspect that bosses can be manipulative sociopaths, who do not deserve any professional courtesy.
BTW: most bosses don't fire f2f (Office Space), but with an indirection, through Human Resources Services (or hire scumbags, likeUp in the Air) [and then good luck, with your file].
I guess Steve had a reason to do this, and I respect him for having the guts to stand up in such a public way against dirty careerist office background politics, management decision support and calendaring theory.
Maybe this case doesn't fit, I have no clue (in my experience, it can be the "right thing to do" in large organisations detached from true ethics). But if Steve felt it this way, then it was this way (see Schopenhauer's most influential work, The World as Will and Representation)
I think the fact that he did it at a conference made it personal. See the other post that is a peer of yours on this thread.
Basically Steve Yegge said that his boss doesn't deserve the respect to know this ahead of time. In fact its quite possible that 1000 people will know this before his boss does, including his boss's boss.
In many regards the only person this was disrespectful to was his boss -- and that makes it personal when you single out someone in this manner.
Am I the only one that fails to be inspired by this? I think I'm not inspired because it's something I've already thought about. We've always had the option to go make money, or go try to fix the world... And whatever that brings us.
I am very clear on my chances of making big changes to the world: Almost nil. Instead, I decided long ago that I'd do my best to make money and improve my own life.
I'm not saying I don't do little things to help the environment, but there's no chance that I'm going to be on the team that cures cancer. There are too many people out there that are both smarter than me and more learned in the topics needed. The best I could do would be to get in their way.
This is cynical past the point of healthy. There are plenty of sectors that would be revolutionized if smart people focused their attention on them.
You don't have to cure cancer; that's absurdly binary and based on fame. You can do less sensational things like invent a device that improves patients' lives using modern robotics and sensors. You can design a system for Alzheimer's patients that incorporates your knowledge of big data sets. I can't dream of all the things to improve, but your domain knowledge is probably tremendously useful in all kinds of fields.
In addition, even "cat picture project" technology can sometimes be used to implement "world changing" technology. I think "do what you love doing and always challenge yourself" still generally applies here.
Making the world better is a complex problem which depends on a hell of a lot of systems.
Having worked in academia, you rapidly appreciate that the people who keep the lights on - administrators, lab techs, librarians - are about the most important people there.
And that system? Depends on taxpayer funding. If you're doing things which are ethical, and you're doing them well, then you're helping cure cancer and all the rest, even if only indirectly. Maybe you could do something more direct; that's your call to make. But don't minimize the impact of doing good work.
I don't think he is trying to minimize the work of others. Steve is saying if you see a meaningful problem that you know how to attack, it's up to you to make it a priority rather than putting it off because you can't be bothered.
He's talking about working toward your potential and possibly making short-term sacrifices in exchange for the greater good.
Because if you have a unique perspective on a problem, you may be the only one in the world right now with the vision to solve it. Don't waste your time working on mundane shit when you know you could be doing something more.
I just recently decided to not pursue a PhD in scientific computing, so I'll be finishing my Master's and working at a startup starting in September.
One of the biggest reasons I decided to not do further research is that I have noticed that there are an unfathomable amount of people hobbling about on ancient and primitive tools when there are existing implementations and papers available that would make their lives orders of magnitude more efficient. So I've made a personal choice to not pursue the latest, fastest and most cutting edge algorithms and rather bring my knowledge, experience and problem solving abilities to people who are trying to solve real problems right now.
My point is that even though you think you can't compete with these guys who read, study and do 'real' science they could use a lot of help from the likes of you. Sure learning some linear algebra and bayesian statistics can help in directly implementing the algorithm, but usually the biggest problem I've seen is a complete lack of software engineering and hard coding specific to certain data sets.
I think the scientists can gain a lot from the software engineering field, especially open source practices. They will be resistant to it, as others have pointed out the incentives don't always line up, on the other hand there is a lot of low-hanging fruit in terms of improvements a decent programmer can pluck.
"My point is that even though you think you can't compete with these guys who read, study and do 'real' science they could use a lot of help from the likes of you. Sure learning some linear algebra and bayesian statistics can help in directly implementing the algorithm, but usually the biggest problem I've seen is a complete lack of software engineering and hard coding specific to certain data sets."
Yup you said it exactly right. More programmers to assist the scientist. The scientist know the science part but they need help with the informatics part (lots of it). If not they do it themselves and you end up with some software that becomes critical but with no one being able to maintain it.
Plus working with the scientist means you have access to expert in their field and usually they like to talk about it so you'll learn overtime the science and whys behind the stuff you work on.
No, you're not the only one. He's a bit naive if he thinks he's going to cure cancer by quitting his job and reading some undergraduate math textbooks. However misguided his plan may be, it's heartwarming that his wife agreed to be his study buddy, instead of divorcing him.
I'll watch this tonight, from the summaries I generally agree with him. I feel the same way about brilliant math and physics people doing HFT for Goldman Sachs. But I'd also point out that from a comparative advantage perspective, it may be optimal to earn lots of money in a "useless" area and then donate where it makes a difference. If you can earn $500k/year at a hedge fund and give half of that to SENS, that's probably better than quitting your job to learn molecular biology from scratch.
Maybe, maybe not. It's a real challenge to hire the good software engineers, but a good engineer can be extremely productive (the old 10x efficiency thing). So, if you are one of the good ones, you can probably do a lot more good with your skills than with money.
Put it another way, anybody can throw money at the problem (and a lot of people do). But not many can write algorithms that efficiently work on terabyte data sets.
What we all can do is to donate our CPU cycles to folding@home or WCG projects. How many of you have your PS3 siting idle? Do you know that you have folding@home built in your PS3, you just need to tour it on :)
I hate to say this, but protein folding projects aren't good for any practical purposes (or show me a breakthrough). It is good for PR (public participation, awareness of the topic), but like SETI (FFT on essentially CBR noise from space) it is quite futile: while in theory it could work, it is really not a reasonable way to use scarce resources.
It is literally burning electricity (oil, coal, Fukushima) for no real reason (MS wants to patent computational heating?)
We have no clue how proteins work/interact (this is why epigenetics is hyped these days). We have some "educated guess", but it is mostly data harvesting from public databases and then doing some simple Bayes or correlation analysis, without much scrutiny on the harvested data itself (ie. was it made up using multiple imputation?)
18 months ago I didn't think they had anything similar to self-driving cars, so it wouldn't surprise me if they did. The human genome project is pretty Google-y, with massive amounts of data and clever algorithms.
They had something, I think it was a protein database search goog labs project. It wasn't very popular, so I guess it didn't stick. btw 23me is close enough, and is also misleading enough (in scientific value)...
How is 23andMe misleading in scientific value? They just published "Web-based genome-wide association study identifies two novel loci and a substantial genetic component for Parkinson’s disease" in PLOS Genetics:
My robotics training hardly qualifies me to asses the output of their academic research. However, I applaud their commitment to publishing results it top-tier academic journals.
Off-handed comments (in a thread about Google) calling 23andMe's scientific research "misleading" seems a bit snarky. Educated discussion has a place -- eg. in HN threads related to the company  or (especially) in the peer-review process.
I took a look into Hacker News hotness algorithm a month or two ago, and it is way more complicated than just points and time. Youtube videos probably don't rank that high compared to other websites. People could be colluding, but I think it is more likely there is more to ranking than we can understand from what we see.
[Edit: This was in response to someone asking why it wasn't on the front page.]
I never heard him say "bio is only domain that is world-changing". But unlocking the genome would have unequivocal implications on quality of life and longevity for all of humanity, what is more important then improving someone's life?
People thought we were "unlocking the genome" ten years ago with the Human Genome Project. If that taught us anything, it is that biology is a lot more complicated than simply parsing data. Biology is messy, complicated and breaks every single one of its own rules. Repeatedly.
I'm not saying bioinformatics is unimportant or unnecessary, because it truly is important. I'm simply tired of people (particularly famous people who are grandstanding) boasting that XYZ will "cure cancer".
As someone who was originally a science major, then switched to comp sci, I always get a romantic tingle when I hear about bioinformatics. Most of the science I've learned is gone by now, but I am sure it still there somewhere.
What are the good starting points for learning about bioinformatics?
I'm not sure I agree with his premise that, in order to change the world you must work on a problem with "big data." I can't see how the energy crisis is fundamentally data-driven, for example, and it's hard to say that you couldn't change the world by working on that.
Regardless, best of luck to him wherever he winds up (within Google or elsewhere)! And I hope he keeps writing!
I think he's implying that it requires the same basic computer science hacking that is required for all of the "cat pictures" projects he points out. The point isn't that we should choose between data-driven and not data-driven; it's that we should choose between cat pictures and the human genome.
This past week, I decided to start working on a project to build a new type of speech synthesizer. Did I know anything about acoustics or linguistics? No, but I've been reading what I can find about them since then.
I used to have a philosophy of intentionally choosing easy, "overlooked" problems. I figured I wasn't that smart, so I should just stick to the simple stuff. The software I built was good and useful, but a lot of it is already becoming obsolete. I want to make software that will last 50 years, not just 5.
This talk came at a great time for me, and it's strengthened my resolve. I'm going to keep learning about speech synthesis and acoustics (which means a lot of math and physics that I slept through in school), and hopefully I can push the field forward a little bit.
Im in my 40s and still learning plenty of math. I think it's more a function of focused practice than age. I've also seen several profs doing good work well into their 80's so I dont know what your profs are talking about!
I think a lot of "it's much harder to learn X when you're older" (languages for example) is more about not caring as much or not having as much time than it is about having less ability to learn given those things. At least I hope so, since I'm rapidly approaching 30 and there are vast and deep realms of knowledge I'd like to explore but am just getting started now. =)
Everyone should work on the problems that are close to their heart, that they have a passion for.
He says something similar in the begining but then he confesses he's gonna do exactly the opposite.
this is major bulshit and I would not take this guy advice even if he payed me a million. Because he himself is not following it.
And even if he did, it is bullshit.
You should work on what YOU find important. Not on what someone else decided somewhere else.
YOU have intuitive intelligence that knows what's important and what you and ONLY YOU have a UNIQUE talent for (because you are unique).
I normally don't like video links, but this one is worth watching.
He starts by talking about how he joined Google because he believed they really wanted to change the world. They are basically the only ones fighting for Net Neutrality, etc. He said Amazon had a similar culture when he was there. He goes on to talk about scaling and how it might be the biggest problem for a lot of companies. Basically, everyone is working on some kind of scaling problem and it's usually for a stupid "cat picture project" (social networking). Later in life, you realize there are more important things (specifically, health related in the video), but it's too late because these tasks require math, stats and domain-specific knowledge. We are mostly lacking the domain-specific knowledge. He talks about how you may wish you could go back in time to tell your younger self to do something more meaningful. He challenges everyone to learn something new and make a difference. With about a minute to go he's puts his money where his mouth is and quits his job.
Over the past 20 years, we've gone from writing software that runs on a single desktop with a very limited set of data to systems like Amazon and Google that accumulate large sets of data and needed to solve scalability problems. Now that we have all this data available, and the scalability problems are FAR more in hand than they were previously, the ? is what do we do with this data.
There are lots of ways we can use big data to solve real-world problems, but to solve real-world problems requires a degree of fluency in the language of the problem you're trying to solve. Most data-mining knowledge is being used to sell ads or to make it easier to share and find pictures of cats on social networking site, when with a little bit of domain knowledge you could literally change the world by solving a big, data-driven problem. Go do that. Learn on your own time and do something worth doing instead of finding new ways to earn a buck by sharing pictures of cats.
I would think that people flagging the story is weighing it down. If the talk is good, why would it be flagged? Maybe because the title of the submission is kind of sensationalistic relative to the talk. The talk is about working more on solving important problems, less on making cat picture sharing easy and fun, and it's not so much about him leaving his job, though it does illustrate his intentions.
Clearly, said employees need a video of Steve Yegge saying this in 2011 to realize they're working on meaningless stuff. xkcd #137 five years ago came close to marking the point, but it took Steve on Youtube to really drive it home.
While we debate the merits of Mr. Yegge's principled talk at OSCON Data 2011 and the potential ramifications that a sudden shift in priorities would cause in the developer universe, #imisswhen is trending from my local cat-picture dissemination outfit.
Personally, I believe we've done enough for humanity already.