Hacker News new | past | comments | ask | show | jobs | submit login
IBM 1959 Job Post (twimg.com)
169 points by wave on Apr 28, 2012 | hide | past | web | favorite | 106 comments



A few observations:

  - Lots of reference to military applications
  - Nuclear reactors and shielding
  - FORTRAN!
An interesting note about the applications on the nuclear side: I worked in a semi-high security clearance environment on certain nuclear operations. One item we dealt with in particular was a very old algorithm implemented in FORTRAN. We were attempting to scale the system involved, and the implementation of the algorithm was a major bottleneck.

The algorithm was phenomenally complex (it's nuclear science, after all). And, we had a challenge in documentation that was impossible to clear up with the original algorithm implementors: most of the team involved had passed away more than a decade earlier.

It was one of the neatest programming challenges I've ever encountered. Those old-school engineers were cool, and I wish our industry could keep more of those people around to pass along what they learned and help teach our industry going forward. The technologies may change, but logic never goes out-of-date.


What I noticed is how many references there were to wanting mathematicians to learn to code. It's weird because it's a good idea and I don't see that in job ads any more.


Yes you do. The branching out just happens already at university. The coding mathematicians are known as computer scientists.


Anecdotally, most computer scientists I know (excluding undergrads... I'm talking about PhDs, graduate and professional academic researchers etc.) don't write code and do a fairly poor job when they try.


Sure, but certainly they had the same problems in 1959. My point was that CS is a branch of Mathematics.


You might be surprised at how many of us didn't go into CS proper, but can still code. My degree was CMS and I still have to explain what it consists of all the time.


Given how I'm in the field, I've met several, so, no, I'm not surprised. Just because mathematics then, and CS today, is the obvious path into the field, that doesn't make it the only one.


I didn't really mean you specifically, but a lot of the people I've met lately seem to think that way.


Perhaps computer science wasn't widespread as formal area of study in universities at the time, so people were looking for the next best candidate, who is someone good at physics and math?


Lol "the next best candidate". C'mon.

[edit to explain that gut reaction] Physics and math practitioners would be far better candidates than a general computer scientist, for the domains described in this ad, even today.


The first comp sci degree in the US wasn't offered until 1962 (though cambridge com. lab had coined it in 1953)


The saddest thing for me is just what an institution IBM has been for America over the years. Having gone to UC Santa Cruz, the Computer Engineering department was started by IBM alums (my advisor was Glen Langdon of Langdon and Rissanen / Arithmetic Coding fame. Other friends/faculty spent time working at IBM Almaden Research where several advances in storage technologies have been produced.

These days, IBM is in decline. Friends/tech blogs/even sites like Cringely -- http://www.cringely.com/ -- note how IBM is quickly trying to shed any semblance to it's old self in names of meeting investor expectations.

IBM isn't the only legacy company in the same situation, we have seen this happen with HP as well as AT&T over the years. Microsoft Research, Google to a point, and Xerox Parc (after a period of decline) are stepping up for some long term/basic research. But, I wonder, will we every see the hey day of IBM Research, PARC, Bell Labs, etc. ever again?


What about Watson ?

IBM is still one of only very few companies doing truly interesting research.


Among the other achievements of IBM I would like to mention the 4 Nobel prizes(3 in Physics):

http://www.research.ibm.com/resources/awards.shtml


IBM is still one of only very few companies doing truly interesting research.

Can you qualify that? Almost every big technology company and thousands of small ones have teams of smart people conducting interesting research across many fields. Take a look at any issue of Wired or Technology Review (or the top links to HN on any given day) to see some examples.


Outside of Google, Palintar, and some of the other machine learning or niche consulting companies, I see little to no research on problems as hard and potentially influential as those that IBM or HP work on. The big reason you don't see what they do is that they don't work on consumer facing products very often.

The companies whose work I see at CS conferences and in patent searches for core algorithmic techniques are mostly IBM, HP, Microsoft, and Google. The majority of companies that are in the top links on HN on a given day do not do interesting research of the type IBM is doing.

Further, there is a lot of reinventing of the wheel going on in the startup scene these days. Just look at "new" things like in memory databases, column stores, event driven frameworks (Node.js) and others. All of these things are many decades old and do not count as new research even though they are newly written implementations. They may be more approachable to the common developer than the decades old implementations, and that has value, but this is not "research".


I probably should have said "important" instead of "interesting".

How many of those companies are doing research that can be compared to what came out of Bell Labs in the 60's and 70's - the transistor, the laser, ... ?

IBM's work on advanced semiconductors and Watson certainly qualify. The later will probably be seen as one of the most important developments in the history of humanity and it wouldn't have been possible without the former.


its pretty hard to know what is important in the moment.


Not really. Lots of people knew that the laser was important, that Xerox park stuff was important, that C/Unix was important, etc even at the time they were introduced.


In the case of the laser, there may have been some sense that it was important, but at the time there were no immediate applications and it was called "a solution in search of a problem."


In Wikipedia though it says that "Gould’s notes included possible applications for a laser, such as spectrometry, interferometry, radar, and nuclear fusion. He continued developing the idea, and filed a patent application in April 1959."


Don't forget Microsoft is also pushing the envelope. Their showcase says it all ... https://www.microsoft.com/en-us/showcase/channelDetails.aspx...


Check your link? That URL returns a server error.


/drum-drum


I hope Watson survives, but based on what I've heard out of Almaden over the years, I'm skeptical.


Survives ? From what I hear it's already being applied to medicine.


And banking. And insurance. The list will likely continue to grow.


Don't worry jmspring, Belphegor[1] may be slumbering, but he is alive and well.

1. https://en.wikipedia.org/wiki/Belphegor


It appears that your view of IBM, and other tech companies in general, are skewed to some degree:

> "These days, IBM is in decline. Friends/tech blogs/even sites like Cringely -- http://www.cringely.com/ -- note how IBM is quickly trying to shed any semblance to it's old self in names of meeting investor expectations."

I believe that is referred to as resource allocation/shifting. All corporations do this, not just IBM. It's not necessarily a bad thing nor good thing, it just is.

> "But, I wonder, will we every see the hey day of IBM Research, PARC, Bell Labs, etc. ever again?"

Simply browsing the technology/science sections on Google News will show what research companies like IBM, Google, etc. are involved in. Sure, Watson is the first thing people think of when they hear of IBM, but that's not their only endeavor. Just off the top of my head, last I read IBM was undergoing research in the areas of quantum computing and optimized processor fabrication techniques.


I'm not saying they still aren't doing research, but the level of research today is no where near what is was during their hey day. Furthermore, IBM, itself, is continuing to move away from the IBM of old and more towards being just a global services firm.

Given its size, any change at IBM will not be immediate, but it has been more in decline than rising.


IBM does quite a bit of research, they just don't blog or tweet about it all the time. Example of just one of their labs, right in the Silicon Valley foothills: http://www.almaden.ibm.com/almaden/welcome.html


Notice the lack of reference to rockstars, ninjas, or brogramming.


And also notice, with transparency, that your work will ultimately contribute to strategies and innovations in the field of power acquisition and, by proxy, literally the death of other humans.

Worth noting, also, minus the overt militaristic references, how similar these job descriptions are to a modern quantitative finance position.


The military is a tool. Like any tool, it can be used for bad things, and it can be used for good things. Without having a military, we would be up shit's creek.

I think you should be more thankful of and respectful towards the people "in the trenches," and also the technologists, who make it possible for you to live in a safe place and spend your time according to your desires.


Consider for a moment who that military decided needed killing en mass during the decade following the topic of discussion. Where all those communal farmers really threatening our way of life? When this army got it's ass kicked and left with it's tail between its legs a few years later, leaving those farmers to live their own lives as they chose to, did our society come crashing down?

Why didn't it?

There are no Mongol hoards waiting to invade, suggestions to the contrary come from those who profit from the fiction and those who buy into it.


that military decided needed killing

No, politicians decided that. That's an important distinction. If we don't like what the military is doing, we need to place blame where it belongs: on the politicians.

There are no Mongol hoards waiting to invade

The military is still highly important. Think about the role it played in the Revolutionary War, Civil War, and WWII, to name a few wars that most consider to have been justified for the US to enter. More recently, the Gulf War where Hussein invaded Kuwait. I personally believe that in the future, Iran and/or North Korea will acquire a nuclear weapon that threatens US citizens. Somali pirates. Who knows what else will come up. We still need a military.


So if the military does something and it is bad, then it isn't their fault because politicians made them do it. But if the military does it and it is good, then they deserve our praise and thanks....

Do you recognize the problem here?

Regardless, this whole thought that we should be "thankful" to the American military is absurd. They have billions of dollars to spend every year, why do they also require thanks? Do tanks run on thanks?

No. Thanks is needed because without the blinding effect of popular societal support, their actions do not stand for themselves. The military requires praise to be shovelled onto it like coal into a furnace 24/7 so that its hired guns can continue to lie to themselves and sleep at night. Listen, if you think they should have thanks, then do it yourself; however it is not your place to scold others who want nothing to do with it.


(1) The military deserves praise or blame regarding how well they do, militarily speaking, and the politicians praise or blame regarding political decisions. You're conflating the two.

(2) And, yes, you should be thankful that someone is willing to defend your freedoms.

(3) I don't scold people who don't thank the military; I scold people who denigrate the military. Actually, it's not scolding, it's showing a more rational way to think about it.

(4) I've enjoyed this, but I'm done here.


[1] I only recognize that a distinction could exist when we restrict the discussion to conscripted armies.

[2] I assert they do nothing of the sort.

[3] Call it what you like, telling people that they should be more respectful of the military when they express concern about being, by proxy, responsible for a loss of life, is... I don't even know how to describe it. Lets say "insensitive of the humanity of others".

[4] Cheerio


Perhaps the most unforgivable thing you've done here is forced me to weigh in on the side of javert. >:(

If you cannot separate political policies from military ones, you cannot pretend to argue usefully about this.

"I only recognize that a distinction could exist when we restrict the discussion to conscripted armies."

You need to further elaborate on this, because one easy interpretation is that you have no idea how to reason about volunteer armies. I'm not going to even go into more esoteric arguments about how "volunteer" anything actually is--just consider that the military does not (except in some bizzaro world some people seem to want to live in) spontaneously go to war. The politicians guide policies, the politicians set agendas, and declare wars and deploy troops (for our sake here, I consider the President in the politician camp).

(I'll also argue that things like what the CIA/Homeland Sec. do that require drones and such are wrong, so save it.)

It is wrong to inflict injury on another human being (we could argue this, but let's not). That said, surgeons cut to good effect, police detain suicide attempts, and bouncers eject unruly patrons. We can argue that harm is being done in all those cases, but to good effect.

More to the point, though, javert is saying that the military deserves praise in terms of how effective/professional they are, and you did not disagree beyond saying that the divide between politics and force was beyond you.

You cannot with a straight face tell me that you refuse to agree that the (American) military is due praise for their ability due to their given directions--while at the same time participating in this community here on HN.

Tell me, what is the cost of pushing consumerism and advertising on people? What is the cost of buying the latest and greatest smartphone? What is the cost of the shinier, faster computer? What is the cost of developing games and amusements to distract and destroy manhours of productivity?

What is the cost of datamining to circumvent privacy and better target ads? Of streamlining sharing of information about friends who wouldn't do so themselves?

So, please, by all means, criticize those dumb sociopaths in the military--but hold yourself to the same goddamn standard when talking about the majority of your fellow hackers!


[deleted]


The ethical ramifications of work that is military in nature that an engineer must consider are close enough to on topic for my taste. These are concerns I know are considered not only by myself but by several of my peers working on the east coast, where jobs with military contractors are a looming presence in the minds of any STEM major looking for work.


"When this army got it's ass kicked and left with it's tail between its legs a few years later, leaving those farmers to live their own lives as they chose to, did our society come crashing down?"

You are seriously undereducated on this. Were you born this stupid, or did you put effort into it?

At the very least, go read about how the war was actually executed, and to what degree politicians influenced military decisions (where to bomb, etc.). Read about the push to let natives do more of the fighting, and read about how shitty the South Vietnamese government was. There were a lot of factors involved, and few of them were failures of the military.


Do you naturally miss the point, or do you have to put effort into it?


Please restate your point with less inflammatory language and more facts.


No, I like it as it is. Why would I neuter my language for your sake?


Not everyone has the same militaristic view of the world, nor the view that we should impose our views on others by telling them what they should be thankful for.


You're playing fast and loose with language, and that's a big no-no for thinking rationally.

militaristic view of the world

That I think the military is a tool we need, does not make my worldview "militaristic." (Truly, I think the military is needed to ensure peace, but that's beside the point.)

should impose our views

I'm not imposing my views, I'm sharing my views. It's rational discussion. If you act as if sharing views equals imposing them, than you're giving a free bone to the dogs that want to destroy free speech (and believe me, they're more common than people think).


I was referring specifically to the "you should" language.


Fair enough, but notice that I'm backing that up with reasoned arguments. It's not intended as a commandment.


>The military is a tool. Like any tool, it can be used for bad things, and it can be used for good things.

Not all tools are neutral. Tools also have intended uses and inherent affordances towards certain uses.

Yes, you can use a knife to kill and also to cut a piece of a cake. An RPG? Much less neutral.

>Without having a military, we would be up shit's creek.

Yes, but if you did not have a military, lots of countries around the world would also be better off. You know, from certain imperialist, self-serving, resource grubbing, our-idea-of-society-spreading kind of actions...


Believe it or not, there are also plenty of "Defense" jobs that contribue to safety and the preservation of human lives.


Can you name a specific example?


Yes, easily. One responsibility of the Canadian Air Force is search and rescue on land and at sea.


Yeah, right. Couldn't that be implemented by some specialty rescue team?

Looks more like something that was assigned to the Air Force mostly for public relations ("look, we also save lives").


You mean like this specialty rescue team?: http://en.wikipedia.org/wiki/United_States_Air_Force_Parares...


Yes, except not having ANYTHING to do with the military.


Easily: thanks to the development and maintenance of the US nuclear forces, the US will never enter into a direct military conflict with any other first-world country with nuclear weapons of its own. Thanks to the doctrine of mutually-assured destruction, what could have easily been a massively destructive total war between two superpowers was restricted to a half-century of posturing and proxy warfare, all of which killed at least an order of magnitude fewer people than would have died in a direct, total war.

No, it's not warm and fuzzy to think about things in this way, but we're not children and we don't have the luxury of idealistic naivety.


This was 1959. Cold War in full swing, WW2, Korea a recent memory for many,


And today is the greatest time to be a programmer in the history of the profession. If ninjas and rockstars and brogrammers are part of it, it's certainly better than building war simulations and calculating civilian casualty rates in the case of mutually assured destruction in order to advance your profession and provide for your loved ones.

Those were not halcyon days for mathematicians, physicists and engineers. I'll take the juvenilization of the profession instead.


I agree with your sentiment, but it's plain ignorance to believe that those "war simulations" or any other software developed by the military is always intended to raise the kill count and not prevent casualties. To make my point, I would bet the invention of C made the military more effective in some fashion or other, but that's only a consequence of solving the more basic problems C was intended to address. I think it's hard to argue that the problems IBM was solving, or those that the military solves, are really any different in that respect. They solve one problem and then the solution can and will be adapted for other purposes, and it's often hard to judge whether that's a net good or not.


The mathematicians, physicists, and engineers of that era helped to preserve the survival of our civilization so you could have the luxury of being an idealistic pacifist.


Some of them worked for Nazi Germany or Stalinist regime or horrific Japanese medical "science" / torture so not all of them were working for freedom. Scientists sometimes say that science is neutral, and it's up to society to use the results wisely.

I appreciate that scientists don't always have freedom of choice under a brutal regime.


I was talking about American scientists ("our civilization"), though the other western allies would qualify as well. Sorry about the confusion.


EDIT: My reply sounds bickering. Sorry, it isn't meant to!

Stalin was a western ally. Stalin was, without doubt, evil.

German scientists worked for the US; they were recruited through Operation Paperclip.

(http://en.wikipedia.org/wiki/Operation_Paperclip)

There's interesting stuff about President Truman's anti-Nazi directive (tl;dr: Don't use scientists who were (or were supportive of) Nazis) and the way that was ignored.


> Stalin was a western ally.

No he wasn't; he was just an ally. The term "western allies" excludes the Soviet Union, that's why I intentionally chose it: http://en.wikipedia.org/wiki/Western_Allies

German scientists that helped the US, to whatever extent they helped the US, did help to preserve our civilization, even if they tried to destroy it during the war. But let's ignore them; I'm talking about the thousands of American scientists and engineers worked on defense applications in the 20th century, and to some extent about those of Britain, Canada, France, and so forth.


And so goes the populists and idealists, union makers and Jeffersonian/latent new dealers. Do not think for a second that the intellectuals that we hold in high esteem made the world better unilaterally. They were but cogs in a greater machine, and their opportunity to "preserve the survival of our civilization" was at the behest of greater men that navigated the reality of politic.


You assume that idealist pacifists could only exist in the current civilization and not any other.


It's rather hard to be an idealistic pacifist if you're within living memory of WWII, though some people surely accomplished it. It's especially easy in this day and age to mentally evade the notion that force is sometimes not only justified, but necessary and morally obligatory. It would not have been so easy to do so during the era that countries like Germany, Russia, and Japan were invading neutral countries and mass-executing their citizens.


"We will bury you" - Soviet dictator Kruschev, addressing a gathering of Western ambassadors in 1956, 3 years before this ad.

When a man with his finger on the button of half the world's nukes says things like that, you damn well better put your best minds to work figuring out how not to let it happen.


The nuclear balance of power in '56 (and indeed well into the 60s) was very one sided - this, after all, was the time of the fictitious "bomber gap":

http://en.wikipedia.org/wiki/Bomber_gap

The Soviets were, with some justification, terrified of a US first strike at the time.


You are still implicitly going with the idea that someone has to defend the current civilisation on behalf of inferior pacifists who won't.

What if there's a peaceful answer which sacrifices way of life instead of human life? Would supporting it be "luxury"?


Setting aside the fact that almost nobody would have agreed to it, surrendering and living under communism would have sacrificed human life as well, just as it did for every other communist country.


The original quote: "Like it or not, but the history is working on us. We will attend your funerals".


I think the early, mid 90's were a lot more interesting time to be a programmer than now, just before the Internet explosion. Back then, a bigger percentage of the focus was on using computers to do something, solve a problem directly (as opposed to implicitly) and the computers were actually getting powerful enough to do things on a cheap basis.

People's cat pictures were a vanishingly small portion of the landscape.

As an example, cheap computing power is one of the reasons that while at the beginning of the 90's, nuclear power was considered a money losing proposition by electric utilities and everyone was trying to get out from under them but by the end of the decade, all the utilities were hanging on to their nukes for dear life and trying to figure out how to extend their licenses/service life.

Now, a much bigger percentage of the focus is simply getting information out of one spot, tranporting it to another, dolling it up w/some marketing glitz.

That's not to say that the cool stuff isn't still going on, but that it is a smaller percentage and the profession has been dumbed down significantly (hence brogrammers and all).


Things that weren't in the mid 90's: stack overflow, google, gmail, google docs, os x, github, git, torrents, and basically every tool I use to make things with my computer (besides bash).

No I don't want to go back.


Yeah, back then you actually had to read documentation (yes, it existed) and more or less know what you are doing. Copy/paste coding and questions from colleagues like "How to connect to datbase, pls help urgently!" were unheard of.


You are certainly welcome to your opinion, but it's unfortunate that you seem so heavily defined by your tools (especially when they aren't even one's you've written).


Where do you think most of the research funding an AI and ML came from?

At my first job we did some work in expert systems and Ai(applying it to engineering problems) and one guy moved from those projects because of his concerns sbout the source of the funding.


Valid points, but isn't it awful also to have a company that basically wastes people's time and keeps them from doing anything but consuming?

I think computers should be employed to do useful things like 1. help us colonize space, 2. stop world hunger, 3. stop poverty, 4. stop war, 5. help people to love each other, 6. help us understand and love God/a higher being (assuming you believe in God/a higher being).

If your work is for some SAAS app that doesn't work toward those things- it is contributing to the death of humans just like a military application- only in a different way.


Notice that some things haven't changed at all: "We're solving tough problems!"


Which serves as exhibit A in the evidence against time travel being invented within a decade.


On the other hand, note that even back then a "senior" programmer is spec'd as having "2 years" of experience.


"Qualified candidates will have at least 5 years of experience with using Flowmatic in an enterprise setting."


Or "developer". You can probably plot the demise of programming as a respected profession as their titles changes from programmers to "software developers" or "architects", maybe using google books n-gram.


The funny thing about the problems stated at the top is I could imagine seeing a job posting today (maybe even from IBM) with the same problems and thinking, "Wow, I have no idea how I'd go about solving those problems."

It makes me wonder what they were really doing back then vs what a job in those fields would look like today.

Related, I find it somewhat annoying when people abstract the job to such a degree that you can't see the tangible things you'd be working on in that field. I'm all for a "change the world" vision -- I really am, not just qualifying --, but sometimes I'd like to hear up front how they plan to solve that problem.


Compare this to a more recent job posting, which more frequently reads like "Looking for a code ninja to make our photo-sharing app beautiful. You need to be awesome and make shit happen." Funny and sad at the same time.


I'm noticing that a lot of the expectations / requirements are a lot lower. Most are "up to 2 years" experience.

Last time I was job hunting, everything was asking for 5+ years experience in one software stack and multiple frameworks. Sometimes 5+ years in multiple fields. What changed?


Well, for one thing, the total number of digital computers in the whole world increased by a factor of about a hundred million.


What changed over the last 50+ years in computing that might raise expectations for programming experience? The personal computer "revolution" of the 80s and 90s would like to have a friendly word with you. :)


Computing as a whole has changed. We are comparing a time (then) when physicians and mathematicians were the ones laying the ground work for what we have today to a time when people start programming to make video games and cool websites (today).

Also, getting experience back then was fairly difficult. Just getting your hands on a computer to get the experience would have been a challenge. Compare that to today where there are kids, literally kids, programming at home right now. For example, I started programming when I was 17.


Probably the availability of people with 5+ years of programming experience. It was 1959! :)


I don't know. Essentially every place I've contacted I've asked about that clause. Not one actually expected to find it, but they all listed it.

5+ years seems to me to be a strange expectation. That would mean you've had either multiple failing jobs and might be an undesirable, or you've been somewhere for 5+ years. If you've been there for 5+ years, what's motivating you to leave? Where do people expect to find these vast pools of highly-skilled jobless people who have experience with <software stack X> in <field which employs a couple thousand people nationally> within <narrow time window>? That they never find any seems to underscore how irrational the 'requirement' is in the first place, but I see it everywhere.


I remember reading a job ad back in 2003 that wanted someone with 5+ years of C# and .NET experience. I assume the position was filled before 2006. (It's possible but unlikely that they were solely looking to poach Microsoft employees.)


This happens a lot. HR ends up being in charge and comes up with requirements that don't make much of any sense.


I think it's breaks down like no experience = SDE 1

2 years = SDE 2

5+ = SDE 3, regardless of language.


I am not sure if you are serious or not, but the post is from 1959. Experience in what they wanted was probably a lot more rare back then.


I think it's similar to the over-education problem. Oh, everyone has a bachelor's (2 years experience) now we need everyone to have Master's degrees (5-8 years experience).

I'm actually curious if one day there won't be a reversal of the trend...I would think it would be at the point where supply no longer meets demand. But really, who knows.


Programmers need far less education now than they did then, on the other hand.


I love the fact that the problem domains remain so static. Well, that's not strictly true: we have a lot more problem domains with which to contend now, but even our tremendously improved knowledge in operations research, military science, and meteorology haven't led us to consider these "solved problems".

Though I personally wish we focused much more on (1) and (3) than (2) as a society.


The first thing I noticed that none of the positions were asking for degrees in Computer Science. All math and physics. Before its time I guess.


Wikipedia says the first CS degree in the US was at Purdue in 1962. (And the first internationally was at Cambridge in 1953.)

http://en.wikipedia.org/wiki/Computer_science


It's amazing how so many of those projects remain largely unsolved today.


It was a time when people thought Artificial Intelligence was either on the way shortly (those who actually used computers) or already here with those Atomic Brains we have now (everyone else).

I just watched the Svengoolie episode of "This Island Earth" and, early on, one of the characters mentioned how that era was called the "pushbutton age". Well, we live in more of a "pushbutton age" now but familiarity breeds contempt; conversely, unfamiliarity breeds a kind of awe, and unreasonable expectations that can leave a bad taste in peoples' mouths.

I personally remember going through it when the Internet was first beginning to trickle down to the masses, pre-Bubble, and I remember thinking that some of those ideas then were patently idiotic. But which!

And in the 1920s, radio went through the same thing, if not bigger. Radio!


Most of these problems, and jobs (with appropriate changes in technologies applied), still exist today, and in greater numbers than in 1958. It's just that the other parts of the industry have grown at a much more rapid pace. This is good to bear in mind when you hear people treat "technology" as synonymous with "web site programming".


Love the pencil. Nice period touch.


I noticed none of the software jobs were in India.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: