When I first entered the working world as a programmer and administrator of an "academic computing center", in the early 70s, you met men like Ray - ex-military, GI-bill educated, learned computers from the electricity on up in their mid-career, rather frequently, either as customer engineers for one of the big mainframe manufacturers (there were 7 or 8, depending on when and how you counted), or from the minicomputer upstarts who were then assaulting the mainframe world of computing with their smaller, cheaper, 12 and 16 bit newcomers. Sometimes you'd get the privilege of a lunch or dinner with one sent out "from the lab" who was actually designing and building the machines you were working on.
It's hard to explain just how new it all felt, then. But in 1973, even though we were sitting on the cusp of the single chip microprocessor and personal computer revolution, the commercial computer was less than 20 years old, and college recruiting materials might well brag that at their institution, there were not one, but two computers on campus. I remember the day the total RAM at our institution passed the megabyte mark - closer to the end, than the beginning, of the 1970s. The ability to "program" was a rare skill - even the people who taught it were still just learning it.
My father was a physicist. He learned to program in FORTRAN in the university in the 70's.
Decades later I, still a teenager, asked him something like this: "Dad, you were a FORTRAN programmer and physicist in the 70's, you could be a very well paid developer anywhere in the developed world... why didn't you?"; he answered me: "I didn't thought this thing about computers would go too far."
We probably are close to the same age. My dad was an engineer who also learned to program FORTRAN in the 70's.
When I asked him a similar question his reply was (quotes are paraphrased): "It was way too tedious to do. You'd spend hours getting the cards just right. We used to put them in a shoebox and mark them with a pen in case we dropped them on the way to the lab. Then you'd wait until the next day to get your results. If you had a mistake you'd repeat the whole process".
Basically it was considered tedious, grunt work in his opinion (at the time...he later of course has come to understand the importance).
> "I didn't thought this thing about computers would go too far."
I almost didn't major in Computer Science because in the late 90s, there were so many negative articles in the New York Times, vis-a-vis software. People don't remember it now, but the media and the culture were utterly hostile towards us, and loved to say our jobs were going to India, that everything there was to know about Computer Science could be studied in railyard switching, in existing abstract math textbooks, etc.
By a combination of luck, and my dad's insistence, I ended up at Carnegie Mellon, and while I was there, I saw what folks at Google were doing, and I thought to myself, no, this stuff is hard, and this is just going to be the beginning.
> "It was way too tedious to do. You'd spend hours getting the cards just right. We used to put them in a shoebox and mark them with a pen in case we dropped them on the way to the lab. Then you'd wait until the next day to get your results. If you had a mistake you'd repeat the whole process"
Even what came after that, e.g. in C / C++ was considerably tedious compared to what we do today. Folks sometimes had to do objdumps of compiled binaries to debug what was going on. We had to get coredumps, load them up, and try to determine what memory error had caused things to crash (this is an entire class of problems that doesn't exist today). You used to legit need that CS degree in order to code in your day-to-day because you had to understand the function stack, the network stack, basic syscalls like wait and poll, etc.
It was a lot of work, for relatively little product, and I think part of the reason why software is paid more today is in part because of 1. faster processing speeds and 2. better tooling and automation, and higher-level programming languages – all of which were enabled in part by cheaper / faster CPU speeds (e.g. people don't have to care about how slow Python is – you can optimize it after you find product-market-fit), and 3. a better understanding of how software should be developed, at all levels of management.
> I almost didn't major in Computer Science because in the late 90s, there were so many negative articles in the New York Times, vis-a-vis software. People don't remember it now, but the media and the culture were utterly hostile towards us, and loved to say our jobs were going to India, that everything there was to know about Computer Science could be studied in railyard switching, in existing abstract math textbooks, etc
I'm glad I'm not the only one who remembers this - whenever I try to explain it to someone they look at me like I'm crazy. In the late 90s and even early 2000s the common wisdom with guidance counselors and even local recruiters was that programming and software design were dead end in the U.S. I remember one article literally said "the bud is off the blossom". I wound up majoring in electrical engineering instead of computer science as a result.
It all worked out in the end, but not following my instincts at the time is one of my few regrets.
It was hard to figure out at the turn of the century when the career fair was literally cut in half after the dot com bust. Although websites had been around for years, web apps were still pretty clunky and it felt like the world of internet-based possibilities still had a long way to go. I decided to try doing application development for pay because it seemed interesting and I figured I could easily switch to something else down the road. Plenty of relatives and acquaintances did inform me that my job was going to be outsourced abroad, though. :) And things looked dire again with the financial crisis but I was shocked that a few years after that, I discovered when recruiting at my alma mater that CS had become the most popular major whereas it was one of the smallest ones when I was studying it! So, lots of predicting that turned out differently...
Yeah, that's why I don't take re-kindling of the "it'll get offshored any day now" panic post-Covid that seriously. Time zones haven't gone away. The communication-based hard parts of software development haven't gone away. The way that delivering what someone asks for usually leads to them asking for more things, not fewer, hasn't gone away.
Yes, this is one reason I am personally really sensitive when various people say how privileged I was to get into computers and that we somehow got all this encouragement unlike young women, etc.
In the 80s we were mocked and called nerds for being interested in computers, and before and after dot com people thought this was dead end career.
Yes. Even as the internet started to become a thing in 1994-1995 when I was in middle school, I'd reckon less than half of my class had a computer at home - and fewer still of them would ever want to mention it.
OT, but when I search for "the bud is off the blossom" the only references I get from google are 2 links to hacker news comments... There's 0 in bing for that phrase. Never heard it before ever.
In the early days computer programming was considered a clerical job one learned in trade schools. I think people looked down on it partly because many of the early programmers were female, beneath the dignity of a male profession.
It rook my alma mater MIT until 2018 to recognize software worthy of a department in itself (after a huge financial donation). Before then it was a step child of Electrical Engineering. This is kind of ironic because me and most of my classmates ended up writing software for money, though almost none of us majored in that field.
> In the early days computer programming was considered a clerical job one learned in trade schools.
That's because in those days, the term "programming" didn't mean "software development", it referred to data entry. It actually was clerical work, comparable to typing a dictation on a typewriter. Only later, when user interface devices (keyboards, displays) considerably improved and it became more efficient to unify those tasks in one person, did "programming" and "software development" start to become synonymous.
It has nothing to do with "dignity of a male profession", or oppression of women, just a misunderstanding of a shift in the meaning of words.
> In the late 90s and even early 2000s the common wisdom with guidance counselors and even local recruiters was that programming and software design were dead end
My career advice as a teenager was that there wasn't any point doing software, as Microsoft had made it all already with Microsoft Office.
My mother talked me out of going to school for programming, and a decade after I graduated high school that’s what I ended up doing anyway, realizing it was going to lead to better prospects.
Universities are always several years behind the curve. At college in the 90s they were still teaching token ring networking despite Ethernet already being common place. The same college told me that programmers didn’t design any of the code they write; they only transcribe code from flow charts.
Just yesterday I was talking to a grad about DevOps. He said the field sounded boring from what he was taught at uni. Then when we discussed it more it turned out his “DevOps” course was actually just teaching them how to be a scrum master and didn’t include a single thing about automation, infrastructure as code, etc.
I also remember just how garbage general publications were with regards to IT. And to be fair they still are now. But there was always a wealth of better information in specialist publications as well as online (particularly by the late 90s).
That may well be true of some universities today. In 1970, they were pretty much the only place you could get hands on experience with a computer unless you somehow slid into a programming job in the financial industry, or a one of the few other areas that actually used them. And they were not behind the curve on the technology, although they tended to have lower end hardware than industry, because any compute was very expensive. The invoice on a 64k byte HP3000 in 1972, which on a good day could support half a dozen users actually doing any work, was over $100K. Memory upgrades to 128K ran you about $1/byte installed - maybe $8 in today's money. It was a big deal to be allowed hands on use of them.
I was talking about 90s to modern era. Not just modern era.
And having computers doesn’t mean any of the lecturers understand the modern (for that era) trends in computing. More often than not, it’s computer clubs rather than cause material that hold the really interesting content.
I don’t doubt there will be exceptions to this rule. But for most people I’ve spoken to or read interviews from, this seems to have been the trend.
It definitively is true of local universities. I've met people from the local university who have a master in machine learning, yet have never heard of docker.
This is a good thing. Opportunity costs are incredibly important with university educations because students have a limited time to learn.
Why spend the time futzing with a tool like docker? It's not foundational to machine learning, so learning that tool takes away from time that could be spent learning something more relevant. And the student may or may not use it when they get a job.
"Getting shit to work" is more foundational to machine learning than you would think, and containers helps a lot with that. If you want to train models on someone else's machine - and you probably will, for anything big - you need to know a little about how that sort of thing is done today.
And if you want to try two different deep learning frameworks, dependent on different versions of cuda, and want them to not break each other, God help you if you try that without containers.
It's not that they don't have a "course in docker". I understand that. It's that they haven't even heard of it, so they don't even know where to start to look for solutions to problems like that. I have been through that pain myself.
Containers is just one of so many easy things, that make your job so much easier, I've learned the painful way in 20 years as a developer in (mostly) non-elite companies, where no one else knew it either because they hadn't been taught at the local universities, because no one there knew it either.
It's highly dependent on school. The Ivies, including "public Ivies" will teach you proper comp sci. A lot of other big schools will do you well also. When it comes to smaller regional universities or junior colleges and community colleges, then it's hit or miss. Your intro CS course may be great if you manage to get an instructor who knows it well themselves and wants their students to know it, or you may get someone who teaches students how to do Microsoft Office without a shred of programming.
I went to RIT in the early 2000s. I remember the CS and CE departments were quite good (although the prevalent Sun workstations were already getting outdated). Somehow I ended up taking 1 elective from the "Management Information Systems" department and the instructor kept mixing up search engines and web browsers. I think I dropped the class shortly thereafter.
I was having to deal with token ring in '96-'97, and have not touched it since. Seems like it went away quite quickly. Cue up someone replying that they're still maintaining a token ring system in 2022... :)
I had to deal with token ring way up until 2001 when even the most die hard nuts had to admit that you could buy a dozen ethernet cards for the cost of a single TR. IIRC the TR people tried to convince us that ATM was the future.
Not quite 2022, but yeah I was maintaining a token ring based network for some subway at my last gig in 2019. As far as I know, no work is done on it now but the subway-car using the system are schedule to run for at least another decade so another bugfix release of the networking firmware is not entirely out of the question.
Hah, not quite nowadays but I, too, was dealing with one from around '97-2000'ish. What a pain in the ass. That was just one network in the building, I also had to deal with 10base-t, which was also a nightmare. shudder
I remember taking a graduate level networking course at NYU in the early 1990s. The instructor was an IBM consultant. We studied token ring, FDDI, SNA, HDLC/SDLC and several other commercial products.
One evening, I raised my hand and asked when we were going to study TCP/IP.
He simply quipped, "TCP/IP is not a real networking protocol."
So I wouldn't say that universities are always behind the curve :)
In 2015 or 2016 o was taking the computer architectures class at my local university… the processor they based the whole course upon was the motorola 68000.
As far as introductory courses go, the older/simpler the processor,the better it is for everyone. My class groused at being taught "old tech" because we taught the 68k, but very few of us had done any assembly before, I think most of the class would have failed if started of on amd64
And why wouldn't they base it on that CPU? If you're trying to learn the basics of shipbuilding, you don't start by going on a deep dive into the construction of an aircraft carrier.
It's a simple chip, with a simple instruction set, that can actually be taught to you in the time allotted over a three-credit class.
The bit on "DevOps" is pretty egregious. There's two key things at stake here.
1. "DevOps" is an absolutely critical part of automation. It's the reason why we can start tech companies with such small engineering staff compared to 20 years ago. It's as important as all the high-level languages we use. This stuff is the logistics of how software gets deployed. It's the same in business as it is in war. Coding chops is like tactical strategy, and being able to ambush a tank column. It matters, and you won't have an engineering org without it, but the whole chain of how stuff gets deployed and iterated is what keeps the ammo flowing and the fuel pumping.
2. Universities want to teach stuff that'll still be relevant in 50 years. Given their proclivities, that means stuff like algorithms.
On one hand, I think that universities and academics can be somewhat forgiven for their ignorance on this matter. In fact I think we ourselves don't know what's going to be needed in our field in ten, twenty, thirty years. If the folks in industry didn't predict infrastructure-as-code 20 years ago, then the universities couldn't have taught it.
But what I know now is that:
- After all these years, no one is getting rid of shell scripting.
- Old school (i.e. 2nd generation) config management still has its place in many companies. Ansible is great for provisioning an AMI, if you need one, but if you need static infrastructure, puppet and chef are actually better because they track state, which allows you to better manage config drift.
- k8s may be hot and all, but a lot of the underlying "ops" stuff still translates. You average resource usage over pods instead of hosts, for example.
- Put together, there is an "instinct" for ops that is not unlike the "instinct" people learn for math, algorithms, and code. They are completely separate and an engineering org needs both. I think that universities don't "get" ops because computer science is more like math, whereas ops is more like history.
- On one hand, being stuck in an older ops paradigm is pretty awful – if you missed the transition to infrastructure-as-code, then it may be really, really hard to get out of that rut. But the field itself can be pretty bad with being stuck – it took us forever to give up our own datacenter racks.
- But otherwise, the old knowledge about old tools didn't necessarily just go away, in fact it's oftentimes still quite relevant. Linux internals (e.g. iptables) are still useful.
- When I was at CMU, a lot of folks learned some of that ops instinct in the dorm room, and in the computer clusters. But the universities pretty much made it optional. Looking back, I think this was a mistake. Ops is pretty much entirely transmitted through osmosis, whereas we at least try to teach people to code in official uni classes.
> ...and loved to say our jobs were going to India.
They weren't wrong, though; they just omitted delimiting that assertion.
Back in those dark ages, mainframe jobs were still considered by career "experts" the "adult in the room" jobs of programming. It is hard to convey to people who never studied that era or grew up in that era just how much microprocessor-based computers were considered "not real computing" in vast swathes of the industry. The proprietary Unixes thrived under that lay perception, as a "serious business" microprocessor-based computers market segment.
And the mainframe jobs did by and large up and wholesale decamped to India from large chunks of the mainframe account base. Those career experts were right in a way.
Just not quite the way they thought. The scope they thought in was too absolute because they lacked the technical (and business, and financial...) perspective and context to understand why the same wouldn't happen to quite the same extent to sectors outside mainframes, nor of the explosion of re-invention of the wheel of many mainframe tech stacks that would drive the industry forward even to this day and beyond, along with the rapid recombination of new ideas.
I was using objdump and cordumps to debug a kernel crash just last week. Not tedious at all. More like working a difficult puzzle. And very rewarding if you figure it out and fix the crash.
objdump and coredumps today are way less tedious than getting a compiler error the next day (if not few days out!).
At least with punched cards if you kept them sorted (line numbers in front a'la BASIC really helped with that) you could easily edit in place - just replace that one card that was incorrect, because each card = one line.
TECO (which begat EMACS) started out because paper tape which was preferred storage on DEC machines was harder to edit in place than card stacks and instead of retyping whole program you'd summarise your changes (that you dutifully copied on fanfold greenbar printout - or suffered) into few complex commands then used the resulting 4 tapes (TECO load tape, TECO commands tape, incorrect program, fresh unpunched tape) to get one corrected.
For maximum efficiency, the OS/360 team had to work 24h - the programmers would write their changes on first shift, then teams had to prepare cards, submit them for compilation, night shift reprinted modified documentation, and when you'd arrive at work you'd have fresh documentation and results of your compile (unless you had the luck to work on-line that day with more immediate feedback)
You say it like negative articles about Comp Sci/Applied Programming/Really any Tech Co from the NYT is a thing of the past. It's with a sense of irony that articles denouncing Tech is easy, routine clickbait for them now.
> I almost didn't major in Computer Science because in the late 90s
You missed, by a few years at least, the opportunity to study and earn a degree that is no longer available from CMU, the B.S. in Cognitive Linguistics. I got an early acceptance from CMU in late 1988, my first choice of education because I wanted that degree in particular, but I could not afford CMU tuition let alone housing, and I was ineligible for financial aid. I studied CS at Virginia Tech at about a tenth the cost and never regretted it. Though I never met him, Allen Briggs[1] was an underclassman there while I was an upperclassman. He ported NetBSD to 68k Macs while still an undergraduate at Virginia Tech, which always impressed me. A/UX licenses were not cheap, and MacBSD was free.
The backdoor into CMU back then was and maybe still is Pitt. Pitt students had the privilege of signing up for any CMU course and it just meant a slightly longer walk to class.
The Cognitive Linguistics degree at CMU in the 90's was an interdisciplinary combination of cognitive science, neurology, computer science and linguistics, and disappeared when the faculty member that created and sponsored it passed away in the mid-1990s. While Pitt is a quality university, I don't think they offer degrees from CMU. Pitt was on my radar and one of the few places I was accepted to, but out of state tuition at the time iirc was $7K/semester, more reasonable than CMU's ~$12K/semester, but I had moved to Virginia the year before, and Virginia Tech's in-state tuition was about $2K/semester with housing (though I was required to purchase $4500 worth of Mac and A/UX license). Today, Virginia Tech's instate tuition is as much as Pitt's out of state tuition was then, which is now about the same as CMU's private tuition was in 1989, and CMU's annual private tuition today costs a little more than an Audi Q5 Prestige.
Perhaps I wasn’t clear. CMU allowed Pitt students to register for CMU classes. Your degree would say Pitt on it, but you would have attended the exact same classes as the CMU students. As in sat in the same classroom with the same professors at the same time, doing the same assignments and taking the same tests.
Thank you, that is what I understood you to mean, but if Pitt doesn't offer the same degree, how could one graduate accumulating credits for a degree that doesn't exist? While many universities allow non-students to audit courses, and one could take every required course of a subject this way, without actually being awarded the degree one can not claim the degree. Also, as I explained, I was out of state, making Pitt tuition expensive. CMU does grant its FT employees and their children free tuition after a token number of years of employment, but, of course, even a qualified and experienced HS graduate without an undergraduate degree would not make it past HR for an interview. And unfortunately, the Cognitive Linguistics degree only existed for a very short window, about 5-6 years. Personally, my only option to attend CMU was to get about $90K worth of college loans, or conversely, $60K worth of loans to attend Pitt, but I would gave sooner accepted the appointment offered me to Annapolis, an even more selective university than CMU that costs nothing but a commitment of 5 extra years of military service. What I did instead was study CS at Virginia Tech and graduate only $10K in debt, which was not difficult to get out from under. And though I did not study any linguistics there, I did exhaust my curiosity in cognitive science and neurology via an elective in philosophy of mind. CS was itself 60 credits of CS and Math with a built-in Math minor, and was tricky enough completing without idk how many other credits in proper neurology and linguistics that I missed out on at CMU or Pitt, though fascinating, each considerably complicated subjects in their own right.
Very interesting. I am from that era, teaching myself to program starting in 1983 (which I thought was quite possibly too late to catch the microcomputer gold rush ;). I was self-taught and learned from popular computer magazines and well-written, carefully selected books. But now that you mention it I remember looking at course catalogs from good schools and being shocked at how retrograde it all was. Those guys at the universities totally did not get microcomputers for years after they should have.
I majored in CS in the late 90s and this wasn’t my experience at all. The Netscape IPO happened in 1995, followed by 5 years of the dot.com gold rush. Computers flew off the shelves, and everyone wanted to get online.
The dotcom crash happens later in 2001, but if we are talking about the late 90s, then I’d say it was a period of huge energy in the CS field and tech companies were hiring as fast as they could and jobs were plentiful all around.
We're probably about the same age. I decided against comp sci at the turn of the century because of exactly what was being said. The dotcom bust just happened and if the media was to be believed programmers were taking jobs flipping burgers and there were enough programmers without jobs to cover the world's programming needs for the next 50 years.
I wound up going to school for economics and then later found my way into the IT world by circumstance.
When I started uni in 2004 it was still like this. I kind of was ashamed at parties to tell what I study, not to come off as too nerdy. I did a double major, so business was hipper. Just imagine! The status of developers changed so much in two decades. Nowadays people are impressed. And even in my career I see the difference. Not so many VPNs anymore, the move to the cloud made everything much easier.
> Even what came after that, e.g. in C / C++ was considerably tedious compared to what we do today. Folks sometimes had to do objdumps of compiled binaries to debug what was going on.
They used to do objdumps. They still do, but they used to too.
> You used to legit need that CS degree in order to code in your day-to-day
And when people today look back with disdain at ugly VB applications and wonder what simpleton, non-programmer, drag-and-dropper built this piece of excrement (that has somehow been running for 17 years without an update and the replacement project that we hired those consultants for ended up 3x over costs and nobody uses it) as opposed to a Real Software Program, there's the reason.
I was in high school in the early 00s and heard the exact same thing, and that was a major reason why I chose not to major in CS! (The other is that my HS programming curriculum and teacher were inadequate, but at the time I was convinced that I just wasn't wired for programming.) In the end I took the long way around and ended up in the field as a self-taught programmer.
There was also bad programmer job market crash in the 80s that changed the market up a lot by the 90s. In fact, this was about when the gender ratio became very skewed (men and women dropped out of programming at equal rates, but the recovery was lopsided).
Our computer science department chair (Ed Lazowska) at the time brought this up as a reason to be wary about department expansion in the mid 90s.
You’ll see a huge drop off in computer science graduates after a local peak in 1985 (they wouldn’t get back to that level until 2000, note this article also quotes data from Ed Lazowska).
I majored in Computer Science in the late 90s and honestly don't remember any of what you're saying regarding negative/hostile media.
To me it felt like a golden age. The .com bust hadn't happened yet. If you could turn a computer on there were jobs everywhere. The world was starting to get online. Linux was really gaining traction and Slashdot was all time.
This is something my father told me too. He said he spent some time writing the code on paper, thinking a lot about it; then when he was somewhat sure about what he had written it was time to punch the cards. He used to leave the batch on Friday and went back on Monday to ask the "computer operator" about results and sometimes the result was "syntax error on line 1."
I enjoyed programming in the 90s and early 2000s but I feel it’s turning again into tedious grunt work with scrum, agile, yaml configuration files and needlessly complex systems.
You should seriously look in to changing companies.
I'm not trying to denigrate you in any way, I myself switched from working at $BIG_BANK to a more lithe type of company and 90% of that bullshit went away.
Agile + Scrum stuff are minimal and now consume ~4.5% of my week instead of ~12.5%, I'm not spending half my "dev" time babysitting and maintaining giant applications no one really understands in full, and instead work on a bunch of little serverless applications, maybe half of which I do actually understand and can explain end to end.
This is one industry where reinventing the wheel is quite the norm. It's good for all the developers - it keeps them working. Older devs can work on legacy systems, and newer devs (or devs picking up new skills) can recreate systems with the new tools and languages.
The current implementation of Agile in most cases is pretty much the opposite of agile as described in the agile manifesto.
In my team we have reduced the process to having a simple backlog which we work through. But I have seen other teams where you spend enormous amounts of time on planning but it’s frowned upon if you think any further than the next sprint. Just check off tasks without any thoughts about long term architecture or strategy. Basically just a sweatshop with replaceable “resources” (the company doesn’t hire “people” anymore but “resources”)
There is no current implementation of Agile though, Agile was an umbrella term for a variety of different practices. I don't think you can blame it if it's been poorly implemented.
Granted, that is most of what I see: poorly implemented agile everywhere, usually half embracing SCRUM. I think that has more to do with the typical command and control nature of upper management though.
I learned Fortran 4 in high school in 1967-1968. That’s how good the NYC exam schools were — Stuyvesant in this case. We had our own 1130. This came in handy in college, I did the programming in a physics group, immediately. But it seemed too tedious to do as a career. I still feel that way.
My dad is an accountant who took some punch card FORTRAN programming classes in the early 70's as well. After 3 semesters he told his professor he wouldn't be returning to the computing department - his professor was shocked, for he was a star pupil! - for much the same reason. He and my mother still tell stories of Saturdays and Sundays spent organizing his punch cards and applying patches (literal in those days!) in the campus computer labs so he could have a more rapid debug cycle than was available during the week.
I agree with your father that batch processing (cards) is a drag.
I was in school and working in the computer lab when we switched from batch processing (cards) to a time shared system with terminal labs. I was part of the team that wired up the campus and connected the campus to the ARPANET (precursor to the internet).
As we rolled out the terminal labs, each CS class was either assigned to batch processing or time sharing. Since I was part of the lab, I could schedule my classes to be time share only. I was only stuck with 4 classes not in the time share lab. 2 were batch processing. For some reason my LISP and AI classes used a teletype interface. It was not as bad as the cards, but still weird. Some classes allowed me to use my personal computer (TRS80) and work at home.
At the time my uncle was a programmer. He said that there was no future in CS and I should switch majors. I could already see the wave coming and ignored the advice.
This is exactly how I first learned to program. Waiting a whole day to find out you had a bug was just way too frustrating for me so I completely wrote it off, as much as I enjoyed writing code. Once the first PCs came on the scene, though, everything changed and I was all over it. Still am.
I remember going off to college in 1986, and thinking I might major in computer science, and my dad told me, "anyone can learn to program computers, you might be better off with EE." To be fair, at the time, anyone could learn to program computers (and it's probably still true) -- my dad was doing it, and his major was chemistry -- and really, anyone that's really good at programming computers is necessarily self-taught to a great extent. You just don't become a great programmer by dint of tutoring. Anyway, I stayed in EE for a year, then switched my major to computer science with a minor in EE, and no regrets.
I find it astonishing that only a couple years later the basic Unix development environment (ttys and full-screen terminals instead of cards, cc, sh, make, ...) came into existence, and has basically prevailed.
In one of my early scientist-programmer jobs I was assigned an assistant to keypunch, submit jobs and pick up printouts. The other scientists thought I was odd for wanting to do all this myself. I had much more productivity than them.
What is forgotten sometimes is that there was (for men) a severe prejudice about working with a keyboard. The image pre-1985 or so was that keyboards were almost exclusively associated with typing pools. Those typing pools were were as far as I know 100% female.
To be honest, this prejudice still exists. I heard a C-suite exec mocking "those guys with the ticky-tacky machines".
My uncle worked for Folgers coffee in the 60s as a general office clerk. They gave everyone some kind of test designed to see if you had aptitude for programming. He scored high, so they asked him if he wanted to learn COBOL, and his career was born.
"Dad, you were a FORTRAN programmer and physicist in the 70's, you could be a very well paid developer anywhere in the developed world... why didn't you?"..."
I'd suggest there is likely another reason, and if your father didn't actively think about it he probably understood it subliminally. Back then, programming was part of mathematics at many universities and, like it or not, everyone doing science and engineering had to study the subject—and for many universities that was Fortran. Fortran was essential part of the background culture: if one was doing mathematics or any of the physical sciences, Fortran was just there—thus, one didn't see it as special or exceptional.
I had no option but to study it but I didn't see that as an imposition—diehards like me were regularly chucked out of the punch card room by the university security guards last thing at night when the joint closed.
Moreover, at my university the Fortran lecturer also wrote the Fortran textbook (well, the ones we used at least), so there was no leniency or excuse: Introduction to FORTRAN IV programming using the watfor compiler - 1968 & 1971 and Basic FORTRAN IV programming (version IBM 360), 1969—by John M Blatt: https://en.wikipedia.org/wiki/John_M._Blatt.
As I found out later there were better textbooks on the subject and Blatt was a didactic forceful character without much charisma, so his lectures were somewhat painful. However, comic relief was not that infrequent. We had Fortran lectures in a large hall which had an upper circle like a picture theater, a certain fraternity would frequent the circle and aim paper airplanes at him when he was facing the blackboard much to his chagrin. No, I wasn't one of the guilty but I fully enjoyed the spectacle.
My grandpa told me the same thing; he didn't think computers were gonna last.
Not sure of exact time, but I think it was late-60s-to-early-70s; Gramps was a business professor at a major university in the Midwest. A team of math nerds started working on this new thing called a "computer". He said it was almost the size of a basketball gymnasium, it took weeks to get the thing set up, and was always breaking. In the end, it could only do simple calculations. He said the university viewed the project as pretty much a failure. So, Gramps, the futurist, told all his business students not to get involved with computers, there was no future in them. Good job, Gramps.
He later told me his big hope was that none of his students listened to a word he said. LOL
As recently as the early 90's, making a lifetime career out of software development was considered impossible. When I was starting out in the late 80's all the developers were taking classes or had a side business, with the goal getting out of "programming" before it was too late. Even those who wanted to stay in the industry took every opportunity to talk directly with clients so they could get into sales or marketing.
One have to understand the social factors as well. It was womens job in the beginning, the stigma of being seen as a female computers was most probably what made it unattractive and tediuos to many men at the time. Many highly placed engineering bosses in programming were women for a long time because those where the people who had experience. See Margret Hamilton of Apollo, and also some of the pioneers in cellphone "software".
my father who was a research physicist from the 70s+ lamented at his colleagues who got too distracted with programming their computers! For me it was great, I got to grow up with computers and electronics and a bunch of adults who loved these things!
It was very niche. My dad (also early FORTRAN programmer) graduated in the very first undergrad CS class at UCLA, around '69 or `70. I think very few universities had a CS course at that time.
What is interesting is that the IITs in India (the first 5 at least) were setup a decade prior (late 50s), and some had very heavy support from American and European universities while setting up. So much so that IIT Kanpur actually had a CS department that started in 1963!
But looking back on it, I would say out of my current perspective this Mel guy was not a genius, but one of the worst programmers you could probably hire:
He written unmaintainable and even unchangeable "write-once" code that was so complex that nobody else could handle it either. He refused to do what he was payed for and just went away as he lost interest.
One of this kind of dudes on your engineering team and your company is in real deep trouble…
It's a given that you will need to throw away everything they did and start form scratch should any changes be necessary later on. However there's one fundamental constant in software engineering: Your software is going to need to change over time! No mater whatever somebody told you upfront. So in case you've got software built by some "Mel" you're completely screwed at that point, especially as changes to SW are usually needed the most at some critical period in time for your company.
He can be a genius to be admired while also being one of the worst programmers you could hire, at the same time. Someone to appreciate, but not to emulate. A highly optimized human being, optimized for the "wrong" thing. More in the realm of art than anything else.
Nah, those were different times when bits and bytes mattered. Everything was written in assembly/,machine code. Mel's tricks were just how things were done back then. There was no repo, code didn't need to be maintained or added onto. The lifecycle of software was much much shorter.
> Nah, those were different times when bits and bytes mattered.
Obviously not. We're talking about mundane business software.
Also the "optimizing compiler" that couldn't reach such levels of "perfection" wouldn't be a thing if this would really matter.
> Mel's tricks were just how things were done back then.
Obviously not. Otherwise there wouldn't be any point in this story.
It points out, with a lot emphasis, how exceptional Mel's code was!
> There was no repo, code didn't need to be maintained or added onto.
VCS dates back quite some time…
Also maintaining code was of course not any less important for a company as it is today. Simply as companies back than also relayed on their software to operate.
> The lifecycle of software was much much shorter.
No, of course not, as nobody would throw away some very expensive asset for no reason.
If anything, lifecycles of software were much longer than today (when you can deploy changes every few minutes if you please). Stuff written in the 70's is still running on some mainframes today!
As changing software was much more dangerous with much higher risk of breakage, less experts around, and everything much more difficult in general, it was more usual to try to not touch an already running system. (Maybe you even heard some quite similar proverb coined back than ;-)).
But "not touching" it does not work, as there is only one truly constant thing: Change.
>IBM's OS/360 IEBUPDTE software update tool dates back to 1962, arguably a precursor to version control system tools. A full system designed for source code control was started in 1972, Source Code Control System for the same system (OS/360).
The events of the story predate the precursors of VCSs by two years, and the earliest true VCS by a decade.
My guess is that the earliest Code Versioning systems were completely manual, "Duplicate your tape, mark it 1.2, store it in drawer". And these manual processes were brought to and duplicated on computers when code began to be stored on the computers themselves, instead of through cards and paper tape.
Certainly if you were a business, you had an extreme business interest in keeping your "known good" stack of cards in a place, and every revision in code required a new stack of cards.
"Hey, the machine just ate 10 cards from the payroll software, can we get duplicates made?"
"No, those were the originals, guess we're SOL no one gets paid" never happened.
Most likely "Ok, version 1.34 of the payroll software that was updated last week? Cards 1032 to 1042? Duplicate cards will be up to you within the hour"
or "We have to revert to the old payroll processing software, can you create a new fresh copy of 1.33, 1.34 has some bugs and we need to get tonights run in?"
I can't find any definitive info when this computer got actually manufactured ("announced in 1960" doesn't mean strictly the same). But this was the time Mel was met first time by the author.
The story plays likely some time thereafter.
I guess some significant time, because it takes time even for a genius to become familiar enough with a machine to do all this kind of trickery described in the story.
I think it may make sense to assume even some years passed between when the author met Mel the first time and Mel's departure form said company.
So I wouldn't be even so much off with the VCS statement—which actually doesn't state any relation between the usage of VCS and the story. I've only said that "VCS dates back quite some time". Which is obviously true. ;-)
But, all this actually doesn't matter.
The more important statement was the following. Which is a direct reply to "code didn't need to be maintained", which is in my opinion just not true.
I did not say VCS was used back than for that purpose.
I guess they preferred more a sort of solid hard copy. :-)
nah, when you're constrained enough, you rarely to never sacrifice anything in the name of future changes. You figure out what needs to be done, then you write a program that does it. If it needs to change, you write a new program. Part of why that's not as bad as it sound is exactly because of those constraints, you're not dealing with megabytes of source code.
There are lots of problems that are specific and simple enough to solve, that it's easier to write a C program from scratch, than it is to find, install and then learn how to do it with some existing package... The same concept goes for programs.. At a certain scale, it's not worth the extra infrastructure/overhead/rigidity/complexity that it takes to write software that's optimized for change.
That said, today, in 2022, it's more or less the opposite, codebases are huge enough that most of software "engineering" is about plumbing together existing libraries, and at that scale, it's an entirely different thing.
No, not even given the historic context this makes any sense.
We're not talking about embedded software with special constrains here!
This story is about mundane enterprise software.
Nothing in the story justified this insane level of over-engineering and premature optimization.
Just using the "optimizing compiler" was deemed "good enough" for all other needs of the company, likely…
Also nobody asked for that over-"optimized" throw-it-away-and-start-over-if-you-need-to-amend-anything-crap.
I have still this warmth nostalgia feeling when looking at this story, but when thinking about it with quite some experience in real world software engineering I'm very sure that this kind of programmer would be one of the worst hires you could probably run into.
Finding any valid excuses for "write-only" code is hard, very hard. This was also true back in the days this story plays.
Sorry for destroying your nostalgia feeling, but please try to look at it from a professional perspective.
The software running on a specific ROM might not be able to change (assuming we're talking for devices using mask ROM and not EEPROM) but it doesn't mean the code itself is supposed to be disposable. Different devices or revisions of the same device can benefit from code changes. Even on the same device unit, they might want to replace the ROM chip to include some fix if it's important enough and makes financial sense. The software itself transcends the constraints of any particular delivery medium.
Ha, perhaps. I work in game dev, and previously had a stint in integrated display controllers for feature phones. Anecdotally, most of my code is effectively thrown away once shipped.
I was a physics major until I stumbled across the jargon file online. It was an, "aha, my people!" moment. It was already showing its age then—nearly 20 years ago!—but sucked me into CS where I was much happier.
This comes up on here every few months, and I can't help but read it every time. We had a few Mels in the earlier days of my employer's history and I can't help but be a little bit in awe of the stories I've heard from and about them.
A few years ago in school I had to read a paper that was written by a guy who happens to also be a part of my local electronics hobby group. I mentioned this to a friend and he noted that, unlike a lot of fields, Computer Science is still young enough that many of the pioneers are still around.
I worked at Microsoft in the late 90s and methodically went around to all of them, from the creator of MS-DOS to the creator of Turbo Pascal/C#/Typescript, and asked them all the questions that I couldn't find in the computer history books.
I didn’t take notes or anything and many of my questions were very specific. For example, how did Turbo Pascal do error handling (easily, because it was a handwritten recursive descent compiler), or did Tim Paterson, who sold DOS to Bill Gates for $50,000, ever regret it (no).
My grandfather was like this. Full stint in the Marines and then worked on computers. I remember him telling the story of how exciting it was (and what a big deal it was) when one of their systems got upgraded to 4k of RAM.
> college recruiting materials might well brag that at their institution, there were not one, but two computers on campus.
Our community college highlighted their Vax minicomputer by having a special window that showed all the flashing LED's to passers by. But when PC's became the "in thing", they felt embarrassed and covered the window with PC posters. Poor Vax, lots of memories together. It was an early lesson in IT = star-today-washup-tomorrow.
We had one of the first class of HP3000 minicomputers, which was both highly advanced, with it's stack architecture and variable length memory segmentation, but also very disappointing. But on the flashing lights front, the first design class did not disappoint (see console in upper right quadrant - lots of LEDS - which were new and only red in those days - and paddle switches): http://www.hpmuseum.net/images/3000_2615A_1973-25.jpg
Computers based on integrated circuits were more recent. But the foundations were much older.
For example take the punchcard. Punchcards as a way to work with automatic computing devices go back to Hollerith machines and the 1890 census. That was how IBM got started. The phrase "Super Computing machine" dates back to 1931, and referred to a tabulating machine built for Columbia University. Raytheon was producing and selling analog computers starting in the late 1920s. Much of the calculations for the Manhattan project were done by machines - Feynman talks about this in Surely You Must Be Joking, Mr. Feynman.
And to give a sense of how much history there is, one of my favorite essays in the 1945 essay, As We May Think, which you can find at https://www.theatlantic.com/magazine/archive/1945/07/as-we-m.... It provided the inspiration for both hypertext and the science citation index. The recombining of those ideas in the PageRank patent was the foundation of Google. But how could someone in 1945 understand computing that well? It is simple! Its author was the man who designed those computers Raytheon sold in the 1920s, and among other things was in charge of the development of mechanical computers for the Manhattan Project. (OK, he did a lot more than that...)
I doubt it. If by computer you want to mean any machine capable of mechanically doing some kind of calculation, then of course there are examples going back hundreds of years, or even millenia - the Antikythera device was without doubt a mechanical, astronomical computer, for example. But I wrote "the first commercial computer," and the first stored program, Turing complete computing machine that you could actually write a purchase order and be invoiced for, the UNIVAC I, was only produced in 1951. So, I cheated a bit with "less than 20 years old in 1973." It was 22 years old. The point being in terms of my post, that when I started, computers, and programmers, were still very unusual in the employment landscape, and for people like Ray, of the original post, the concept of being a programmer appeared AFTER their formal education in electronics was finished.
There is a distinction between computer and stored program computers, and you're right that the first commercially available stored program computer appeared in 1951. (Though, interestingly, there was a 1936 patent application on the idea in Germany. And a barely functioning one was actually built in 1941.)
But Turing complete computers predated that. In fact Turing's own design for a Turing machine was NOT a stored program computer. And you're right that the modern idea of programming postdated stored program computers.
But computers are older. As you pointed out, arguably thousands of years older.
However I maintain that automatic tabulating machines were on the path to modern computers. Early accounting applications were based on them, as were key parts of the technology used. Like punch cards.
I agree that "automatic tabulating machines were on the path to modern computers" were on the path to modern computers. But that does not in any way negate that there was an inflection point in the 1950s and leading into the 1960s that represents the birth of what we mean by the word computer (in English). The machines that I learned programming on in the 1960s were qualitatively different from those things "on the path" to them, but are recognizably the same species of machine as what we work on today - which is to say, they were Von Neumann architecture, Turing complete, off-the-shelf, generally available computers for which one wrote code - not calculators, tabulating machines, or hardware reconfigurable logic engines, and not laboratory experiments aiming toward that. The only thing missing from them that is inherent in our contemporary approach to information is the notion of ubiquitous networking of devices to create a broad and broadly accessible information and calculation landscape. That was still a laboratory project in 1970.
Does that matter? Depends on your point of view. What I meant by "how new it all seemed" then tells the tale here. Baruch could maybe see the future in 1945 because of his experience and vision. But by 1970 or thereabouts, almost anyone who touched it could see it, and millions were increasingly able to touch it, precisely because it was was commercially produced and generally available. Still not ubiquitous, as they are today, computers were nevertheless showing up in every corner of life.
So it's not that I think computers or information as a processing as a concept, or even as an occasional reality, were new around 1970. But computers as a phenomenon that would define the way interact with each other and the world, and computer programming as a skill that anyone could acquire, really were.
Also many of those guys were EEs who had no degrees. They always seemed cheerful and happy with their jobs. It was one of the things that inspired me to teach myself programming.
If the minimalist resume is appealing, can we also bring back walk-on hiring?
In warehouse and construction work, if someone shows up at 7:30 AM on a Monday morning, odds are quite good that the foreman will have something for them to do. Maybe not that day, but maybe tomorrow, or maybe someone on the list above them won't show up that week and they'll get called. I made rent doing that in my early 20s and they even let me leave early sometimes to work on my internet business because I didn't have a family to support and maybe someone else had a bill they needed to pay and wanted my shift.
Why again does boutique startup need to interview 500 overqualified people? Hire someone right away and let them quit if they want to and hire someone else. It's just business for crying out loud.
> Hire someone right away and let them quit if they want to and hire someone else. It's just business for crying out loud.
This is missing the point of interviewing. The goal isn't to find any warm body to fill the chair, the goal is to find someone qualified to do the work who also has a history of doing good work at previous employers. You also don't have unlimited headcount and hiring budget, so it's worth making the investment to find the top 10% of your applicants rather than picking first-come first-serve.
One of the things you don't realize about the hiring market until you've been reviewing resumes for a while is that problem employees are over-represented in the candidate pool. The most qualified employees spend the least time job searching because they're given offers right away. The most problematic and underqualified employees are frequently searching for jobs after being fired or let go. If you sample applicants at random, they're far more likely to be in the underqualified and/or problematic group than in the great employee group, statistically.
The other thing that isn't obvious is just how damaging a single bad hire can be to a team. Hire someone who clashes with their peers and fails to deliver any good work and you'll find yourself losing the good team members very shortly. Nobody likes working with painful coworkers.
That said: The analogy of a "walk-on" job isn't dead in tech. If you pick a company you want to work for, find someone on LinkedIn, and send them your resume with a short pitch about why you want to work there, there's a good chance they'll at least strongly consider your resume. Nobody is guaranteed a job this way, but it's one route to getting your foot in the door even when you don't see the exact job posting you want on the website.
I actually very much agree with this comment. I was a manager for 25 years. A bad "team fit" was not good.
I never had a technical failure in my hires, but I did have a couple of "bad cultural fits." These usually weren't toxic people, but people that couldn't handle the responsibilities and pressures (we were a small, high-functioning team, and everyone's visibility was fairly high).
But this:
> who also has a history of doing good work at previous employers.
makes me wonder how LeetCode tests can tell you that, as they seem to be the single most important component of all software engineering hires, these days.
In my experience, they just drive out the qualified people that can see projects through, and leave you with ... the ones that are really well-practiced in short, academic exercises.
I'll second that. I never hired anyone who couldn't do the work. The only times things went badly were times when the person basically didn't want to do the work, due to some personal hangup. No amount of interviewing is going to weed out that guy who can code perfectly fine but deep down is yearning to be a psychologist instead. Like any other job that pays bills, you are vulnerable to paying his bills until they find what they really want.
> In my experience, they just drive out the qualified people that can see projects through, and leave you with ... the ones that are really well-practiced in short, academic exercises.
Yeah beats me how anyone thinks LC is useful, other than for weeding out the most unqualified people, like people who genuinely have never coded. I suppose what it really does is finds you people who are willing to put in the time to study all the hundreds of questions.
As opposed to putting in the time to learn how to write and release ship software.
I won't study LC, because I'm waaaaaayyyy too busy, learning Swift, UIKit, AppKit, WatchKit, SwiftUI, DocC, MapKit, SiriKit, device SDKs, networking, USB, etc.
I literally work every single day (like seven days a week), and learn something new every single day, yet I am barely keeping up. I would be nuts to sacrifice any of this time, studying schoolboy questions that have little to no relevance in the software that I write.
These technologies result in actual applications that you can sell and market.
But overall I agree with you. While I don't have the results to back it up, I do believe that ultimately Leetcode isn't that useful outside of recruiting people straight out of college. Otherwise, I think there are more domain specific methods to assess candidate fitness to a role.
Also, my giving my unsolicited opinion, I do agree with one of the Reddit posters saying that you are currently on your way to burnout. While I can sympathize with you being super interested in learning programming all the time (because it is very interesting!) be sure to not ignore your other needs, and also to take a mandatory break at least one day a week. Speaking from experience, if you don't do this you will burn out and you will have to pick up the pieces.
Well, I have been doing this for over 30 years, and going at my current pace for probably ten years (I noticed the comment about me being an enthusiastic youngster. That was cute).
But I also take breaks (and naps) whenever I want. No one wants to hire old folks, so I don’t work for anyone. I do this, because I want to.
What’s that saying? “It’s not work, if you love what you do?” For me, coding (designing solutions, in particular) is relaxing. I have some fairly serious family and extracurricular obligations that bring their own stressors. Coding is how I get away.
I’ll tell you when I was actually in danger of burning out; it was when I was a manager (which I did for 25 years). I spent a hell of a lot of time, sitting on my ass. I coded on the side, so I wouldn’t burn out. This is like Disneyland, compared to that horrible grind.
I share this position fully, but we need to consider the benefit of a standardized test despite its flaws. Other paths to success need to exist (e.g., university admissions take people on other merits besides SAT and similar) but it can be more difficult to assess the validity of any claims regarding level of experience without a well-known certificate.
When a job doesn't focus on what LC asserts, and I think we'd agree about how huge that segment is, then naturally it should be a small or non-existent consideration.
I have no problem with licensing, depending on the job.
Some stuff should definitely have a steady hand on the tiller, other jobs, not so much.
But the thought of licensing software engineers is daunting. The industry is so incredibly varied.
For example, almost none of the programming I do, involves higher math. I've pretty much forgotten all my calculus, while other jobs are almost nothing but math.
I could be writing crucial, lifesaving device control stuff, and the math person could be working on a game physics engine.
I am in awe of game programmers, but they also work on "nice to have" stuff. I know someone that writes software for medical devices. He is not a "math person," but he's also very dependable, and can be relied upon to Get The Job Done. He has many years of working on things like USB drivers, firmware, Bluetooth, and networking layers.
So, if the “licensing test” insisted on a good command of advanced math (because “everyone should know it” –the same argument given for LC), neither he, nor I, would make it, so the company would be deprived of some very good, dependable, disciplined, and talented engineers, fully capable of writing highly effective asynchronous device control code, and would, instead, prefer an inexperienced math programmer that would try to rewrite the project in haskell.
Amen to that. We hired a guy that just completely polluted the office morale. We were small and wanted to do the "right thing" so we tried to figure out how to make it work for way too long. Finally ended the relationship and the whole place felt better. He wasn't a bad guy, just didn't fit somehow. He was really big on "lanes," didn't handle code review well, and on and on... Important lesson learned. It would have been better for him and us if we had ended it much sooner.
The GP was using the proxy of "shows up at 7:30AM, ready to work" as signal for motivation and a lesser extent, competence. Not a morning person, but would prefer this to leetcode hazing.
As Seymour Cray said, "The trouble with programmers is that you can never tell what a programmer is doing until it's too late."
It can be months (at a high salary) before you really know whether a hire is likely to work out. I think it makes sense to invest more effort in screening applicants in this case.
I've hired hundreds of developers over three decades, and this is completely wrong:
> It can be months (at a high salary) before you really know whether a hire is likely to work out.
It's only that way if you make it take that long. You should know if you have a good programmer 2-3 weeks after the hire. Here a couple things that make making great hires hard:
* Making it difficult to learn and understand your system.
* Having slow and expensive employee onboarding. I've seen companies spend $3-4K (not including the actual laptop) just getting a laptop to a new employee after IT gets done with it. If it's super-expensive to make a hire, the incentive will be to keep people that aren't getting the job done.
* Not looking at work output for extended periods. In short give new people tickets that can be done in a few days at most so you are able to look at work output in six days instead of measuring at six months.
> I think it makes sense to invest more effort in screening applicants in this case
There's only so much you can really screen before error in your hiring process exceeds 50%. Every step you add to a screening process has an error rate, and some are very subjective and error prone. The more screening you do, the slower you go, and honestly, the worst candidates you have to pick from. Why? Because a good programmer will be on the job market for 1-14 days (I'm not saying you are bad if it takes you longer to get hired, it's just what we're seeing in our recruiting software right now).
I have a human conversation, make sure the expectations are clear and ask them if they think they can do it. Then I look at some of their work to verify and that's literally all.
No whiteboard, no takehome, no brainteaser, nothing like that.
Go google "how gates hires" or "how jobs hired" ... it's more or less the same. None of this I watch you implement a sliding window in a shared coding environment bullshit.
Let me put it this way. Say your candidates were all award winning scholars with phds and prestigious organizations to their name, then how would you go about it?
With respect but also, you'd still check to make sure it's the right fit, obviously.
Now here's the crazy go-nuts bananas idea - treat everyone with that level of respect. Totally wacky, I know. But hear me out - you can apparently build better teams with trust allocated to trustworthy people as your building block. That starts the day you interview and extends forever, well beyond your time working together.
Because it's self-evident that designing a quick-ramp up process and modular system/good docs makes this a lot easier. It's 2022, you should be able to review checkins on gitlab the first week with a couple of basic tickets.
If folks are too green for that then they can be put thru an internship first. If an obscure language, have them do checkins on a tutorial.
I've seen many variants of the recruiting process from the cute product feature disguised as a take-home to 6 stage interviews with two engineering(!!) interviewers per round that cost the company a few thousand per (un)successful candidate in man-hours.
Which is hilarious in an industry that is pretty binary ("you can build it") || ("you can't build it"). Doubly so when the majority of dev jobs are in web which is easily explored in the candidate's language of choice with basic CRUD / RESTful concepts.
I know a company that hired a contractor, who (probably) sat on his ass for months, then went AWOL with nothing delivered, and said company had to start over from scratch. Probably a little too much trust there.
This used to happen in the 80s. I went to interview at a startup company in Mountain View in 1987. There was some chit-chat then the interviewer asked me to wire-wrap a circuit (diagram was provided) and power it on and connect it up to the logic analyzer - he went away for about 1/2 an hour while I did that. He came back and complemented me on my neatness. Then he took me over across the room to talk to the VP of engineering who, after a few minutes of chit-chat, asked me when I would like to start. Those were the days.
Pretty much my interview process. I ask them to program a contains(string, substring), then come back 30 minutes later to code scattered all over the place, sometimes with “// 54 upvotes http://stackoverflow.com/…”, code not compiling, and I’m still wondering whether I should accept them.
I wonder what’s so hard with my interview. 5 years ago, even interns could do it, one of them could even tell the difference between UTF-8 and UTF-16.
Maybe the problem is that nobody needs to solve that problem in their jobs anymore? For the last few months I've had the joy and privilege to really get to know the TCP and TLS stack intimately, and find myself looking for the patterns that are going to be most useful for handling data bit by bit. But prior to that, I really needed to care much more about the semantics (and the engineering culture around them) and the large scale structure of my code. I might get more hung-up/distracted by `contains(string, substring)` vs `string.contains(substring)` than what the actual operations to achieve it might be. And also, "surely this problem is already solved, optimally" aligns with one of the 3 great programmer virtues: A Great Programmer is Lazy. 30 minutes is unfortunately exactly the wrong amount of time if somebody has fallen into this trap.
Anyway, I guess it depends what compute layer you're interviewing for, but it doesn't necessarily sound like your interview process is broken, exactly. Like, it could be testing for the right thing, but in the absence of that thing, maybe you just need to find another thing that is substitutable.
Our real algo: We save a bunch of objects, but some of then exist in the DB, so you need to intersect what’s in the DB with what’s in memory before saving, except you can never hold all of the db at once.
It should be our real-life test, but it’s too long. It’s our most complicated algo, and honestly it’s very simple in the end. But given all the variables scattered around in a string.contains() (I don’t even look whether the result is correct, I look whether it’s structured for intelligibility and how they debug the off-by-1 errors), I can’t suppose a more complex algo will be done cleanly.
Maybe I’m mot giving them their chance - It might have taken time for me to output clean algorithms.
`upsert`, not sure how having someone implement `contains` is going to help solve your IRL problem optimally, but I guess the interview process is more about testing cognitive strength vs. practical experience.
The joke is that the would-be worker assumes the forklift question relates to the previous question, and thus that the teapot is so large that it requires a forklift.
This make sense for highly productive labor, that foreman hiring people for just showing up generally got a positive expected return for this. In fact, this is still how a lot of (sometimes illegal) day labors go about getting work: truck comes by, picks up people ready for work, work get done and everyone makes money.
> Why again does boutique startup need to interview 500 overqualified people?
Because these startups lose money as a matter of principle. The people working there aren't actually performing productive labor. All of that hiring is about creating a large illusion in the market place.
Most of my labor has gone to waste. More projects than not never ultimately shipped, but even the most valuable projects I did, still made money for companies that ultimately lose more money than they take in. Many of my best projects are for SaaS companies that don't exist any more.
The guy picking up a bunch of people in the back of his truck is about to go build something real and is going to get paid in cash, and the more people he can get in the back of the truck the more jobs he can get done in that day, which means more cash for everyone (and if you're on the paying end, it means that project you wanted done is done faster).
> It's just business for crying out loud.
I don't think this has been true in tech for over a decade. I had a COO once excitedly proclaim that if the company made more money than it cost to run, we would have unlimited runway. The COO seriously thought he had stumbled upon some brilliant realization about a company making more than it costs to run.
The guy with the truck knows far more about business than most execs at tech companies today.
> Most of my labor has gone to waste. More projects than not never ultimately shipped
Same here and I've been in the biz for ~35 years. An architect can drive around a city and point to buildings he designed. It's a bit disillusioning to think that the vast majority of the work I've done has just sort of disappeared because either a startup didn't make it or got swallowed up into a larger organization that had other plans.
> The guy with the truck knows far more about business than most execs at tech companies today.
Yep. The guy with the truck can't lose much money for very long. A lot of tech execs have gone years without needing to worry about that because there was so much easy money around.
> An architect can drive around a city and point to buildings he designed
I was under the impression that architects also did a lot of spec work, or designs for RFPs that don't ever get built. Or maybe only get built as a model.
I'm not disagreeing with your premise -- there is a lot of programming work that is hidden, lost, or wasted. However, it's not a trait that's exclusively a programming thing.
> An architect can drive around a city and point to buildings he designed.
I don't think that the situation is really that much different for building architects.
In cities that are experiencing rapid densification, it's not unusual to see numerous buildings from the 1950s, if not much later, being demolished to make way for newer and larger structures.
Even when structures aren't totally demolished, it's not unusual for them to be so extensively modified that the original building is virtually unrecognizable, or even completely obscured by the work of other architects.
It's also quite common for building projects to be canceled before construction starts, but after designs have been prepared, and other architectural work performed.
Many projects that do eventually get built often go through numerous revisions, with the final product being almost nothing like the earlier designs.
We had a fellow just out of college walk in off the street, maybe 2015 or 2016. "I heard you guys do Clojure programming here, is that right?" We said yes. He said I'd like to interview. We interviewed him that week and hired him.
We were a small-ish startup and he had done his homework, showed interest, and could write code to our standards. He stayed for a year or two and then moved on.
This is such a terrible take worthy of not making it past Econ 101. Yes in theory a completely free labor market is cool. In practice, centuries of labor exploitation and history of workers’ right show that this will quickly devolve in employer’s favor.
I guess as long as abending of interviews is still legit, it's as viable to take walk-ins as to take electronic applications. Just be sure to act on any GTFO needs before the start date. The interview process is longer to accommodate those legal protections, but those don't start until hiring (or something that isn't the first conversations), right?
Long story short: I think we can lengthen the interview period to accommodate those legal time sinks without killing off the (now a bit of a novelty) meatspace first impression. Unless you're talking about blind selection, where the candidate's likeness (name, etc.) is redacted for DEI. In that case, I can see how witnessing the candidate's skin, gender, etc. could be problematic. TFA even explicitly included the types of things we try to leave unspoken these days: excellent health, 5'4"?!
the main point is how it is in direct contrast to how other sectors will interview for weeks and months, before any resolution at all, then require negotiating an offer, just to get to a two-week notice at a minimum and then require another 2 weeks to a month to get paid, with deposits taking several more business days (up to 5 actual days) to be available
paid instantly when you realized you might need it, versus paid 4 months from now hoping you planned and forecasted correctly
I wouldn't call entertainment work (even the craziest corners of that world) nearly as off-brand as suggesting that women dominate it. I have no idea how far from 49/49 the makeup might be, but I do know that I'm a guy who's done plenty of entertainment industry gigs (music production, fwiw) where the pay is quite reliably received before going home, and that's the important distinction here.
I was referring to women dominated crazy corners and I don't have lived experience with man's equivalent job pursuit to say so I didn't speak about them
> In warehouse and construction work, if someone shows up at 7:30 AM on a Monday morning, odds are quite good that the foreman will have something for them to do. Maybe not that day, but maybe tomorrow, or maybe someone on the list above them won't show up that week and they'll get called.
That's hard to do when there are so many strings attached to employing someone. It's a double edged sword which makes the decision to hire someone a lot bigger." Yeah, I have stuff I need help with right now" is not enough.
"Workers' rights" makes hiring and particularly firing very expensive. The same thing with "tenants' rights." I would rent to anyone with bad credit and a sob story if I could simply evict them within days for non-payment. But because of these rights now everybody has to have an 800 credit score and six figure W2.
While I’m not doubting you would personally do as you say, I would point out that as someone living somewhere with quite lax tenants rights and a nationally recognised housing emergency (Australia), landlords not renting to people because tenants rights are too onerous doesn’t seem to have a particularly large impact on housing availability.
Whereas some tenants rights seems to have a pretty large positive impact - ie all the positive effects of secure housing.
> Hire someone right away and let them quit if they want to and hire someone else. It's just business for crying out loud.
You're not hiring construction workers, you're hiring architects. It often takes more than 3 months to get used to a new codebase and understand how and why things are done the way they are.
And sometimes you are hiring construction workers, and sometimes trade-specialists. I've hired contractors to address specific tech-debt, or accelerate QA on project, to implement a CRUD type stuff for things with well-known approaches that just take time.
Well, then may be it's important to note that we are talking about two different industry niches and practices from one of them are not a good match for the other. And may be we even should come up with some kind of names for these kinds of "tech" to avoid mixing one up with the other.
We do have a variety of words that have certain connotations -- there are even essays / blog posts about the supposed hierarchy (or soup) of script kiddies, hackers, coders, programmers, developers, architects, etc. etc. etc. but the analogies to their metaphorical ancestors (very often from real construction) are neither perfect nor stable. A housing developer can more easily search for laborers than a software hiring manager can search for a code monkey who doesn't try to steer the ship!
The closest equivalent to that today, in most white-collar roles at least, is contracting. Usually not a physical walk-on hire anymore, but it is often quite a quick process (sometimes as little as a few hours), from "I need a worker", to "I can do it", to "get cracking sonny". And it's then, days / weeks / months (usually not years) later, a similarly quick process, from "we don't need you anymore" / "I'm outta here", to "don't let the door hit you".
But the contractor usually needs to be "employed by" some agency company, that guarantees his/her suitability, and that handles the (not insignificant) paperwork / legalities, and that takes a handsome cut.
We automate tons of software. That's what compilers and interpreters are. And now we're even entering the era of plausibly-deniably-stealing-other-people's-code-from-Github-as-a-service.
I worked at a place a long time ago upgrading legacy software from the 80s. I helped them update their build process from file shares, copy/pasting and manual building to Git, Jenkins and automated builds/tests/packaging. I know it's not automated code, but it certainly saved us a lot of time with about 50 assemblies. I'm sure some companies were ahead of the curve when it comes to stuff like that, but this was a really small company. Modern CI/CD software is great and saved us lots of time.
Same. It's fair to say our tooling today is far superior, but "fully automated" implies virtually zero input from humans beyond the initial configuration.
Being a bunch of engineers on HN, I get it, but that’s needlessly pedantic. My first real job was a systems and tools programmer working on scaled database systems.
Literally every aspect of that job is fully automated.
Where I work today 20+ years later, we have 1/3 of the people doing like 100x the amount of work by several measures. Little teams of developers can just crank out work.
You call it pedantry, I call it "field of work". I've managed to avoid working on apps built around databases for my entire career, instead focusing initially on operating systems, device drivers and OS-level tools; then later on software for pro-audio/music creation workflows. Literally no aspect of this work is "fully automated".
Yes, the productivity of modern day "stored-in-the-DB/presented-in-the-browser" project teams can be astounding. But that's only one type of software.
then why isn't construction automated? i think if your company's processes are so specialized and nuance, you should really take a step back and ask if they should be.
That's the convenient myth to think banks caused crisis. Money printing and credit rates lowering by government caused this. Free money? Sure, why not. Please watch princes of yen documentary where they speak about "credit window guidance" conducted by Japanese national bank for more than decades
I don't deny frauds in credit agencies. However, somebody was buying banks' CDOs? Who, ancient aliens? No, they weren't called surprime because credit agencies cheated on their rating, they were called so because they were inherently risky more than a normal mortgage ( so called ninja loans, no income, no job ). That's the people were buying this shit with the all time low rate loans. Human greed huh? That's less convenient than blaming banks for all the evil in this world
I eventually gave up on Word templates and now keep my resume in LaTeX. Neat and organized - not unlike this example only with more detail and nicer fonts :)
Occasionally a recruiter will ask/demand I give it to them in MS Word - I've learned it's always a bad idea to give recruiter a resume in an easily editable format.
My experience has been that great workplaces sometimes have Whitt luck with recruiters, so you’ll end up judging a book by its cover. But one only has time to make so many applications, and you gotta filter them somehow, so why not by recruiter skill?
I see obviously not-the-original resumes from basically every recruiter. If you tell a recruiter "we need someone with senior FOOlang experience", the next day there will be six resumes, one of which has an organic "brought in FOOlang to orchestrate object frackers, reducing development time" and five of which have "N years FOOLAND" inserted into a Skills section in a different font.
This is interesting...what do you suspect is behind this? Do recruiters just tell candidates "Hey, make sure to put FOOlang on your resume before you apply to this job."
No, I'm strongly implying that many recruiters will just change resumes with minimal regard for the truth. Note the difference between FOOlang and FOOLAND, among other things...
Woah, that is way more insidious than I was expecting. I get what you mean now, and that just seems really stupid on the recruiter's part. Doesn't it come out during the interview process if there's BS on the resume?
But I'm guessing that's your point, right? Because the hiring manager should notice, and the interview process should screen for it, so this must be a symptom of much larger scale dysfunction in the tech recruiting/hiring space.
> Doesn't it come out during the interview process if there's BS on the resume?
Oh yes. As the candidate this is also great when the interviewer says something like “It says here you’ve worked with x” and you go “I’m fairly certain that that wasn’t on there when I submitted the resume (to the recruiter), let me see that” and it turns out like the OP said, extra skill added in a different font.
Yep, I've been surprised by the contents of my resume at an interview set up by a recruiter once.
"It says on your resume you have extensive experience in X."
"I do not."
They also have a thing for stripping your name and contact details out and pasting their ugly letterhead over the top. Which I suppose they could still do with a PDF if they have Acrobat Pro.
> They also have a thing for stripping your name and contact details out
This makes some amount of sense because they want to avoid the company bypassing the recruiter and their commission. I was once hired like this (although I didn't know it until much later, when the owner told me). I think it's a realistic and reasonable fear.
I have a "redacted" version of my CV for this purpose which removes the personal information, but I can't recall I ever actually used it since I haven't really used recruiters for a decade.
Resume manipulation is actually more common than you think if you go through a 3rd party recruiter, the ones that cold call you for jobs.
They usually strip the content from it and drop it into a container resume that has their details so that they get credit from the hire I assume. lol. That being said, whatever the poster was doing to stop the resume from being edited is moot. They will just copy the content from the PDF and paste it into a new one.
I just answered my own question, since I had the MS report in Word.
I was hoping I'd have to supply a password to edit it, which would be a somewhat reasonable level of security. But no; you just click "edit anyway." Duh.
Unforunately, being able to read the document is already sufficient access to make your own copy and alter it however you please.
The best you can do is a digital signature[1], which proves that the document has not been tampered with. Of course, it's up to the recipient to ask for a signed document and to actually check if the signature is still present and accurate upon receipt. Otherwise, it's very trivial for a third-party to remove the signature and add their own edits.
In some pdf readers there is an option "respect limitations"… by disabling it you can print even when print is disallowed and so on. I guess it's the same with word documents.
In my experience looking at résumés (mostly for entry-level tech roles), fancy formatting is 100% an attempt to cover for a lack of skill. I haven’t seen one that had good content. Intuitively, this makes sense to me, and seems to fit with the old adage, “if you can’t dazzle them with brilliance, baffle them with bullshit.” Imo, your experience should speak for itself.
When it does, you have trouble fitting it all on 1-2 page. You quickly go to text only, and organize things accordingly. When you lack experience, you try to make things look like an infographic so as to fill space with trivial information.
Oh, for sure, I read all of them top to bottom when I’m part of the hiring team. I just haven’t actually seen one that was fancy that had good content. The layout doesn’t matter to me, as long as it’s consistently formatted.
I think certain schools or career programs must tell their students to do resumes that way (with gauges and stuff) because I have encountered big clusters of them, even if on average they're rare
My experience from around 2017: Today's employers just don't like good simple resumes for some reason. They expect you to be a graphic designer to even look at it. I had a clearly formatted, simple resume and had a 0% success rate (besides one offer for 30k which was an insult) after hundreds of applications coming out of college to places far and wide or local, until a recruiter for a consultancy company cold contacted me. People online lombasted my resume for not being fancy enough. I still think it's an example of a perfect resume personally. Doesn't matter anymore thankfully now that I have a great job I want to stay in.
Certainly far above average chops for an entry level applicant so I don't know what every employer's problem was. Much harder world for junior developers than anyone would make you think if even I had that much trouble. I was legitimately scared I had no future outside of fast food for a while there.
There are many problems with your resume, and lack of fancy design is not one of them.
- Half of your CV is empty space
- Dates are formatted really badly, no one would be able to get a good grasp on your project and work timeline quickly
- You have a game project that spans several years, and you sum it up into one sentence. Why are you doing that? It's one of your main selling points, and you don't expand enough on it.
- The prioritization is not sound. I'm reading about your proficiency with vim (pretty much irrelevant) before I even know your work history.
- You mention Linux administration. That's pretty broad. You should specify more. Did you deal with network config? systemd? FUSE?
Overall your resume looks lackluster, unprofessional and bare-minimum. It's been more than 7 years now, and you still see nothing wrong with it? Sorry if that sounds harsh, but I'm doing you a favor by giving you a reality check.
If I got this resume from a junior I would hire them assuming the interview went decently.
Half is empty - just graduated, can't expect that much stuff to put there.
Maybe I should have expanded more on the game but I was getting the impression nobody cared that much about personal projects.
The very first two things you do with a resume are match up the list of technical skills with the list of tech in your position requirements, and check the education requirement. That's why those two go at the top.
I'd dealt with a lot of config issues running Debian on various hardware in high school as well as installing it, dealing with apt, understanding chron, a lot of basic things. I don't think systemd was much of a thing yet.
I still fundamentally disagree that the resume is bad. Again, I'd hire this person, and I'm saying that as someone who's done interviews and hired people now.
> The very first two things you do with a resume are match up the list of technical skills with the list of tech in your position requirements, and check the education requirement. That's why those two go at the top.
That is what you do. The very first thing I check is where they worked before and how they describe what they did there.
In the case of juniors, that’s replaced with looking at basically anything they did and how they describe it.
By the time the CV gets to my desk the keyword matching is already done.
While the resume isn’t necessarily bad, it screams to me that the candidate didn’t even bother to look at, or didn’t care how a CV is normally structured/laid out before handing theirs in.
It’s not necessarily bad, but when looking through a bunch of them you don’t really want to adjust your mental parsing model for every resume, so barring anything else about it that stands out, you just mentally dismiss it.
To be fair, this is something I’ve seen more from people just leaving school.
I'm not sure I'd even interview this person; I'd at least have to find reasons to. It definitely doesn't give me anything to ask about. Quotes like "Save a lot of time by automating an email process" are so vague and ambiguous that the bullet point may as well be omitted.
Education on the top is a waste of most anybody's time, let it be on the top only if you have no work experience (and if that's the case, yikes).
I don't care about language/skillset checklists, because that's not what I hire for. Especially with no indication of relative strength or accomplishments with them - if I need a specific skill I'll happily scroll to see if it's there or not, but I want to see an actual story about this person's career and interests, not a set of "fill in the blank" stuff that every graduate has:
* woo high school and a college degree great, that's not something we're going to talk about
* version control, editors, etc are table stakes
Why is all of the experience effectively unscanable (by eye)? Prose-first, no easy way to separate by date, by project, or position held?
A resume doesn't have to be a thesis in print design, but I think you'd do well to consider the feedback you're receiving in this thread and that you alluded to receiving before - this resume isn't advertising anything to me and it's work for me to construct a coherent story about why this candidate is a strong fit for hiring.
> It definitely doesn't give me anything to ask about. Quotes like "Save a lot of time by automating an email process" are so vague
... are you kidding with these two sentences side by side? you just answered your own question, ask about the email process! and how is that outline of what I did even that vague? and everyone says to list accomplishments that benefitted the company instead of responsibilities, which this is a perfect example of.
I'll hopefully never have to write a resume again but if I have to I'll just pay an expert. The way I instinctively read and understand a document by actually reading it and caring about the content first is just too mismatched from the average neurotypical's way of looking at the world for me to empathize.
Don't beg the question in your resume. I don't want coy points to ask about, I want details so I can make sure that I can actually have an engaging conversation about something, especially so that I can figure out how it might relate to work I'm doing here. It's astonishing to me that you can't grasp that a resume is a form of communication, and that therefore you might want to, I dunno, communicate in it.
> and everyone says to list accomplishments that benefitted the company instead of responsibilities, which this is a perfect example of.
It's not perfect, it's nearly worthless. A perfect example would have at the very least some hint of the business value and the nature of the problem at hand.
> The way I instinctively read and understand a document by actually reading it and caring about the content first is just too mismatched from the average neurotypical's way of looking at the world for me to empathize.
This is pretty patronizing. I'm actually suggesting that the content of the resume you provided is itself lacking. The issue is not that "oh no the neurotypicals don't want to read my resume", the issue is that this resume's content has room to improve, but you think that can't possibly be the case. Yeah, you're lacking empathy (or at the very least pretty defensive), but it's because we WANT content, not because we're looking for surface-level stuff. In fact most of the feedback has actually been that this resume is TOO surface-level (list of skills? don't care, vague bullet points? don't care), and you're defending the superficiality of content, which is incoherent alongside the claim that people don't care about the substance.
"Saved over 30 hours of manual work, where the worker had to find a PDF, Excel sheet row contents, and information from one other data source, combine them into a formatted email with this information following certain conditions, and look up the right email to send it to. This was done with Python and the gmail api, <whatever the pdf library was, I forget>, and manual string parsing, building hashmaps of the information and sending the emails out once every 10 seconds to honor the rate limiting. Parsing failures or any other exception was logged for manual review."
All that extra detail seems trivial to me and covered well enough by the gist provided. Is this version really that much better for these details? Is it even enough detail for you yet? It feels like too much detail to me.
It was something about tax ID's and temporary merchant licenses for the state of Maryland.
I think you're just being intellectually dishonest if you can't imagine some compromise between what you started with and what you have here, given the feedback you've received.
That's the actual definition of bad faith discussion. True to form, then :)
Edit: An actual dramatic improvement is trivial to pull from your bait example, by the way:
"Automated a manual process for collating tax ID and temporary merchant licenses for the state of Maryland, honoring rate limiting and supporting auditing of failures. Used Python and Gmail API".
Obviously there is some precision loss in my writing because I didn't work on the project and because your description of "something about tax ID's" was ambiguous to begin with.
Could even add "saving 30 hours a week/day/whatever".
The advice I give for resumes is that they fill two roles.
The first is like what you say - make sure you check the right boxes. OFten the first person who sees your resume is non-technical, so make it easy for them to progress you on to the next step.
The second purpose of the resume is far more important: get people interested in YOU.
Almost everyone has done interesting things, but many people are bad at communicating those things. Often people are bad at even identifying the interesting things!
For each skill/technology you want to highlight think of a situation where you solved a problem using that skill. You're aiming for a problem your interviewer will find interesting, and will want to ask you questions about. Very briefly highlight the problem and that you fixed it with SKILL, and you're done. The more specific you can be the more interesting you will appear.
Two or three of those, your employment/education history, and contact details, and you're done.
I think your resume does do this, but it should be both more prominent and more intentional.
If I was helping someone today and they showed me this resume, I'd recommend re-ordering the sections and putting a brief intro at the top serving the same purpose as a cover letter. I'd also make sure they create tailored versions for each job they apply for.
Ah, I had the same experience as you when I graduated
People will offer all sorts of wacky feedback about resumes when prompted, but the real issue is that recruiters just don't look at or care about resumes that much. For my last few jobs I just sent an unformatted text file as my resume, one that would make dist1ll's eyes bleed. The most recent recruiter was actually angry at me for how "unprofessional" he thought my resume was, but the decision was out of his hands.
You have to network your way into a job. Put out feelers to friends, family, friends' families, professors, professors' friends, etc. A referral will jet you past the recruiter's filters and get you a real shot at a job.
That's a good point, but I think people have a lot more connections than they realize
Chances are good you're acquainted with 50+ people, and if each of them is acquainted with 20 unique people, that's 1000 people you can contact through a few weeks of hanging out and chatting
Professors often know relevant industry people in town, too
Worst case, you can hop on here or blind and ask for referrals. Some random dude cold-mailed me before and asked for a referral to my company. I chatted with him a little to make sure he wasn't completely insane, then passed his resume on to a relevant hiring manager. I forgot all about it until I noticed that a $3000 referral bonus showed up in my paycheck a few months later
You don't need it to be fancy, you just need to be one step above the raw Word appearance you have there. It's the equivalent of having the same pair of pants but it fits you. But as you've said it's moot now that you have a job.
The OP resume looks like a raw typewriter document to me. I don't understand what value it brings to go beyond - it's nothing but a tool to convey your skills and experience.
Dumb things such as having a tiny bit of color or a slightly less basic format can make or break you when the recruiter/hirer is sifting through the pile. Besides, it takes all of ten minutes to do this and be done with it.
It's like fashion. Arbitrary but just having the basics understood goes such a long way professionally and socially compared to the effort expended
Maybe this is a good use case for AI. Write your resume then let AI rewrite it to be aesthetically pleasing to resume readers.
People like my interior decorating, I can appreciate other things that are aesthetically pleasing, and I've made my share of actual art. I just don't think I have it in me to understand whatever in the world is going through someone's mind when they're displeased with this resume. It's a missing faculty like blindness or tone deafness. When I get a resume or an email I just read it, I don't sit there and hem and haw over how many pixels the bullet point is indented or whatever it is.
This is why resumes should be abolished entirely and replaced by a standardized database you put your experience and skillset into. Anything an employer wants to know has to go in a standardized field. No discrimination can possibly occur based on your ability to format a piece of paper according to invisible, unpredictable metrics that you might have no faculty for and have no bearing on your ability to do the job.
I think it’s telling that you have received very valid and constructive criticism on this resume from multiple people, yet you still can’t see why this is a bad resume. You know from data and observation that in the real world this is in fact a bad resume, because it doesn’t get you interviews and offers - which is the point of a resume.
It's not that they are displeased with the Word doc or examining every detail. Just that aesthetically pleasing things attract our good graces on a subconscious level, and CVs are in competition with other CVs. Really we are overcomplicating this. The OP could have fixed the CV in the time it took us to discuss it.
I'd say too often what happens is that engineer minded types rebel against unwritten norms like these and derive a sense of moral superiority from being immune to them, when ironically from an engineering POV it would make more sense to just do these little things, iron the shirts, use a template, and call it a day.
It's not 1980 any longer, and even the resume from that era has a nice table layout for ease of reference. Yours on the other hand looks like a first draft with arguably seome good stuff in it.
I don't care for a fancy resume, but it should at least look like you spent more than 15 minutes on it, and with the reader in mind.
Also: I kind of like the concise wording when the writer doesn't feel they need to adhere to STAR. "Duties Included" is the only meat I want to see when I read a resume. I don't care what the specific challenge you faced was, or if your hard work transforming protobufs from one format to another resulted in "23% year over year revenue growth and 3 industry awards" for a product that uses your protobufs 5 layers up the API stack. Yet resumes are all moving over to including this cruft. I admit I use STAR on my own resume because every resume coach insists it's the only way to get noticed. It's just yuck.
I have an MBA, which I don't really use as a working software engineer, but nowhere else will you learn to take a clear, concise, one page paper (i.e., this is what we did and this is the result) and expand it to 10 pages of BS-laden corporate-speak nonsense.
MBA is still the junior leagues. To learn to take one line and turn it into a whole page you have to go to law school.
And even then, I would daresay that those are applied and therefore lesser. If you want to be able to expand one word into one entire paper, you need to go deep into academia.
My guess is if you sum all of the money saved/revenue gained listed in each MBA CV you'd end up with a number bigger than the GDP of the history of the world.
I don't see many 5-star skill ratings, dual colored backgrounds (?), or unreadable fonts. Where do you see those? I would turn people away from that format if they asked my advice.
My resume, and the resumes I've seen aren't too far away from this format. More bullet points and a bit more detail than this, I guess. But otherwise pretty similar
I see them a lot for interns and new grads. I think there's a bunch of templates that have these 'features' and when people are first starting out they don't know better. No interviewer I've ever met thought 5-star ratings were a good idea.
I graduated last year and one of our final classes required a resume be submitted using the professors format which was colorful, differing fonts, and used "confidence percentages". I wouldn't dare use it in the real world but I'm wondering how many of those new grad resumes are similar.
And new grads are probably often looking for something anything to stand out if they haven’t done any projects that really stand out and have a middling GPA from a middling school.
Yep 100%, and I did the same to be honest - I have a memory of painstakingly deciding which languages or technologies I was "experienced" in and which I was merely "intermediate" in, without realising it was wasted effort :) I think people tend to be quite forgiving of graduates or those new to the industry, it's hard to know what's expected of you.
As someone who has been through several government vocational programs in Canada, I will say that when your Case Manager or Instructor says to write your resumes and letters a certain way, you do it.
My girlfriend is a biologist with the National Parks Service and all of their resumes are expected to be three to five pages long. It hurt my soul when she told me that.
Government resumes are different as they rely on documented experience as a substitute for a civil service exam. They want completeness more like a dossier than a marketing document. Your catalog of skills and experience is critical, as "X years of Y" rules the day.
It's actually easier - you just tag on whatever you do every one in awhile. "Normal" resumes are like ads for you, and the positive/negative usefulness of your resume is more about your ability to produce compelling bullshit for an audience, miss the mark, or land in the middle of the bell curve.
Something I was surprised by was the amount of white space and the straight to the point job descriptions. Whenever I send my CV to recruiter I'm always asked to add a little more about some language or skill.
Over the years my CV has become super dense with text, not because I have more experience to list but because I've been told repeatedly to list all the languages I've used and details of the projects I've worked on.
I think that's correct method. Being a software engineer at Google could mean anything, so job titles alone don't communicate enough by today's standard. In 1980, this guy's job titles alone told you enough to know if you wanted to meet him for your job opening. There's probably a volume aspect too. The hiring manager in 1980 wasn't getting 1,000 resumes that looked like this where project information would help differentiate.
This resume has better UX than 80% of the resumes I've seen - which is particularly frustrating especially if it's a FE resume. These days I'm happy if the candidate can correctly spell the technology they claim to have been using the past N years. Most are full of bullet points with every other word bolded. If the bulleted skills list is greater than 5 lines chances are it has the same skill duplicated multiple times. I'm not sure if the candidate is fully to blame as I'm sure the recruiting companies manipulate the resume format. But ultimately the candidate is responsible for the quality of the resume.
I need React devs but I’ve dropped the ball — I just hire Java people and they’re better at front-end. Usually more savvy about libraries, they at least ask who did the lib and what license it uses.
I’m sure thousands of competent frontendists exist but they’re drowned out by people from bootcamps who can’t even spell “bartender” properly (and I have a bartender who’s the best of the class in my team, so again, nothing is set in stone, but I was losing my time with front-endists).
The replies here are hilarious: from people saying how much they loathe it to people saying they wish more resumes were like this. It just drives home how much of a crap shoot applying is...
I'm a big fan of these. I maintain both a simple text resume, and a 'fancy' one in a pretty latex template. I've gotten jobs from both, and have even (successfully) submitted both in separate parts of the same job application. Sometimes people want to see the 5 star skill ratings, though I'll never understand why, but having it as an option has unfortunately helped.
With the amount of people who can put "duties involved: computer programmer" in their CV today, you're asking to receive dozens of near-identical applications. Hope you enjoy interviewing every single candidate because you can't evaluate ahead of time whether their programming experience involved, say, pottering around with VBA or implementing their own compiler.
1980 was not the dark ages. Photocopiers though not personal ones were widely available and used. But yes it was probably typed on a typewriter initially. (There was typesetting but you probably wouldn’t have used it for a resume.)
I struggle with getting a good CV, one that I do feel comfortable with. So, as a reference, I just saved this resume as a reference to use as soon as I get to rework mine. It is straight forward, not overburdened with details and covers the essentials. Great stuff!
You can use something similar, even today. My CV isn't *much* different than the one in the link and it seems that it does stand out as I've been complimented about it, twice.
I do too, but I see way too many people "over-thinking" their resumes, especially in the presentation layout: if the raw data is clearly organized, the layout should be minimal, not the other way around
I got made redundant in 2008, i signed up for unemployment benefit (UK) and part of the requirements was i had to attend a cv writing one day workshop, in the basement of a particularly dingy pub.
The guy leading it was spectacularly useless from the get-go, training us in how to use word in the most wonderfully terrible way, one particular nugget i remember him coming out with was:
'bold is particularly good for standing out, in fact I would even go as far as putting the entire document in bold'
I asked him about all caps, but stopped short of asking him if i should sprinkle it with glitter for fear i would 'fail' in his assesment of me and cut my benfits.
My mother was a systems analyst in the 1960s. She got a contract as part of a team installing a brand-new IBM/360 at the University of Ibadan in Nigeria, her job was to write administration software.
She told me it was a top-of-the-line model with 64K of RAM. At some point, it had a malfunction and they had to replace one of its memory "boards," which were lattices with ferromagnetic cores suspended on filaments. She brought the defective board home for me to play with.
Although I went on to write software on punch cards, and built a PC in the 1980s, I think the moment that I held core memory in my hands was the closest I've really gotten to "the metal" in my life.
>...I think the moment that I held core memory in my hands was the closest I've really gotten to "the metal" in my life<
Why is that? You don't build your own computers anymore? Or write a small for fun application on ESP32 that you slap on a board made by yourself? That's the fun, to have a helicopter toy flying your own board + software.
No, even when I put together a PC, I was putting chips and boards together.
But as a boy holding that memory board, I was looking at the cores that held individual bits of memory. Instead of connecting things with ribbon cables or whatever, I was looking at tiny wire that would transmit individual bits of information.
The interesting bit about this post is that with that resume, you can still feed a family in 2022 (okay, you won't need any assembler, and one from the set { Fortran, COBOL } will do).
I wonder if Python and JavaScript will get you that far 50 years from now?
One day, long after I'm gone, people will finally accept that Python and JavaScript are no longer young languages.
JavaScript is 26 years old, Python is 31. They both continue to grow in importance year-on-year, JavaScript because there is nothing on the horizon which will plausibly replace it, and Python because a large number of industries and programmers genuinely love it.
I think there's a nontrivial chance they'll both still be languages of primary importance in 50 years, but I'd bet my bottom dollar that they'll at least remain as relics yet needing support the way Fortran and COBOL exist today.
> people will finally accept that Python and JavaScript are no longer young languages
> JavaScript is 26 years old, Python is 31
I can't speak for Python, but Javascript has changed¹ massively in recent years, more so (I expect) than Fortran or COBOL every did in their active history. It could be argued that what we have now is a younger language with the same name.
> but I'd bet my bottom dollar that they'll at least remain as relics yet needing support
This I definitely agree with, though I suspect less so than Fortran/COBOL/similar. It is much cheaper to rebuild these days, and so many other things change around your projects², and there are more forces pushing for change such as a legion of external security concerns. That will add up to there being far fewer projects³ left to be maintained that haven't been redone in something new, because they fall into the comfy gap between the cushions of “it still works, don't touch it” and “it is far more hassle to replace than to live with as-is”.
----
[1] the core language is still the same, but there is so much wrapped around it from the last decade or so that I suspect someone who learned it fresh recently would struggle initially on EcmaScript 3 or before/equivalent.
[2] where a Fortan/COBOL project might live for all its decades on the same hardware using the same library versions.
[4] no absolutely fewer of course, but relative to the number of people capable of working on them – much of the price commanded by legacy COBOL work is due to very few having trained on the language in decades and many of those that did earlier being fully not-coming-back-for-any-price retired or no longer capable at all (infirm or entirely off this mortal coil), so those remaining in appropriate health and available are in demand despite a relatively small number of live projects.
Fortran77 vs Fortran90 were fairly different languages that required a substantial revision to the numerical methods assignments that I had in the early 90s as the department shifted from one to the other.
> There are now two forms of the source code. The old source code form, which is based on the punched card, and now called fixed form and the new free form.
> ...
> A completely new capability of Fortran 90 is recursion. Note that it requires that you assign a new property RESULT to the output variable in the function declaration. This output variable is required inside the function as the "old" function name in order to store the value of the function. At the actual call of the function, both externally and internally, you use the outer or "old" function name. The user can therefore ignore the output variable.
> but Javascript has changed¹ massively in recent years
Does anyone have any good resource to learn modern JavaScript? Not any of the weekly js framework, but the updated language, capabilities and patterns.
I have found https://javascript.info/ to be a good resource for both learning and reference around modern JS. I visit it instead of MDN with regularity for practical examples of JS features.
The grammar can be a bit spotty in places - but it is open source and has gotten a lot better.
I can recommend Gary Bernhardt's execute program[0]. One of the courses offered is "Modern Javascript" which goes through additions in ES5 and ES2020. There are also multiple courses on typescript. It does cost some money, but there are occasionally special offers.
Yes...Fortran at least has changed a lot since inception. There's been Fortran 90, 95, 2003, 2008 & 2018 standards since to keep up with the various industry fads of the time (You want OO Fortran? Sure thing.). You can get a good overview of Fortran features from inception through the 2008 standard in the paper "The Seven Ages of Fortran" by Michael Metcalf or on the Fortran wiki (https://fortranwiki.org/fortran/show/Standards).
Does a lot of that extra pool of features get used in production (relative to more "legacy" code) as seen with many reengineering JS projects regularly, it is the Fortran user base more conservative? I might expect the latter, but this is just a gut guess.
I'm not doing much work with Fortran-using communities these days, so this is an opinion only, and probably out of date in various ways.
Yes, developers are using the new features. Most code I touched was regularly using up to 2003 features, with later stuff dependent on other factors (solid compiler support across different compilers being a big one). However, most Fortran programs are going to be fleshed out with components taken from the vast collections of well debugged and tested libraries available, many of which are still F77, and probably will stay that way. Fortran is more 'conservative' in the sense that there's not much compulsion to rewrite working code in the name of 'refactoring' or whatever. Adoption of new features is more 'I'm going to use a select set of things that will make my life appreciably easier' rather than 'everyone on board with all the latest shiny things'.
> there is nothing on the horizon which will plausibly replace it
I'm not going to be making any bets - but the one project that has possibility is WASM. A mature, polyglot ecosystem on top of WASM runtimes with web-apis seem like it could displace JS in browser as #1.
This is not likely to change anytime soon (if ever), as nobody is working on this, and there is even quite strong opposition to get features in that are fundamentally needed to run anything else than the very few languages that already compile to WASM. ("Nobody" is interested in invalidating their investment in JS ;-)).
Also WASM is actually slow, or better said, "it does not deliver its full potential".
It will need advanced JIT compilers to keep up with the other two mayor VM langues. But in this regard WASM is behind around 20 years of constant development and improvement.
My strongest hopes in this regard are currently with Microsoft (even I don't trust this company at all!), who are indeed interested to run their CLR stuff in a WASM VM, and could probably deliver on the needed features. But then, when you would run a CLR-VM (or a JVM) on top of a WASM VM, you know, you're building just the next Matryoshka… There are no real benefits to that besides "look mom, it runs in the browser".
Probably not. Unless you're rendering to another target besides the DOM (ie canvas) I doubt you see JS displacement as #1 in the browser. JS is not the performance bottleneck, the DOM itself is. And in the meantime, you've got 25 years of example code, component libraries, talent development, dev productivity tooling, browser integration, etc built up around it.
And unlike other operating systems, the browser does not give you any kind of standard library of reasonably good components to build on. So the sheer size and volume of components and the ecosystem built up around npm well be an uphill battle for any WASM target language to compete with.
>Unless you're rendering to another target besides the DOM (ie canvas) I doubt you see JS displacement as #1 in the browser.
if we're talking on the level of 20,30,50 years, we may in fact be able to move away from a DOM-based web. and WASM is simply a binary spec, so it can adjut with whatever comes on the horizon. We've had similar sized giants rise and fall in that span.
Yes, Perl certainly took an odd turn on their 'next gen version of the language' journey, but I'm willing to bet there will be a Perl community running 5.247.2 or some such decades from now, alongside sh, awk & sed.
That's one of the things I pray every day to go away. (Even I don't believe in any gods, and am a Linux-only user for the last 20 years).
The Unix shell language is one of the most horrific legacy technologies that are still around. I really wish it dies soon™ and gets replaced finally by something sane!
Why did Python win the war with Ruby? Was it purely the math community deciding this is where we throw our weight and left Ruby the runt of the litter?
The libraries. Ruby has Rails. Python has... everything else (plus Django, so it also kinda has "a Rails"). You'll likely be using something less well-maintained and shakier if you use Ruby outside of Rails stuff, than if you'd picked Python. Python's basically the modern Perl.
Why that all happened, IDK.
I write that as someone with a soft spot for non-Rails Ruby (after much consideration and repeated encounters, I kinda hate Rails). But it's rarely the best choice, unfortunately.
I'd reckon the parent's suspicion about the scientific community is correct in that it was a large influence. When ML and deep learning blew up, the academic Python community was in a great position -- you had numpy and scipy early on (both optionally BLAS and LAPACK btw), then scikit-learn for ML, matplotlib for plotting results, open CV ports, etc. As for why Python was adopted so early by the scientific community, I'm not sure. Maybe because it was a scripting language that was also very friendly for hooking to C and Fortran?
I genuinely love Python. Not in a shallow feature-to-feature way. But deeply that it has enabled a career and provides a livelihood to me and my family. It puts bread on the table. It taught me how to program and it taught me the power of computers.
Life changing tool. No other tool in my house comes close to what computers + python has done in my life.
Oh, I like it too. It's got problems like most languages that see any actual use, but it's totally OK, even good. I didn't intend my post as a put-down of Python, so if it came off that way—whoops, not what I was going for.
It's funny, you don't hear much about the Python/Ruby war anymore. Python was more of a general purpose language and had decent web frameworks (Django and Flask primarily). Ruby's main claim to fame was, and still is, Rails. Rails has lost a bit of steam over the years, partly due to node.js and the microservice revolution, so to speak. If anything, Sinatra is a better fit for microservices and yes, sure microservices aren't a perfect fit for all use cases, but they do exist now and are reasonably popular compared to when Rails first came out.
Additionally, Python made significant inroads as a teaching/academic language and a scientific/math/ML language.
Way back in 2004, I had been using C/C++, Java and Perl and was ready for something new and useful. I'd heard about Ruby and Python at that point and tried both. Ruby felt too much like Perl for my tastes (no surprise, it's kind of like OO Perl) and while I didn't love the significant whitespace in Python, it just looked cleaner and simpler to me.
I have been using Python off and on ever since. I have worked with Ruby a bit as well. What's funny is that they are fairly similar and I've long argued that the two language communities would be better and stronger if they "joined forces".
But of course people have strong opinions about programming languages. Myself personally, I like Python a lot more than Ruby, but I've been using Go for a few years now and it's my current language of choice.
True, but Python became more popular as a general purpose language. For example, Python starting shipping in most Linux distributions sometime in the late 2000s, Ruby did not.
I didn't mean to imply that Ruby isn't or can't be a general purpose language.
It was already in wide use for scientific computing by 2000, due to the comparative ease of writing interfaces to C code. The main idea was to use Python as a glue language to "steer" high-performance computing.
The Python/C API was easy to learn and use, Python's reference counts worked well for C-based objects, and it was easier to build non-trivial data structures than Perl or Tcl, which were its two main competitors at the time.
(Tcl extensions required manual garbage cleanup, I remember Perl's extension API as being rather complex, and I had to read the Advanced Perl manual to understand something as simple as having a list of dictionaries.)
> Python works around it by having so many libraries built in C or C++.
Which works quite fine, until it doesn't.
By than the needed rewrite in some language that delivers decent performance and safety all over the place in one package will be very expensive.
I'm not saying that you should avoid Python (and its native code kludge) altogether but when using it just pray that you never reach that point mentioned above. It's a dead end and will likely require an almost full rewrite of a grown, business critical (and already heavily optimized) application.
Prototyping in Python, then rewriting the performance critical parts in a speedier more difficult language is one of the most efficient paths a project could take.
I knew Python decently well before I ever played with Ruby.
Ruby to me feels like a very ugly version of Python. It's like Python and Perl had a baby, and I have very strong negative opinions of Perl's syntax. It baffles me how a language that people jokingly refer to as a "write-only" language ever got any sort of ground.
I also think it is easier to use, period. I've used Ruby professionally since the Rails 1 days, and still program in it most days. A couple of years ago, while working at an AI company, I helped out on an ML project due to a time crunch, and I needed to use Python to contribute. I wasn't asked to do anything ML-specific, but rather help by building out infrastructure and data processing pipelines, i.e. the stuff that used the ML models.
I'd never used Python before but within a couple of hours I was writing code and in less than a week I'd engineered a pretty slick, very robust pipeline. I was quite honestly fairly astonished at how quickly I became productive in the language.
I could be wrong about this (my experience with Python started and stopped in that one week) but the impression I got was that Python is smaller, more constrained (i.e. fewer ways to do the same thing), and syntactically less complex.
Python is easier to use if you come from almost any background, programming or not. I believe this is primarily b/c there isn't a lot of "special syntax" in Python, it's all very explicit and common. The same is not true with Ruby.
Could you point out specific parts of python that are easier for someone with C/C++ background as opposed to Ruby? I remember starting with Ruby (after rudimentary CS50-level C), and finding it quite reasonable and logical, and nicer than python. I still think it's nicer than python, although I've long since stopped using it.
I believe the issue isn't so much "vanilla python" vs "vanilla ruby" for a developer coming from a C background but rather that ruby's programming style leads to a significant bit of meta programming which (aside from being a bit of a challenge to get one's head around) leads various shops and frameworks having built their own DSL for writing ruby.
Open classes give me the security heebie jeebies.
irb(main):001:0> "foo".bar
(irb):1:in `<main>': undefined method `bar' for "foo":String (NoMethodError)
from /usr/local/lib/ruby/gems/3.1.0/gems/irb-1.4.1/exe/irb:11:in `<top (required)>'
from /usr/local/bin/irb:25:in `load'
from /usr/local/bin/irb:25:in `<main>'
irb(main):002:1* class String
irb(main):003:2* def bar
irb(main):004:2* "foobar!"
irb(main):005:1* end
irb(main):006:0> end
=> :bar
irb(main):007:0> "foo".bar
=> "foobar!"
irb(main):008:0>
On one hand, that's really neat. On the other hand, the ability to add or modify a method in a system class is not something that I'd want near production code. I'm sure that other orgs have sufficient checks and style guide to prevent something from creeping in... but that sort of flexibility in the language is something that I'd prefer to stay away from if I want to be able to reason about ruby code.
See also Ruby Conf 2011 Keeping Ruby Reasonable by Joshua Ballanco https://youtu.be/vbX5BVCKiNs which gets into first class environments and closures.
I wrote code professionally for 40 years before retiring. Toward the end I was also doing a lot of non-code crap but I always resisted the push toward any kind of management. I am quite happy at how my career turned out.
Funny enough, I did exactly that. During covid, I bought the dip and moved to a place that is 100' away from my dads house. I get to see him every day now. I feel very lucky.
Java probably will for application development. Python should to with all that ML code. I don’t see anyone in a hurry to move it to Julia. C++ also, if you’re in the embedded space. I know Rust is coming but I don’t think c++ is going anywhere for a while.
Probably not. Lots of "critical" software is re-written frequently. I met the head of IT at Target during a presentation he gave on how they switched from PHP to NodeJS (~2012). I took a year to migrate the entire ERP solution to NodeJS (plus frontend).
Just like that, PHP was gone.
If the entire ERP can be re-written that quickly, then if a better language comes along, it will displace the Node infra.
SQL is pretty darn perfect for its purpose. As long as databases exist, SQL will exist. It is also super easy and intuitive to learn- I taught myself and it has been my bread and butter for the ~17 years since .
SQL will always exist as a lingua Franca for data because its declarative nature lets anybody immediately start using a new database/data tool just knowing basic SQL. At the same time, it’s far from perfect for all use cases - it can be much easier to write certain queries imperatively than declaratively.
I can envision a future in which databases are primarily used imperatively while still supporting SQL for those who want to use it.
> I wonder if Python and JavaScript will get you that far 50 years from now?
AngularJS itself singularly powers surprisingly large number of Enterprise applications. So even assuming the unlikely scenario that those languages are dead, and the only useful work is from dinosaurian companies who was too slow to switch, the answer would still be yes. :)
I must be looking in the wrong places, but I don't have the feeling that Fortran on my Resume has really helped me a ton. Fortran is a fine language and of course "real programmers can write a Fortran program in any language," but it is hard to compete with the breadth of C++.
Python has been in use since before I graduated college. I supported a code base written in Python at event Microsoft back then (they acquired a company that wrote their product in Python, and then ported it over to C++/COM). We all had one of the something like 1.4 O'Reilly books, even.
And I think one person from the team went on to write/support/somehow be involved in Subversion SCM (which was heavily written in Python).
....so Python has been around, used in product, by large companies, for a long time. I don't see it going anywhere in the next 20 years.
That job market seems like so long ago. See job ad, type out a letter with your CV, wait for them to write back, somehow organize a time to meet, more rounds, and so forth. At least it must have been hard to spam out CVs.
The guy seems pretty hardcore from my perspective. University training for all those techs. Of course hardware back then cost a lot of money so you wanted people who knew what they were doing.
What did people do for interviews back then? Reverse-a-linked-list? That would have been a relatively recent publication in 1980. Kadane's algo didn't arrive until 1984 IIRC. Was K&R published yet?
My copy of K&R belonged to my dad and was from the early 80s!
My job interviews in the 80s and 90s (college summer jobs, or the one time a company tried to get me to leave college for a job) had no whiteboard-coding-style technical skills components, aside from demos of software I'd written and general discussions of implementation details.
One job I interviewed for was at Davinci Email. This was probably 1990-ish? They made a LAN email product that ran on top of Novell NetWare. There were a couple of hours of general interviews, including a lunch on-site. The last interview was with someone very technical, who had printed out a few pages of listings of the obfuscated C contest. He asked me to go through them and tell him what each program would output. I did not get the job.
Got a job working on an old Ada system during the Great Recession (I was unemployed and desperate enough to take any job). Mom was kind enough to give me all of her old Ada books - apparently her employer (defense contractor) had sent her to training when the language was first introduced.
> The last interview was with someone very technical, who had printed out a few pages of listings of the obfuscated C contest.
That's cold. The worst I've had was otherwise normal code with a few deliberate bugs introduced: "Tell me what's wrong with this code."
> The last interview was with someone very technical, who had printed out a few pages of listings of the obfuscated C contest.
I had something similar happen in 92. The interview consisted of leaving me alone looking at one screen of Apple BASIC that came from their actual production code. The screen was nearly full of characters. I was supposed to tell them what that bit of code did.
I spent about 5 minutes looking at it, and though I think I could give the correct answer, I also realized I didn't want to work with this code base. I then thanked the interviewer and left.
The headhunter that sent me on this interview was quite irate that I had "embarrassed" them. C'est la vie.
> The last interview was with someone very technical, who had printed out a few pages of listings of the obfuscated C contest. He asked me to go through them and tell him what each program would output. I did not get the job.
It's nice to know that stupid interview questions are not a modern innovation.
I am 100% fine with CS trivia interviews. I am fascinated by CS and can talk your ear off about it.
What I am not fine with is that you're judged entirely on that. My biggest complaint about this industry is not the CS trivia, it's that my entire job history is irrelevant. I have a decade in this industry and a staff title and I am still treated like a junior developer with no experience when I am interviewed. It's degrading and insulting. I can understand rigor in an interview at our average salary but the market is still firmly controlled by corporations despite what the media says about job prospects. Given that there are approximately 10-20 jobs per engineer in the industry right now, if we really cared, all we would have to do is just collectively say "no".
I really struggle with this, because on one hand I don't want to be the arrogant special snowflake kind of person, but on the other hand I also have a 15 year job history and 100k lines of code on GitHub, including some fairly widely used stuff. If you want to establish basic competency it's not hard.
So basically my solution is to just ghost people when they ignore the subtle "maybe look at my GitHub that you asked for to establish basic competency?" and start asking for coding tests because I neither want to do the test nor come off as a twat, and this seems like the "least bad" option. The truth of the matter is I have the time and can do it, I just don't feel like doing it; nothing more.
And I also consider it as a bit of an indication whether I want to work for them in the first place. "Rules must be followed, at all times" with zero flexibility or common sense is not really something I deal well with.
I've brought that culture back in our company. Hasn't failed us yet. Turns out for a CRUD web app you really don't need top hackerrank skills. In my humble opinion, people who excel in algorithmic code interviews want to overcomplicate everything and get burned out super fast with 'real world' tasks.
I'm deeply skeptical of claims that you can't suss out "fakers" like this. For one thing, people who were that good at faking could be making a lot more money leveraging that skill directly rather than trying to sneak into mid-paying software jobs.
I think a far more likely explanation is that lots of interviewers are very bad at interviewing, and that interview anxiety, especially given the kind of shit that gets thrown at you in programming interviews, is a lot worse and more widespread than one usually supposes. Result: interviewers are convinced they're constantly catching "frauds" that they couldn't have caught otherwise, but they're frequently wrong about both those things—that the person was a "fraud"; that the interviewer couldn't have caught actual "frauds" with an ordinary interview.
You are right, it happened once, but that's what probation periods are for, in my opinion. Also I'd add we don't hire a lot, so this approach probably doesn't work for places which are hiring a lot of people regularly.
Yep, I blame Microsoft. Now these sorts of interviews are done even by companies writing pedestrian web apps that don't require hard-core CS knowledge. Yet they test every applicant on it anyway.
If you have enough applicants, why not filter out so you have the best of them? (well, maybe not quite the best, there is some value in having someone less likely to get bored and move on PDQ).
One reason, besides the obvious lack of respect, is that the more you test for things you don't need, the higher the odds that you select somebody with false positive results on the things you need.
I've never seen any evidence these interviews accomplish finding the best or even a competent candidate.
In my opinion the best interview process involves simply looking at the work history and having a conversation about it. If it sounds pretty good, you go with your gut and hire. A bunch of different people paid this person a lot of money for 5, 10, 20 years and you really think there's a chance they were all fooled? The conversation and your gut figure that part out with a decent success rate.
Yep. Can they talk at length and in reasonable detail about previous projects they've worked on? You can usually figure out if they were actively engaged and involved in the work vs ...just kinda there.
Also, do they try to BS you when you ask them something they don't know...
The "traditional" coding interview is really only appropriate for a college student/new grad with no work history to lean on. Even then, there's probably a better way...
The last "tech" interview I had was in 2007, with my current employer. It was not an algorithmic interview, but rather a deep dive into how much I knew about SQL (which was a critical part of my position at that time), and a bit of general web knowledge. I definitely hit a point where I said "well, I don't know," but managed to get the job anyway.
I had two previous "tech" interviews prior to that. The first was for a Perl shop. All Perl-specific questions, and the interviewer even gave me a copy of the Camel Book to thumb through if necessary. The second was for an MS-based web shop. They sat me down at a computer and told me write a relatively simple C#-based CRUD app. I was allowed to Google whatever I needed. The Perl interview was fairly challenging (I knew parts of the language, but was not an expert), the C# one not so much, but I'm sure it weeded out a lot of applicants.
I've also had three other jobs where there was not a "tech" interview at all, mostly just chatting about projects and whatnot.
They were definitely influential, but it's way more complicated and probably has as much to do with The Guerilla Guide to Interviewing, which was Microsoft-based.
My first interview for my first job in 1981 consisted of the manager asking me general programming questions ending in a description of an errant program which I had to explain how I would figure out what was wrong, and what it was likely to be. No whiteboards, no coding, nothing. That was the only interview, and I was hired on the spot, despite having 0 work programming experience and 0 college education in programming (was chemistry major, programmed for fun on an Apple ][). Highly unlikely to happen so easy today. I retired recently after nearly 40 years as a working programmer.
> What did people do for interviews back then? Reverse-a-linked-list?
The last time I was hired for a purely software-related position was 1993. They were looking for an IC (they didn't call it that, then) who would quickly transition to tech lead / system architect. The technical part of the interview involved role-playing myself presenting an initial proposal to a potential customer who had provided a brief written requirement. One of the interviewers role played the customer. IIRC the interview schedule was something like 30 mins to read the requirement and think, 15 mins to ask initial questions, an hour (perhaps longer?) to develop the proposal, an hour (?) to present it and face customer questions.
In essence I was being asked to spot ambiguities in the requirement and develop an initial estimate. I passed. I would have failed had I applied for the same job straight out of uni.
He is coming from an aerospace/military background so I assume he is providing this information in case the job requires working in confined spaces and/or lifting weights or other physically strenuous exercise.
I've been reviewing a lot of resume's lately that call out the candidate's exact date of birth and marital status. Many even call out their parents' occupation. I've seen more than one that say "Mother's Occupation: Homemaker".
Although I now realize this is a cultural difference issue, it caught me off guard at first.
Which country/region are these resumes from? I would be very surprised to see something like that on a U.S. or UK resume/CV. I know that it’s quite common for photos and personal details to be on CVs in parts of continental Europe though.
I didn't even think about this at first, but—perhaps the parents' occupation is there because of the caste system? I heard that it's still around not just in India, but also in diasporas like the Silicon Valley.
Could be. I work with a lot of Indians, both here and abroad. I won't/can't profess to be an expert on them of their culture. But I can say that I've learned a lot over the years. I can only speculate (but won't do so here) on the reason for giving parent's occupation. From what I understand, the caste system is not what it once was and is beginning to disappear into the past, but it's legacy can still be felt in some profound ways both in India and here in the U.S.
For example https://german.dartmouth.edu/opportunities/working-germany/w... mentions family info, which seems bizarre to British and US people, as it's not something a candidate can control, so it seems unfair to discriminate based on this. I know German culture is different of course.
While that site is basically correctly, "not that long ago" should be interpreted more like "within post-war era", not "a few years ago".
Photo yes, Familienstand yes still today for conservative companies, parents occupations not since probably the 80s in most places that would consider foreigners at all.
(It is, as you say, all fairly obvious bullshit designed to make sure the right social class gets preferential treatment...)
The provided link was fun to read (as a German). Personal highlights:
> German employers simply don't know what to make of an Art History major who wants to take a temporary job in an accounting firm before going on to medical school.
I'm still laughing.
I guess actually nobody knows what to make of an Art History major in the first place. That's one of the typical things one would study if your only plan in live is to become "a wife" (OK, today maybe also "a husband"), or when you have absolutely no clue what you want to do and need additional time to orientate.
Also nobody would hire an Art History major to do an accounting job. Never ever!
That's just ridiculous. You need professional training in accounting if you like to do accounting.
And going to medical school after getting an Art History major? Alone the idea is even more ridiculous than the idea that you could do an accounting job with an Art History major… You need almost teen years to become a full medic. Also getting into some of these universities require that you stand in line for quite some time, and have absolute top grades form school. The people that consider going for Art History study aren't the ones that would have any realistic chance to ever attend (a German) medical university.
So alone that sentence above is actually a kind of joke. But that's not everything funny in there.
> They may neither know what the Ivy League is nor know which university is more prestigious than another.
> In Germany, where you went to school is largely irrelevant.
Jop. And that's a big advantage!
Maybe not out of the perspective of some Dartmouth scholars, but most people on this planet agree that the anglo-saxon system for higher education is just complete madness.
The whole Bologna Process BS (which is modeled by the anglo-saxon madness) significantly decreased the quality of German's higher education, and at the same time almost invalidated the achievement of possessing an university diploma. Now everybody can get some "Art History Bachelor" degree, or some crap like that…
I strongly hope that we'll stop that madness at some point before our education finally hits the lows of the anglo-saxon equivalent!
There was a time that a German "Dipl.-Ing." or "Dr." title had some meaning. What you get nowadays with most "master" students are people that would miserably fail at "Vordiplom"… Also, "everybody" and his dog has a "bachelor degree" which makes it actually useless (and made just "regular school" out of university).
Amusingly enough, my German-as-a-foreign-language teacher had a degree in Art History and made great use of it. Of course, the relevant part for her resume was that she had a Art History degree from a German university conducted in German as a US-native. It demonstrated a much higher degree of language proficiency than the average foreign-language teacher at a high school in the US and gave her classes a unique twist.
The US (although not the UK) college system values taking multiple paths early on, especially for MDs and JDs, so an Art History major isn't completely absurd. At the university I went to, pre-med was a list of classes, but you couldn't select it as a major. Most students would major in something related, like biology, to maximize the overlap in classes, but a Classics major (with a heavy focus on learning Greek and Latin to help with medical terms) was considered a rare but very viable option.
That said, I think the greatest strength of the German education system is its trade schools. The US trade school system is much more ad hoc. Most jobs/problems don't need the heavy theory of a graduate degree, and honestly I think both the US and Germany could use fewer PhDs and more people with practical skills.
This makes no sense. If anything, "anglosaxon" countries are much less obsessed about prestigious schools than places like say, France. So to portray it as a uniquely anglosaxon trait doesn't make sense.
Also, german higher education is meh at best. Even beyond rankings, german universities are usually well in the middle of pack at best, in almost every quantifiable metric. Though putting the blame on the anglos for that is... very typically german I guess.
> Though putting the blame on the anglos for that is... very typically german I guess.
I'm not putting blame on anybody. (I wouldn't be here, or wouldn't have even learned the language if I wouldn't enjoy being with the "anglo people" as such ;-)).
I've said that the standards were undoubtedly much higher before the "Bologna Process", which adapted the German system in most parts to the anglo-saxon model, for net negative gains, imho.
> I strongly hope that we'll stop that madness at some point before our education finally hits the lows of the anglo-saxon equivalent!
Don't British and US universities significantly outperform German universities according to most rankings? I think there's just one Germany institution in the QS top 50 and it's... 50th.
Discrimination based on relevant skills surely - not irrelevant things like family situation? In other countries things like that are actually illegal.
I've been to a military museum few days ago and I was surprised how incredibly small some of the cockpits are. I'm 6'3" and wouldn't fit in most of the fighter jets and other vehicles. Not that I wouldn't be comfortable with my legs pushed to something - my shoulders are literally too wide to get in, my thighs/ass are too wide to even try sitting there. Might be a problem for aerospace tech.
And don't get me started on spaceships/capsules - I don't have any particular fear of confined spaces but this was a little too much (too less?).
Your comments made me wonder if there were height restrictions, found this verbiage on US Airforce site:
"For pilot and aircrew positions, height specifications vary by aircraft and most applicants can successfully pursue a career in aviation with the U.S. Air Force. Applicants who are significantly taller or shorter than average may require special screening to ensure they can safely perform operational duties. Applicants of all heights are encouraged to apply."
If you think much about the different platforms, it makes perfect sense that there are specific and varied requirements. Presumably they're pretty flexible about who flies a C-5, considering it's big enough to carry Chinooks or M-1 tanks [0]. OTOH, ejecting out of a fighter jet probably doesn't go very well if your knees are smashed up against the dashboard.
Sitting height is just as important for safe ejections as leg length. My dad was 5'10" but with a tall sitting height and he was just barely under the safety line for a seat in an S-3.
Having short limbs and a long body can indicate the presence of a medical condition known as "Hypochrondroplasia".
> Hypochrondroplasia is a genetic disorder characterized by small stature and disproportionately short arms, legs, hands, and feet (short-limbed dwarfism). Short stature often is not recognized until early to mid childhood or, in some cases, as late as adulthood.
When I was an undergrad I thought about seeing what it would take to be an astronaut. Turns out that the largest spacesuit they made back then was 6', so even if I had Buzz Aldrin's CV I wouldn't have been able to go.
Former USAF pilot candidate: there are physiological reasons specific to high-G maneuvering in fighter jets that taller people are disqualified for as well. Shorter people have less challenges with GLOC or loss of consciousness.
The New Mexico Museum of Space History has a Mercury capsule you can sit in. Even with most of the equipment removed (just the control panel and a bench) it's incredibly claustrophobic. To be strapped into that thing with full equipment, in a flight suit, on your back over tons of explosives, and launched into space for day or even a few minutes...I can't imagine.
1. This is an "old" resume. I'm not as old as OP's dad but I definitely recall putting my gender, height, probably weight, definitely "health: excellent", etc. on my resumes in the 1980s. Different times. Times of discrimination? Probably. :-/
2. As others point out, this gentleman was in aerospace and even though he wasn't a pilot, here's a fun fact for the morning ... I watched a documentary on the early years of fighter pilot selection and grooming. Russia apparently recognized and exploited the value of a short heart-brain distance in its pilots. If I recall correctly, pilots with shorter distances between heart and brain can pull higher G's (or maybe negative G's) before browning or blacking out in certain maneuvers. It makes sense when you think about it. So if you're an air force looking for every edge you can, you might select for this trait. Shorter men (specifically, shorter sitting height: reasonably correlated with heart-brain distance). Also some women, I would expect. Anyways, this is probably not why OP's dad shared his height, but sharing a possible TIL as it was for me... :-)
My first real job was 1986: Having a "stable" personal life was something employers would "need" to know. And god help you if there was a gap on your resume; it was advertising your deviancy.
Such attitudes all changed drastically just around that time, not in small part because the demand for EE and CS people started outstripping the supply.
From my memory of Engineering School in the “colonies” in 1990, CS were very geek, EE were normal geek, mechanical engineers were normal engineers, and civil engineers were social deviants (to use stereotypes). There was a certain geek pride abnormality that was almost expected of the better EE engineers . . . or perhaps I thought that because I identified as a geek? I don’t recall noticing any strong demand for EE engineers. I think I noticed the uptick in demand for computer skills in the mid-90s, although I admit I surely was unaware of the world around me.
This is pure speculation, but I would guess it was simply expected information at the time. In some countries, it is still expected or at least common to see information such as: a portrait ; date and place of birth ; marital situation and number of children ; and so on.
What!? Some countries force you to state your marital and familial situations!? That’s insane. It’s illegal for companies in the US to ask about that stuff.
That said, some other people pointed out he had a military/aeronautical background, so height and health might come into play for certain jobs. That makes sense to me. You probably can’t work in the cockpit of a plane if you’re 7 feet tall.
Friend worked for the military overseas. Locals would send in resumes and include details such as their social caste and were horrified (or delighted, depending) when that information was redacted before being sent to hiring folk.
On the upside, it apparently became an upward mobility avenue for "low caste" folk who would otherwise not be considered for valuable positions.
The downside is clear if you're upper-caste, but my point was to reinforce the (challenged) statement that over-sharing in other cultures is both commonplace and intentional.
Going back a little farther it was not uncommon to list your religion and what church you were a member of. Going back a little farther than that, if you were a candidate for an executive position at a company, they might interview your spouse as well.
afaik the line of thinking were : religious people were more honest, happily married more stable (ie, your relationship wasn't a source of stress/instability), and being a parent made you seem more responsible.
Nowadays cynicism has inverted those thoughts:
Religious -- hypocrites. Married -- soon will be divorced, may get pregnant. Parent -- how irresponsible for the environment.
I worked at a startup, in India's silicon valley, before the word startup ever existed. 1 PC, with 640Kb of Ram, 20Mb hard-disk, the owner took a loan to buy that thing. We were 4 programmers sharing that 1 PC. We each had 2 hour timeslots to code in C or whatever other language. Away from the keyboard, I'd write code on paper, so that I could make the most of my slot. Sometimes I would have to print my code, and my boss (the owner) would go over it with a red-ink pen. I was an apprentice. The boss (owner) would warn me, "beware of the day computers will be writing programs, it will be soon here (and maybe put me out of a job)"
Bangalore was where it was all happening then (as far as India was concerned). It was a heady place and time. As a rookie customer support engineer I also got to see Data General's computers at various customer installations, spooled tape, and drum hard-drives, etc. This was early 90s in India. India's IT Export industry was in its infancy.
My stepfather Marty Einhorn was part of a developers' association in San Diego back in the '70s/'80s. I don't remember the name of it, but I used to go occasionally as a kid. I met Richard Siegel there several times -- seemed like a nice guy.
I dearly miss my adoptive father, and his close friend, who introduced me to computers in the early 1980s, in a dreary backwater town of a backwater Middle-Eastern country, starting with a Sinclair ZX-81, soon followed by a ZX Spectrum :)
We were always behind the rest of the world in everything; to get new software or books and magazines we had to wait for someone to make a 10-12 hour trip to the nearest major city, which happened once every couple months, so we had to prepare wishlists in advance ^^
Those 80s computer mags were the best part of my childhood: Your Sinclair and specially ZZAP! because those were all I had access to when someone else was using the TV, or the computer, or just waiting for the electricity to come back on (something which that part of the world still struggles with)
I graduated to a Commodore 64 and fantasized about getting a Commodore Amiga, but by the time we could afford a new computer, the world had moved on, and I got my first "IBM PC" in 1993: a 286 with a 40 MB (megabyte) HD :)
My dad's friend, my uncle, was pretty much a genius who had taught himself electronics, repaired his own TVs etc and even built his own audio equipment and other simple devices for his friends. He tried to teach me programming in BASIC but none of it stuck with me (I try to make up for that by learning Z80 and 6502 coding these days)
The guy died relatively young, and his genius was never recognized outside our small town, but the children he influenced and instilled a love of technology in, always remember him and owe their skills to him.
I'm at the age where the college I went to required us to take JCL, but it was already on its way out. I also took IBM 360 assembly language, which was WAY MORE high-level of a language than I was expecting it to be. Before then, my impression of assembly instruction set were from 6809E and 6502. In comparison, IBM 360 was a dream. But I never worked with it. It was just a class.
The other thing that was interesting is that unlike the rest of our assignments which we could do in the lab, this one we had to send the code to a computer in a different city, which ran the job, and came back with results 4 hours later. You had only 4 runs to get your code to work.
The most interesting about part of this story is that the every next semester after me, the same IBM 360 assembly language class use an IBM 360 emulator that ran on IBM PCs (this is at time of real-mode 640K DOS). So if I had just waited a semester, I could have done my assignments using an emulator on the PC.
> Who else can say they used a computer with two ROUND screens?
Hmm, that takes me back, heheh. I remember using some spanking new equipment in the late 60’s with round screens. I was just 8 or 9 then so my recall is unsure but think it was Digital. I do recall learning how to read and enter memory locations with toggles so I could cheat at Lunar Lander.
The PDP-1 could be fitted with one (I believe you can play with it at the Computer Museum in Mountain View). The CDC-6xxx series had a magnificent console with two round screens (and vector graphics, although they weren't direct view storage tubes like the supercool Tektronix were). There were two such consoles at the Living Computer Museum in Seattle.
And, BTW, it's not always that I meet a fellow Elder Thing. I am myself from 1968, so I didn't get near an interesting computer until much later.
I keep looking at small round displays and think about building something with them. If we could get our hands on a legally available disk image for, day, DtCyber, it'd be feasible to build a cyberdeck-style laptop with 2 round OLEDs (or, more practically, a cheaper rectangular one and a with two circular cutouts.
The interview that resulted in my first real job as a programmer was in 1976. In that interview I was asked quite detailed questions about writing an interrupt handler and the rest of a device driver. Despite having a new MS in Computer Science, it was a stressful interview. However, I did get hired and actually did do a project that involved writing the drivers for a new piece of hardware.
> "Experienced software architect with a proven track record of delivering high quality and performant blah blah blah"
What's wrong with that? Each part of the phrase sounds like something an experienced developer should strive for, and is objectively testable (e.g. delivered projects in a work setting or not, well-written code or not, the software runs quickly or not, etc.).
Junior developers in particular may lack the track record part when starting out, so it's a good indicator that a person is applying for more senior positions.
The problem is it's already given by the extensive job history. Typing that is just a dance we have to do to get past automated filters that look for keywords. By my estimation we are now in the "black hat SEO" phase of resume design. Soon, not even that will work.
Back in OP's Dad's day actual humans who actually cared looked at every resume and more often than not treated the interviewee like a human. For us, we just get fed into a machine and if we make it out of it maybe a human will glance at it.
I started doing my MSc in large part to get past automated filters in the aftermath of the dot-com bust. The irony is that I could have left it at that: Just adding "Studying towards an MSc in ..." got me a marked uptick in responses, and recruiters asking about it.
This may not be an accurate worldview of mine, but I've actually completely given up trying to apply to jobs that have application portals/likely keyword filters (though I may be willing to if I search for public sector work in the future).
I try to find work through past coworkers and often by reaching out directly to the hiring manager if I think I could have skills that they are looking for. My friend of mine who took the standard volume approach got over a hundred rejections before receiving one offer, often with radio silence. The human approach is nice because it bypasses the filters, and you're far more likely to at least get responses along the way during the job search.
Well, I don't think there's necessarily anything "wrong" with it (It's actually from my own resume), I just cringe when I read it. It reminds me that I operate within a world obsessed with jargon and eye-roll inducing business speak. It all feels so unnatural to me.
When I see things like that on a resume I instinctively smell bullshit. Right or wrong, that's my reaction. "Proven?" Show me the proof. "High-quality and performant?" I better not be able to quickly and easily find end users complaining about your companies software.
There's so much suspicion in this industry. Is it so in other industries? We see a 20 year work history, and we assume you must be lying so we LeetCode you in front of a couple recent college grads. And now we're going to go after end user complaints as well?
I guess its good I work on the back end, I can always blame poor user experience on the front end and "UX" people.
I'm suspicious of people that need to dress up their 20 years of experience with business speak, yes. It is the business speak specifically that makes me suspicious.
Everyone does this because everyone thinks they need to. So you're suspicious of everyone, but the only thing you can legitimately suspect is that they're the kind of person that does what needs to be done.
You said it yourself: "Each part of the phrase sounds like something an experienced developer should strive for". Nobody will ever write the opposite. So it signals nothing.
But as I also wrote, at least one occurrence of this in a resume is still a good signal that the candidate understands what a hiring manager is looking for, and also signals that the candidate isn't totally junior.
If a candidate is totally junior and still writes that they are are a junior an "experienced software architect with a proven track record of delivering high quality and performant..." but fails to back it up, they end up being judged next to people who actually do have this experience.
In that case, it could be better to emphasize the "willing to learn quickly"/"excellent team member"/soft skills aspect to set expectations right, and/or develop better technical skills so a candidate can actually claim that. So, it's only really beneficial to write if you actually do have experience, and thus could be a worthwhile signal to include at least once in the resume to show you're at that level.
My generous interpretation is that dad needs full-time care that the family can't provide in-house. For example, alzheimer's and dementia patients often have very particular needs.
runvnc is my github. Yeah maybe I didn't word that the best. The short version is that after my mother passed away, myself and my sister were there full time for several months, but at some point we couldn't handle it anymore. His memory was almost completely gone, bodily functions often seemed to be like torture to him, but the big issue was that he started yelling every time we tried to move him. The hospital said it was apparently a type of vertebral compression fractures or something.
Dementia and/or serious motor issues will mess that up every time, unless you've got a close family member who's willing and able to be a full-time caregiver for years on end.
But sure, some people avoid that through some combo of luck, genes, and clean living. Or just die before it becomes an issue, I guess.
Yeah, my grandma has dementia, can't walk without assistance, and basically just shits herself all the time. She lives with my mom, but she doesn't have the energy or mental fortitude to be the full-time caretaker she needs. She wants to put her in a caretaking facility, but just doesn't have the energy to research options, not to mention the money. She wants to call my uncle and ask him to pay for it (Grandma and the uncle are on my dad's side, but my dad passed 3 years ago), but has basically been avoiding making that call.
> Or just die before it becomes an issue, I guess.
If I ever get to the point where I'm no longer living, but merely surviving the way my grandma is, I'd sign a DNR and make it my solution.
I'm not sure it is a matter of health. An relative of mine in her upper 90's recently passed away at home -- fortunately a number of relatives were well off enough to move nearby and help out, and hire a nurse to check in occasionally, etc etc.
If anything I suspect dying at your home would be easier if you weren't very healthy.
My grandfather died in his home after a short but severe illness with round-the-clock care from the family. It was relatively easy, as far as these sort of things are ever "easy".
My grandmother died a few years before that, after spending a year in a nursing home with a less severe illness. It just wasn't feasible to keep her at home: she couldn't really walk any more and half the family would have to pause their lives. I don't really know her opinion on that as we never really had that kind of heart-to-heart relationship, but I would certainly much rather be "put in a home" than be such a burden. My grandfather was very happy that his end was a swift one, so the family wouldn't have to put through a long drawn-out ordeal again.
My mom used to joke that in our family, we don’t end up in homes because we all die of cancer before retirement.
Well, joke’s on her, she’s still alive at 70 and I survived my first bout with cancer at 36 already so she was wrong on two counts. Still didn’t put any elderly family in any home though…
People only get "put in a home" when they require around the clock skilled medical care. Visiting nurses and the like exist for people who need ongoing medical care but not as often. It's usually not anyone's first choice.
Yeah, genuinely curious for how many people in their 90s that are “put in a home”, does that change end up being the right move for them. Kind of wish the question was studied so people could make more informed decisions.
Whoever can type a resume of that length in a typewriter with no errors I would instantly hire. Such level of attention to detail is extremely rare these days.
Back in the 80s I would type up a resume and make corrections with correction paper (big upgrade from liquid paper!). Then I would go to a print shop and have them make copies of it but on nice paper. The copies would not show any signs of the corrections. The same thing could have been done with the cover letter in this case, since only the date and addressee at the top would need to be changed for each company.
Sadly, communication and writing skills takes a backseat to algorithm interviews. I understand the company benefits massively from hyper-focusing on hard skills but quality of life suffers for your average IC mired by the daily failures of miscommunication.
In 1985 I learned and shipped products using IBM Assembler, Cobol, JCL, TSO, and some others not listed.
It was a bit of an unusual mainframe software spinoff company where I did my 1st year and later co-op work placements. My next co-op placement was more conventional embedded C.
Eh, I was pumping out PHP+MySQL and a bit of Python until the early 2010s, when a) I went to work on a larger site where disparate specialized tech was used to optimize every part of it, and b) hipster explosion of devops happened.
Listing gender, height, health on the resume (!) Can’t imagine getting a resume with that info these days.
Listing corporate training under education. Again, wouldn’t expect to see that today. Not sure if that is because no one does employee training anymore, OR, if it’s just expected and understood that you’ll learn new stuff constantly as a programmer these days.
Perhaps not in the U.S due to labor laws and the EEOC, but in some countries you must also attach a head shot. Not only that, but HR can casually drop by your house unannounced to inspect your living conditions and make a note of anything "unusual". I know it sounds straight out of Severance, but that's how things would be stateside if unions and others hadn't drawn the line somewhere.
Headshot yes, blood type typically not.
They do require to reveal a bit of other information including age, number of dependents, marital status, expected length of commute, etc.
Funniest aspect is that a lot of employers expect applicants to handwrite their resumes and some actually goes as far as rejecting non-handwritten resumes.
That's cool. I may have worked in former IBM buildings in Endicott that he might have worked in. I worked there a few summers in the early/mid 2000's when it was Endicott Interconnect Technologies. I loved exploring those old buildings, lots of tunnels, abandoned sections, old equipment. I wish I took photos.
I was born and raised in Endicott. My only cool visit was going into the quiet room in the IBM Glendale facility. It was covered with that angled studio foam. It was disconcertingly quiet. That sense of "oh this is a big room - I can tell by the echo" starts reporting strange readings.
I really need to post something like this about my mom. She had a BS in math and was sent to learn how to program computers in the late 50s in her first job. I have a photo of her in a skirt moving jumpers around on a room sized computer. She programmed in Fortran for most of her career. She retired in the early 90s and passed away soon after. If anything, I followed in her footsteps. I still remember playing colossal cave adventure on the minicomputer (Harris?) in her office in the late 70s.
Looks like he also got quite a bit of training at what I assume is UCSD Extension program. I received a certificate in C programming from there in the early 90’s. I wonder if they still offer similar programs today.
How amazing it would have been to work for Convair or General Dynamics during that time!
The absolute heyday of Convair! The B-58 Hustler would have been introduced around the same time your father worked there! One of my favorites!
I wonder how many return calls you'd get if you used that resume format today!!
I hope you (or whoever owns that repo/resume) gets the chance to talk to him about working there and what it was like to watch computers shrink in size while getting more powerful! Thanks for sharing!
1. As much proprietary stuff as we still have to deal with, we've really come a long way.
2. The approach that our parents took of working at one company for many years (or a whole career) (and retiring with a pension) really disappeared quickly.
Regarding 2. : I’d stay at a company many years no problem if that meant i could get a livable wage, start a family, buy a large enough house, save money AND save for a pension… all on a single income.
The reality though is that nowadays if you want to reach a salary level where you can start thinking about some of such things you have to do quite a bit of job hopping.
When I graduated college with a EE degree in 1977, my resume was damned sparse on experience, mainly because I didn't have any. All I could throw out were the EE and engineering classes I'd taken in school. Somehow, I got hired by a company in S.V.. By the time I retired after 43 years, I had so much experience, it wouldn't fit on 3 pages. Fortunately, I didn't need a resume any longer.
During the early 2000's there was an IT slump in CA due to the dot-com crash, so I spent a lot of time sending out resumes. Eventually I created a script to quickly customize them for a given job ad.
It had topic meta-tags, and each topic had four levels: high, medium, low, skip. The level would control the placement and amount of detail given to each topic. Medium would default to "low" if there were no medium-level content/detail. Thus, I didn't have to always type 3 variations per section+topic. There were other switches I won't go into.
A found that highlighting applicable domain experience helped a lot: "billing", "budgeting", etc.
I still did some hand customization, but the script allowed me to send out roughly 1,000 resumes and/or CV's all over the nation without getting carpel tunnel. (I preferred to stay in CA, but the market was really dry at the time. Plus, location mattered less if it were only a contract.)
When feeling trapped, a programmer always "writes a script".
It's sad how a fraudster can abuse knowing a persons full legal name, date of birth, past home address and phone number. All of which was on this resume posted to the internet, and should probably be masked out.
It struck me as comforting to actually be able to post it all online without a major worry, I imagine a person of that age living with close supervision of their family can get fully disconnected of this paranoia and just enjoy actual life, sans administration and connectivity.
My dad was an electrical engineer in his home country. When he immigrated to the US in the Seventies, he had to basically start over because his certifications weren't recognized. He took it as an opportunity to try and change careers, and he looked at everything from locksmithing to programming.
Several years ago, long after he passed, I found the C book he used. "Hey, I know C!" I thought. It was a weird feeling to have independently ended up in the field he strived to be in.
I love the resume itself: the non-fussy font, wording, and layout is refreshingly straightforward. It makes me want to take your dad to lunch and hear all his stories. My grandfather piloted a B-29 in 1945, and his brother was an early computer scientist; maybe their paths overlapped. Warm regards to your dad!
My father started at Univac just after the Korean War. He never did a lot of coding, instead he was a "systems man". He went from the room size mainframe systems with huge dedicated tape and disk machines to time sharing to buying me an Apple II around 1980.
Unfortunately, systems and procedures is no longer sexy and is mostly an after thought today. But back in the hey day of the 1960/70s that was where the money was.
I nearly switched from EE to CS, but decided against it after my first Fortran class. I had the semi-mythical shoebox of punchcards too - walking to the other end of campus was an exercise in fear.
Good to see your dad's resume. Good recollections of the programming languages COBOL and FORTRAN. Hope your dad enjoyed and continues to enjoy whatever he does.
I was born in 1981, learnt programming in college in 1999-2000. I learnt COBOL and FORTRAN too. To me, at this moment, all programming languages are almost the same. I am doing go now, will pick up rust by the end of this year.
We have to the solve the problems they keep changing.
Perhaps the initials of the person who typed it. In most business contexts at the time, something like "ABC/jed" at the end meant a secretary/assistant with initials "JED" typed something for a manager/employee with initials "ABC."
Ditto! We had big dumb noisy electromechanical typewriters in my Typing 9 class, and then upgraded to some kind of semi-smart typewriter for Typing 10 that had a small buffer so that you could type+correct one line but it wouldn't print until you pressed the return/enter key. They only taught us a couple weeks of word processing on computers. Microsoft Works, if I recall correctly.
> Interesting that this is like a cover letter and resume in one.
Pre-email that was pretty much how it went. There was no point in overcomplicating things. People had to type these cover letters and resumes up, over and over. Or they would pay for a service which used one of those “word processor” gizmos. Fortunately for me in ’80 I had a selectric-like printer and a CRT, the very next year I got an IBM PC and could ditch the mainframe. As a result became tres marketable and enjoyed substantial career enhancement.
It's hard to explain just how new it all felt, then. But in 1973, even though we were sitting on the cusp of the single chip microprocessor and personal computer revolution, the commercial computer was less than 20 years old, and college recruiting materials might well brag that at their institution, there were not one, but two computers on campus. I remember the day the total RAM at our institution passed the megabyte mark - closer to the end, than the beginning, of the 1970s. The ability to "program" was a rare skill - even the people who taught it were still just learning it.