I don’t hear much about it these days, but circa 2004 when I was going into computer science I had multiple teachers and a guidance counselor warn me that all the jobs would be going to India and tell me I should pursue something else. This seemed to be very much the prevailing wisdom of the time. I pursued it anyway as it was my passion.
I was one of five students in my program that had been a huge multi-campus program only a couple years earlier. We were the last cohort before the college discontinued the program entirely, and I was the only one to graduate. What I found however for maybe five years after graduation was a insanely high demand for developers.
There was genuinely a generation that was so strongly discouraged from becoming developers that there were very few. Seems to me like the folklorists have largely missed this.
My father was very strongly against my decision to get into web development (2012) and he echoed this sentiment heavily. I distinctly remember him yelling at me as a teenager; "you'll be living under a bridge in 5 years if you do this! India has this market!". He eventually kicked me out for not listening, forcing me to to borrow money from a friend to pay tuition in my final semester.
Needless to say he was completely wrong. I was out-earning him within 2 years of entering the market and I have probably the greatest job security in my immediate friendship circle.
Despite my efforts, he refuses to talk to me to this day. The man does not like being wrong nor disobeyed. A shame really.
If there’s anything I’ve learned it is that you should always make your own mistakes. Your parent’s mistakes don’t even make sense any more 30-40 years down the line.
My father pushed me into EE. I burned 5 years in industry being paid crap and then discovered it was actually a fairly low paying profession in the UK. Ended up supporting a bunch of Sun machines for other EEs which was more fun. So i learned Perl one weekend, bailed out and lied my way into a sysadmin job for 2x the money. 20 odd years down the line I’m glad I made that choice.
When offering PhD studentship positions, I have had several cases of people from developing countries who wanted to come work in my group, but didn't, and the reason (from their own accounts) was that their parents wouldn't let them go get their PhD anywhere not in the US.
They ended up at groups in the US with much worse scientific output than mine (it's true that well-known US universities wipe the floor with mine in every metric, but that's not true when you go down to the level of individual groups, labs or fields of research), while working unpaid while I was paying a decent salary.
They were bright people so I think and hope they'll have fruitful careers, but their parents (with their best intentions, I'm sure) surely put a roadblock in front of them due to the prejudice that there are no good opportunities outside the US.
This. I also had a M.Sc. student work at my company, a brilliant guy from China, he deserved the best Ph.D. Which in my opinion is with a Professor that made his mark, has a secure position (so he is not still in his publish or perish period), enjoys what he does and cares about his Ph.D. students' development. His parents didn't agree, they wanted him to go to the most prestigious lab he could get into. Probably to be worked like a measurement slave for 4 years + the inevitable extension.
I told him to beware, talk to other students and try to gauge the atmosphere in the lab. If you're smart and motivated you'll make it. But will you have fun and be happy? What do you want out of life?
I have friend that made "Associate Professor" at an impressive age. He does make 80 hour weeks and travels 50% of his time. It can be fun and it is an adventure, but is it what you want? This life is pretty incompatible with having a family for example.
Beware of the fact that your parents may define success very differently from you.
Why is EE so bad? On the surface it looks like a degree with a strong grounding in mathematics, exposure to programming, the discipline of engineering, and a diverse range of applications across industry. It kind of seems like the ideal degree if you want to hire someone. I'm always baffled when I hear stories like this.
I have some hypotheses. Note my degree is in physics, perhaps a similar situation though not as widespread. There are a number of things happening.
Most engineers in all disciplines lose their math ability after graduating. The workplace itself allows this to happen: They get so busy with regular design work that they forget their math and theory. A lot of the analysis work is handled by their CAD tools. The work that does need math or deep domain knowledge is handled by one or two experts within the department.
There are some practical limits to the size and complexity of hardware, that limit the amount of hardware work. An electronic board might be designed and tested once, and then a million copies made. The software for supporting that board is maintained constantly. This is partly due to a conscious choice to move functionality from hardware to software. When hardware is obsolete, it's abandoned. When software is obsolete, it's augmented with new software on top of the old software.
There's a strong message from above that software is more important than hardware. Sparkly software is what management sees when they are shown the product. The people who find that they can program well enough to do it for money, have moved into software development.
It's harder for an individual hardware person to capitalize on their own innovation, because they need the infrastructure to test and manufacture new hardware. So we can only move at the pace of the businesses that employ us.
Programming can inflate its own demand through technical debt, and can organize itself to a level just short of full blown collective bargaining.
Note that I'm not talking about pure software businesses, but those businesses don't need hardware engineers at all. ;-)
Though I don't agree with negative take on software like "sparkly software for management", "inflate its own demand by technical debt" or "organize itself for collective bargaining".
Sparkly software gets you just as far as it is useful usually, less sparkly sells worse but you still need it on hardware to get job done, hardware alone is not enough.
Inflate demand - well people just suck at organizing big projects there is no need to artificially inflate demand it just happens as business needs more and more features.
Software developers are bad at organizing and bargaining - because they all think they are better than others and code of other people always sucks :)
What is my hypothesis:
Hardware has physical limitations as is obvious - even if you build millions of boards - well it takes storage space, you need copper, aluminum, you cannot make transistors smaller into infinity. You can only sell so many phones as there are buyers.
Software on the other hand is limited now mostly by amount of the developers in the world. There is infinite amount of programs that you can run on finite amount of hardware. There is infinite amount of software to be built let alone maintained that is why software developer salaries are going through the roof.
While I can sell 1 phone only once now I can build SaaS solution that I will get cashflow and monthly payments it is not even that individual can capitalize on his own innovation. Basically infinite revenue stream from SaaS model is just so attractive for any business man.
Brilliant comment.
It applies to most engineering, or scientific fields. Develop, discover, once. Maintain or improve forever.
Explains the massive difference in demand for truly imaginative innovative thinkers, and the maintainers.
Come to think of it, other fields too.
Well for me the degree was of mediocre usefulness. The job I had to sit in an office on a factory floor with no windows, fully airgapped on my own 7 hours a day doing test automation and designing test fixtures mostly. That was the entry position for us back then really. I had to wear full static gear head to toe which was hot and uncomfortable and smelled weird. This was broken up twice a day by some shitty machine coffee and to sit in the canteen and stare out the two small windows at people outside at the company over the road all smoking. All while being paid just about enough to eat for the month and keep up with my games habit. There were also 9 layers of bureaucratic nightmare everywhere and as the company was large everyone knew or was related to everyone else as they all lived in the same area so it was politics galore.
A lot of negative responses here but my own experiences having graduated with an EE degree from an average university (in the UK) 3 years ago has been pretty good so far. I was able to get 3 internships at 3 different semiconductor companies throughout the course of my degree and got a job at another semiconductor company immediately after graduating doing digital chip design so I don't think the job market, at least in my niche, can be quite that bad. The pay is pretty good, well above average for a recent graduate in the low cost of living city I'm based in and a bit higher than my software engineering friends in the same city. Still considering maybe doing a masters in computer science to expand my career options though
An EE degree IS awesome. It gives a generalizable mathematical foundation which applies in just about any other field of work. It gives a huge competitive edge. For example:
* Circuits are physical implementations of differential equations, and EE gives a unique way to intuitively think about dynamics, which applies to finance, epidemiology, and just a really diverse range of domains.
* With a rigorous EE background, you can rapidly pick up most domains of engineering (think Elon Musk), since you've got all the mathematical foundations. The reverse isn't true. The way math is taught in EE is broader than e.g. mechanical, civil, or other engineering disciplines, where it tends to be more domain-specific. EE gives you a lot of depth in probability, error analysis, signal processing, controls, calculus, linear algebra, etc. I think the only things missing are statistics and discrete math, and I picked those up elsewhere.
High-performance board-level EE is insanely fun. Incredibly creative. You get to build stuff, design stuff, do math, and it's just a huge diversity of intellectually-fulfilling stuff.
IC design is a bit less fun, due to the many-month turn cycles (develop, wait months and hundreds of thousands dollars, and test/debug), but not bad.
However, the EE industry sucks:
- Pay is not bad, but much worse than other jobs you can get with an EE degree.
- Work culture has all the worst excesses of the eighties -- think of Office Space style cubicle farms, dress codes, conservative management, ISO processes, and paperwork.
- Yet it's somehow adopted some of the worst excesses of the 2010s; it no longer feels like work is a family or a community
- And it's hard to get into. There are virtually no jobs for junior-level EEs (which isn't just BSes -- in the Great Recession, I knew bright newly-minted Stanford/MIT/Caltech/etc. Ph.Ds who couldn't find jobs).
- Even at the senior-level, there's a bit too much of a boom-and-bust cycle, without the booms ever getting that boomy, but the busts being pretty busty.
I spent maybe five years doing EE work after my EE degree, and I think that was enough. I've been out for a long time now. I still do EE as a hobby, and I enjoy it, but the industry culture isn't one I remember with fondness.
I suspect a lot of this stuff will continue to disappear from the US into Asia; that transition is rapidly in progress. US firms maintain specialized knowledge in some areas (e.g. particular types of MEMS devices), but there are plenty of places we've fallen behind. I don't see us on the path to regain leadership. I think some of this is cyclical. Declining industries don't make for good employers, and poor employers don't make for growth industries.
EE is one of those "engineering is really applied physics" disciplines. There's a slant towards standardised EE-specific solutions for PDEs, but it's still much more abstract and mathematical than any field of CS, apart from maybe cryptography and data science.
But career-wise, it's a mediocre choice in most Western countries. (Possible exception is Germany, where engineers have a similar status to doctors and lawyers.)
Most people have no clue what EE even is, or just how much math and engineering goes into building everyday devices and services.
(A friend of the family said "Great! You'll be able to get a job repairing TVs!" when I got my course offer.)
> There's a slant towards standardised EE-specific solutions for PDEs, but it's still much more abstract and mathematical than any field of CS, apart from maybe cryptography and data science.
Here's the thing, though: PDEs are NP-hard. There isn't a generalizable way to model dynamics. On the other hand, dynamics come up everywhere:
- How is the pandemic going to evolve?
- How will incentive structures skew cultures?
- How do I build a suspension for my car?
- How does heat leak from my house?
- How does my understanding evolve with learning?
... and so on.
What EE does -- and I think uniquely -- is given intuitive, graphical tools to think about differential equations, in tools like Laplace, Body, Nyquist, root-locus, and so on.
They also give a lot of applied experience in applying those, including in contexts with nonlinearities. An op amp will clip on both sides, which you model as a linear differential equation (which is easy enough to reason about) and a memoryless, time-invariant nonlinearity. You squint. You kinda ask yourself how it would work if it /were/ linear, and the nonlinearity just cut gain. And at some point, after doing it enough, you have intuition for what it will do.
With the EE-specific stuff, I can intuitively reason about these things think through to design.
EE is all about modeling -- building simpler equations which approximate more complex ones in ways which give intuition -- so this is also usually correct or almost correct. Indeed, if you go onto grad level courses in control theory, you'll see formalizations of this intuition, where for example, a time-variant system or a nonlinear system is modeled as a linear time-invariant system, together with a bounded error.
A lot of the mathy stuff -- which I've learned a fair bit of as well -- is in abstract more general, but in practice, gives much less intuition.
My experience with the real world is that there are rarely actual differential equations handed to me. I kinda get that we've set up some pricing structure, or some incentive design, or whatnot, but I can't model it formally. I know which way things push, and whether those integrate or not. I can draw a block diagram and reason about how it will behave, in a way the math side doesn't let me do.
>Germany, where engineers have a similar status to doctors and lawyers
Errr, no they don't. In terms of pay and status Doctors and Lawyers trump Engineers every day of the week in Germany, the only exception being the engineers with PhDs who are tech leads in some well known research institute or big-brand company like Audi or Porsche.
I think you over promote the utility of the formal methods taught in the degree.
Not once did I use Laplace in EE. It was all cheat sheets or applying data sheets and doing some adhoc calculations in excel or taking a wild guess and iterating.
After twenty years of doing IT related stuff I’ve forgotten how to even differentiate stuff.
1. I did learn them well, and so I did use Laplace quite often in EE.
2. I jumped careers not into IT, but into tracks which leveraged both programming and a mathematical skill set.
I'll mention: I'd be bored out of my wits doing just IT. The intersection of IT and math includes computer graphics, visualizations, robotics, machine learning, fintech, image processing, and a ton of other stuff I find much more fun and fulfilling.
That's not an implicit commentary on your path, by the way, just an explanation of mine. We all have different goals, desires, values, constraints, etc.
Yeah, I have an EE degree but have only worked in software, and I have to say you can bring any kind of mathematics to the job if you have the imagination to find the places where it is an advantage.
I sought areas where I could learn more math and use it to stand out from the crowd, albeit to mixed results, because if your manager can’t read your analysis paper he may not be impressed either, and sometimes the reverse.
If you know a little bit of math, there's no benefit.
If you know enough math to jump to e.g. medical imaging, robotics, controls, simulators, image processing, ML, or similar, there's a ton of benefit.
An EE degree ought to give enough background to get there, although it may involve a year or two of study in a particular domain, and a side project to prove you have the skills.
I think some of it must be 'sticky' culture. I'm another former EE who left the field to develop with much better pay so the inter-discipline competition does exist in some form. What I saw of EE jobs didn't have the dress codes and comfortable offices.
Same answer as any underpaid field: an oversupply of labor in that field vs demand.
The absolute worst fields are the “sexy” or trendy ones. Unless you are strongly driven to enter a field for non-monetary reasons always look into employment opportunities.
"The ideal degree if you want to hire someone" is different from "the ideal degree if you want a high salary".
To get a high salary, you need to be in a good negotiating position, such that if company X won't hire you for US$150,000(/year), then company Y will hire you for US$145,000 (a strong BATNA). There are three possible reasons company Y might not hire you for US$145,000:
1. They are badly managed and making irrational decisions.
2. It would be unprofitable for them to hire someone like you for US$145,000; in an engineering position, this is because their revenues minus cost of sales would go up by less than US$145,000, risk-adjusted.
3. They can hire someone else like you for a lower cost, such as US$130,000.
Item #1 can mostly be discounted as a difference across fields; there are badly managed companies in every field, but generally they aren't the ones who hire a lot of people, and they aren't the ones providing your BATNA. However, in the case of programming, company Y might be you and your college roommate setting up Pinboard or Tarsnap, so there is perhaps a relevant difference here.
A thing about items #2 and #3 is that "someone like you" means "like you" from the company's perspective before they hire you. The fact that you can solve hard leetcode puzzles during the interview in ten minutes figures into this, because that's something they can observe before they hire you, unless they go through a recruiter, in which case it doesn't. If you can do a board layout with 166 MHz DDR and it will work right on the first spin with no signal integrity issues, that doesn't figure into "like you", because that takes at least a week, so you can't do it as part of the interviewing process.
The bigger difference, though, about item #2, is that the returns to NRE work in either EE or programming depends on volume. If your EE innovations give them a working board that costs $3.80 to produce instead of the other guy's $4.30, then if they're producing 100 units, you've produced $50 of value for the company. But if they're producing 100,000 units, that same amount of work on your part has produced $50,000. And similarly if the product brings an $0.50 higher price instead of having an $0.50 lower cost. And similarly if we're talking about lowering the cost per user of operating a server farm, or increasing the ad spend per user.
So why is that a difference if they're both NRE work? Because producing 100,000 units of an electronic device requires a huge amount of up-front capital investment. You can't produce 10 units one day, then 15 units the next day, then 25 units the day after that. So if Company X is investing $3 million in your project and Company Y is investing $0.3 million, the Company X devices are likely going to get produced in ten times higher volume, so every design decision you make is worth ten times as much money.
Now, of course if you're working on Google's backend systems that serve, ultimately, 5 billion people, an 0.1% improvement produces more value than an 0.1% improvement in the systems of a company with only 1 million clients. So FAANG can afford to pay hackers more than the average health plan, ISP, or venture-funded satellite imagery startup. But, even in those cases, the capital investment needed to get a lot of value out of a programmer's work is fairly small, maybe US$10k to US$100k, rather than the US$1M or more that is common for EE projects. This puts programmers in a much better negotiating position.
So, though the two fields have similar diversity of applications, type of work, and intellectual difficulty, electrical and electronic engineers are reduced to begging for scraps from rich employers and then have to sit in bunny suits on factory floors with shitty coffee, while programmers get free massages, incentive stock options, and private offices. Except in countries where companies hire programmers through recruiters.
(Why do they do that? I think it's mostly an industrywide case of #1: companies in England and Australia use recruiters because everybody else uses recruiters, and so the ambitious programmers leave the country, reducing the large advantages obtainable by black-sheep companies who hired directly and could thus hire only competent programmers, to whom they could profitably offer twice as much money. But maybe there are legal reasons or something.)
Same for physics. I earn over 3X what I could make as a top-tier researcher paying myself out of grant money and have had a handsome seven-figure IPO (close to 8 figures, if the market price holds). It's quite shameful, as physics research is almost uniformly more important and relevant to society. Now that I'm set for life, financially, I very well might return to physics.
In America, we value frivolous crap over serious scientific and artistic endeavors, mainly because the masses are terribly dumb. Idiocracy was quite close to the truth, as good satire must be.
I had almost exactly the same career arc. Perl became Python, Sun became Linux and I eventually moved beyond sysadmin work into automation, then API development and now I lead software development teams. Are you still doing sysadmin work?
> I burned 5 years in industry being paid crap and then discovered it was actually a fairly low paying profession in the UK
I was absolutely dumbfounded when I looked up engineering salaries in the UK. I mean, even in the US, I don't consider most engineering fields particularly high paying, but the UK is something else.
In hindsight this may seem wrong but I don’t completely blame him.
The offshoring of IT was so novel unlike say manufactured goods. Governments can’t stop it although I think they should in some form if you think it’s important for your country
I'm really sorry for the relationship you have with your father.
I've had similar experiences, but not as drastic as yours -- my father didn't allow me to use my bike for YEARS as a punishment for smashing my bike as a 10 year old (faulty chain, impulsive child).
I guess this I would call the real toxic masculinity.
Unfortunately there's nothing much you can do, except build your life around it, keep your boundaries firm and perhaps psychotherapy.
That's a harsh punishment. What kind of effect did it have on you? I remember I biked to school at that age (~10-15 min). If I damaged one of my tools, such as my bike or my shoes, it'd look damaged, and I disliked that. But I had to cope with that. That was a punishment by itself.
I wasn't able to get a job as system administrator (even though I believe I'd liked it) because I can't program. I tried it various times, and the reason I always quit is because I get frustrated and get migraine symptoms. I tried C, Lua, Python (multiple times), and all I ended up being able to is some Unix kung-fu and shell scripting. Which seemed to be not good enough for system administration. It kind of feels like my failure in life, to be frank. Tho I'm happy I ended up with something else, IT security related.
> I guess this I would call the real toxic masculinity.
HN is the last place I would have expected to see these words.
Which is slightly interesting take on a parent - I presume they housed you, fed you, got you educated and relatively safe...I cant possibly imagine you are writing this from a rehab facility or jail. And yet, for denying you a bicycle deems that man "toxic"? And what does a parent taking away a privilege have to do with masculinity? Which one of your human rights were infringed upon exactly?
>> my father didn't allow me to use my bike for YEARS as a punishment for smashing my bike as a 10 year old
Seriously, being so strict to punish a 10yrs old kid for years is not a sign of a toxic personality? Parents suppose to build theirs kids character and not stomp on it to make them 'tough'.
toxic masculinity is not being manly man man. Its being insecure and lashing out at anyone who disagrees with you or is not 'with you'. Any father that thinks their kid misbehaving is somehow a personal attack.
The last bit fits the bill the most in his case -- he took it as a huge personal attack that I smashed the bike he bought, and the tough discipline he was raised in is why I labeled it as toxic masculinity.
Why do you associate this with masculinity though? Yes, your father is man but I'm not sure if his masculinity has anything to do with it?
I recognize the character traits. I know someone raised without a father who also shows them. Perhaps in his case a result of having a tough time financially their entire youth? This put a huge value on stuff (because almost everything is irreplaceable, literally, and there was hardly any time to fix things), leaving little room for error and no reasonable room for breaking stuff, as children do?
As I am getting older I really realize that I do things like my parents did them. It is really difficult to just decide to do it differently because of the emotions involved. It is difficult to be rational the whole time, the monkey brain sometimes leads. What is important is to be able to admit your mistakes and being able to apologize. For all my flaws I hope to show my kids at least that there is no shame in this.
I'm sometimes harsh on the children only to later apologize and tell them I actually do believe they should be able to make mistakes and that whatever they broke can be fixed again, and if not, no lives were lost (ideally).
I wonder why I get so down-voted, perhaps my non-native speaker brain has a wrong understanding of the term "masculinity"? In this context I take it to mean that being unreasonably angry at kids that wreck stuff is a typical thing for men, and that women don't suffer from this? Is this wrong?
Denying a child his bike for YEARS, and I mean years gatheting dust in the shed is toxic by my definition, obviously not the single thing my father did that I would label this as an example of toxic masculinity, otherwise I'd call it "odd", also my dad had enough qualities that I don't label him as abusive, in my mind this is an example of over the top discipline + the other cultural stuff that in my mind I would call toxic masculinity.
I am certainly grateful to my father for all the material support he provided, and he did provide a lot and still insists he does, even though I constantly remind him I'm an adult and quite well off myself.
You don't need to be a drunkard to drink alcohol, and you don't need to be permanently toxic to exhibit toxic masculinity. I dislike the term (because people immediately get defensive) but the concept is definitely useful and describes a real phenomena.
It's not about masculinity being inherently toxic, it's about toxic expectations men are expected to fulfill by the society.
And you don't have to have your human rights infringed, for example something as simple as saying "you are useless" is toxic, yet it doesn't infringe on any rights. Not everything that's legal is good.
I have a lot of respect for your perseverance, it was the right thing to do. Make your own mistakes, making someone else's is just so... Painful. And it makes you feel weak.
Similarly someone once advised me not to do a Masters in Molecular Biology because the market was screaming for Bachelors (back then in the Netherlands a Bachelor meant working in a production lab or something). I didn't listen and this also turned around in about 3-4 years and I love how my career went (and is going).
I had a similar (but much less intense) situation with my Father in law by the way. He has some experience in "home improvement". I like to read and watch YouTube before starting any activity. So we had a big discussion/argument on how we would re-do our ceilings. In the end I said: I prefer to do it my way and mess it up than to do it yours and regret that. He agreed, we had a fun time working together after that. He learned a thing or two and now he consults me on many things he does around the house.
"I honestly don't know which way is right. In case something _does_ go wrong, though, I'd rather have only myself to hold it against because I made the wrong decision than also hold it against you and let that make its way into our relationship."
I think this just made its way into my "life rules for dealing with family and close friends."
It's up there with "give money freely, but never lend it," and, "don't hire a friend to perform work for you on a deadline."
It's for sure something I try to keep in mind raising my children now. It's a form of admitting that you never have 100% of the facts + knowing that making mistakes is very good for learning. You can definitely learn things from how even the youngest of kids solve problems.
This of course does not mean you should let them die screaming "yolo!", there is a balance to be found, but having this mantra keeps the balance a bit more on the "open mind" side of things.
Kudos for sticking to it, I'm not sure I could have done that in your place! I'm always horrified at stories of parents who act this way. "Do it because I say so, I'm infallible as far as you are concerned, if you disobey me I'll hurt you."
I was not aware of how much damage he did until my mid 20's and it led to about 2 years in therapy to work through it all. This story is one of countless stories where he has acted out of line. A benefit of my career is that I had enough capital to pay out of pocket for a very effective therapist.
I ended up gaining a far better understanding of who he is as a person and perhaps even why he is the way he is. This has allowed me to come to terms with his actions enough to forgive and forget (for the most part) and move forward with my life. I still call him once a year on fathers day, an attempt to extend the proverbial olive branch, but he never picks up.
All in all my life goes on and I am happy and healthy. I am extremely grateful for the community I live in, the fantastic friends I have surrounded myself with and (most importantly) my incredible wife who helped me through a lot of the pain over the years. I wouldn't be where I am today were it not for her.
I'm not trying to put everyone on the same level as your situation, but many (most, I might guess?) have had parental situations that are less than good. Sometimes for a short period, sometimes for the entire childhood.
My childhood wasn't bad (as in horrific) but there were enough ... episodes that stuck with me for a long time, and sharing with friends (both at the time and afterwards) raised a lot of eyebrows (mostly saying "that's really not normal/acceptable").
"This has allowed me to come to terms with his actions enough to forgive and forget (for the most part) and move forward with my life."
I'm glad you got there. I've come to a similar place (without a lot of therapy, but over a longer period of time). My parents did the best they could, and on looking back, they were in a situation they didn't want to be in (kids at a young age, money problems, etc). With enough time and distance, and knowing them as an adult, they're generally OK people. They did the best they could, they just didn't know very much at the time, and ... they got a lot of bad advice (imo) during a pre-internet time when it wasn't fast/easy to get a lot of information. I'm still rather fortunate, in that they're both overall good people (just... were not prepared for parenthood) - many folks are bad at parenting and ... are just overall not great people either. I have a sibling that is still struggling to come to terms with some of this, and is still searching for 'answers' to things that I don't really think have 'answers'.
And yes, a good spouse can really help balance you out, give you some grounding and perspective. You can get some of that through therapy as well, I'm sure, but having a spouse with you is a different sort of grounding and perspective.
"If you pursue this career, you'll live under a bridge in 5 years, so I'm kicking you out and giving you no money, so you'll live under a bridge right now."
edit: kudos to you for sticking to it really! You should be proud of yourself!
I know you made the point in jest but you’re right. To be adaptable you need to embrace change and opinion is the hardest to change. I’ve seen people die holding an opinion which has been thoroughly disproven. Embrace change!
To be fair, 2002-2003 would have been a pretty rough time to graduate as a developer. The industry hadn’t yet recovered from the whole dot com thing and there were still a million “converted business major” developers on the market who hadn’t yet got the hint that they should go to law school.
The advice you got might not have applied to you on the way in. But it certainly applied to all the students those professors had just seen graduate. It probably took them a few years to notice the tide coming back in.
Incidentally, the jobs had all been “going to India” for a long time by then. I’ve personally heard about them going there for at least 30 years now and I don’t doubt they will continue to do so in the future. At this point I’m not overly convinced they will all make it.
> The industry hadn’t yet recovered from the whole dot com thing
You are completely right about that and that slipped my mind. Either way it was kind of nice how easy it was to get a job back then. Getting my foot in the door was a little difficult actually, but once I had some thing on my résumé I got an interview for basically every job I applied to.
My résumé is much more impressive these days, and it may be in part ageism or higher salary expectations, but it’s much rare to hear back from a job application.
It's ageism. Sorry for being so blunt, but I know to many people who were just uninteresting to human resourses after hitting 45-50. I know more than a few homeless people in my life, and over half were once Programmers. It makes me sick. I do know this, if you are in tech, save up that money.
(I have never understood this industry's way of hiring. Long drawn out interviews, and candidates whom are just average that get jobs.
Why not hire a guy for two weeks. See how they do. And of course, be upfront. Tell them this is probationary. You don't sent a guy to relocate, just to be let go in a few weeks.
There are some really great older workers that some company will scoop up when hiring is upgraded to this decade. Sorry about the sentence fragments. Too lazy to fix.)
As someone who's been in charge of recruitment, I've made an effort in hiring people who are less likely to get a chance at other places (minorities, older candidates)
One thing that stood out to me though is that a lot of the older candidates I interviewed that had been struggling had deep knowledge of a technology that's no longer used and had had very little interest during their career in learning new things outside their job. So once the thing they were expert in was no longer fashionable, they had a hard time catching back up.
I know it's unfashionable to say this and I know that a lot of people have full time jobs and don't want to have side project additionally to their full time job but if you want to keep a career as a software developer past 40 without going into management then you need to do side projects,you need to keep up to date with some of the latest fad (even if some of those fads are cyclical and recycle concepts from 20 years ago). Doing a job well and becoming an expert in an obsolete technology is no path to career growth.
> … "if you want to keep a career as a software developer past 40 without going into management then you need to do side projects,you need to keep up to date with some of the latest fad (even if some of those fads are cyclical and recycle concepts from 20 years ago). Doing a job well and becoming an expert in an obsolete technology is no path to career growth."
Sad thing about this is that some of us (like me for one example) have known this since the days before personal computers were even a thing, but it don't really help to stay on top of modern tech when all the people in charge of hiring in the tech industry are twenty-somethings who automatically instantly hate anyone with even a single gray hair or wrinkle visible, and instantly dismiss a lifetime of knowledge based entirely on ageism.
Even sadder is that the "decision makers" in many tech jobs still to this day actively ignore good advice that people like me have been giving since the early days re; things like network security being more important than they want to admit to themselves (just to name a simple and obvious example), and then when their ignorance of such issues come back to bite them in the ass they always scapegoat the "new guy" somehow and out he goes.
Yes, I should have elaborated. The two best engineers in our team are in their mid 40a and our QA lead is in his late 50s. So, I'm well aware that even great candidates who are really good at what they do are passed over just because of age.
But I did want to explain the typical issue I have seen when interviewing older software engineers that had a long career but shot themselves in the foot by being too tied to a specific technology. At no point did I want to say that it was the case of every engineer I interviewed. And I fully agree that it's often tough dealing with the ageism in this industry.
A track record of being right is useless for having your advice be seen as "good advice".
It takes some politics, and interpersonal trust to get your advice taken seriously. Sadly, this also means looking past the technical details and considering organizationally why certain decisions need to be made. The company might prefer the occasional breach and "we are sorry" PR over spending a lot of money on network security, for example.
> "The company might prefer the occasional breach and "we are sorry" PR over spending a lot of money on network security, for example."
Sure, but what about when the choice presented is more along the lines of "Hey, let's spend a little money on network security now so that we don't have to spend millions later cleaning up a huge mess", and is met with the mentality of "You're worrying about nothing." Ummm, no? I'm sharing the expertise for which I was hired?
This is true for anyone. I've interviewed young people with 5 years experience only to find out they had 5 years experience doing the same thing over and over that they learned in the first 3-6 months on the job.
While the pace of change has slowed down in some software areas in my last 20+ years of working, I never saw it as a field where a person could learn something in school and simply apply it for an entire career. For better or worse, it requires learning outside ones job, or at the very least, constantly moving jobs to keep skillset building.
For me, this has been fine because I've always loved tinkering on the computer. Learning new tech is something I've always done as a hobby as far back as I can remember.
Those going in only thinking programming is an easy way to make money, may struggle later in their career. Though, this is less of the case now as there is so much more older code out there to be maintained.
I just want to add that technology is political. The directions specific entities push for become de facto standards. I have heard and seen many technical hiring staff express the idea that some technology or practice is wrong or obsolete when they are simply unaware of either the full picture or how the underlying tech works. There was a medium article that recently suggested that developer competence be measured by a number of standards that are quite obviously the result of the authors misperception about how web browsers, JavaScript, modern front end coding idioms, and modern front end build tool chains work. What was worse was the condescending disdainful tone the author had while they clearly had no idea what they were talking about.
Developers who understand the fundamentals of a particular area may not be ignorant of so-called developments in that area , but rather may be able to see where trends and modern approaches fail or lack. I know a Swiss developer who never used react in his life, but he coded up a rough approximation in an hour in front of me. He has his own tool belt, naturally. Would he be a good front end factory grunt? No. Smart companies use his services to design them tools.
>There was a medium article that recently suggested that developer competence be measured by a number of standards that are quite obviously the result of the authors misperception about how web browsers, JavaScript, modern front end coding idioms, and modern front end build tool chains work. What was worse was the condescending disdainful tone the author had while they clearly had no idea what they were talking about.
Closing in on some ageism inflection points, I find I have the opposite problem: everything is still interesting! Maybe more so: things I thought were boring at 20 are things I now know more about the relevance of. And the thing that's terrifying is knowing I won't have time to learn it all.
Paradoxically, this does have the side effect that I'm more skeptical of learning some new ways of doing the same thing, preferring to prioritize new learning for actual new capabilities vs different arrangements of the same deck chairs. I see that behind some of the inertia in my experienced cohort -- just-in-time can be reasonable approach to picking up specific tech.
OTOH, I've certainly worked with people who are aggressively disinterested in picking up anything new, even when offered the chance to do so on the job, and that's certainly no fun to work with. Or manage.
And thats the reason I would stop my yonger-self from doing software.
When compared to other careers after, say, 15 years of experience you are not well established high paying position as a lawyer or an engineer (the actual kind :) )
After 15 years you are out-of-date dev that didn't find a time to do their job, manage family and friend and have a side projects to keep up to date with technology.
This is a bit overdramatic but sw devs are expected to ride the wave all the way through their careers when other career paths offer stability and protectionism of well established fields.
What about the half million total comp and the part where you can retire in your thirties? If anything, I bet younger me would do even better than I did if he were to start today.
Sure. I bet those vocal people would be happy to explain in detail what they did to get those jobs.
It’s not easy, and you may need to actually go to the place where those jobs are. But each of those companies will hire thousands of people at those compensation level in the next year. And the people they hire were in no way exceptional (or different to you) before they started down the path to getting where they are now.
No need to buy "their book on how to get those jobs" or anything like that.
To get the interview, you just gotta have some years of experience, some side projects, and that's about it. If you went to a top tier school or something like that, you don't even need the years of experience. Getting an interview at those companies is the easy part, pretty much everyone I went to college with got an interview with one of those companies before graduating (and I didn't go to some super known ivy league college, it was just a pretty good public school in Georgia). The issue was that most people didn't pass those interviews.
To pass the interview, read up on systems design, read up on algorithms, practice some leetcode/hackerrank/etc., and you are good to go. All of those resources can be easily obtained for free. There is no secret or trick to any of that.
You mentioned "fashion" twice. I think as an industry we need to re-evaluate why it is so important for candidates to be "fashionable" compared to being smart, reliable, experienced, or efficient. Why is a new grad with 1-2 years of experience and only knows React so much more successful finding work than someone with 20 years of experience programming in eight different languages and in companies ranging from a 10 person startup to a 10K person mega-company?
You are absolutely right. Would you like your brain surgery to be done by a young kid straight out of collage or a 20 year veteran who has experienced and seen it all before?
I have had the same experience when hiring people. However the problem was the young developers not the experienced ones. Very few young developers I interviewed showed any interest in learning anything new. They just wanted to find a job that matched the limited tools they had. The best developers I have ever hired were seniors.
Cue for resume-driven development, and people asking "why do we hype so much that new worse-than-useless tool that is costing that much people's time and reducing our software quality?"
In the Netherlands, permanent employment contracts are much harder for companies to break than in the US, so they often hire new people as consultants or temporary employees for a few months, then offer them a permanent contract if they work out.
It worked out well for me at TomTom: I kicked ass during the 3 month probationary period as a contractor, then when they decided to offer me a full time contract, I was in a very good position to negotiate a salary (and even a hiring bonus), much better than if they were hiring me cold out of the blue.
No, they don't. You'll generally be paid less and have less benefits. It truly only serves as a "do your dues" period.
GP also doesn't mention that having a perm contract on its own grants an individual benefits you wouldn't otherwise have. Inverse this, and it means a perm contract becomes a requirement for reaching a lot of milestones in life. Most notable are mortgages and even free market rent.
> Why not hire a guy for two weeks. See how they do. And of course, be upfront. Tell them this is probationary. You don't sent a guy to relocate, just to be let go in a few weeks.
By doing so, that cuts out anybody who currently has a job. Looking for a new job is much less stressful when your fallback option is "keep working at the current job and continue the search". It also gives you a much stronger negotiating position, since you don't have the deadline of bankruptcy hanging over your head.
If I heard that a company wanted me to work for two weeks on a probationary manner before deciding, that would be a very hard pass. That's long enough that it couldn't really be done without leaving a current job first. Best-case scenario, the transition becomes immensely more stressful for no good reason. More likely scenario, the hiring company decides to play hardball after the two weeks, and I'm over a barrel because I already left the safety net.
I graduated in 2003; did volunteering for 9 months and then got a very low-paid job in a startup in the Midlands in the UK.
The assumption I was given was almost all the thinking and innovation had been done, and it was just a matter of working your way up the Java coder/senior/architect hierarchy over time.
I feel a bit cheated in retrospect - almost all other times the industry's felt amazingly exciting/fun. At that time it was a bit deflated. Luckily I just wanted to work with computers, so at the time I wasn't bothered.
I graduated in 2002. My peers had been boasting about how we wouldn't even entertain offers below $75k when we graduated in 2000. There was the general .com bust, but I was also literally working on my resume the morning of 9/11 which essentially froze up the market entirely.
Most offers which were "locked up" were rescinded. I was fairly fortunate that the small software consulting firm I had interned at landed a new contract and made me an offer- for $37k. I swallowed my pride, the market was absolutely flooded with java devs with a few years experience that I was competing against. It seemed far better to be working and building a resume even if for a non living wage.
I received only one hit on my resume my entire first year working, despite sending it out to multiple places each week.
After about a year, things picked up a bit and I landed a job with a group that was impressed by my distributed systems undergraduate research work and curiosity and was building what would become known as a high frequency trading system. My salary was still lower than my 2000 self thought acceptable, but at 55k I could at least afford to move out of my parents house.
Things took off from there a bit, but it was a rough start. And it's a post 2010 thing that engineers can retire in their 30s. My expectation was that this would lead to a comfortable middle class lifestyle but the eye popping salaries of today were beyond the expectations of any of us. Tbh, when one of my coworkers left to go to Facebook in 2007, I looked at it with disdain- a website? How disappointing... I imagine his net worth is deep in the 8 figures- he is still there.
Oh yeah 2001! 6 months and dozens of applications to get a job offer. I would have done 100s of applications had their been that many to apply to.
Low points was being rejected from a job maintaining access databases in a small town. Being rejected from another job because I hadn’t suggested manually unrolling a loop as an alternative solution to a problem and I said “oh yeah you could do that I guess” (feedback: bad communicator I withhold information), and a 2 day hoop jumping affair for a dev job involving fake meetings where you had to make up bullshit.
Fuck graduating 2001. Also probably heavily affected my lifetime earnings as I started on £18k.
Just an FYI to the Brits on this post - I've found US startups are happy to hire remote engineers from the UK. Might take a little bit of effort but it's surprisingly possible to get a six figure job quickly.
Make a UK company and they pay that as a contractor, you sort it out with HMRC yourself.
The Americans seem to be much higher paying, greater variety of roles, happy to let you remote.
Now is the time by the way, I was blown away by how hot the market is. Maybe jump in there before WFH starts to get retracted, you can do more interviews before you're back in the office.
Yeah America seems to have the nice combo of reasonably price places to live and high tech salaries. Since “remote, but US” is common I do t see why you can’t take your $150k and live off a quarter of that somewhere and save most!
I recently got an offer that was $175k base, full remote and felt comfortable enough declining it (though most of that is because Im optimizing for liquid total compensation so….) $200k total compensation liquid is still mediocre for a mid level engineer but at least I don’t pay London real estate prices.
It's because some areas where you can get a house for $1,500 per month instead of $4,000 come with caveats like neighbors with guns and hellish weather
Isn't it deadly ironic that the country that has such terrible health care is also the same country where people get shot all the time?
I love it that San Jose is going to require gun owners to carry liability insurance if they want to carry weapons. It's about time they started paying for the health care and rehabilitation and funerals of all the people they shoot, and stopped whining that they're victims of government is oppression, when they're perfectly complacent with everyone paying for licensing and registration and insurance of cars.
>San Jose to Require Gun Owners to Carry Liability Insurance
>San Jose officials have passed the first law in the country that requires gun owners to carry liability insurance and pay a fee to cover taxpayers’ costs associated with gun violence.
Bearing in mind that 30k a year goes a lot further given that there's no need for health insurance and student loans are effectively just a 6% tax on earnings above 27k until either your loan is repaid or it's been 30 years since graduating.
While it's true that UK salaries are quite a bit lower than other countries if I'm honest if I earned £150k a year the only difference in how I live my life would be that I'd be driving a Tesla. 30k is more than enough to live comfortably on, even in areas with a fairly high cost of living.
Health insurance is generally covered by your company as a benefit, and if it’s not when you’re in your early 20s it’s generally only a few hundred dollars a month. On a salary of $150k it’s pretty insignificant. Average student loan debt in the US for a bachelors degree is $30k (too high of course) but when paid out over decades is also insignificant against $150k.
I’m not making a value judgement about whether health insurance or student loans should exist but just pointing out that your comment seems like a false rationalization about why a much lower salary in a different country is okay.
Outside London, sure, but a London one-bed apartment 40 minutes away from city will run you ~1300/montn, that £18,000 a year with councill tax and bills.
I didn’t have student loans because state schools are cheap. I pay like $75 out of pocket for my health insurance per month and most of that is my HSA contribution.
For what it’s worth I can’t afford a Tesla either making $200k now and live in a 450sqft studio.
>which is on the upper end of graduate starting salaries even for today
Do you have a source for this? Anecdotally, my impression is that graduates in any degree can reach £30k with reasonable effort. Specifically for graduates in tech, the top end is something like £100k nowadays, and the average would be more like £45k.
I don't beat myself up. I had 3 offers. 2 for 18k and one for 17k for the government. The fact I had 3 offers but nothing before that for 6 months was probably an artifact of getting better at interviewing. The job I eventually got I was accused of being "too polished".
The highest starting salary I applied for was 25k, and the highest I heard anyone get was 40k, but working on a trading desk for 12 hours a day.
>6 months and dozens of applications to get a job offer.
I honestly can't tell if this is supposed to be good or bad. Graduated in 2020 and spent over a year looking for work with several hundred applications sent. Ended up enrolling in a masters because of it, only got a job last month.
Back in those days developers were still doing most of the hiring decisions so with a sanely looking resume you could get an interview and if you could pass for knowing programming you could get a job (apart from in the post-dotcom bust and 2008 when firms just didn't hire).
At some point during the 20 years since, orgs decided that HR was to handle hiring. HR trying to add value by having the BEST candidate in their mind and decides that nice-to-haves are promoted to requirements, anyone over 55 (because they would not stay long enough, even while younger ones change jobs within 2 years) is culled as is anyone with less than 5 years of experience (unless possibly if you're below 25).
Then orgs go around complaining that it's hard to recruit people, Go figure.
I thought it was bad, since most applied for jobs while studying (the big companies that come to the uni type of thing) and had jobs lined up. I decided not to disrupt my studies with this so waited until I graduated to look for work.
My boss loves to play with the idea of outsourcing, but he seems to be aware of the issues to some degree. In our situation, the idea is pretty hilarious - we as developers have to make sure our products don't become unmaintainable, product management doesn't like to think things through when specifying new features, and we already don't get time for refactoring. I think outsourcing (especially when done to save money) requires pretty fantastic product management and my intuition is that it's fundamentally incompatible with a codebase that is growing along with the company.
Programming used to be getting a computer to do something.
Now it's refining the idea someone has until it's code. People think they know what they want, but have no clue about the masses of details required. If the person with the idea and the programmer are far apart (in space, time, or culturally), it's pointless.
Most ideas don't require 100s of programmers, so the cost saving is minimal
> Now it's refining the idea someone has until it's code.
This is one of the most succinct ways I've seen this said. Thank you.
For the most part the days of handing a programmer a spec who then codes it up are gone. For many problems, programming itself is the easy part. Figuring out what to program, and the complete solution is where the majority of the work resides.
> the decision makers that are convinced GUI data engineering tools are the future.
The same train of thought leads to the idea that you can produce good software with mediocre developers by using processes, coding rules and checklists and "development methodology" like agile.
To be fair, Agile is not about leveraging mediocre developers - it's about enabling a handful of your best, most experienced hackers and architects (the fabled "two-pizzas team") to work under the best possible conditions, because the proponents of Agile had internalized the lesson that trying to scale up large teams of mediocre devs is a fool's errand.
Agile is what managers make it to be. To A LOT OF THEM it is about not doing any planning, writing zero specifications, using vague language for requirements, comparing performance of apples and oranges, "self-help"/motivational, justification to pressure those below and not bother with those above and so on and on.
But to most, it is just putting lipstick on the same old pig.
Did I mention the whole training, coaches and lango flinging industry?
Agile in it's original form - yes. Many everyday biz versions of agile et al - No.
Fundamentally, dev is something to negotiate with, cooperatively, versus commanding "do what I say"-style. Many businesses are unable or unwilling to do away with this top-down hierarchy, hence agile will never succeed there.
The agile they practise instead is "Agile, but without actually relinquishing top-down control".
TBH, I think what we are really missing is for the development team to be a contract team (ala Accenture) that can implement agile via official mediums e.g. pushback via contractually negotiation, backlog prioritisation via contract pricing ("we can do that, but this is what it'll cost").
The missing piece is people forget that the bar for software also rises with the ease of creation.
We do have very capable drag and drop CRUD generators now (not quite the same, but I still have a soft spot for VB5). But, what's expected, even out of a simple CRUD form, has risen.
I experienced the exact same thing (graduating in 2005) to the extent that it dissuaded me from getting a CS degree at all. Outsourcing was supposed to be the end of programming jobs in America.
My career took a 6 year detour until I finally made my way to programming professionally. I wish I’d had your conviction - the jobs I did in between weren’t half as fun and my career started that much later.
Interesting. I was just starting CS course at university in 2003 in Poland and it was obvious to everybody it's the future and there will be enough jobs for everybody.
There were 20 candidates for each place in the course so you had to pass the entry exams really well (and it wasn't the best university in Poland, far from it).
I guess that's the difference between being the outsourcing source and destination :)
> I guess that's the difference between being the outsourcing source and destination :)
Personally i'd agree that this is good for individual employment prospects, but not necessarily good for the country itself.
For example, here in Latvia i've heard that there's a shortage of developers and hence many government projects can't be fully staffed easily, because they're having problems being competitive in regards to wages, when compared to outsourcing companies, like Accenture.
Of course, there's no reason for why any developer should work against their interests in regards to being able to save and earn money, but it does seem interesting - maybe i'm wrong and it's just a regional thing?
Government Development jobs just don't pay enough to be competitive for talent. So instead talent goes to outsourcers. Here in the Netherlands that has meant most big government development projects are done by outsourcers. Which leads to bad incentive alignment. Yielding cost overruns, projects that are DOA, and projects with way to large scope. And at the bottom line, the consultancies are better of keeping it this way than to improve things.
But government is not agile enough to pay fair wage for development work. Besides, it would be unfair to all other government employees if these developers made so much more money with less time on the job and the same level of education. (sarcasm)
Government employee in Canada here. The pay itself is not necessarily the best, but top notch benefits, generous vacation (maybe a moot point in Netherlands but definitely not here), a rock solid pension make it a lot closer than the salary number alone.
There's also the fact that the number of hours worked is always fair, and that you can move within the public service very easily.
How many big projects are handled mostly "in house" by the government in Canada. And for the outsourced ones, how much input / leadership do the internal government people have?
Because it is largely the outsourced big projects where our Government fails.
The situation is not very different here: the outsourced projects often fail, and lots of the big ones are outsourced. Definitely not a good use of public funds (my opinion), especially since the funds spent for such projects are often stratospheric. Also, the amount of talent in government is not THAT small, so often these decisions seem very political.
Exactly. Even in Malta, most government IT projects are partially outsourced to local or international companies for many of those reasons. Developers don't usually want to work for the government at the start of their career.
I think for almost all countries the taxes developers working for foreign companies pay outweight the higher costs of hiring some of them for building software for the government.
Let's say government hires 1% of the developers in the country, foreign companies pay them 3 times more and if not for outsourcing 50% of these people would have to emigrate.
If X is base salary ignoring outsourcing then the country would earn
0.5 * X * tax_rate - 0.01 * X without outsourcing and
3 * X * tax_rate - 0.01 * 3 * X with outsourcing
So the change is (3 - 0.5) * X * tax_rate - 0.01 * X * (3-1) =
2.5 * X * tax_rate - 0.02 * X = X * (2.5 * tax_rate - 0.02)
So in this simplistic example for the country to benefit from outsourcing it suffices that tax_rate > 0.008 which is pretty much always true.
Let's say government hires 10% of developers, nobody would migrate and the salary difference would be 500%. Even with such unrealistic assumptions the tax_rate would just need to be higher than 10% for the country to benefit from outsourcing.
Depends how you define 'good for the country'. Upwards pressure on skilled wages due to increased demand rather than brain drain sounds like a decent place to be.
I remember a national best seller around that time was "The World is Flat" [1]. It talked about globalisation and made convincing arguments about outsourcing. Maybe your advisors were influenced by it.
Plus, there's also the "pork cycle" (https://en.wikipedia.org/wiki/Pork_cycle) aspect to consider: few CS graduates -> better employment chances/higher salaries for developers -> more people study CS -> (after a few years) more people graduate (if you're unlucky, coinciding with a recession) -> worse employment chances -> fewer people study CS and so on...
The failure of software outsourcing is really quite interesting, especially since telework is working decently well. That seems to eliminate the “remote work doesn’t work” objection.
It was not for lack of trying. The management class pushed it very hard. There are still many firms trying but it’s not really panning out.
From what I have seen the answer is that good software engineers are globally scarce and the good ones in India or Eastern Europe make enough that outsourcing isn’t that profitable once all overhead is included.
“Insourcing” is becoming a bit of a thing. You can get some salary benefits outsourcing to lower cost domestic places like Nebraska or Kentucky and not dealing with language, time zone, international currency exchange, legal differences, or other headaches. These folks can also travel to your office for an in person meeting in hours not days. Oh and their power grid and broadband are reliable too.
The jobs that have been most successfully outsourced are low to moderate skill manual jobs where basically what is needed is huge numbers of cheap hands.
Also a lot of the good software engineers in India end up relocating elsewhere, like the US, to get better pay. I've worked with a lot of really talented Indian software engineers, and some really terrible Indian developers (I hesitate to call them engineers).
All of the talented ones had relocated and were living in the US. I'm sure there are some talented software engineers that stay in India also, I just never end up working with them (probably because the companies I worked for were trying to save a few bucks by outsourcing to begin with, and went with the lowest bidder).
That’s strange, I started studying in 2005 and had planned to go into Comp. Sci. for at least 2 years, yet I had never heard similar sentiment expressed. Germany.
Same here (2005), but in Austria. I think Europe still had to catch up to the US‘ pre-2000 level of „digitalisation“ such that the bubble burst didn’t do that much to our industries.
Good point. I guess the language barrier was another issue, with English obviously being more popular than German, so "Indians in India taking our IT jobs" wasn’t really a thing.
Except that Germany and Austria have an obsession to outsource, or more accurately nearshore everything to Poland, Romania, Hungary, Ukraine, Czech Republic, etc. so dev wages there hold no candle to the ones in the US or even UK and the only path to decent pay is going into management.
I started university in 2000 in Spain, and I remember at that particular point in time studying computer science wasn't seen as desirable. When people had that interest, parents typically pushed them children to study telecommunications engineering instead. It was the fad at the time: they were all employed, had a strong lobby and made a lot of money.
It didn't last much, by the time I graduated from computer science everyone in computer science had a decent job (not paying a lot, but enough for a living) while telecommunications engineers were out of job in their specific field and mostly taking second picks in the CS job market. And to my knowledge, it has been mostly like that since then.
It really was the prevailing wisdom of the time. I entered the job market in 92, a few months after CERN released their browser. I knew what was coming when I saw the web for the first time, but the only jobs at the time where custom desktop software for businesses. It was the same after the .com bust which is right around where you picked up. 04 and 05 where really bad years to be in the industry, as the only jobs left in the industry where internal corporate software gigs. Which meant most of the employment opportunities where not, out in CA, but rather in sporadic areas around the country.
As well, there was a waves of offshoring, so again the common wisdom was do not go into computing. With that being said, I hade the fortune of working with a company that was early to mobile, and was working with Palm on mobile web apps around 2000, so when the crash came, though I lost that job due to the company closing, I knew mobile was right around the corner and went back to grinding out internal corporate software for a few years.
The offshore projects started to fail to deliver and then Steve came back and launched the iPhone. My experience is software is a boom bust economy. It's been more robust this run and everyone learned that outsourcing was not the win-win they thought it was going to be. In the meantime software ate the world so there is not enough hands to do the work that is out there and to your point, a whole generation was discouraged from entering the market. If there is any market that has a bad track record of employment forecast software dev has to be top of the heap.
I went to uni in 2004 and always had a crush for civil engineering. A few days before signing up my dad told me: I feel construction is stalling, I think you should go for Electrical Engineering. Best advice ever. By the time I finished construction was indeed dead and there were too many programming jobs.
Similarly to other users here from Europe, in 2010 the market was actually starting to take off. My dad was such a visionary... or I just had a total lack of it!
I graduated Electronic Eng in the UK in 1998, my best friend did Civil. I went into computers, he could only find low paid jobs in the UK so went to Hong Kong where construction was booming and now owns a significant part of a multimillion dollar company with hundreds of employees. I haven't really followed the money at all, so I'm not too sad.
Weird, around that time (a few years earlier actually) I was warned by the IT school we went to that they might try and pick us up to work for them before graduation. Didn't work out like that though, and plenty of my classmates couldn't find a job in IT and had to go for something else or move on to college.
I went on to college anyway because I wanted to do software development. Most of my classmates that joined didn't make it, or needed more time. A friend of mine ended up working in garbage collection and construction (remote crane operator) for a while, but he landed a job at a shipyard, and now a very cushy one at a generic IT supplier, traveling the country sometimes to go to jobs.
I had the exact same thing when I arrived at university in 2004.
There was a big Compsci building that had been completed a year or two before I arrived (started being built before the dotcom crash). It was scoped for the massive growth in students that they were expecting in computing.
By the time I arrived the thing was practically a ghost town. Student enrollment had cratered. I remember the huge fields of desktop computers available for students on multiple floors, and it was basically empty the whole time.
Actually, I was trying to find out the number of programmers world-wide the other day, and many US-based analysis were saying the same thing: "programmer jobs in the US will drop in the next 10 years as jobs move abroad".
For example [1], but many more with the same gist.
The BLS statistics have us based “programmer” jobs declining by 9% but “software developer” jobs expanding by 23% on a higher base. The job description is largely the same, although software developers have more design responsibility.
Yes and this thinking still pervades the engineers that came of age in that era and did avoid programming. One of my coworkers is obsessed with outsourcing and thinks of any programming task as being something that should be done in India. He asks why I should do something when we should hire a team in India, which is a ridiculous thing to say for so many reasons.
I went to university a couple years after that and also felt a similar sentiment that it was basically too late to get in on a good career before the industry got outsourced.
Of course then social media happened and everyone and their grandma needed developers so it didn't matter that people were flooding in because there simply wasn't enough labor to go around.
I think there are also qualitative reasons why outsourcing didn't prove as successful as many companies thought.
One being, that business requirements are often complex, and contain ambiguity, which requires communication between developers and the business, which is hampered by differences in timezones and culture.
My biggest issue with education is the whole "study this, get a job doing that" idea. It's plainly wrong, and it sucks in people who drink the koolaid and think there's an easy route to work.
Very little is learned in college that is necessary for general employment. By general I mean just about everything that isn't licensed, like being a doctor.
What college does is it throws a bunch of hard but irrelevant subjects at you that you slog through, so that you learn how to learn stuff. The best demonstration of this is in subjects that are scientific mathematical, because there's very deep trees of knowledge there that you have to trace, and there's relatively few ways to use common sense and cultural experience to answer questions.
This is why there are so many devs with no "relevant" degree. If you can sort through the mess of learning a stack, you can do the job. If you need to know some particular CS thing, you will find it and learn it yourself. There's no magic dust in any class, it's all publicly known information that's been digested by many many people before us, and these days you can find umpteen explanations for just about anything online for free.
The thing that sorts people is the ability to persevere through technical issues. You'll find a lot of people in this world are for some reason unable to deal with the mess. This is not a condemnation, I realise everyone has their own situation, economically and personally.
Thank your for articulating something I've experienced over the years so clearly.
It's one thing I wish recruiters/HR would understand. I have an advanced college degree in a field that has nothing to do with engineering and I've found that omitting it from my resume actually made the recruitment process a whole lot smoother as a candidate.
Interviews went from inquisition sessions about why I wasn't doing what my graduate degree had "destined" me to do, why I had "given up" on it, how I could relate my academic background to the current job (I don't because I haven't cared for it in years), etc... It's like you're inherently suspicious if you didn't follow the path you took in your 20s all the way to your retirement.
So I stopped making any mentions of my academic creds, and the worse I get now are some condescending lines about how hard I must have worked. Other than that response rate & interviewer's attitude have improved markedly.
I have a music degree, and when people ask me why I do web development, the answer is a combination of "I enjoy it" and "I have children and therefore need to be able to afford things".
I was very into music since middle school. I loved performing, the adrenaline rush, the challenge. I wasn't that good, but I was passionate, so a few musicians encouraged me to make a career of it. My dad strongly disagreed, he wanted me to embrace my other love of STEM. Now I'm gainfully employed coding and I have multiple bands I play with. So damn, I think my dad was right.
Music is one of those degrees I think naturally translates to programming. My first programming boss many years ago, was a classically trained professional musician. He ended up starting a software company with another person (who was crazy so we all moved on) to create some kind of management software. Eventually he retired as the CTO of a large privately held company.
I don't think most music degrees are technologically grounded. I have a degree and courses in music from 2 major universities and both of them were reticent to teach about music software. Music is a traditional field, there's a lot of older professors hanging on to tenure who listen, write by hand, and generally rely on their ears and aural sense, rather than tech.
These folks could definitely use software, but why? They largely don't need it, they are performers or educators, not composers. Even composers, some prefer to work at the piano or in their heads, writing down snippets in tech or on paper doesn't matter to them in my experience.
Well, in my experience as an educator and student over the last... 12 years or so... degrees that universities offer are overwhelmingly comprised of students wanting to study electronic music/audio engineering or some hybridisation of those (sound composition/screen composition). Orchestral degrees are becoming far less popular.
That's entirely not the point of college. College isn't supposed to give you job skills. There are other educational avenues far superior for that purpose. College is supposed to install you with broad and reasonably thorough knowledge base in the general direction you chose as well as shape your perception of the world and the underlying mechanisms.
It wasn't ever supposed to be a required qualification for the modern world emplyoment but here we are.
> It wasn't ever supposed to be a required qualification for the modern world emplyoment
but somebody who had been through college (in the early days) is a good proxy for being intelligent and capable. It's easy to check for this qualification, rather than an indepth vetting or trial period.
But as soon as this is found to be a metric being measured, it ceases to be a good metric.
Surely that's not the case with proper engineering degrees where someone might be expected to spec out steel for a load-bearing structure that cannot fail without loss of life.
>Very little is learned in college that is necessary for general employment.
Obviously for a programming job you don't necessarily need a 4-year university degree, but I wouldn't diminish its value either.
>What college does is it throws a bunch of hard but irrelevant subjects at you that you slog through
No. That's not the right way to look at university education. The golden rule is "You get out what you put in". If you see college as being useless and irrelevant then that's what you'll get. You can certainly graduate with a 4 year degree and know nothing and have it be a complete waste of time. That's on you. I know many people like this. On the other hand, every CS class you take is valuable and does teach you something deep about our profession that you can take away. It's also especially valuable for a student because as someone learning you don't know where your interest lie so getting a sampling of the entire field is very nice IF YOU PUT THE WORK INTO IT. So what you deem as 'slogging' is really 'learning challenging topics' - which is hard whether you do it in class, or independently.
So tell me, which CS classes do you think are irrelevant and useless?
Looking back at my education, I had a Database class where we had to implement a database backed by a B-Tree, the cornerstone data structure of many modern databases. Was that useful? In your world, it isn't because I didn't have to implement a B-Tree in my career ever again and if I want a database, I can choose an existing offering ... but when I work with databases having this as part of my mental model of how databases work has been invaluable.
I agree. If you're in college spending who-knows-how-much for your education, make it worth the money...
Take some electives on stuff you're into. Music theory, humanities, social sciences... Take the time to go deeper on your passions. Study the broad amount of subjects: you're going to have more structure than self-study.
I agree that university isn’t vocational training.
However, the knowledge is relevant. If you don’t know that some data structure/algorithm/whatever exists, would you know to look for it or what to look for?
Sure you can learn all this by yourself, but you can also learn it at the university.
Maybe the way I'd phrase it is the knowledge you learn at university, to the extent that it's applicable, is actually just indexing.
So for instance I figured out that numerical methods are a thing that you can use to approximately solve differential equations, which often don't have a nice analytical solution.
Can I solve differential equations? Not anymore. Can I solve one by the end of this week? Definitely.
What you do in college is a bit of intellectual tourism, looking at wide variety of topics briefly. There are natives who depend on the tourists for part of their living, who know every nook and cranny of PDE solvers. But like in the real world we're mostly at home in one place and tourists everywhere else.
The other answer to your question is that you build up an intuition for where the dragons are. The more tourism you do, the more you come to suspect there are other continents in certain places.
I largely agree with your points, but i'd say it's more "intuition building". To make a silly example, I don't know C# at all, in fact I have never written a single line of C#, but I'm very confident I could get up to speed relatively quickly because I know Java fairly well. This is simply due to the fact that after a while you just "get" how things work.
Similarly, even though I'm by no means a mathematician, I feel, at least to a limited extent, like I "get" math.
This kind of intuition, innate feel for how things work in a certain field of study lasts way longer than the details of a specific proof or the code for a specific algorithm, and I feel it's the actual output of higher education.
I was able to build a decent intuition in both maths and programming before going to university. It's not something specific to higher ed, just experience.
> numerical methods are a thing that you can use to approximately solve differential equations
indexing of knowledge is useful for certain fields, but without in-depth understanding, you can never apply this knowledge in ways that hasn't been imagined before.
I am reminded of the story of Feynman and the Connection Machine, where there was a need to figure out how much traffic CPUs would generate for the router they designed. If you weren't fluent in differential equations analysis, you would never have thought of using this method to perform the analysis.
Because the training in "deep diving into something" is evidence that the person can learn some other thing over the next few years.
It is a tenuous connection at best. It works somewhat in STEM because of the mentioned deep trees: some people are just naturally curious and will get to the bottom of anything, be it DS&A, circuits, complex analysis, or accountancy.
Because in most western(?) states, the entirety of education and (junior) hiring is just a load of bs.
I totally understand too, it’s super easy saying “oh for this position we require X, Y and a PhD in Z”. It’s guaranteed to filter out lots of candidates. Whether that filter is the one you need is another question altogether.
This insight is severely under appreciated. It may very well be the primary personality trait that separates engineers. It's not intelligence, reasoning, or problem solving skills. Those are certainly needed, but they are needed in a lot of occupations. Engineers for whatever reason are unwilling to let a problem go no matter how frustrating.
It can be all summed into reading, writing, math, and some grit.
Among US college graduates only a minority can demonstrate literacy proficiency. Roughly half demonstrate intermediate skills. Then there is 10-15 percent who are "below basic" (the educational system failed them). Many other countries have similar problems with their education.
A university is not a programmers' boot camp.
(I can talk here only about Western Europe here, I don't know academia in other parts of the world.)
A university is for learning the ways of science. Of scientific thinking and methods. And to learn the foundational facts of your field.
Engineering is the application of the science. It is not the science itself. You don't need a scientific background to engineer. It helps, but it's overkill.
You don't need to study to do development work. But a structured education helps to become a good software developer. Why? Because on the job you teach yourself only the bare necessities to solve the next problem. You have never the time to dive deep into the background of the problem. And deep and broad background knowledge is fun and helps to train your brain in general.
I think you need to caveat where your experience is from. In the uk and where I am courses are practically grounded and you spend a lot of your time in the final part of your degree during a practical placement. As a result I think you get a much less theoretical background.
I did my engineering degree in the UK, and it was mostly theory. I did do a placement as well, but the course itself seemed to be a tour of various science subjects, plus math.
I mean I built a working radio and a mini bridge in my first year, but you didn't get credit for anything other than the exams.
All in all it seemed like a whole bunch of vignettes. Here's how structures work, here's how electric circuits work, here's Navier-Stokes, here's some data structures, this is thermodynamics. Just a whole load of things that would be easy by themselves, individually, but piled high so that time management is the real issue.
The econ and management parts were much the same, just less hard because essay subjects are more bullshitty. Load of topics, there you go, now let's write an essay.
That's strange. I know students who had placements where it was all practical such as working for the BBC in one of their audio engineering teams for example or working as a programmer for a game studio. Didn't really seem like an extension of university at all.
> Very little is learned in college that is necessary for general employment.
College used to have two value propositions:
- enabling very smart people to do research and nothing else
- identifying / finding reasonably smart people that are needed for important tasks in society, but which might have languished in some village otherwise
Finding smart people has become a non-issue, and research has been corrupted by publish-or-perish. Thus, college today serves four purposes:
- cementing stratification, instead of loosening it: wealthy donors buy their way into prestigious universtities for their offspring
- serve as rite-of-passage: as we abolished all others, this is one thats left to signify a transition to adulthood
- enable a class of parasitic administrators and bureaucrats to extract as much money as possible from future students
- establish a caste of unproductive left-wing ideologues that spread ideas such as critical race theory among the future generation
Can't talk about academia, but why has finding smart people become a 'non-issue'? Without university I almost certainly would've gotten a job in my small rural town rather than moving to London with a lucrative and productive career.
> why has finding smart people become a 'non-issue'
The internet. Though somewhat complicated by misguided Civil Rights laws, admittedly.
> Without university I almost certainly would've gotten a job in my small rural town rather than moving to London with a lucrative and productive career.
If you leave aside the social conventions for a second, could there have been another road to that prestigious job in London other than university? I'm pretty sure the answer is "yes" for at least 80% of all jobs.
As if Oxford and Cambridge weren’t cementing stratification, enabling parasitic administrators, and establishing a caste of ideologues since the 13th century…
Nothing has changed; the elements of your second list have always been present. You just happen to have been told to disagree with the ideologues currently there by your own ideologues.
We're in full agreement. What is your point? "This idea is bad therefore all ideas are bad"?
Some ideas are healthy pro-social enablers of a functioning society, others lead to the death of millions. Christianity is far from perfect, but compared to the competition it has a stellar track record.
I have increasingly come to believe that part of the purpose of university education in the UK is to transfer money from taxpayers to the private owners of student housing via student loans.
I moved to Palo Alto in 1984 after the tech boom had collapsed. All the mortgages were under water. People talked about how exciting it had been. “But how many disk drives will companies ever want, really?”
In 1982 the MIT EECS department was trying to encourage students to pick other majors.* By ‘84 they didn’t need to.
* posters in the Infinite Corridor that read, “Course 6 got you down? Change majors!”
Was this the actual feeling at the time? Or just one reporters perspective, spun into a story. I feel like the ad for $7.95 prime rib is probably a more relevant data point from the past than this article.
Entered college in 2001 just as the dot-com bubble was bursting. Freshman year graduating seniors were still pretty optimistic. Sophomore year I declared as a physics major, despite having worked at a dot-com and enjoying programming for fun, as a bunch of family friends started floating "Why would you go into software when all the jobs are being outsourced to India anyway?" Junior year I knew that I liked programming more than physics, but saying I wanted to go into software was still a dicey conversation to bring up. Senior year Facebook & YouTube came out, I switched to a CS major, and I started getting more "Oh, that's a lucrative career, you'll do fine" from older people.
For an older perspective, my father-in-law was an EE in Silicon Valley through the 80s and 90s. He said that the period from 1989-1994 were dark days - the job market was stagnant, nobody was really getting rich, housing prices in Silicon Valley were going down. If people were saying it in 1987 they were a little prescient, but there was definitely a mood in the early 90s that PCs were a fad that was over and the Next Big Thing would be some other form of tech.
My uncle dropped out of Waterloo in the early 70s because everyone figured there weren't enough computers for computer science to be useful.
Like Professor Frink said: "I predict that within 10 years, computers will be twice as powerful, ten thousand times larger, and so expensive that only the 5 richest kings of Europe will own them"
I entered Waterloo in 2002 and I took "Computer Science for Business" where a confidently incorrect lady repeatedly told us all our jobs would be outsourced to India so we should all be managers instead.
20 years later and that person was the worst fucking professor I ever had.
The only reason I knew she was full of shit was because I was a second generation 'good at math class' person. It doesn't matter where you live or what non-logarithmic things cost, dialectical materialism is real.
People who thought jobs would go to India probably never did anything (or managed any work) that couldn't be outsourced there, hence why they thought this would happen in 100% of cases.
> It doesn't matter where you live, dialectical materialism is real.
I'm going to take this as a positive comment that scientific evidence based theories of the world generally work out correct: You were right to ignore the fantasists.
If you meant it some other way, do please elucidate
"I think there is a world market for maybe five computers."
Thomas Watson, president of IBM, 1943
(of course in 1943 that was a totally reasonable thing to think considering they were the size of a small house and ran on thousands of vacuum tubes that needed constant replacing and started in the millions of dollars)
And it’s come full circle because today the majority of the Western world’s computing needs are served by five computers: Google’s, Facebook’s, Apple’s, Microsoft’s and Amazon’s.
Each of them consists of millions of individual devices, but the end user posting to Instagram or accessing their iCloud files doesn’t need to know any of that.
Sure, if you ignore the issue of how the user accesses them.
Even if you pretend the user has a dumb terminal, that's 5 computers that can each serve a hundred million users at once. Very very different from batch processing.
Also the original quote was about preorders for a single model in the united states.
In the 80’s and 90’s in the U.K., in my experience, definitely.
My parents were always hugely dismayed by my interest in technology, said it was all just a flash in the pan, a waste of time, it’ll rot your brain. My boarding school from 7-12 (‘90-‘95) banned all computers and electronics because they were considered a distraction. You wanted to use a BBC micro, you sat on the now-imprisoned geography master’s knee, and he’d give you the key to his basement. I digress, but technology was so unpopular and unwanted you could only access it via pedophile.
Even at secondary school in the late 90’s the careers service told everyone who expressed any interest in tech than they needed to choose a realistic career, say a doctor or a lawyer, as technology is just a fad, the internet is just a gimmick, but people will always be having accidents and ending up in court.
As a kid, I honestly felt like I was the only person in the country who didn’t think computers were useless.
I started my career in the eighties and it did feel rather dismal in the UK. All the interesting stuff appeared to be happening on the other side of the Atlantic. There were pockets of development going on but most of the programmers I knew were writing COBOL on IBM mainframes.
I remember reading an article about employment rates for different degrees. Computer science was at the bottom of the list.
This is why I'm always sceptical when I hear people here talking about the current high demand for programmers carrying on forever.
I graduated from high school in 1982, and from college in 1986.
My mom was teaching CS courses at a satellite campus of a state university, mostly to older students looking to change careers. Many were trying to get out of assembly line or clerical jobs. Her students were getting programming jobs after one year of intro programming.
There was certainly a lot of misconceptions at the time. How could anybody know? How many programmers would really be needed? Did programming really require a college degree? My mom didn't think so. She thought the market for programmers would quickly saturate.
I loved programming, but majored in math and physics instead. At my college, the people who seemed to be doing cool things with computers (relative to my own interests of course) were in the physics department. I had a summer internship at a computer facility, and the work that the programmers were doing seemed boring.
The market for programmers did not rise monotonically in my region (the rust belt) because the entire regional economy was heaving up and down. By 1987, it was still not clear that computer science majors had a compelling advantage over self taught programmers in the job market, especially since the degree took 4 years.
What nobody anticipated is that while programming seems easy once you know how to do it, learning to program is prohibitive for most people, for reasons that I don't think we understand. The relationship between choosing to major in computer science, and achieving a successful career as a programmer, is tenuous at best.
I still love programming, but have never been employed as a programmer per se.
Honestly, it is you busting at the blackjack table that makes those prices so low. These restaurants are subsidized by the house via the majority of losing gamblers. About the only reason why they even charge for the food in the first place is to set a minimum threshold for the non-gambling clientele.
I originally went into chemical engineering rather than CS/software/computer engineering as many of my relatives thought that software engineering was a fad that wouldn't stick around. I graduated in 2019. So even several decades later some thought that.
Just make sure he's aware of what exactly is chem engineering. It's not discovering new chemicals. That would be chemistry PhDs. It's not even discovering new reaction pathways, that's still mostly chemistry people (process chemists).
Chem engineering is process engineering. Chemists give you the reaction they want to do, and how much of it. Then engineers will size boilers, heat exchangers, pipes, process controllers and a bunch of other industrial equipment to make this happen.
It's not a bad career, you just have to be aware that it mostly happens in a production/manufacturing kind of environment with everything that entails (ie possible oncall in case of plant emergencies on weekends, etc).
As a professional engineer, you are putting your ass and your career on the line every time you sign off on a design. If an accident happens with loss of human life, serious injuries or large financial loss, you can be sued and you can have your license revoked if it was caused by a mistake in your design. You have to be able to live with this weight on your shoulders every day you come in to work.
It's obviously not paid as much as programming, but what is? Programming is pretty much an outlier at the moment, chem eng is probably in line with other traditional engineering disciplines (civil, electrical, mechanical, etc).
I think the article is fine, the title is just pre-WWW clickbait.
This quote seems a fair assessment, but doesn't exactly match the headline:
"There certainly is going to be a need for more and better technical talent in the high-tech industry."
So enrollment dropped from 2.1% to 1.6% after the hot computer jobs market cooled a bit, and reality set-in for several of the students who weren't very technical, and just expected an easy path to a high-paid job. It's been known to happen.
I entered college in the fall of 1982. I thought about majoring in CS but the general concensus was that the field was overpopulated and there wouldn't be enough jobs.
I graduated college the first time in 1999, right before the Dotcom bubble popped. Outsourcing was definitely the big scare. So much so that I initially designed my career around avoiding outsourcing the best I could.
My plan was (and still is) to not just be a tech person (it has worked out ok, but may not be the best way). For all my electives I took business courses, and I started keeping up with the business side and finance. I wanted to be in meetings and offer solutions (at one job I had the title 'Solution Architect') to problems that may or may not be programming related.
I do see this a lot more today. Where some of the best programmers are ones who ask questions, and figure out that the best solution to a given problem is no coding at all.
Yeah. I intended to get a CS degree, but the parents insisted it wasn't worth the efforts and that I was better off with a real job.
I eventually got into IT anyway, but It's nice to know I have a proper education to fall back on when things go south; forklift driver licences don't expire...
I don't recall this being the feeling of people who I knew at the time. However, I have to admit, the quality of the writing in this article was much higher than I tend to see in today's news blurbs of similar length.
I find that I have to create my own content if I want to share news that's not the most negative spin one can reasonably make. To get any more negative, you would need to start fabricating stories and leave the land of plausibility.
Edit: Come to think of it, when I was interviewed for a news story about the quiet little hep a epidemic a few years back, they fabricated a quote from me and thereby left the land of plausibility.
By 2001, three companies I co-founded went under. One was going public in April 2000 (oops), plus a couple others failed after 911. I wound up living in my car and writing code in a cafe. There were quite a few of us. The owner was sympathetic. You could buy a tea and share a table for hours. Refills were free. One web developer said he was changing careers to house cleaning. Oddly, it was quite liberating.
Ah, the cycle of life. We incorporated in 2001 and grew fast (Deloitte fast 50 for 4 years in a row) picking the people and clients from the companies that did not survive the dotcom bust.
When I was telling people I'd study computer science at the end of the 80s, everyone said: "Why? Everything has already been developed, word processors etc. Nothing left to develop".
I think CASE - (Computer aided software engineering) was starting to come about that time iirc. Business folk would just drag and drop stuff in these new fangled GUI's and everyone would go home early.
Wasn't CASE just a fancy word for what we now call IDEs? Business people might not do much dragging and dropping there, but developers do it quite a bit.
As others indicated, CASE was more about code generation rather than smart IDEs. My recollection from the time is that the industry really strongly believed that almost no-one would write actual textual lines of codes and that graphical programming was going to be where it was at.
Looking back, it’s a remarkable delusion: mankind went from pictures to pictograms to text, not in the reverse. The career path everyone expected (mentioned elsethread: go from a junior guy-who-writes-lines to an architect guy-who-draws-pictures) doesn’t really make sense. If anything, while a high-level architectural document needs to have some diagrams, even more than that it needs to have text: text to describe the problem being solved, the context, the potential solutions and the reasons to choose one solution over the rest. Likewise, it’s striking how many PowerPoint slides could be effectively replaced with short point papers … but that’s a discussion for another time.
I am not 100% sure of why the industry had this wrong-headed idea, as I was too young at one point in time and too junior at another to really have good context. Maybe it was the rise of GUIs which led folks to think, ‘gosh, if pictures can enable folks to use a computer easily, then they will enable folks to program a computer easily!’ Or maybe it was the unfortunate habit non-experts have of not really understanding an expert’s field. As I recall, it also had a very strong correlation with the rise of OOP.
I used Oracle CASE at the time, it had a "data dictionary" where you defined the database schema in terms of the business terms and you could match it to the physical elements in the tables.
I think there was also a tool that you could make business logic in and maybe map to SQL*Forms (the user screen painter). There was an idea of being able to join these together somehow, but I don't recall the mechanics.
I started college in 1986, EE, but after a year switched to CS in 1987, graduated 1991. Around 1990 or so, I can remember walking around the halls of the main CS building on campus on the way to one of my classes, passing a door labeled "Hypertext Laboratory". I remember thinking, "Hypertext... pffft. That's never going anywhere."
This was the time of the Turbo-button. Programs were made for 4.77 MHz IBM PC, so you had to slowdown the 30 MHz by default. The Turbo speed was mostly useless, and so it seemed to be the ultimate end of computer development.
Achually for 10 years nothing exciting happened. Until maybe 1992 when you could install your very own Unix.
> Achually for 10 years nothing exciting happened. Until maybe 1992 when you could install your very own Unix.
What was happening in the span 1982-1992 was that everything imaginable was coming out as a peripheral for the IBM PC. It wasn't just stuff like disks and printers and graphics cards. Month after month you'd see something new, and realize "Yeah, I guess you could do that as a computer peripheral", but it would be something that you never dreamed of that way before.
And all that stuff enabled a ton of software. It was an explosion of new capabilities.
Its crazy to me that a tech forum poster would consider the incredibly innovative years of 1982-1992 as nothing because his preferred OS didn't exist yet.
1982: Commodore 64
1983: Apple //e
1984: Macintosh
1985: Amiga
1986: 386-based PCs
1987: Hypercard and Acorn Archimedes
1988: NeXT cube
1989: 486 released. Deep Thought defeats its first master, WorldWideWeb released
1990: Power Processor, Gopher, EFF founded
1991: PGP, AM386, Python
and that ignores the incredible BBS and shareware scene that blossomed at that time. A lot was going on in the mid to late 80s in the PC world.
I would add the AtariST to your list. It could use workstation compilers and had a flat address space, you could port software to it a lot easier than to a pre-386 PC.
And even a bit earlier, 1977 was the year of the "holy trinity" of early home computers -- the initial Apple II, the TRS-80 Model I, and the Commodore Pet. There were even earlier machines, like the Altair, but those were generally sold as kits and assumed knowledge of electronics, but the 1977 machines were consumer products that you could just buy and use.
It was a different time, when most of those new things came out, they were more expensive than whatever proceeded them (in a sense). Individuals weren't buying $10,000 Macintoshes, $20,000 NeXTCubes, and $30,000 UNIX/RISC workstations. My communist professor was bragging his SPARCstation was more expensive than a Mercedes!
The low-end got cheaper only very slowly. In 1990 you could go into a computer store and drop a couple grand on a slightly better version of a PC-XT or original Mac, ancient tech at that point.
This all changed very fast in the early 90s with 386/486 and Windows got cheap fast and crashed into the consumer market. Most of the above died or almost died in the process.
Home computers in the eighties weren't that expensive. That was the whole point – you could actually have a computer in your home now. Kids got them for Christmas.
Yes and no. "Computers are the future", so middle class parents felt like they HAD to buy their kids a cheap Commodore or something. I think 80% of them ended up in the closet, most of them were used like game consoles and maybe 1% of them created assembly language programmers. I know successful engineering people who had no interest in atari/commodore video game stuff.
Meanwhile, college students needed real work computers which were still quite expensive then. PC-XT/Mac Plus computers were not cheap in the late 1980s.
Do the math. They really were that expensive in today's money. Of course you had cheaper alternatives for the home like the C64 and even more so the ZX Spectrum, but that flimsy toy would still set you back several hundred dollars in inflation-adjusted prices.
Amiga was £399 in 1989 in the UK, which is about £1000 in today's money. I expect they were cheaper in the US, but £1000 is fairly reasonable for a computer.
Good deal from a company obviously swirling around the bowl who sold computers through the mall ninja store. Great games! Meanwhile, their university presence was just about nil.
If you weren't a gamer, you probably didn't know Amiga existed.
Back in '87, we're talking 386 territory for cutting edge compute, so I think we were limited to 25Mhz. As has been pointed out, those machines were expensive.
Pulling out an old PC World magazine from '89 I see that a 386/25Mhz based IBM PS2 was selling for over £4k, and that was with a mono screen. More typical home machines were the clones, like Amstrads which were typically 512k 286/10Mhz machines were either dual 5 1/4 floppy or if you really rolled out the boat, a floppy/hd combo.
I stuck to home computers rather than business machines due to the prohibitive expense of the above, and only saw decent machines which would in any way reflect modern computers when I got to university in '89. When you've been struggling with an Amiga with a floppy drive, a room full of Sun 3s was a thing of wonder!
30MHz in 1987? Not many people had such luxury. My family bought a ~12MHz 286 brand new in ~1992, and our 25MHz 486 was still years away (but yes, they had turbo buttons, and the 486 even had an LCD display showing the current frequency).
One reason was the rise of the Internet. While there were a lot of neat things you could do with a personal computer, it wasn’t until you could connect easily to others that it really became fun and useful for most people.
I got my first computer, a Mac, in 1986. That was exciting for me, but I really felt that the revolution had happened when I got a modem in 1992 and connected first to a local BBS and then, a couple of years later, to the Internet itself.
In 1988 I watched when they tried to use the Smalltalk environment with its windowing system in software production. It was very robust and well for a week, but then it just choked up. Not enough memory and virtual too slow.
For me it was Amigas instead of Macs but it seemed I could get another computer cheaper than I could a C compiler. Then there was this new BBS in town called "www" or something and it had this free C compiler ...
In 1981, I knew for sure there was no money to be made as a programmer, even though it was what I really wanted to do. So I studied Electrical Engineering instead. College didn't work out, and I did a fair bit of programming, and some electronics repair/design work. Clearly I was wrong. 8)
As hopefully many readers will realise, when it comes to computers and the internet, the "Optimists Archive" is orders of magnitude larger than and equally as amusing as the entire "Pessismists Archive" for all things. When the "fad" is also the means of publishing news of the "fad", it stands to reason the "news" will generally support the fad. The number of people who have and continue to use the internet to publish their "predictions for the future" is enormous. The number of accurate predictions is infinitesimally small.
I started my CS degree in 79 and entered the workforce in 82 and at no time did anyone in the UK imply to me it was a shrinking field.
I think this may be a niche view of the US education system, because I was working in UCL-CS in 87 and got no sense it was in contraction there, nor in Australia when I moved here 87-88 and in fact at no time in my personal experience, has CS or SW eng been in decline as a taught subject. Ok maybe 1-2 year windows of reduced entry requirements, but shortages of jobs? Nope.
Nice find from 1973!
The second guy in the video is really chill, he is complaining about having little time to do actual work. You could have a similar interview in 2021.
In the movie The Karate Kid, 1984. The mother moves to California because there are no more computer jobs in New York. It's stated a few times in the movie that this is why they moved.
If it was true in the 80s, it was certainly true in the early 2000s when I was getting my CprE degree. I met quite a few people who were studying ComS or CprE for reasons that can be summed up as, "I built my own computer, I like video games, people said I'm good with computers, so I should be a Computer Science/Computer Engineering major!" They rarely lasted more than a semester or two, though I think the most epic flameout I knew managed a whopping 0.34 GPA his first semester...
And if the goal is "Get a degree, coast on through life with the easy money from computers," well... good luck. There are some very wide open pathways where if you're good, you can make lots of coin, but they're rare, and most tend to involve an awful lot of years of hard learning before you're competent (embedded dev and firmware/OS dev are some fields desperate for competent hires).
If you can write solid C, ring 0/EL3 doesn't scare you off, or you can figure out how to cram 10 pounds of hard realtime code into a 5 pound microcontroller... yeah, you'll go far. Come out having done the minimum, knowing Python and Javascript, well... there's an awful lot of competition for most of those positions if you're just a basic coder, not skilled at using those to accomplish interesting real world tasks ("make those images slide across the webpage" is not particularly novel or interesting anymore).
I'm struggling a bit to find ComS majors as a percent of degrees, but some stuff I've found indicates it's only in the 3-4% range or so today, so while it's double the percentage referenced in the article, it's not an order of magnitude higher by ratio (though in terms of degrees issued, probably, with college being more popular).
> If you can write solid C, ring 0/EL3 doesn't scare you off, or you can figure out how to cram 10 pounds of hard realtime code into a 5 pound microcontroller... yeah, you'll go far. Come out having done the minimum, knowing Python and Javascript, well... there's an awful lot of competition for most of those positions if you're just a basic coder, not skilled at using those to accomplish interesting real world tasks ("make those images slide across the webpage" is not particularly novel or interesting anymore).
Go take a look at salaries and get back to me. Embedded does not pay as well as React/Rails, for the sole reason that React/Rails developers are closer to business problems.
Wonder on which world you live - around here the salary for hard real time on embedded are around 30-40k. Personal data point : a few years ago, after my PhD (revolving around some aspects of real-time), I did some interviews to stay fresh and was offered 35k to work on embedded chips used in Dassault missiles.
I told them that I was already making more from freelance consulting at that time, they said tough luck.
"Past performance is not a predictor of future results". The gravy train might keep going for as much as another decade but, as more and more people are flooding into web development every year, at some point demand is going to be saturated and then ...
There are also way more web jobs than embedded though, but you're right that another .com bubble will be bad for web devs (especially juniors, the seniors should do fine).
Is being "closer to business problems" really the sole reason? Or even a big reason?
From the numbers I've seen, DevOps people are the best paid of all types of developers, but they aren't closer to business problems than React/Rails devs.
I think it has more to do with the fact that most embedded developers are working on niche platforms and niche problem domains, so they don't enjoy the same level of job mobility as React/rails devs, who in turn don't have as much mobility as a DevOps person.
Moving companies is by far the easiest way to get a pay rise. Job mobility gives you more negotiating power, and thus a higher salary.
> Is being "closer to business problems" really the sole reason? Or even a big reason?
It's got to do with visibility. Those people more visible to the holders of the purse strings get paid more than those who aren't. The closer you are to the business side of things, the more visible you are to the profit line.
Compensation is very closely tied to how visible the role is to the company profits and/or leaders. A highly skilled DBA who ensure the most profitable application's DB is up, running and tuned is less visible than the business major who uses QlickView to send reports to the CEO. Guess who has more leverage to ask for money.
Sidenote: Skills have nothing to do with compensation. For a long time I used to feel guilty whenever I learned that I was making similar money to someone who was more skilled at my role than I was.
Now I don't feel guilty at all if I find out I'm making more than that person, and this is because I realised that the HR head, who has few relevant skills and can viably be replaced by a mollusc, makes more than I do.
I don't feel guilty making more than others anymore, because other people making more than me don't feel guilty about making more than me.
I get what you’re trying to say, but then how come frontend developers tend to get paid less than backend devs, who in turn get paid less than devops engineers?
Because front-end developers are the least visible to the CEO (the furthest away from the business data).
Backend developers are a little bit closer to the business data than backend developers.
Devops are even closer to the data that the CEO wants.
The "visibility" is not "visibility relative to the product" but "visibility relative to the profit".
If you draw the entire business as a series of concentric circles, with the circles representing the business, the centre of the circle (the heart of the business) is Accounting, the next layer would be HR, the next would be sales, then marketing, then maybe advertising, then product development...
The only people less visible (further away from the business data) than the front-end devs are the data entry people.
There's also supply/demand in play. There are a lot of front-end devs out there. Normally when I see someone advertise themselves as full stack, they are backend heavy and can create passable front end code. Not often I see it the other way around which adds even more supply.
You’re absolutely correct. I can write solid C :) but I make ok salary (155 base) and the vast majority of my work is doing architecture diagrams and writing reference code, most often in js. I also write a lot of sql.
The pay people are getting with their basic Python skills still outstrips non-programming professions at the same level of experience by a large margin. I also would like to express some skepticism about the embedded development pay scale idea, because a lot of those jobs are at engineering-first firms that put software, and by extension programmers, last.
> because a lot of those jobs are at engineering-first firms that put software, and by extension programmers, last.
I may be wrong, but I’m also skeptical of ever applying to these sort of positions because I feel they’d likely carry other aspects of engineering firms like heavy credentialism. l.
I have an MIT degree with coursework in operating systems and a lot of ring 0 experience and my well-paying job has me writing Python and JavaScript. As far as I can tell I'd be taking a significant pay cut if I went into embedded dev.
Supply is higher for "basic coders" but demand is also much higher. There are a whole lot of images that need to be slid across web pages; there aren't that many CPU architectures that need Linux ported to them.
This is correct. At a hedge fund (ostensibly one like HRT given the pedigree you mentioned) geofft makes on the order of $500-700k or more, unlike mere mortals like me that went to a state school that make just $200k. Of course, I didn’t take any embedded classes and got an A- in my C class so perhaps I deserve nothing to begin with.
Speaking of hedgefunds. Those are unlikely to hire embedded software developers but they may need a lot of "basic coders" and they can afford to pay them a lot even if the job is simple.
hedge funds are not uniform enough to have any predictable set of skill requirements
some need a node js script, some swear by python because an area of computational finance has gathered around it, some swear by languages closer to the metal, some have use cases that are better off with low latency stacks, others dont
some are modifying FPGAs, others have simply negotiated a payout from a court case and will teach you F to analyze a dataset faster
if they need coders at all, they all mostly need some level of quicker analysis but not possible to predict what kind of coder a hedge fund will need
Its useful to understand where you are on the totem pole.
You may come from a background where “college” was a given, or a background where college was quite a dramatic feat as nobody else in your family had done it before, but either way while they are praising you its important for you to know the limits of the “any college” experience early on
Nobody has ever praised me. I have nothing thanks to my state school degree and its apparent I'll never amount to anything because of it.
I'm still not sure why I should even keep going on given my life is basically already over now because I went to "any college" because I only got a 1500/1600 on my SAT.
FAANGs dont care about that and will still pay you mid six figures.
There is also a global boom town in crypto, where you can build and release anything you want and earn more than FAANGs will pay you. You dont even have to speculate on any coin or token. Just launch products and earn.
Boost your serotonin levels and look at the possibilities again.
I already work at a FAANG and only make $200k 3 years in. I really have no hope of cracking any of the others (I work at the least impressive one, naturally).
Seems like being intellectually inferior will stunt me for literally everything - I have fewer positive life experiences than a Stanford pre-freshman
I would suggest toning this down it’s really painful to read your self flagellation all over Hackernews. Your tone comes across as people in non Amazon FANGs are gods.
I’ve worked at both FB/Apple in the ML space and have been promoted multiple times. I can promise you people aren’t throwing themselves at my feet to bow down to my amazing intellect. I do not sit and laugh with my heaps of cash at the poor peasants who only make 100K a year. My life is not perfect! People in FANG are not perfect, a lot of the tech there is also not special at all.
Stop obsessing over your career I can promise you no one is paying attention to where you work so closely. Get some friends you’ll see how little this matters. I have many CS friends in non FANG jobs and I do not care where they work nor think less of them. They’re some of the smartest and happiest people I know. Do you really judge people who don’t work at FANG like you judge yourself, because that really makes you a rude misguided person.
By the way I went to a state school not even in the top 50. I wouldn’t trade that experience for the world, I had an amazing time. No one cares where you went to school two years after you graduated. And no one definitely cares about your SAT scores how do you even remember that anymore?!? Seek professional help, stop whining so much, and do something else besides freaking out about Amazon.
I mean, to people like me they might as well be. Even if people don't bow down to you, it's still better than being dismissed and ridiculed for only making $200k at Amazon from a state school with a < $300k net worth. Fwiw, my smartest friends all work at better companies than me (Google, FB, Airbnb, trading firms) and make quite a bit more than me - somewhere between $80-250k. If anything that makes me feel worse. I have friends that are students and they're all miserable, poor, and I can't really do things with them for a lack of money on their part. On the other hand, I can't do anything with my richer friends because I lack money and can't go on expensive vacations or do expensive activities. It's not great for me. It obviously doesn't help that everyone assumes people like me to be stupid, again, due to my lack of pedigree - every year of my (top 200? 250?) undergrad felt like a stab in the chest, I worked so hard to get so little (a resume that can't get me an interview at Jane Street and or a shot at a top school's grad program). I honestly don't have much hope for the future anymore as a loser in meritocracy. There's just no point if anything I do is just going to lead to the same suffering and derision from the meritocrat elites.
You say
“ It obviously doesn't help that everyone assumes people like me to be stupid, again, due to my lack of pedigree”. I am telling you with 100% certainty this is false. People do not think like this.
You should seek therapy to challenge these assumptions you hold.
If I can give you one more anecdotal evidence my partner works at Amazon. I make more than them by over 100K. I do not care, I do not view them any differently. My partner is also proud of their job and likes it. They are smarter than me, it’s just through various circumstances I got a higher paying job. They actually don’t want to leave Amazon right now. They are not depressed that I make more than them. I also try and model my life after them because I view them as more successful than me at life.
Also your friends sound crazy it they are going on trips that you cannot afford on 200K salary. You can definitely afford a few trips to Hawaii a year. Also going out to eat, movies, shopping, camping, hanging out, video gaming etc. are all activities you could do year round on your salary.
I’m done talking back and forth but if I could give you one last message. Use Amazon’s insurance to see a therapist (my partner does and it’s covered). Show them your hacker news comments. Just see what they say. I promise you, you will feel better about these thoughts a few months in the future if you do.
By the way, the reason why I took the time to write out such a long response, is I worry about people like my partner who might read this. While they are quite mentally healthy and I’m sure would brush off your comments I can see someone else slipping into your mindset and getting stressed.
Comments like “FANG gods, I make only 200K, etc.” are really out of touch. I actually hope you someday get a job at Google/Apple/etc. and see that it really is not all that it’s cracked up to be :)
> you can make lots of coin, but they're rare, and most tend to involve an awful lot of years of hard learning before you're competent (embedded dev and firmware/OS dev are some fields desperate for competent hires).
Embedded dev and firmware both seem to pay comparatively poorly, despite the expertise required to do it well. Last year I discussed a number of contracts in this area, and when I found how much was on offer, they were all distinctly low compared with other contracts.
OS dev, I will grant you, can pay well, eg kernel development, graphics drivers, storage layers. I think that's driven by servers and mobile devices more than embedded.
> If you can write solid C, ring 0/EL3 doesn't scare you off, or you can figure out how to cram 10 pounds of hard realtime code into a 5 pound microcontroller... yeah, you'll go far.
Oh if only! Doing that stuff is fun.
I think cramming hard realtime into microcontrollers is best kept for hobby time if "coin" is what you're after though.
Perhaps it's different for big, serious controller projects (like say a Mars rover or satellite), but I think mostly, embedded/microcontroller type things seem to be paid closer to mid-level electronic engineering levels. I don't know why, because there's a lot of expertise involved, but electronic engineering seems to pay considerably less than software at the moment.
> (embedded dev and firmware/OS dev are some fields desperate for competent hires).
There's a Samsung R&D center in my country where they do very cool and tough stuff and they ARE desperate for hires - mostly because their salaries are a laughing stock of the entire city. Web and data/backend pay best and the "tough" fields are trying to get by on sub-web salaries.
I agree that knowing only basic Python or JavaScript isn't that useful today. In the companies I've worked at we've never hired a fresh grad or career change person whose skillset was described as "basic Python skills." We've only hired if they have professional experience, a compelling portfolio, or show some exceptional talent for a junior level. I think this is the norm.
Maybe I'm just traumatized by 00s coders, but I hear rejection of college as rejection of knowledge and critical thinking. I admit school ain't all that, but you CANNOT reject it all.
Not just academia. Google is jarringly restrictive in who they hire. If you haven't memorized the Data Structures textbook, you can't even talk to a recruiter. I've been pretty successful without even glancing at that book (since most business domain data structures aren't complex).
Now...I have improved my knowledge of complex data structures like graph traversal, but it's rare this knowledge is needed in every day business-oriented development.
Similar story post Dotcom crash at my alma mater. There were either 21 or 26 incoming undergrads declared as CS majors. The dean exclaimed that it was great to have them and that they were "the true believers".
Fast forward to now, and there's about 250 incoming CS undergrads without much student body expansion.
I studied Physics but I think if I could do it all again CS would make more sense (in my job I do some CS stuff anyway and zero physics..)
At least in Europe, even Engineering fields (let alone sciences and god forbid, the arts/humanities..) have pretty dismal job prospects compared to CS.
It seems like everyone and their grandmother wants to "learn to code" these days and it really doesn't seem like a sustainable situation.
That said, it's quite a hard field to actually learn enough to be a useful, paid professional so at the moment I'd still say it is the best degree choice.
There will always be people who choose a field for success, and they are more likely to flee when it seems unsuccessful.
I was once connected with someone who was just getting started in his career. He was entering the tech industry as a software engineer, having graduated w/ a CS degree from a UC in 2000 or thereabouts.
This person had majored in CS because of the late 1990s tech bubble, yet felt strongly at the time that it was not a good field once the tech bubble burst. I was aghast, not only at the reason he chose his path and the waste, but also at the idea that a bubble pop would send him out of the field so easily.
He ended up going into real estate just as that field bubbled into insanity, God bless his soul.
The person you are talking about is following hype and trend, hoping to hit an easy career and ride the tide.
But the issue with this is that by the time you hear or know about a trend, it has already started and probably well underway. Like trying to catch up to a wave on a surfboard, you could miss, or fall off as it crashes.
I think traditional CS programs could take a page from what I believe is the most successful aspect of coding bootcamps; the inspiration/excitement/desire to learn etc. felt by a graduate in a very short period (weeks/months vs years+).
It's not perfect, they are missing many fundamentals, but now the student can appreciate why those are needed and the drive to learn more grows naturally.
I think other disciplines could benefit similarly, especially for students completely new to a field. Go through an intense/short program, then follow up with a more traditional, longer term program.
Maybe degree programs should start with a bootcamp, then a "real" project using those skills, then start with in-depth CS fundamentals so that students can start to unwrap the things they've learned, why things are done a certain why, explore problems they had with their post-bootcamp project, etc.
When I graduated in 1995, some of my friends were disappointed when they went into computer science for lack of a better option. By 1999, they were getting frantic calls from recruiters trying to outbid each other.
In the mid-80s, I went to a job fair, and two jaded programmers told me that there was no future in computer science, everything was already done and invented, and the job was mostly boring maintenance.
I made the mistake of studying Physics in uni because there wasn't any decent CS uni near my town. After interacting for a while with the bitter, skint postgraduate students that hung out in the labs, I dropped out, and I've been making software for a living ever since. Once in a while I wonder if those lab guys are already approaching minimum wage.
Well it depends.
There is and there will always be demand for "talented developers".
But I remember at University there were students who after years of studying were not able to write a simple program or app.
Later they became managers or technicians. But it looks like all of them found some kind of job.
It was 2011, a childhood friend visited me after 10 years. We talk about the old days, we laugh, I even tease her husband sitting next to me, smiling politely. She excuses herself leaving the men alone to bond.
"So, what do you do?" He asks after a long 5 second.
"Oh, I'm a programmer." I say with pride. To be sure I add, "A computer programmer." I make it a habit to add this because regular people have a hard time understanding what we do as hackers. The man smirks.
"Like, writing code right?" He asks. This is the first time I get this reaction. Usually, people are almost envious when I tell them what I do. They praise me and I act humble by belittling my profession just a bit. But not this man. This one is trying to hold his ribs together.
"Yes, you know I accidentally fell into programming, I actually went to school to become an Electrical Engineer." I say, but as I'm watching his reaction, my voice starts to break. Not only is he not impressed, he feels sorry for me. Before I am done he bursts out laughing.
"No, my friend, if you were chasing the future, you made a mistake."
This was the first time I ever heard this. I'm a programmer, am I not writing the code that powers the future?
"Nursing, my friend. Nursing. That's the future." He went on to explain why nursing was not only one of the most important jobs, but it was also one of the fastest growing. And then he told me something that I still can't believe I heard correctly.
"Even jQuery is becoming irrelevant"
The past few years, I've been working on incredible software. Some of it almost feels like magic. Distributed systems, embedded software, neural networks. But every once in a while, I remember his words. At the very least, they help me stay humble.
> Nursing, my friend. Nursing. That's the future." He went on to explain why nursing was not only one of the most important jobs, but it was also one of the fastest growing. And then he told me something that I still can't believe I heard correctly.
Actually, I still see stuff like this these days, in regards to automation FUD. Something about the only jobs that will be left will be those that require human emotion and empathy.
"Find work that feels like play" apply here.
CS/Engineering problem are not easy but always feel like game to me. Chasing fad(picking major)/money to me is like cart pulling the horse. I get it but not for me.
There were bound to be some big spikes at the beginning. But software salaries have plateaued into an equilibrium finally, I don't think we'll see anything like that in the future.
This happens from time to time. There were about 60 people in my starting year of CS in 2003 (~30 completed); two years previously there'd been about 200.
That’s when the smart people exited the field. Because they felt that computers will soon stop being fun (that’s what happened in the early nineties indeed). And now we are stuck in this bells-and-whistles industry (software and web). While real engineers and scientists are focused on the fields that matter, like transportation, space exploration, robotics, construction, food production, medicine…
The claims in this article, bluntly and simply, are not true: what is described as a 'fad' has grown into one of the hottest professional fields, for decades.
Gee, 1987, I was at IBM's Watson lab as a
researcher in their effort at the time in
artificial intelligence.
So, at the lab we had an ambitious
computer services group, and they had for
our general use six mainframes, I believe
all single processor machines, running the
operating system software Virtual
Machine/Conversational Monitor System
(VM/CMS).
For access from home, I had a PC/XT with
some special software to use a standard
dial-up land line telephone line to access
the mainframes. As I recall, the data
rate was about 30 characters per second.
So, the mainframes were likely water
cooled. As I recall, the last of the
water cooled mainframes had a clock speed
of 153 MHz.
In contrast, the last desktop I plugged
together has an AMD (American Micro
Devices) processor, FX-8350. It has 64
bit addressing and 8 cores. The standard
clock speed is 4.0 GHz. To buy the
processor, at Amazon I paid their usual
price, quantity 1, ~$100. Tiny little
thing; came in a tiny little box. The
processor has been fine: It is the basis
of my startup.
So, let's compare, counting clock ticks:
The 6 mainframes had best I can guess at
most in total
6 * 153 = 918
million ticks per second. The AMD has
8 * 4.0 * 1,000 = 32,000
million ticks per second.
Then we can take the ratio
(8 * 4.0 * 1,000) / (6 * 153) = 34.858,387,799,6
Call it 35. Soooo, counting clock ticks
my desktop with the $100 processor from
the tiny box is 35 times faster than all
six of those mainframes combined.
With speculative execution, out of order
execution, pipelining, etc., there is on
average some number of instructions per
clock tick. So, first cut here, and maybe
a gift to the IBM mainframes, for the two
processors count the average number of
instructions per clock tick as the same.
So, this fantastic, we're talking not just
world class but unique in civilization
class, improvement in price/performance,
is much of what enabled Google, Facebook,
Amazon, Apple, Microsoft, AMD, TSMC
(Taiwan Semi-Conductor Manufacturing
Corporation), smart phones, solid state
disks, and essentially all of the
Internet.
E.g., for communications data rates for
connect from home, now with 1 Gbps
Internet versus then, 1987, 300 bps, ...,
to the 56 Kbps of V.90 of 1995.
So, how the heck in 1987 was a computer
science student or early career person to
anticipate and actually believe in this
fantastic, all-time unique step up in
price/performance of computing and
communications along with associated
infrastructure?
This progress blows away wood, wheels,
stone, copper, iron, steel, steam,
electricity, and broadcast radio and TV.
Now, what are we going to do with this
suddenly cheap, powerful computing,
communications, and infrastructure?
When I enrolled at the University of Maryland in 1985, it was still at the height of the fad, and the CS department actually had so many hopeful undergrads signing up to be CS majors that they had to filter them out and discourage them somehow.
They came up with this idea they called "Programming Calculus" that they taught with this language called "CF Pascal" (which we called "Crucially Fucked Pascal") that was lobotomized so it was simple enough to write "correctness proofs" (FWTW).
Hitting freshmen who have never programmed before with such a crucially fucked language, and then asking them to think about programming as calculus and write proofs that their programs are correct was an effective filtering technique!
(From a document I don't have the URL for, describing Programming Calculus:)
>Perhaps the most noticeable achievement in the Department's educational program was the restructuring of the first year of computer science courses. Unlike most other introductory courses which start by teaching a specific programming language, this first year is oriented around the calculus of programming. Concepts such as program correctness and formal verification are introduced, although at a simpler level, at the freshman level rather than at the advanced undergraduate or graduate level. The students write programs in a subset of Pascal, called CF-Pascal that includes character data and files only. "Advanced concepts", like integers and arithmetic, are delayed until the second semester. This new course was designed by Harlan Mills, aided by Victor Basili, John Gannon, Richard Hamlet, John Kohl and Ben Shneiderman.
(From "Maturation of Computer Science Research and Education at the University of Maryland: Evolution of the Department of Computer Science from 1979 through 2006", describing CF Pascal:)
>Educational activities: The first year of Basili’s tenure as Department Chair was taken up with the
problems of students and getting the University to agree to a restricted major for the Department. By the
second year attention could be placed on other educational concerns.
>CF-Pascal: As part of the plan for the restricted major, the freshman Computer Science courses were
redesigned. For most of the 1970s, the SIMPL family of languages by Basili, was the basic programming
model. SIMPL-T was used in the freshman programming course and a systems programming variant,
SIMPL-XI by Dick Hamlet, was written for the PDP-11 minicomputer.
>Spearheaded by Dr. Harlan Mills, Vic Basili, John Gannon, and Richard Hamlet identified a subset of
Pascal called CF-Pascal. This was Pascal using only character and file data types, and essentially made
programming in CF-Pascal equivalent to programming a Turing machine using denotational semantics as
the programming model [Mills89]. This became the basis for Computer Science I – CMSC 112. In 1984
Marv Zelkowitz, as part of the IBM Fulcrum award (See Section 3.2), obtained a laboratory of 20 IBM
PCs. A syntax directed editor was developed for CF-Pascal, called SUPPORT, and this environment was
used in CMSC 112 during 1986-87 [Zelkowitz84]. CF-Pascal was used for about 6 years until the
freshman course changed again in 1989.11 Pascal became the basic language, which later became C++
and finally Java as the freshman programming language.
HCI researcher Ben Shneiderman did a study comparing the CF Pascal "SUPPORT" programming tools with a mainframe batch programming environment, and it turned out despite their predictions, SUPPORT users didn't receive statistically higher exam and programming project grades, and users preferred the mainframe batch system over SUPPORT.
(From "Subjective user evaluation of CF PASCAL programming tools", describing the study comparing CF Pascal SUPPORT to a mainframe batch computing environment:)
>Subjective user evaluation of CF PASCAL programming tools; Journal Articles; 1987; Chin JP, Norman KL, Shneiderman B; Department of Computer Science and Human-Computer Interaction Laboratory Working Paper
>Abstract: This study investigated subjective evaluations of two programming environments: 1) SUPPORT, an interactive programming environment with a syntax directed editor on a personal computer and 2) a batch run environment on a large mainframe computer. Participants were students in a 15 week introductory computer science course. In Part 1, one group of 128 first used SUPPORT, while another group of 85 programmed on a mainframe environment. After 6 weeks they were given an evaluative questionnaire and then switched programming environments. In Part 2, 68 used SUPPORT and 60 used the mainframe. At the twelfth week of the course, they were given two questionnaires, one evaluating the environment they had used in the last 6 weeks and one comparing both enviro nments. A measure of programming performance (exam and programming project grades) was also collected. SUPPORT was predicted to reduce the burden of remembering syntactic details resulting in better performance and higher subjective evaluations. Unexpectedly, the SUPPORT users did not earn statistically significantly higher grades. Furthermore, participants expressed a preference for the mainframe over SUPPORT. Specific items on the questionnaires were used to diagnose the strengths and weaknesses of each environment. Designers of syntax directed editors should focus on reducing the syntactic burden not only in programming , but also in the user interface of these tools.
So in the end, CF Pascal achieved its lofty goal of driving away more prospective CS majors than a mainframe batch computing environment could.
This corroborates my long-term suspicion -- all those talk about provably correctness of programs is basically a way to gate keep people from entering the industry.
I mean, of the millions(?) of programmers out there churning out code, how many % of them actually prove their programs correct?
Despite what all the academia-types say, I firmly believe that programming is at most 50% math/logic, and the other 50% is literature or "writing skills". The ability to express ideas in text form that is easy for people to read and understand is one of the most crucial skills to have as a coder.
The point is, this aspect of programming is so neglected that people readily accept gatekeeping of aspiring programmers by scaring them away with advanced maths, but nobody ever told you that if your writing skills suck (even in English or your native language), you might end up a bad programmer...
"50% math/logic" can account for a lot of proving correctness. And it's not even hard, even writing your programs in a modern type-safe language will put you way ahead of the pack.
I'm not saying the math is insignificant. It's just that, if you agree in principle that a significant part of coding is making the code legible to other people, the question still remains -- why is math usually a required part of a CS degree, but you never hear CS degrees require training of technical(?) writing skills?
I wish this were the case now. The field has eternal Septembered.
Tools no longer empower, they constrict and patronize developers, setting up ceremonial barriers that are merely performative while restricting and shepherding users away from creativity independence and ownership into proscriptive systems.
The other disciplines actually take systems of qualifications seriously. Doctors, mechanical and civil engineers, lawyers, this is how they avoid crackpots, frauds and other Dunning Kruger claims.
We do no such thing. This damage control tech is the way of bumper laning everything while simultaneously kneecapping the serious practitioners.
Computer programming is trashed by the flood, nothing positive is coming of this. Like the Luddites who stood at the Dawn of the March of Intellect, the destination we are coursed for we ride to on waves of destruction. There's better ways possible.
I agree with your complaint about tools being designed to constrict and patronise us (my old manager used to always make reference to "hardware obstruction layers"). Perhaps software development as a whole is in an Eternal September, even, but don't think that using qualifications as gatekeeping tools will solve this.
I come from a "real" engineering background (EEE), as does my old man (Chem Eng) and things are no different there: the companies that care more about whether or not you're 6 Sigma certified than what actual, real experience you have at the coalface tend to repel highly competent problem solvers and attract the Process Eng equivalent of the "Microsoft Certified Solutions Developer".
The SWE "hacker" ideal where your code speaks for itself and they don't care what idiotic courses you've completed is one of the things that attracted me to the role.
But the code doesn't speak for itself. It's not meritocratic.
It's amenable to the same social ingratiations as all other institutions.
In the same way that say, homeopathy presents itself as medicine where it is just expensive capsules of pure sugar, the reliance on the extraordinary popular delusions and madness of the crowds has been a well documented problem since mackey wrote a book in that name in 1841.
That's why when barriers were real, in days of inferior computing, only competency could pull through while now it's been supplanted by conformance and orthodoxy.
To what? It's merely perfunctory. The levers to reality have been detached. We have become the homeopaths diluting belladonna in a hope that poison will become medicine. But all we get in the end is merely a sugar high.
I was one of five students in my program that had been a huge multi-campus program only a couple years earlier. We were the last cohort before the college discontinued the program entirely, and I was the only one to graduate. What I found however for maybe five years after graduation was a insanely high demand for developers.
There was genuinely a generation that was so strongly discouraged from becoming developers that there were very few. Seems to me like the folklorists have largely missed this.