Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: Is my career doomed if I don’t become an AI/ML Developer?
100 points by thatsamonad on Nov 10, 2017 | hide | past | favorite | 67 comments
A bit of background: I’ve been a software developer for about ten years now. I started doing JavaScript development and have transitioned in to more “back-end” development using Java, C#, and Python. However, I don’t have a computer science degree and my math education never progressed beyond trigonometry in college. That being said, I keep up on technology near every day and try to keep my skills as up-to-date as possible.

One of the major things I’ve seen all over HN and elsewhere is the emphasis on the growing AI/ML industry. I’ve tried to dig into both areas in some detail and find that the math is just way beyond me, leading me to believe that maybe it’s just not my cup of tea given that I don’t have a ton of extra time to teach myself calculus and advanced statistics. However, I’m concerned that if I don’t learn those things my career is going to be dead in ten years.

How realistic are these concerns? Should I be as worried as I am or will there be a need for good Java, etc., developers in the relatively near future?



I would bet a small fortune that the ML hype will not live up to expectation on a 7 year horizon. It is true that there are cool things coming out of the field, and it is true that there are a lot of smart people are entering the field. But the demand for researchers, using actual mathematics day-to-day, is going to be incredibly slim compared to what you'd think it'll be if you read the news about this stuff. Data science has been hyped up for close to 10 years now. And all that time were there fears that we'd all have to become statisticians in a not too far away future. But the day has not come. In fact, you could argue that there is even less reason to get dirty with pen and paper than ever, since a huge mountain of frameworks have been developed (e.g. tensorflow) precisely with the motivation that it'll be able to bring in math-afraid coders into the ML sphere.


Give it a year or two and you will see frameworks where the complex mathematics behind ML will be more abstracted away, the model architectures used for various applications will converge on their best uses and any software engineer will be able to train a model to do what they want and apply it to their product. There's a lot of hype right now, but you probably won't need the in-depth ML knowledge to get create products in the coming future. Right now there is still a lot of research (trial and error) on fiddling with architectures to get the best results. Personally I like to build products, and struggling with messy training code, running tests, and comparing results aren't my cup of tea.


I have a lot of sympathy for this view, but I think you should be a bit more specific about whose "expectation". ML has exceeded my "expectations" for the 5 year horizon now, and I'm expecting saturation of the class of current-ML-solvable problems soon; but that (barring some sort of paradigm shift that I can't see at the moment) the siloed nature of ML training will mean that actual applications will still create a strong demand for deployment solutions for about 5-7 years to go.


People have said this about apps and it's still going. I think eventually ML will become a grunt job like Android, PHP and so on.


What's cool for developers like you and me is the AI/ML technologies are being commoditized.

I have no idea how to build an image processor, or a database, or cloud infrastructure, but I can use Pillow, Postgres, and AWS to build products on top of those capabilities. AI is starting down the same route, where the number of developers who use it will vastly outstrip the number of developers who contribute directly to it.


This is not cool. The idea is to create very easily replaceable jobs, which makes sense for the creators but not for the users of those platforms. Even if it is hard and requires time to master its better to know the basics and aim for relatively difficult to replaceable jobs.


If you're worried about this, I would recommend learning about the "complements" to AI -- data preparation and distributed computing. Those both involve a lot of conventional software skills.

What you don't get from the flashy news articles is that a lot of ML work is programming and sys admin work: writing code to prepare the data, and a cycle of tweaking and re-running large jobs (including debugging them when they run out of resources, etc.).

You can also read papers like this which talk about the software engineering challenges of machine learning systems.

"Machine Learning: The High Interest Credit Card of Technical Debt"

https://research.google.com/pubs/pub43146.html

There's a big difference between "we got this thing to work once" (which is most of the new articles you read), and a system that does useful work over a long period of time. That latter is what you should know about as a software developer.


I do distributed computing and feel as equally dark about my prospects as the OP. You use TensorFlow, Tensor Flow Serving and call it a day? Compared to the advances in ML, systems is pretty well-mined out. The biggest excitements in the last 5 years were Docker and Kubernetes. There was a small blip of consensus systems around the time Docker was being rolled into production. I feel kinda depressed :'(


There's plenty of comments on saying no your career is not doomed. I'll add that using HackerNews is not a strong way of grasping industry needs. If one took their advice solely from HackerNews, one might think the only tech that matters is JavaScript and associated framework of the season. This is simply not true...check out indeed.com, many many languages are in demand. Lastly, AI might be the thing of the future but the press, even the "tech" press, has a way of bubbling up and propagating hot stuff.

Remember "nanotechnology" from a few years ago? I recall reading, in a respected source, soon we would all have nanotechnology technology infused clothing off which would just roll all foreign particles! Well...still wearing my cotton slacks today...


I was in "nanotechnology" during its heyday in academia and the inside joke was that everyone slapped "nanoparticles" onto whatever grant proposal they were writing, because the funding probably increased greatly with buzzword association.


Right, like reading HN you would think Django/Rails is what people are using for web frameworks. But actually Spring is more popular if you check StackOverflow careers for example. HN is an echo chamber of ML or JS hypes


But if you look at the trends on entreprise in the past 2-3 years, it is also moving to JS. So if HN is an echo chamber of future trends, it is good advantage for a developer to become top notch, isn't it? This is how I see HN (in the technical part).


Enterprise fully embraced Java and Spring for web and its popularity is growing.


I think I’ve seen more JS bashing on HackerNews than anywhere else, I disagree with your example. Not sure though if grasping industry needs here is good or not.


the difference is that nanotechnology never worked


And machine learning doesn’t solve 90% of the problems people think it does


1) ML is still going to be a tiny niche in numbers of all the software things that need to be done, it's just a comparably new and trendy niche, so it gets talked about.

2) In most ML projects, the majority of the necessary code written has nothing to do with ML, it's all the data transformation, transformation pipeline, service backend, data management tools, etc etc; so in a multi-person ML project usually most developers don't touch the ML part.


100% correct.

I am a statistician and a developer and my work falls into the domain of machine learning. The vast majority of my job is really painful data gathering, scrubbing, and fighting the toolchain. I have done this for YEARS, and the software is never just right, requires tons of tuning, huge budgets, and understanding management. These variables just don't line up that often. OP has nothing to worry about.

Moreover, the math stuff just gets abstracted away into a library. It's never like what went on in graduate school.


If your toolkit is centered around 'web' programming then I think you should consider widening your reach. If there is a phyla of developers it could arguably defined as;

   * embedded
   * web
   * control systems
   * data analysis
   * finance
   * networking
   * systems
The more bases you cover the more likely you will find a job developing code during a transition. Note that for senior engineers it isn't about the programming language it is about the domain knowledge around the programs.


Although having AI/ML expertise is obviously overkill for most engineers, understanding the concepts and limitations of AI/ML is a helpful skill for any engineering field. (specifically it is not a magic spell that can solve any problem no matter what)

But don’t say “I took Andrew Ng’s Coursera class and now I’m an expert on AI/ML!” like many engineers tend to do nowadays.


Agree on this point, but I'd add that you should do it only you're genuinely curious and not just because you have to. I'm finishing the Udacity nanodegree and while I'm not sure I'd recommend that specific program, I've really liked getting exposed to the fundamentals and concepts.

It reminds me of when I first started programming, I started seeing everything a little bit differently...I'm old and this will likely have little to no impact on my career but it's been a lot of fun.


I'll be signing up for that.


No you're fine. What you're seeing is the emergence of another subfield in computer science.

Your question isn't to dissimilar from: Is my career as a backend developer doomed if I don't become a JavaScript/React developer?


You know, being able to make a microprocessor from scratch and being able to supply the right instructions to get what you want -- these are different domains. As long as you can understanding what Machine Learning is _doing_ I think getting practice with some libraries and watching a lecture or two will give you the ability to play with and train them.

Basically, you have a training set of data that has "this is the data in" and "these are the outputs I expect." so if you are training something to learn the different animals of the animal kingdom, your data set might be portraits of giraffes and gorillas, and your outputs might be simple labels like "gorilla, snake, giraffe, tucan" etc. The neural net part of it is like the parameters you can hone. You can hone how many layers deep it is (try a layer for each dimension, like if you are tracking 5 facial features, you can try a network of roughly 5 layers). You have an "activation function" [usually sigmoid] because the _neurons_ sum their inputs and only fire on activation threshold (like your brain neurons).

Anyway, don't let the maths hold you back. When you can find a Python library on neural networks and play with it well enough to get results, you have pretty much figured out the puzzle. Not many people have deep intuition on how it works exactly, because the parameters (the "brain" of "neurons") is really an abstract mathematical object that is adjusted regressively over the course of learning [the training data].

Don't be discouraged, there is plenty of growth in software, and unless you are trying to help make a breakthrough in ML or "artifical intelligence" which, in my opinion, is something we have not even come close to touching, you don't really need to know the nitty gritty. The promising modern day results in ML came about because someone wanted to model a brain with a computer and ended up learning how to pattern match.

      (Results match Data?  Accept : Adjust)


No.

1997: Is my career doomed if I don't become a web developer? No.

2007: Is my career doomed if I don't become a mobile developer? No.

2017: Is my career doomed if I don't become an AI/ML developer? No.

2027: Is my career doomed if I don't become a/an [...]? No.


Best answer so far. Technology fads come and go. Anyone in the business for any length of time has seen so many of them. If you just get your news from the HN bubble, you're likely to believe AI/ML is all anyone does anymore, but I'd speculate that it's needed in less than 1% of all programming projects out there now and in the near future.

If you're a programmer, focus on the fundamentals: Learn to write new code and maintain legacy code. Learn to write tests. Learn API design. Learn how to estimate projects. Learn to differentiate between a risky code change and one that isn't risky. Learn to refactor something without changing its functionality. Read/know the literature. Don't get hung up on any particular language, framework or technology.


The fads are cyclical too. This isn't the first go around for AI, the last time AI was hot was the 80's, then there was the famous AI 'winter'. It's important to recognize hype for what it easy. Like the thread on meteor.js the other day. After the hype subsided, a lot of people realized they didn't really need what meteor.js provided.


Seems like "HN bubble" is a large part of this person's challenge. Like you say, "Technology fads come and go", and there's no place quite like HN to have your sanity picked apart by fad-feeding frenzies and when, all is said and done, the often time-wasting arguments and ultimately irrelevant material that flies around on HN.


Are my finances doomed if i didnt buy bitcoin at USD100 some years ago ?


Probably not... but, do you kind of hate yourself for missing the wave? Probably...


If you didn't buy bitcoin in back in 1978, then yes, you're doomed.


Exactly, the same thing happened with mobile a few years ago and it just didn't click with me. There was a bubble of mobile developers, apps and firms. I can recall at least 4 such firms that placed all their bets on mobile projects, and they're not around any more. And mobile has/had a much much greater market shaping influence than AI/ML will.

Actually, be wary with this sort of "specialization" because if the market shrinks or stops growing, you'll end up isolated from the work force.


Evergreen. I've been doing this professionally since 2000, with coding going back to the mid 90s when I was in school.

I'm at a client site that is still rolling things from their MAINFRAMES into web apps. I know some old LISP programmers that are freaking out that major systems will die because the youngest person on the team is pushing 60 with no code or programmer replacement in site.

There's always a long tail, and always something new coming out.


2018: Is my career doomed if I don't become an blockchain/crypto developer? No.


Is my career doomed if I don't become a sysadmin?


2027... quantum developer


I think you'll be okay.

If you want to get a look at how AI and ML are going to end up being applied to a huge number of business problems, try out the Azure Cognitive Services APIs[1]. I'm not associated with them in any way, I've just played around with the APIs and enjoyed them.

A huge number of developers today wouldn't be able to re-implement the frameworks they build applications on top of. And that doesn't stop them from being productive and well-paid. They add value be being able to to recognize real-world problems, and understand how various libraries and frameworks can be combined to solve those problems.

I think that AI/ML over the next 10 years will be like SaaS was over the previous 10 years. When frameworks like Rails became popular and gained wide acceptance, it became easy to very quickly come up with web-based solutions to a huge number of problems. And people didn't do this by being experts on how the whole tech stack worked. They did it by knowing how to solve problems with specific technologies, and then recognizing problems that the technology could solve. Lots of startups have made piles of money this way.

I'm a huge proponent of understanding as much as possible of the theory and math behind the technology to use. But being able to solve problems with the technology requires you to be out the world, recognizing problems that can be solved. For example: take a look at the kinds of output the Azure Computer Vision API can provide when provided with a given image[2]. Now think of all the businesses you've encountered during your career. Most of those businesses have problems you could solve using that API. And if you don't like MS or Azure, they're far from being the only provider of such APIs.

There is, and will continue to be, lots of opportunity (and money) available for people who can understand how to use new technology and recognize how to apply it to problems in a way that will save companies money, or help them make more money. Be one of those people.

[1]https://azure.microsoft.com/en-ca/services/cognitive-service... [2] https://azure.microsoft.com/en-ca/services/cognitive-service...


I've been a professional web developer for 8 years. I don't have a C.S. degree either, though I did go through calculus (not really relevant). Honestly, I think it's far more important to be a solid programmer than learning the hype of the month. That's not to say you shouldn't keep up with developments in technology though. I think it's just a fad that will fade out. Sure, it sounds like you can do some cool stuff such as image generation or natural language processing. However, I think those applications are not as numerous as the posts on HN would lead one to believe. Not everything needs some magical AI to synergize the widgets.


There is going to be a massive infrastructure built out in the way of AI services. You're going to be able to interact with those services using numerous existing languages, including Java, C#, Python and on down the line to include PHP. There will be several tiers of complexity, what we're seeing built now is the most difficult first tier. In the coming years, more and more layers of simplicity will be built, abstracting away the complexity. AI broadly is another tool, that's all it is, we'll build a lot of ways to use that tool.


At my university the "intro to AI" undergrad course has exploded into this 500 student monstrosity (and they're struggling to find a prof to teach a second section). The department has rules in place to make sure that CS students get first dibs on taking the course because of how many non-CS majors want in on the AI goldrush.

Much of this is overhype. ML is no replacement for good software development. Your career won't be dead. ML will undoubtedly be integrated into toolsets that don't involve any math whatsoever.


There is still a significant technical hurdle to overcome: position a programming problem as a machine learning problem. If we can take a semi-formal compilable spec snippet as a feature-set, and an AST subgraph as a labeled point, and bash that into tensors, then yes, a lot of code monkeys will be out of a job. All this hype of job automation of waiters and truck drivers - software engineering will be the first job to be "roboticized" - it has the least meatspace moving parts.


A lot of people are saying here that your career won't be doomed, and I think they make decent points. However, to play devil's advocate, I would like to argue that your career is in fact doomed if you don't learn AI/ML.

A lot of folks like to smugly predict the future based on their individual experience during past technology cycles. The advancements coming out from AI, however, are completely different than anything humans have seen before. It is entirely possible with deep learning that entire sectors of tech could be automated away. Maybe front end development doesn't get fully automated, but perhaps we build tech that makes a single developer 10-100x more productive? Then the labor demand for front end developers could drop in the same way that bank teller jobs have dropped due to ATMs and online banking. I would argue that while software is eating the world, AI is going to eat software.

Don't discount AI. Learn as much as you can about it and perhaps your career will be the last one to be automated away...


Bank teller jobs have actually increased since ATMs were introduced. I don't necessarily disagree with your point, but bad example.

https://www.wsj.com/articles/technology-isnt-a-job-killer-14...


According to a recent interview with OpenAI from YC's blog [1], they believe that an engineer without any AI/ML experience can be productive on their team from Day 1, and that AI/ML makes up 'virtually none' of the work. The vast majority of the work is apparently in building the infrastructure to support it.

In conclusion, it'll be an easy transition and you can provide most of the value an AI/ML company needs through pure engineering.

[1] http://blog.ycombinator.com/building-dota-bots-that-beat-pro...


You'll be fine.

Yes, it's true that if you happen to have expertise in ML or a related field then your career prospects right now are looking good. But not having ML absolutely doesn't mean that your career prospects are bad. To use medicine as a parallel: sure, it might be sexy and lucrative to be a Brain Surgeon (or whatever), but 1) only a tiny percentage of doctors can be brain surgeons, and 2) being a doctor is a solid and well-paid career no matter what you specialize in.

As for the people who say ML is going to make most software developers obsolete because machines are going to learn to write code - don't believe the hype. A large chunk of the work programmers do is translating vague specifications from other humans into precise technical concepts (and only then into code). The only kind of machine thay could really do that kind of work well would be a general AI, and that is 1) not even on the horizon in terms of our current technical knowledge, and 2) basically a social and cultural singularity for humanity as a whole, so that worrying about your own career prospects I those circumstances would honestly be kind of silly.


Not at all. You can still learn about blockchain.

Jokes aside, you don't have to learn AI/ML specifically to remain relevant, but you have to learn something. You can learn:

- about a domain (energy, finance, medicine)

- a toolset (math, stats, programming, design),

- a "hot" new technology (bitcoin, drones, wearables, VR),

- put together a network of talented people and so on.

You have to do something, but not necessarily any particular thing.


Short answer no.

Long answer, most if not all people who go on about AI/ML/Data science etc are people who don't really understand the nuances completely. They might have done Andrew Ng's course over coursera and calling themselves AI/ML expert. So, they are the physical manifestation of the proverb - "empty vessels make more noise".


It's not practical to do complex math every time there is a new application of AI. So the amount of math necessary will be limited. Devs will probably shortly need to be able to use AI libraries and/or cloud systems. This may be complex and use a little math sometimes but doesn't require a math degree. Less math becomes necessary over time as off the shelf becomes more powerful.

The next part is controversial but I personally believe the first unimpressive but totally general AI systems will appear in 2019. They will be animal-like intelligences, with real or simulated embodiment, inputs and outputs that can serve any purpose, online learners that can reuse the same nets across tasks.

As AGI becomes more powerful in the early to mid 2020s it will become apparent that most of the existing jobs will either be replaced or have to change to something the AIs can't or aren't allowed to do. AIs will be able to code.


Why 2019?


2019 because it looks to me like some groups like Ogma, Open AI, Deep Mind, as well as probably several others that aren't widely popularized, are starting to put various aspects of it together but it seems too speculative to assume anyone will both have all of the capabilities and in a strong enough level next year. I think someone may put it together more or less in 2018 but it will be nailed in a way that most can't deny in 2019.


For most developers, AI/ML will be just another tool in their toolbox.

You probably use compilers and/or interpreters every day, but most programmers i know have never written one. Have you? Software Security (Web/Internet Security) is a concern and you probably know the basics and how to use the libraries, but are you a security researcher who works on new encryption algorithms for a living? Probably not. There's a lot of major technology nowadays that uses hardware acceleration; even browsers use openGL/Vulcan/DirectX for rendering. But do you need to know how to write a 3D engine?

What i'm saying is: it wont hurt to know how to use AI/ML. A few will earn their living as AI/ML researchers and developers. But will it be necessary? I don't think so. It'll be just another branch of specialization.


I'm seeing associate level devops positions being completely replaced by ML. I was deeply skeptical - and still am - but I can't argue with the results. I'm a little worried of dev teams relying on hardware they don't understand, but that's happening with or without ML.


I have the same questions, but to my surprise, there are more software engineer/senior software engineer jobs in bay area than data scientist/machine learning engineer, according to LinkedIN. This is the scenario now ! No idea hows its going to be in 3-5 years from now !!


I could easily be wrong, but suspect the solve for AGI will not involve much in the way of complex math (I have been of the opinion for a long time that NNs are not the solution, see my recent Twitter thread here for elaboration: https://twitter.com/gavanw/status/929085831104512001).

Analogously, my advanced calculus is pretty weak these days, but I have found a strong understanding of basic algebra and vector math is all you need to be a good graphics programmer. There are many fields like this where you do not need to read or understand a single whitepaper in order to be functional within it.


no, if you can just write scripts to clean up the data used in the ML/AI models, you'll likely spin gold for the next decade or so.


This. I'm transitioning from being a full stack web dev (because that's basically impossible now) to being a data integration developer.

Old school ETL grew up along with the ML folks to be sophisticated, and essentially another tech. stack that looks like back end programming. Figuring out how to move data around reliably and quickly and clean it up in process is an unsexy job that's in really high demand. Using code (often JS) to pull out data and do mockups for the BI people before they plug in their fancy dataviz tools and do the pretty stuff is a lot like front end programming, and also not sexy, but in high demand.


Think of it this way: how many useful tools can be described by a simple CRUD app? How many of those actually NEED AI? Does the HR software need AI to show an org list, or manage the company calendar? What about banking software? What about just about every other app out there? Sure, you can get creative and think of many ways these could be enhanced by AI, but realistically there's going to be a demand for simple AI-free software long after we are all dead. And even when you do have HR software with AI built in, 90% of the app is still going to be simple CRUD stuff.


Understanding AI/ML will soon become something like knowing how to use regular expressions, parser generators and other such tools. The most common cases will fade into the background as more languages and libraries support them and you'll no more consider yourself an AI/ML developer then you do a "regex" developer.


Most work will be done using environments like TensorFlow. It doesn't mean you don't need to learn how things work. It just means that you don't need that much of maths, beyond understanding concepts.

There will be plenty of grunt work as well as work related to developing and maintaining the infrastructure.


You'll be good until we figure out how to make machines write enterprise ready software programs and the ability to maintain it on its owm. I think you have 10-20 years so you should be in management by then though so you have nothing to worry about :)


There is still a need for COBOL developers. I think Java will be fine for a fair few years yet.


It seems like a lot of the AI stuff is being abstracted high enough to make a transition into it pretty easy. If you’re cool with learning something new, then you’ll be fine!


I'm a distributed systems person. Not worried at all. There are lots of specializations - just choose one that is interesting to you!


You're never doomed if you keep learning -- and humans will be writing a lot of code for quite a while yet I believe.


According to me there are more chances of ML/AI jobs being automated than back-end/frontend jobs.


If you are interested in the field, it might be helpful to pay for a teacher and ask for advices. A teacher will be able to trace a path from the level you are now and help you learn just enough math for you to get started. It's way easier, take less time and is less frustrating. In France it's possible to get a math teacher from 30€ to 50€ an hour.


No, your career is not doomed. Follow your insight. Keep reinventing yourself, keep curious, keep learning be it in your career or AI/ML or anything.

Try to make friends, get involved in projects.

Good luck!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: