Hacker News new | past | comments | ask | show | jobs | submit login
Learning at work is work, and we must make space for it (sloanreview.mit.edu)
1673 points by sarapeyton on Dec 11, 2019 | hide | past | favorite | 453 comments



I’ve always learned on the job and have never asked permission. I guess I’m lucky that I haven’t worked in the type of places where somebody’s looking over my shoulder every minute of the day. Somewhere around an hour a day every day and I’ve been doing it for years and nobody has ever said anything about it.

I subscribe to those weekly emails for the programming languages we use at work and I read them when they come in. I sometimes watch a conference talk about implementing something similar to whatever I’m scheduled to do next.

If I were running a company I’d expect this of all high level employees. It’s your responsibility to be on top of whatever’s going on in your field.


I've started using Friday as a "personal development" day at work.

I do not write any code on Friday (unless it's a severe production level issue). Instead, I spend the mornings reviewing PRs that I wasn't included on (to keep up with whats happening, but also to learn more about how other people write and review code) and the afternoons are spent reading/researching/online classes.

This has really helped me avoid burn out. I go into the weekend less exhausted and more motivated to return on Monday and implement new stuff. It has also helped generate some inspiration for weekend/personal projects.


This sounds like a good idea that I would really like to try, for my own sanity if for nothing else. For me, the issue—imagined or not—would arise in the Friday morning daily stand-up. I’m not sure it would go over well if I said that I intend to spend part of the day doing PRs (this is fine and expected) and the other part learning/researching (likely not).

Oh the joys of the JIRA sweatshop. We have JIRA pulled up on the big screen TV and the product guys cycle through the status of each dev team members’ items during each daily standup. This is my first development job, surely it’s not like this everywhere?


The way we handle this is by padding our sprints enough so that there's always "extra time" at the end.

E.g: If we as engs think we can do 12 tickets in a one-week sprint, we commit to 8. This leaves room to pick up any production issue related work, address tech debt, and have some breathing room so we're not rushing through jira tickets.

It's also workplace specific, but imo product people shouldn't be engaging in daily standups other than passive observation. If they are drilling through the jira board and asking for status updates, they're taking on the role of a micro-manager, not a product owner/manager.

Standups should be engineers talking to eachother and raising blockers/issues that would prevent them from meeting the sprint goal they committed to, not daily check ins with product (again, imo).


How does this fly in an organization where your promotion is based on sprints and not sustainable coding/quality?


If you want to contribute to the value of the organization because you're a substantial equity holder, then talk to the founders. A sweatshop with high turnover should be less valuable to an acquirer than a strong team. Your initiative may be well received as leadership and you can get your promotion that way.

If you want to stay in the organization and get promoted by performance metrics, then figure out what numbers are being evaluated and work to maximize them, so that you appear as the model worker. It's not very hard to stay one step ahead of management if that's the game you're playing. If you have the best relationships and the best metrics, you should be able to get your promotions.

If you want to stay in the organization and coast, then who cares, just do what you want and they're almost certainly not going to fire you without plenty of warning. If you do get put on a PIP, then go find a new job immediately, and repeat. From what I can tell, most of the workforce does something similar to this for 30-40 years.

If you're not a leader, not a model worker, and not on autopilot, then you probably need to find a new organization where you can be one of those things.


You seem to have re-iterated the Gervais principle:

https://www.ribbonfarm.com/2009/10/07/the-gervais-principle-...

(although leader, model worker, autopilot are more polite terms than sociopath, clueless, and losers)


Ha, that's fantastic! I didn't know it had a name.


Not just a name, but a whole collection of essays characterizing modern corporate life. And you just managed to summarise its takeaway very eloquently in ordinary, everyday language.


What an interesting read. Thanks!


Eyes opening information. Thanks!


Such organization is begging for a quickly written code full of bugs. Then you do some overtime fixing it, and you are a hero for a moment. But before you get promoted, you get sick because your body cannot handle all that stress anymore.

Sometimes you learn to play around the rules. For example, if you are required to report your progress every day (and "sharpening the axe" is frowned upon), a possible solution is to report on Friday morning only half of what you did yesterday, and keep the rest for the Monday report.


The beauty with imaginary numbers (sometimes known as story points), is that that they are just that — imaginary. And as the dev, you know how to massage them to your liking.


It doesn't; find a better job.


Man, I wish this is how my last place did it. Instead, if we were getting through 12 we'd commit to 14 and then the dev manager would wonder why stuff wasn't getting done.


It's not like that everywhere

> product guys cycle through the status of each dev team members’ items during each daily standup

Something has gone horrible off the rails. Standup is supposed to be a quick time for every team member to raise any blockers primarily, with a quick "here's what I did, here's what I'm doing" type blurb. The benefit of standup is identifying blockers and if engineers are getting mired in problems (so you can fix those issues outside of standup).


A place where I worked got off the rails when standups were used to allow two people to work through a problem while the rest of us sit there patiently. They grew to be 45 minutes long each day.

People brought chairs. To a standup!


That sounds like an opportunity to very politely explain the phrase "take it offline" to everyone. It would be awkward, but probably less awkward than standing there patiently doing nothing for half an hour.


This is where the Scrum Master should step in and move the conversation along.


Agreed. PMs (product or project) should rarely speak during standups; instead, they should be listening for blockers and other action items for them.

I've worked in environments where the PMs micromanage like this. It's both hellish and extremely inefficient.


It's almost like... gasp management always used agile to implement micromanagement all along.


I logged in just to make this same comment but you beat me to it. Agile is micromanagement in disguise, and we fell for it hook, line, and sinker.


To be fair, we were already micromanaged. Then Agile was posited as a solution, so of course we were enthusiastic. Unfortunately the only thing that appeared to change was the name.

So we’re not really any worse off than before.


Are you sure you haven’t just always been micromanaged and this is another way for that to happen? All things can be abused for bad purposes.

I haven’t heard too many examples of “oh man we used to be free and ship great features on time but now that we have a backlog and talk to each other every day it’s a hellscape death march”


I've done my stint in "large enterprise". It's a whole new world there where management believe that if projects aren't on track, then the solution is more meetings, more agile training and in-house "coaches", and even more micromanaging. I'm no longer in that world, and I'll never go back.


As someone that has also worked for large enterprises as well as government, I’d say that using agile can be another way to micromanage, but it can also be used in a way that improves product quality and impact while helping developers.

Generally, I’ve found that the level to which agile approaches make life better depends on how much management is actually willing to let the team do its work and stay empowered. This can happen with trust and top cover in large enterprises, but it takes constant work at the PO / product manager level otherwise regression to the mean takes over. Also hard to avoid the inertia of making successful teams bigger rather than letting them continue small.

I wonder how the Amazons / Facebooks of the world avoid the trap, but then their enabling teams are likely a big percentage of their workforce because they understand how important software is to their business.


By top cover, do you mean managers shielding lower level employees? That was my experience at a mid-size finance firm, it was only possible for my team do to decent work because my team lead was fairly competent at keeping the higher-ups up at bay.


The best managers I've worked for have understood that one of their key roles is to be shit-umbrellas.


Precisely.


I stopped going to my team's standups because of this. For a few weeks I was occasionally asked about what I was doing, I just always said I was busy on whatever I was working on. Now I'm the only member of my team who never joins the standups. I wonder if they resent me for that, but it's not worth going back.


Probably they resent you. What feels like a good solution is discussing this in the retrospective or have an informal discussion about this. Then again, team members need to be open to discussing this and viewing it from another angle.

Also, by discussing the issue you might discover others feel the same way and are open to changing the process.


If this were the case then standups would be totally useless unless the org was so cooked that nothing gets done unless someone is made accountable in front of the entire team.

If I run into a blocker why would I wait until the next morning to try and get it resolved?


Usually the idea is you move onto another task since you have multiple allocated at one time, then ask briefly for guidance during the standup when everyone has put time aside, rather than interrupting everyone throughout the day.

Blockers are not just people and usually it's not done for accountability. They can be undocumented APIs, an unfamiliar requirement ("do we already have a way to extract images from PDFs before I download this new thing?"), etc.

If it is urgent and your only option is to sit there and do nothing then asking the team for help during the day is fine. Just weigh up the cost of interrupting everyone (or that high-performer who has all the answers) against what you gain.


> If it is urgent and your only option is to sit there and do nothing then asking the team for help during the day is fine.

Sit there and do nothing for awhile just may be the right thing to do!

I'm thinking about the Theory of Constraints and optimizing a system versus individual parts. This is a concept you would want your PM to know. So when they are mentally mapping the workflow heard during standups, they can be thinking about optimizing the entire production system.

Next step is the importance of communicating this to everyone on the team, so that they understand that there may be times when the right thing to do is nothing.


"Hey, I spent all yesterday trying to get X to work but I think I'm stuck"

"Hey I was trying to use Mary's api but I couldn't figure it out, can someone help me find who to contact?"

If you ask immediately for help all the time you aren't being self sufficient. At the same time if you are spinning your wheels then just a second pair of eyes can be helpful.


I had a new manager start his first weekly meeting by asking if there was anything on fire. People filled the ensuing silence with recent annoyances. Of course nothing is on fire right now (and I get it’s a metaphor). If something is “on fire”, I’m not waiting till the Monday meeting to bring it up. I’m not even going to attend the mandatory meeting while I put it out, if that’s when it happens. That’s the definition of on fire.


To be fair, your new manager doesn't know that's how you operate. It seems a reasonable question to me, with the expected answer being no. But why not ask just in case? You wouldn't want to be the new manager who just launches into new business not realizing that one of your developers is too timid to interrupt you with the major breaking bug that's currently live, or whatever. Once they get to know the team and can trust that you'd be on top of that kind of thing, it's another story.


Maybe I’m too sensitive to the phrase “on fire”. I can’t imagine someone at a factory starting a meeting the same way.


He was perhaps trying to get a feel for the team?


Yep, that's a way away from ideal.

You're unlucky, there are a lot of software teams out there where the developers are pretty much autonomous. That's not to say they do absolutely whatever they want, but micromanagement is usually out of the question.

A couple of months ago I spent about half my working hours in a week watching everything that happened at .Net Conf. I didn't ask anyone, I just said I was taking training time and since I don't take much no-one cared.

If I was in your position, I'd take this as a sign that there are better places to work. Wait out your current job until it's adding value to your CV (few good projects and contributions you can talk about), then move out. You'll get more money, and you'll probably have a better idea of what to look for.


Any idea what to look for when searching for such a job? I have asked about micromanagement in interviews before but I usually just get bullshit answers.


Brainstorming a bit on this. Curious for other ideas:

Start with asking questions about how work is assigned/doled out. Where does all of their work come from, the Agile board? Someone stopping by and asking "can you do x" or "can you help Jill with Y".

Follow that up by digging into how they then talk to others about their progress on that work. Ask if they're interrupted or allowed to progress independently? How often does someone ask them "Is that done yet?"

Ask about how often they're asked about the status of the same piece of work by different people. Dig into how they keep everyone else apprised of what they're doing.

Ask them about their relationship with their Scrum Master, Project Manager, Product Owner, Dev Manager, etc. Ask what they could change about it if they could.

That line of questions may uncover the micromanagement pattern. Even if it doesn't, it will go a long ways in helping you get a feel for how a team works.


I'm not sure about other places but the teams I've worked on and worked with at Google works like this.


Where I work standups are for devs only. The product guys are involved in planning and reveals, but otherwise they stay out of our hair. We're expected to complete what we've committed to on time or promptly notify the product guys if that's not going to happen for whatever reason.

About once a week (usually Friday) I tell my team I'm going to spend the day researching or exploring some new idea that doesn't have any stories yet. It's never a problem. A decent chunk of our big leaps forward for our projects have come from us working on random stuff nobody told us to work on.


By the way, this is the official SCRUM way. Product people should be there for planning and for demo. If at the end of the sprint they got what they wanted to get, they should be happy and do their own work... or relax if they have nothing else to do.

Micromanaging means you have nothing useful to do, and you are needlessly making other people angry. Such people should be fired first. (Yeah, I know, they are often the last ones to stay, because people who have better options leave first.)


You may want to describe part of your work on a task as reviewing solutions (for a given PR) and then after the standup spend some time learning about related tech to whatever PRs you have assigned.

If the product guys are any good they will care about having 'options' and will therefore welcome devs that are taking a broader look of the solution space.


It isn’t like that everywhere but it’s quite common, especially in companies where development is not the primary focus of the company. You’re more likely to end up in that kind of company than not unless you are in a place like the Bay Area or similar tech hub.


Replying purely for reinforcement of the parent.

This is not just common, but the norm in most organizations. I think the only time in the past 12 years (working with probably 20 organizations) where this didn't occur is the one engagement when I was charged managing the team.

There's far too many places where a team of even just four or five developers will spend 30 minutes (sometimes an hour!) every morning in "standup" being grilled on every single item committed to in the sprint.


I recently had my first experience with "agile" and "scrum", and this was my experience. I left after being there for 6 months for unrelated reasons, but one of things that made it an easy decision was the daily fucking standup.

I convinced the team to do without the standup and just put the status updates in slack for a week. At the end I was the only one who thought it was better. To quote one developer "I don't read it when its in slack". That indicates to me that the information isn't necessary to do your goddamned job.

I dislike agile now as a result of that experience.


It is like this in two out of 4 places that I’ve worked.

Those two were both large corporations. It is my theory that it happens as soon as you have a separate scrum/project manager for the project you work on.


Stand up for what you believe in. Tell them you are spending the day on research.

Better yet, cancel that standup because it's garbage. Only one representative needs to go to the product meeting.


Well, you don't have to mentioned that you are doing your own research...

Do they look over your shoulder every minute?


This is what I encourage my staff to do. We have a company-wide bagel breakfast in the morning, and then no more meetings. We also skip our daily “async slack stand-up” on Fridays. My expectation for the day is that you prioritize personal development, long-term planning, or just “deck clearing” if you’ve had a long week or desire something less mentally demanding for the day. Obviously, thing happen that can interfere with the ideal, but it’s pretty rare and, when it happens, I make it a point to figure out if there’s a way to avoid it going forward.

What we don’t do, is “no deploy Fridays”. We deploy all the time, and maybe somebody wants to do some minor bug or light UI cleanup. I’m certainly not going to stop them. Requiring an “off-day” for deployments is a major red flag for me.


Except when the Friday deployment breaks production and one of your employees needs to fix the issue on Saturday instead of relaxing with their family.


You seem to be getting down voted but this is important. Unless the company is paying staff to be available and on call over the weekend people have lives to be getting on with and wouldn’t be able to respond to an issue as quickly than during the week.


I understand this isn't a perfect world and problems happen but we also shouldn't live in constant fear that a deployment will break production. There should be processes in place that prevent broken code from even getting to the point where it's merged into master.

The biggest extreme that I've faced was a team where there was a 36 hour window between Wednesday and Thursday where deployments to production were allowed and it was a nightmare getting any code out especially with most deployments involving 10-20 new commits so if something did break we had to rollback a lot and rack our heads to figure out what went wrong.


I agree. Employers, teams, and employees need to have clear expectations and communication around availability, including how that is related to compensation, but also how it relates to the design of the systems themselves, the planning that goes into rollouts, and a culture of quality and accountability.

Also, sites "go down" for more reasons than deployments, and often, a deployment is one of the fastest or only ways to fix or work around the issue.


The company I'm at sets aside every other Friday for all engineers to learn, contribute to open source, hack on an idea they may have (that may or may not be related to work), etc.

So far the only problems I've seen is that sometimes it's hard to convince engineers to actually use the time instead of still working on tickets.


very cool. does your company maintain the IP ownership to anything made during these periods?


I would expect so. Doesn't it seem a bit much to expect the company to give you paid time to work on your own personal side projects? They don't own the knowledge and skills you gain doing it though.


My team when I first joined had something like this; we called it a 'spike' day. Basically a full day of time where we could learn about something new, maybe hack on a non-necessary feature or even personal project for a few hours a week.

But sadly over the past few years we've kind of just done away with it. We don't allot time for it specifically anymore. We think about it from time to time and wish we still had the bandwidth for it.

But it just doesn't happen anymore.

It's a tough thing to build into a team's structure, I think, and in such a way that it's sustainable.


Go in tomorrow and tell everyone it's time to start doing spike again.


I run a small consultancy and Development Friday's are almost mandatory for my developers. Everyone is given Friday afternoon to learn new skills or improve existing ones, I even tell my clients that this happens (we charge by the week) and in the last 4 years I've only had one complaint and we quickly cancelled the contract with them after.


I love this, and that a major value in it is feeling refreshed. I think I'm going to try this a bit, time-permitted, and ask for forgiveness later if there's ever an issue. I'm constantly finding gaps in my knowledge that my smarter team members would certainly benefit from me being more on the same page on if I spent more time developing my skills.


'Great Thoughts Friday' from Richard Hamming

https://youtu.be/a1zDuOPkMSw


Slightly better video and sound (from the same source material):

https://www.youtube.com/watch?v=e3msMuwqp-o&list=PLctkxgWNSR...


This is pretty much what I do at the company I work for. We also spend part of that time having roundtable discussions on different topics.


This is a brilliant idea! Hopefully it becomes popular.


As an engineering manager, this is what I expect my top performers to do and encourage my junior folks to do the same. It benefits the IC, the delivery team, and the company at large.

You can't heads-down slam out code for 8 hours a day 5 days a week. Be responsible with your time and use it to push forward the vision & mission of the company by taking your professional development into your own hands.


Continuing education in software engineering has always been a challenge for me. While my current employer allows for 20% time to learn new things, I find that I'm just unable to. Many employers (not all) place constraints on what one can do with that time. Typically the biggest constraint is that it must relate to the business in some way. As such, it can be hard to justify why you're spending your 20% time learning how load balancers work in nitty-gritty details since it's unlikely you'll be writing one from scratch or helping the company with it.

Not only that, but if you choose to learn on your own time, finding a lesson to fit in your daily routine is also tricky, especially if you're caring for a family or have other commitments. Couple that with uncertainty about what to learn next, it can become overwhelming just to get started.

Very recently, I started working on a project[1] to address this exact issue. The project is to help established software engineers progress their careers, learn new concepts, and refresh their existing knowledge with daily bite-size software engineering lessons designed to fit in their daily routines.

[1] https://www.dailyswe.com


> Typically the biggest constraint is that it must relate to the business in some way.

This is so short-sighted, because it means you can't learn anything unless your boss is 100% sure it will be immediately useful (at which moment, someone else is probably already assigned to do it). Most things I learned in my life were not immediately useful when I learned them, but many became useful later. Programming itself is a good example of this; when I was a kid, computers were considered just an expensive toy. By this logic, I should have never learned programming in the first place.

These constraints do not allow you to explore. If there is a new framework or a new programming language which MIGHT improve your productivity, but also MIGHT be a useless fad, you are not allowed to find out which one it is. No one in your team is. Thus you get stuck with the old technologies forever (or someone breaks the rules, or someone studies the new technology in their free time).


"As such, it can be hard to justify why you're spending your 20% time learning how load balancers work in nitty-gritty details since it's unlikely you'll be writing one from scratch or helping the company with it."

Unless you use load balancers in non-trivial ways at scale and really need to understand the ins and outs of how they work to utilize them effectively.


Well, you know that, and I know that, and OP knows that. But it’s likely that OP’s boss not only doesn’t know that, but is mentally incapable of comprehending it. And even if he does (my boss is actually a sharp developer who does understand what’s going on), remember that we all have 8 bosses at any time.


You can't generally predict how that knowledge will be useful, load balances don't exist in a vacuum, he might learn about packet structure, fail over and all kind of other stuff directly applicable


You don't have time to learn, but you have time to build an app backed by a service?

Or was that a faux story for your sales pitch?


Procrastination is a powerful motivator


110% agreed

I've always felt a level of entitlement around that. You (the business) want me to learn and get better. You get more value over time that way.


I think we're lucky that many companies have this attitude.

I've experienced the polar opposite in a Japanese bigco where we were given extensive material to learn (generally about the industry, or certifications), and it was very clearly understood that we were to learn the material off company time. As an American naturally I had an allergic reaction to this culture.


Friend of mine is a senior exec for the Fujitsu Europe. I met him doing the Haute Route, a two-week ski tour in the Alps.

He says the amount of persuading he had to do to the global execs that being off-grid for two weeks might have it's benefits almost made it futile.

I can't help but wonder if this attitude has inhibited japanese progress.


To be honest, it's impossible to survive in tech if you are not constantly keeping up to date.


What's IC?


Individual contributor. It's a fancy management word for SWE or anyone else in a non-management role.


Ug...ok, I'll be that guy: what's an SWE?


Software Engineer.

I have no idea who started using these acronyms but they definitely forgot they're new and not universal.


I guess the W is to differentiate it from Systems Engineer?

We use SE here in Japan for both software engineers and system engineers. How are they difference? The latter is older and seems to have originated from people doing architectural, consulting type of work.

Fun fact, consulting/contracting businesses here and also known as SES, System Engineer (as a) Service.


I haven't heard SE outside of a Japanese context, so this may just be a case of different countries randomly picking different acronyms.


A simpler, more universal word might be grunt.


The higher levels of IC are usually more noncom than grunt.


> never asked permission

That's a key thing.

Even in the most high pressure environments, where "project management professionals" are breathing down your neck for "deliverables" it is your responsibility to learn stuff and evolve.

It's nice if there's time and resources for people to better themselves, but these are typically limited to activities which are "related" to one's work function and are subject to "approval" from people whose interest in your career stops hard at what you can do for them in the roles they assume you're best-suited for.

The good news is most places are NOT rank-and-yank hellscapes where every minute you do something that's not on somebody's gantt chart will cause you harm in the long term. In most places you can lift your nose off the grindstone... but you have to be willing to impose your own self-direction and structure.


Same but I've also been in places where they do look over your shoulder and I ended up fighting the tide. Everyone around me was in a constant panic mode, so I slowed my work output for 2 weeks (most I could get away with without ending up in a room talking about my work performance) where I studied and optimized the tasks consuming most of my time. Then automated them (this involved working what felt like two jobs at once pushing well over 80h a week)....

When I finished I had so much free time I could finally start doing the interesting projects and my work output was consistently 2x my co-workers. I considered explaining this to management but they had just admonished a co-worker for not doing the thing they asked and trying to improve process. So I kept it to myself.

Obviously this was a bit of a toxic work environment (early in my career) and I was luckily switched out of this group.

My point is that you can fight bad management to make a better working environment for yourself but it takes a lot of effort and you need to be strategic in what you tell them as it can backfire.


> look over your shoulder

I suspect that one of the drivers behind open offices is to make it easy to “catch” people who are “wasting time” learning on the job.


I used to do this, and then we "switched to be more agile." (This just meant using JIRA and tracking sprint efficiency wrong).

Companies do think "It’s your responsibility to be on top of whatever’s going on in your field" and they might even claim that learning and development on the job is important, but when it comes time to log JIRA hours, they tend to show how much they really believe in this.


If you want to track this through JIRA you need to make time for both learning and teaching. I've found that mgmt is reluctant to allow for "a day of learning", but if the output of that is a document or a small seminar where you can share that knowledge with the team it goes down much more easily.


It also takes all the fun out of it. I can’t properly get my mind in a state to learn if I’m going to be held accountable for how much I pick up.


> I can’t properly get my mind in a state to learn if I’m going to be held accountable for how much I pick up.

That is school and university in a nutshell though.


Maybe make tickets for your learning activities?


Heh, would not go over well. Now that we have JIRA, tickets are not just watched by my boss, but my bosses boss, my bosses bosses boss, and a few dedicated project managers of some kind.


Yuck. When my company started pushing for more "agility" and my team moved to a sprint structure, I was lucky enough to have personal leverage as a senior team member. Early on I made it clear that tickets and sprints exclusively exist for the team to plan and manage expectations, and that I wasn't going to hold with upper management using them as leverage over anybody. To support that, I had to interrupt stand-ups a few times when someone would start iterating through tickets for progress instead of letting people report.

I wish misuse of these tools wasn't so widespread, but it was to be expected.


> ...tickets are not just watched by my boss, but my bosses boss, my bosses bosses boss...

Are all these people really hired for their competency? The amount of wasted time this describes is staggering. Even at the worst of times, ticket interaction by even a skip-level was a sign of mounting org-wide desperation.


Use code words.


We didn't use JIRA (very much a NIH-company with our own job tracker) but we always had a job number to log training/learning hours against.

IIRC, management actually questioned people who didn't log much time on it. There were a lot of problems at this place but I liked that.


“using JIRA and tracking sprint efficiency wrong” is the working definition of agile at most companies AFAICT.

This makes me sad. I had a great experience with what I now think of as “authentic agile” at my second job, now I doubt I’ll ever get to do things that way ever again.


Very much this. If the management culture at your workplace is so sick they feel the need to make this behaviour a problem, there are probably other things wrong with the culture there.


Yes it is a huge red flag in general. If your company discourages learning on the job, you run, cause the company is only heading down hill.


You're correct that it's our own responsibility to be on top of whatever is going on and to train ourselves. However, plenty of companies think that expect for training scheduled and provided by the company, that it's not appropriate to train ourselves on the clock.

Personally I agree with you, and find that discouraging people from training themselves during business hours just results in most employees not fully applying themselves.


Someone who is great with how to develop good processes at work recently said something that made me think a lot about this as well.

Why have WIP limits? Well one of the reasons is to be able to create downtime where you can go watch a video or read a newsletter. So I've been trying to spread this message in my team when someone has asked "The WIP is blocking and I can't really help anywhere at the moment" by answering "Go watch a video or read or take a walk for 10 minutes".


Since you weren't explicit about it, having a PM or manager who a) understands this concept, and b) can communicate it to the team is important and what you did here!

I mentioned in a comment above that with the Theory of Constraints in mind, sometimes the right thing to do for an individual contributor at a point in time is, nothing.

You went ahead and explicitly turned that into time to learn for them. Bravo.


> I subscribe to those weekly emails for the programming languages we use at work

--

Are these an internal thing for your organization? is it Just joining a mailing list for X language? Or is there some aggregation service you could share?


I found this good curated list: https://github.com/zudochkin/awesome-newsletters


Folks over at CooperPress maintain several dev weekly newsletters with pretty good overview of week's news: https://cooperpress.com/publications/


Me too. The company benefits in the long run. I've never had any complaints. In general I've found no one ever complains if you spend time learning.


As a remote company we're dedicating a 1-hour video call every week to learning. On top of that we've 2-3 voluntary video calls that we call MJ Talks (similar to TED Talks) talking about all kinds of topics ranging from technical and software related topics, but also health, fitness, personal finance, etc.

On top of that we dedicate one week every year to something that we call MJ University. We rent a hotel in a location and fly everyone in to that location. If you're interested on all the details, we've written them up here: https://mobilejazz.com/blog/mj-university-staying-ahead-of-t...


I really think this is the way to do it and it's also how I've always approached this. So much so that now the other members of the team I'm on joke about how I'm the guy that likes jumping down rabbit holes - which they're perfectly fine with!

Something very important to this approach IMHO is paying your learning forward. Mentor other devs, write guides for your team/organization, etc. Make the benefit of your learning visible to everyone else.

That means that yes, your learning has to have value to the business and to the team. If that aligns, I can't imagine many problems arising. Your learning becomes an act of giving and taking, rather than just taking, which is a win-win for everyone involved.


What does "learned on the job" mean, though?

Were you reading Stackoverflow? Taking a digital course and watching lecture videos? Reading a textbook with a pencil and paper out? Attending in-person trainings?

All of these things are scrutinized very differently in the workplace.


Exactly. Lots of places will not blink an eye if you are reading a blog post on something you'll be able to apply within a week. Having a textbook with pen and paper out to learn something that may not bring fruit for several months is a different story.


Same here - i keep all the deadlines and use the spare time to better myself. Or even implement new ideas in production code.

Also latter half of Friday is officially, by company policy, set up for such tasks - learning, and finishing internal tasks that always get pushed back.

The more i learn, the better value/cost ratio for my employer.

I would immiedetly change jobs if they didn't challenge me, or at least provide an environment where i can learn.


This over anything else. I wouldn't get to be a CTO multiple times if I wasn't in this mindset.


I sort of felt embarrassed when my manager told me to stop solving so many tickets and spend more time learning and improving my skills. He was so right.


This is research, it's valid part of anyone's job.


That you should “always be learning” is absolutely true. It helps with neuro-plasticity and keeps you engaged.

That said, as a manager I find it hard to get direct reports to accept sometimes that it is not only okay, but required, by me that they learn new things. I do what I can to encourage it, offer to buy books for people, give time to do online course work, etc. They often complain that they don’t feel like they are “Working” even though I explain to them that as long as its work/business related I will expect to be able to call upon them in the future with this new knowledge.

So what can I do as a manager to make it more “okay” to spend time at work learning?


It's because when annual reviews happen they aren't talking about the books they read they're talking about a project that they contributed to.

In my experience formalizing "learning" in the work place doesn't work because it requires the performance of learning for management types rather than real learning which involves working through real new problems over an extended period of time.

The real way to get employees to learn is to hand them responsibilities they are not fully prepared for along with the pay that goes with them and see how it goes. Right or wrong managers are rarely comfortable doing that.

Employees need to be comfortable failing in front of you and few are because there are few good examples of that turning out well. When it does go well, all too often the raise they were promised doesn't come through.

this isn't to say you're a poor manager, just that it's unusual to have a healthy environment for on the job learning.


I agree, I personally have similar problems with formalised learning. Nothing sticks until I use the knowledge practically, and toy projects don't seem to count. Solving a real issue is the perfect time to learn (especially if someone else is around who can mentor or verify the result).


At least one big company I know of, learning is part of performance reviews - agreed upon in advance by the manager and engineer. It didn't usually help (at least AFAIK). Part the fault of the engineers and rest the culture of the team to always be fighting fires.


I don't know the exact situation you're talking about, but my guess is that this is exactly the type of formal "learning" I would argue does not work.


This is spot on. Thank you.


Part of it is just people - I had a co-worker once that their supervisor (remote) did the same thing, bought training courses for them, instructed them to make it part of their schedule to do the courses and learn more, assigned another developer to try to mentor, etc. But this person was also overloaded by said supervisor with day-to-day operational work and felt responsibility to accomplish that work at the expense of their own personal development. [and had personal life situations that they could not easily spend extra time outside of the time they were in the office] They sat next to me for a few months, and so once on whatever afternoon that they had blocked off on their calendar for training, I physically went over to their desk while they were gone, unplugged their phone, and when they came back, reminded them this was their training time and they needed to spend the time the company was giving them on it. They did that day, but then got some flak from another non-co-located employee for not getting enough operational tasks done, and I don't think did much training again after that.

If the entire org doesn't encourage learning, grant the time for it to happen, and protect employees from operational reprecussions of spending time learning, it is hard for individual employees to make it happen.


Reduce pressure. When I don't have 7 customer deadlines I'm much more inclined to spend the hour daily I'm meant to on training.

Reward learning with career advancement.

Send your people on relevant training courses out of the office as often as is useful/practical.

Oh and if you have timesheets make sure you can "bill" to training!


I think scrum, jira, backlogs and always tight deadlines condition people to never sit back and systematically learn something. Instead you feel the urge to always “produce” and feel guilty if you don’t. At least his happened to me. Lately I am making a conscious effort to work from home for a day and take a Udemy class or read a book about something where I feel i don’t have enough background in. The modern frenetic workplace discourages sitting down and systematically learning something.

So as manager take a look and see if your environment makes people into ticket closing machines or into people who have the freedom to allocate their time where they feel it’s most useful.


In theory, scrum is supposed to provide intentional non-ticket time after retro and before the next ipm/scrum meeting. That time can and should be used for learning, hacking, contributing, etc.

Reality of course differs.


The first rule of software methodology: no matter what the methodology proposes, professional “managers” will turn it into a micromanaged waterfall process.


I was so stoked when I read about that part of scrum back in 2006. Not once has it materialized since.


Are there still places left that don't do scrum? Hopeless to put "I am working for a workplace that does not practice scrum/sprints" in my cover letter?


Actually almost nobody does Scrum. They use some artifacts from Scrum but leave out the important parts.


It’s actually way, way worse than that. Nobody does Scrum and everybody seems to be getting into SAFe.

https://www.scaledagile.com/safe-5-preview/


Probably Parkinson’s Law in action.

You are providing the “option” but not literally scheduling the time. Employees need cover otherwise you are “learning” but nervous something will “slip” because “Learning” isn’t business critical.

https://en.m.wikipedia.org/wiki/Parkinson%27s_law


>I find it hard to get direct reports to accept sometimes that it is not only okay, but required, by me that they learn new things

Does this apply to skills that might be more applicable to another team?

My current manager encourages learning, but shoos me away from topics that are handled by other managers. Frustratingly, he's blocked transitions of mine to other teams with work that I've found interesting and educated myself on.

How do you prioritize learning, and balance it against your interests as a manager to keep your resources focused on your objectives?


You can't, and you shouldn't. Your "resources" (what a derogatory way to refer to people, by the way) will be more valuable, more competent, if they know more than just that what's needed for your team's direct focus.

Being able to place your own work in broader context is very powerful.


>Your "resources" (what a derogatory way to refer to people, by the way) will be more valuable, more competent, if they know more than just that what's needed for your team's direct focus.

Would that I could safely direct your ire to my manager.


You mean to your management resource? :)


The second a manager blocks a transition it’s time to leave the company.


> So what can I do as a manager to make it more “okay” to spend time at work learning?

Be seen doing it yourself.


Several companies that friends of mine work at institute a 10% personal project time policy. Where 10% of their week is devoted to personal projects. They pick any topic of interest related to programing, learn something new, and when they are done they show the project to the team to present what they've learned. I don't think they have a time limit per se. Some people I know have done work with the raspberry pi, or learned a new framework, or implemented something they were doing at work in a different language. Leaving it open ended allowed for people to pick something that was of interest to them.

The problem with this approach I think is you get more buy-in, but might be arguably less directly applicable to work.


> ...might be arguably less directly applicable to work.

Why is this a problem?

IC continuing education isn't (primarily) about having them finish reading the RFC even though they've already gleaned what they needed for their immediate problem. Rather, it's about drawing in whole new areas of knowledge. It's about keeping your deck stacked with wildcards so when you get blocked by something hard, not covered by your standard 'best-practices' you have enough diversity of experience to actually have a hope in hell of having something to draw upon for inspiration on how to solve it.


>Why is this a problem?

I don't see it as a problem; my boss sees it as a problem. I've tried reasoning, but without that direct connection of "what am I paying you for" it just falls of deaf ears.


Put it in expectations, quantify it, and reward it at performance reviews. In a self-review, I should be able to write that I took Andrew Ng's Coursera ML course and be rewarded for how that makes me a more valuable team member. On the employee end, I should be able to explain what I learned, how it increases my value to the team, and how I've applied or will apply it.

Too often, companies "expect" their employees to take advantage of the fact that they allow them to learn on the job, but only reward short-term performance.


As an employee I’ve felt like it’s hard in the past to take initiative and do things that aren’t immediately and clearly justifiable as strictly necessary work. Reading a book might or might not be useful but doing something boring but mandatory is certainly not slacking. With that said any time I tried to take more initiative and take more risks (while making a sincere effort to work on the thing that’s going to move the group/project forward) and then just accept correction if I make the wrong call, usually work was more enjoyable and I was more productive (by estimation of me, other coworkers, and my manager).


I recommend making it part of regular cadence. Every Friday or every other Friday (or part of the day). You can encourage them to share learnings or summaries with the teams. These days can also be used to attempt to apply learnings in ways that improve the team or org. These don't need to be formal courses or books either: it can be exploring and learning systems at work, or understanding system telemetry or customer use cases better.


> So what can I do as a manager to make it more “okay” to spend time at work learning?

Maybe instead of having them read something, make them give a presentation on it to the rest of the team?

Then they have deadlines and produce content. Sounds just like work to me.


If you want your reports to learn on the job:

1. Make it part of their annual goals (e.g. attend a conference, get a certification, compete in a competition).

2. Throw them in the deep end, a little beyond their edge, and be okay if they fumble around a bit.

3. Make them teach. Give them stuff to understand and present to your team or other teams. That forcing function will give them a mechanism to immediately exercise their newfound knowledge/skill.

4. Make room in your project schedules for it. They'll learn on their own if they have the time, or they'll invent something. The ones that don't are your bottom tier.

My first boss sent me to Siggraph my first year to drink from the fire hose. He also asked me, on my first day, what I was worst at of all the areas of programming I was aware of (Windows UI) and assigned me three months of work writing custom controls... I did it in six months, and I've never been able to thank him enough for teaching me, right off the bat, that your people are your greatest investment, that optimizing for the project is rarely the right choice.


Decrease expectations in other areas? If it takes me 8 hours a day to do all the work you expect, and you say “oh it’s fine if you spend time learning”, am I supposed to stay for 9 hours a day to do the learning?


If you are actually paying them to learn, which is the key issue here, then they shouldn't complain and i'd have a hard time believing they would.

So the solution is to make it clear you pay them to learn.


Tie it to deliverables, like reporting back to the team about the content or giving seminar style talks.


> Tie it to deliverables

That's a great idea in theory because tackling a real world problem tends to motivate towards a real world solution.

However, learning invariably encompasses failure which rarely measures up well with typical (if arguably useful/useless) performance metrics that developers contend with.


If the deliverable is a team report, then failures are super useful and still count. "I tried implementing X as a test after reading about method Y, but found that for our domain there are serious drawbacks."


I wholeheartedly agree with that outcome.

I was referring to the potential for a metric that could backfire in the context of a performance review. (And that'd be a relatively tame surprise in one of those god-forsaken events...)


I've just had a preliminary conversation about people's next set of objectives.

Training is on there (with a 5% of time budget for it), and if they don't achieve their objectives, their next appraisal will go badly. Pay rises and promotions and so on depend on good appraisals, and I anticipate some quarterly review meetings in which I tell people that they're on track to do badly because they're not meeting their training objectives.

If having their training objectives written in black and white, and being reviewed quarterly and appraised annually on whether or not they're meeting their objectives doesn't make it "okay", well, I guess I'll have to come up with something else, but I sure hope that will make it clear.


For me and my people, learning never works when it is a task independent of actual work, meaning I only learn something new when I see a place I can use it.

Learning must be optional, never required. And must have incentive: I learn programming because it makes my work faster. But if I can't use my program, I feel like I wasted my time.

Learning should tell me that I am increasing my value.

I can never learn new stuff just because I am supposed to. Nor will anyone suffer a book if there is no IMMEDIATE benefit.


I empathize and don't have any clear solution. One thing that has helped is actually setting a schedule and following up on it. If Tuesday mornings have two hours booked then you can actually just ask if that was followed any time you follow up with that person. It sort of hacks the categories by turning those moments into clearly work tasks with boxes that need to be ticked.


Can and have seen agile teams just create user stories for this. With estimates and acceptance criteria (yes having your people pass a quiz can be acceptance criteria). They even did an end of sprint demo where they had to present (without a deck) on the concepts of what they learned and answer questions.


I'm at a new place; the managers there did it by picking particular topics and instituting a group learning time (blocked off everyone's calendar for the time, let people select one of two topics, are soliciting new topics for future versions). It's working reasonably well.


Check out my comment re: learning spikes:

https://news.ycombinator.com/item?id=21763789

Having an actual story to back the learning process might help making employees feel more comfortable with it.


You give them a stack in the companies long term success.

Once the goals align, it will become clear to everyone that employees must both have deep understanding of the current company and an eye towards the future.


Make sure they have time to do it. If they feel like they have a lot on their plate or a higher-up breathing down their neck, then new learning will not be a priority.


Hire me. I love to get paid to watch MIT lectures that will benefit my work.


I'm currently negotiating with two potential employers, one of them one of the bigs. It would be great if they would pay me to learn at work, but I don't feel like I'm in a position to ask for that — that's an industry=wide problem.

So I'm trying to hack the system to get fewer hours: 30-32 per week. That gives me enough time to self-study, handle some of my own training and overdeliver, yet still have a life.

I have extremely strong open source bona fides, I'm an inveterate organizer of study groups both at work and elsewhere, I speak regularly at meetups in order to force myself to learn new things well. Giving me the space to train myself is a great deal for potential employers.

But they aren't biting. 40 hours is what they are set up for and it is hard for them to figure out how to be flexible, even if they want to.

At my last job with a small growth-stage startup, I successfully negotiated for 32 hours, and it worked out great for all parties. But it seems harder than I think it should be to close such deals.


Voice of experience: Companies are incapable of thinking this way. Instead, take the 40 and simply siphon off 8-10 for your personal education.

As long as you're not actually working for a different company during this time, it's all good. If company doesn't like it, move on.


My experience has been that KPIs, metrics, and reviews don't matter a whole lot. Do what you want. Get the job done, and done well. Management knows who is worth keeping and the numbers will somehow magically show that.

And if that doesn't work out then move on.


Yep. Prove that you can do your job in less time, and use the slack for learning. Dip out early on days you need to speak at things. Work from home on days you know will be slow and do learning while no one is around.

It’s not difficult to pull that off at a big company.


> Prove that you can do your job in less time

If you prove it, they may just expect more from you in the same amount of time.


I’m not advocating for announcing completion and broadcasting your efficiency. I’m just saying, ensure you won’t fall behind working on your real job 32 hours a week. That shouldn’t be too difficult in most cases, given all the research on productivity.

If you can’t do that, they were right to refuse the 32hr/wk proposal. If you can, just do it, sneak in some studying, and act like it took you 40.


...some bigger learning projects require more time or different rhythms from what can be made available through siphoning off time, which seems to be what most people in this thread are saying they're doing.

For example, if you're doing daily standups, then one hour of an 8-hour workday is doable. But if you want to learn something that requires from time to time that you concentrate deeply for a longer period of time, say 4 hours, then that's half a workday, which may well show on daily standups.

If you have a manager who has a tendency to show up at your desk unannounced and they glimpse you watching a video, they probably won't say anything or ask what you were watching, but may walk away having mentally applied a discount factor to what they think your rate of productivity might be.

Also, while siphoning-off may work well in a good culture of consistent non-rushed problem solving, it may not work so well in a culture where it's constantly management by crisis. Like tomorrow may be the deadline for delivering something in the context of a crisis. The crisis may be entirely made up, but you would still end up looking really bad if you didn't deliver. Then you're probably not going to spend an hour learning that day if you know there's a risk you will have to stay at the office for an hour longer as a result.


You're right, and I would never work at a place with daily standups (for that reason among others).

Instead: (1) Here are your priorities. (2) Tell me if you run into any blockers. (3) Let the team periodically know what they need to about what you're working on.


“40 hours is what they are set up for and it is hard for them to figure out how to be flexible, even if they want to.”

I had the same experience. asked for 32 hours week with corresponding pay cut. The HR people barely could comprehend the idea and certainly didn’t have a process for dealing with such a request.


I also tried to get a 32-hour week, interviewed with multiple companies, and at the end got a deal for mere 50% of my usual (40-hour) salary. And the company still felt they were doing me a big favor.

Within my probationary period I decided that 50% of money for 80% of time is too much of a discount; that I could simply work full-time for a year or two and then take a break for a year or two instead. But I never actually took that break.

There are many companies complaining how they can't find enough competent programmers. But almost none of them is willing to provide part-time work as a benefit. I believe that a software company that would publicly declare a 4-day workweek would soon have hundreds of people begging for an interview.


I currently work 28 hours per week. This is the second time this has happened in my career. Both times the path was "be an FTE for a while at 40hr/wk, produce, wait for something bad to be suggested, quit, take contractor offer for same pay at reduced hours."

I don't imagine this works for companies with employees numbering over ~250 or so, so I've been very lucky.


I don’t think they would even lose much productivity. When I count the number of useless meetings I have to attend I could probably produce the same results with a 3 day week.


It was fairly common (when I lived there a few years ago) in the Netherlands to negotiate down to 80% time, and I even managed 60% once.

Of course it’s a pay cut, but it was fantastic :)


Someone who asks for reduced hours is probably shooting themselves in the foot wrt comp. It suggests you are negotiating against yourself. If you aren’t asking for more money you are leaving money on the table. Just work effectively for 30 hours and nobody is going to know you aren’t working 40. Measurement of productivity in software is very vague.


I would just take the job and spend 90 minutes a day doing the things you mentioned. If anyone objects, well that’s conflict about how the flow of work will proceed, and dealing with those conflicts is everyone’s job.

You’re presuming that you’re not allowed to do the things you think are crucial as part of your daily work, but that’s already conceding too much to your superiors.

You do have to wait for the right time to do things, so it is justifiable as part of some near-term team goal. And be willing to wait when it’s not the time.


> It would be great if they would pay me to learn at work, but I don't feel like I'm in a position to ask for that — that's an industry=wide problem.

It's a society wide thing. I don't work in tech, I don't do anything CS related, for certification/licensing that will assist us in our jobs (and that is used to prioritize applicants for positions) we are expected to do it on our own time, outside of work, on our own dime which if you actually get the license/certification they will then refund a portion of a course you took IF it was graded and the only ones that carry grades require you to attend in person for 40 hours where you'd have to take a week off then go put yourself up in another state... and then annually the exam for the one with a license has a 3-20% annual pass rate, with the average around 15%.


I'm currently a frontend engineer. I told my team lead a while back that I'd like to transition to the server side, especially since our server team is woefully understaffed and our frontend team is overstaffed.

He told me "Sure but you'll need to learn all that stuff on your own time". So I never did it because I'm interested in doing other things at home.


If there's one thing I've learned between being a new engineer and a senior eng is that you can just take a small slice of time in your work schedule to learn. You don't have to clear this with anyone. You're salaried, take an hour or two a week to learn something new. You're getting paid for your technical judgement, use it to invest in something relevant and deepen your ability to judge. It pays off for both you and your employers even if they're not willing to formally fund training in large dedicated chunks.

Edit: clarified schedule to work schedule.


I've honestly never found myself in a position where I didn't had some spare time whilst at work. Even at my current job, I've transitioned from frontend work to backend work (I had backend experience already in Node), and I've always been able to keep up with our sprint. Other engineers complain they don't have time to improve our code base, or they can't learn anything new because they're always busy, but I think that's just an excuse. It's just a matter of organising yourself and looking for opportunities to learn.

I actively picked up stories I wasn't familiar with, just to learn it. Other engineers pick up stories because they're already familiar with the subject, that's not how you learn new things.


I've shot down two companies in the past three days because they made it known during the recruitment that they expected more than 40 hours a week, sometimes well in excess of it.

It's always couched in some doublespeak about "ownership". Trust me, if I were owner or part-owner in the enterprise, I wouldn't mind. But I'm not.

People complain about entitlement attitudes among the young. They need to look at employers if they really want to see entitlement.

It sounds like your boss feels entitled to your time without taking an ownership stake in your professional development.


The other thing about those places is that they only attract the desperate devs. So you know when they pull out that bullshit in the interview that if you were to take a job there you'd be on a team chock full of juniors with a few "seniors" that just fill a chair for 50 hours a week and haven't learnt anything new in the last decade.


I got that impression. At the first shop, the people I met didn't seem like they would be employable anywhere else.

The hiring manager also spent some time whining about the state of the developer market and said they couldn't afford the higher end of the range they had posted on their website.


Eugh, annoys me that people talk about entitlement in this case then management is surprised when developers don't stay any longer than 3 years


Your lead made a mistake there. Someone with understanding of both FE and BE is worth as much as one of each, especially when they have intimate knowledge of your actual code.


>Someone with understanding of both FE and BE is worth as much as one of each...

Someone with understanding of both FE and BE is valuable in certain situations. Companies love the idea of a "full stack engineer" because they believe they are getting two for the price of one. But in practice, and accounting for exceptions, that's not necessarily the case. A full-time FE with equal experience will be better at the front than a FE/BE and a full-time BE with equal experience will be better at the back than a FE/BE. This is the reality of having deep experience in a specific domain. As an attempt at an example, if you went to college and got an English minor and a CompSci minor you'd be capable in both areas but you would likely not be as knowledgeable as an English major nor a CompSci major within the respective domain.

This divide becomes much more visible when you see a FE/BE architect a solution versus a full time FE or BE. When fixing bugs and implementing straight-forward features a FE/BE has a handy skill set. But when you move past that point, deep experience within an area starts to show it's value.


> This is the reality of having deep experience in a specific domain.

This seems like it should be true abstractly, but concretely I've never met anyone with deep experience in a specific domain who didn't in the process gain enough experience in other domains to have basic competence in multiple domains. Not necessarily as much as specialists in those domains, but it's practically required to achieve the depths of experience in the domain you think they specialized in.

(Which is not usually the domain they thought they were specializing in.)

This is particularly relevant where there isn't a clean separation of concerns and the involved skills are highly transferable.


Basic competence is just as you stated, basic. If you are BE with basic competence in FE then I would argue you're not a BE/FE. You're a BE who has basic knowledge of how FE work is done.


> As an attempt at an example, if you went to college and got an English minor and a CompSci minor you'd be capable in both areas but you would likely not be as knowledgeable as an English major nor a CompSci major within the respective domain.

I don't think this analogy works, because CompSci and English are pretty unrelated to each other as fields, whereas frontend and backend by definition work together to create a product. That makes a person experienced in both worth more when working on either, because they're more likely to have in mind the entire system when they do design decisions pertaining to their side of the network.


I did say it was an attempt. :)


I need to disagree with you here, I consider myself full stack, but if you were to ask, FE would be my deep experience. And for me this is perfect, I have no problems with any FE work, and the same goes for BE work. Sure, we have BE engineers with very deep experience, but I've never found myself in a position where I couldn't keep up with them. I acknowledge a lack some knowledge in BE, especially when it comes to deeper CompSci domains, but it has never held me back.

But in general, I think this is just a different mindset, I love being able to be independent, and complete a stories from FE to BE. I used to be in a position where I could only do FE work, and it really scared me. I had no clue where critical data was coming from, if something went wrong, I couldn't debug our APIs, I was in the dark constantly. Now if something goes wrong, I can switch over to our backend, debug, write extra test cases to cover the bug and push code. I also understand what BE engineers are talking about, and when designing a new feature, it's much easier to come up with new APIs.


Knowledge about how the other side of the fence works doesn't suddenly make a full-stack engineer. I'm sure if we actually came up with a ratio (80/20 vs 50/50) of work it would provide a clearer understanding about when a person is FE, BE, or FE/BE. Alternatively, we could look at the type of work the individual is doing. For example, in your role are you going to be an architect doing solutions for the back-end?


It’s not just knowledge, I can actually do all the work, and yes, I could design solutions. It’s just that my real deep knowledge would be FE. But I consider myself full stack because I can do both sides without a problem.

In my team I’m just the only FE, so naturally most of that work falls on me. There are months where I only did BE stories.


It's often the case that a company decides to filter down the infinite possibilities of the programming world to a strict subset. What's left can often be a little boring, and that's on purpose.

So in a world where your choices are 1) learn other tech within our toolbox or 2) learn stuff in your domain but outside of the scope of your current work... only one of those is okay to do much of on work time.

All those ES2018 features and CSS variables may be cool things a front-end engineer should be learning, but if we have to support older browsers then I don't get to use any of it.


Until suddenly the requirement to support older browsers is dropped, or you get shifted to another project, and then these ES2018 features would come in handy.

Learning only the stuff that covers your immediate working needs isn't a winning strategy, neither for you nor for the company.


If you are 'learning' stuff open-loop (with no feedback from users or peers), then you are deluding yourself about how much you've actually learned. You're one important but terribly insufficient step above book learning.

No battle plan survives contact with the enemy. There are tons of strategies I know to solve problems that I never use because it confuses other people (and some of them confuse me too when I look at them six months later). That's experience. That's wisdom. You won't find that in a private github repository, and only those who already know what they're looking for will find it in a book.


> If you are 'learning' stuff open-loop (with no feedback from users or peers), then you are deluding yourself about how much you've actually learned. You're one important but terribly insufficient step above book learning.

Book learning ain't bad; in my experience, most of the developers I know would do well if they picked an actual book every now and then. It's not as good as hands-on experience, but still miles ahead of "learning" by StackOverflow-driven development. Experience is an important piece of the puzzle of wisdom, but it's not the whole thing. You need to also have understanding, and reading (and thinking about what you read) is a good way to acquire insight.

WRT. closing the loop, your users and peers aren't the only way to close a loop (and arguably, they can only provide certain kinds of feedback that are very context-specific). Trying things out for yourself is also a good way to close the loop through feedback from reality itself (e.g. whether something works, or how difficult you find it, are such pieces of feedback). On top of that, peer/customer feedback makes no sense for a lot of useful knowledge - for instance the new ES2018 and CSS features you've mentioned upthread.


I disagree, there is a point when having too much 'deep experience' in either BE or FE (usually BE) becomes counter-productive to performing useful work.


Then why do Google (and others) hire specifically for FE and BE roles?


I have no opinion on the issue being discussed, but the fact that Google does things a particular way is not evidence that that way is correct. It is entirely possible that Google would be better off hiring full-stack developers.


I double majored in English and Economics and consider myself an amazing full stack developer.


"But in practice, and accounting for exceptions..."


Unfortunate approach.

My company not only encourage transitions, we actually "pay" for them. That means we will assign a senior developer that will have responsibility for growth of the new mentee and we purchase any educational materials the person requires.

We have quite decent amount of transitions within last few years and all were successful. They do require commitment and investment from both sides.

I do believe we are stronger for it as people appreciate the opportunity for growth and support we provide. We, on the other hand, re-gain skilled colleague that is not looking elsewhere.


Man, I wish I actually had a job where I got mentored.


Bad. Went I went from JS front end to iOS, I was put in a learning period on my own + internship in the iOS team and that worked great. It’s how companies like Spotify also did it. They paid companies like the Big Nerd Ranch to travel to Stockholm and give workshops to future iOS devs.

They benefited and I also did, as it helped me evolve into a full-stack (web, mobile & server) developer. This was essential for me to transition into building my own company [1] at a later point, as I could build the product on my own both for web and mobile.

[1] https://standups.io


We have a formal mentorship program and employees get 4 hours a month for professional development. We still feel we fail them.


Using the word "still" there is uncalibrated. 4 hours per month is barely anything.

Imagine yourself trying to learn a real skill in that allocation of time.

If anything, you may even be reducing your tech employees' "professional development" with that metric, because I bet the average tech employee already spends greater than 4 hours per month on active learning with or without management actively allowing them.


>4 hours a month for professional development

So half a day a month? That's pretty much nothing. Maybe time to work through a couple chapters of a book on some new topic.


Yeah, it's supposed to be on top of their existing 1-1s, but we still feel we fail them.


Make that 4 hours a week and maybe you'll feel better.


Unfortunately, HR is in charge of 'professional development'. Noone wants to fight HR and would rather keep their cushy job, get paid and let this thing crater.


Unfortunately, that is when, as a team, you have to not classify things certain ways. Still meet your commitments, but bake in learning time. It just becomes part of what you do during the sprint. Likewise, you should not have a "testing time", "quality", or "security" budget - you just bake those into how the team operates.


^ This is the way. Learn how to sell it or hide it from those HR parasites.


Calling an entire field of personnel "parasites" is an ad hominem attack.

Even if you were to justify your statement, it would be an ad hominem attack.


No shit. Cheers for pointing that out for us.


We do 4 hours a week, and I still feel it's too little.


We've just given up on it, I just give people tickets with stuff they want to learn about. "Oh, you want to learn Selenium, here's a ticket that has Selenium"


How did it work out for you? I'm also trying to do the same.


It worked out great, but you have to want to do this. You have to set out time for it. No wonder why people from Lambda School get well-paying jobs: it's because you have to be motivated to reach the goal.

Educational materials/courses are out there, it's all about you, wanting to smash it on the wall and work hard for the goal.


It looks like you have a pretty inexperienced lead.

I'm extremely happy when my team members want to expand their horizons, it keeps them engaged in their work and makes the team as a whole stronger.

Your lead should have paired you with a backend developer and have had simple backend tickets assigned to you. This works really well in turning Windows developers into Web developers (together with some coaching).


I was going to say 'jealous', which is orthogonal to experience. Either is bad but, only one tends to sort itself out over time.


The middle ground here is that I'd try to grasp the basics on my own time, then ask the switch.

I can understand your boss though: would you swap a competent employee for an incompetent one, paying the same amount? Probably not. Yes, you retain your knowledge about the front-end, but you wouldn't be using that.

My advice would be to work out the basics on your time and then ask somebody in the backend team to be involved, baby steps.

If you actually become useful for the backend team, they will advocate themselves for your switch, and bring you up to speed once there.


> would you swap a competent employee for an incompetent one, paying the same amount?

But this is the glass-half-empty boss. The glass-half-full boss is happy to have a competent employee who can be called upon when needed for important front-end stuff and is picking up backend work at a faster rate than the intern the team would have otherwise, because they're already an experienced engineer (with knowledge of the front-end no less!) and there is skill beyond just writing code that transfers between domains.

And that's to say nothing of swapping a competent employee who's already onboarded for an empty seat and a talent search, when the employee quits to expand their skills elsewhere.


Same could be said for the employee. If the employee invests in his own competency he can negotiate a higher salary or quit and work elsewhere.


They’re not incompetent. They have domain knowledge in both the general industry of the company and the fine details of the company itself. They’ve probably been nearby the backend developers while they were talking through issues and even offered insight in the lunch area about specific algorithmic problems some backend developers were having.

Languages tend to be super easy.

Frameworks are a little bit of work.

Domain knowledge is hard.


If that competent employee wants to switch career tracks and you won't facilitate that, they'll just go get hired somewhere else that will.


>He told me "Sure but you'll need to learn all that stuff on your own time". So I never did it because I'm interested in doing other things at home.

When somebody says that to you, your responsibility is to either a) change jobs or b) learn during work time anyway.


Currently in the process of (a) ;)


So what about the server team lead? What did they say? Seems to me it doesn't matter what your existing frontend guy has to say about it.


They just hired more people but were severely short staffed during a crunch, so the server team just worked 80+ hour weeks until those new guys came on


I'm utterly dumbfounded at someone who manages people, stating this. I budgeted for training on a per employee basis when I created my yearly budgets.


If you are interested doing other things at home, then why transition at all? I mean, how much time you will need to become good at "server side" and the company benefit from this?

Imho it's about 50/50, company should not expect employees getting new skills only on their own time and the employee should not expect learning only from 9 to 5.

Technically searching through the internet for examples, is already learning.


Your team "leader" sounds more like a team "director".


Oof. I would honestly look for another job and tell them you want to be able to learn new stuff on the job. If you have a job you can go as far as saying, if "I'm not learning anything new I am wasting my time".

Companies should encourage people to learn stuff on the job, because no tutorials or classes can really teach you the day to day struggles.


What if someone learned something on their own time, do they get to be reimbursed from the employer?

Or, can I ask my employer to reimburse my under-graduate and graduate tuition?


They aren't going to reimburse you after the fact, but if you get approval in advance many companies will reimburse for graduate programs or other courses.


There's no hard and fast rules. No, a company will never retroactively pay for you learning, but yes a number of companies will reimburse you if you attend graduate programs while employed with them.

It's a balancing act. If a company wants employees to keep learning, they need to invest in that. If I, as a salaried employee, am expected to be available outside of the explicit working hours for emergencies, then it's also fair that I be allowed to do things that are directed at my own career during work, bonus if it proves useful to the company.


In my company, if you transfer to another team, you're a traitor. Your current mgr can block your transfer, so it's suicide to try to switch teams.


I worked on a small project where it was just two people: one back-end developer, and one front-end developer. The project was not important enough to hire more people, but important enough that the manager complained about the "bus factor" and asked us to plan our vacations responsibly (i.e. not more than one week at a time).

Because both of us were interested in the other part of the work, we offered a solution that we could spend some time teaching each other. At the end, both of us would know both parts of the work, and be able to substitute for each other during the vacation.

"No, that would be a waste of time."


From the POV of your team lead, this is perhaps not an advantage move on his side, so I do understand his concern, after all he is not the company, just an employee as you. And from your description it seems that you are a competent employee, thus even the less

Nevertheless he should give a more encouraging answer as a team lead. And if you still want to make the change (looks like you are not very interested in it too), maybe just ask to spend a little more time on server stuffs and see what's the reply.

It's all negotiation and sometimes it's difficult to ask the right question.


Maybe the lead made a mistake. Or, I hate to say it but...maybe OP isn't viewed as HiPo and worth the perceived investment. There's always another side to a story.


Company won't help you advance your career = they don't think you can, or for some reason don't want you to = you should start interviewing elsewhere.


> HiPo

I thought this was a new battery technology and had to google it.

https://hr.toolbox.com/articles/hr-hipos-identifying-hipos-h...


Where I work (mechanical engineering business, not just software) there's a training course that people are encouraged to take up in their own time. However, the company does support staff taking the course with teaching time from senior engineers - and they make a big deal of having completed it with a certificate, bonus and pay rise. That's in addition to the apprenticeship scheme which represents a more sensible approach to industry funded education.


You should have said you already did and would like to start today.


At my employer:

--

"Get your brokers license"

ok provide me with a course

"well, we've got a limited number of people that can do it at a time"

ok so make the material available to me for learning at home and I'll do it on my own time

"We appreciate your feedback. You could purchase a course on your own, with your own money, and study on your own at home"

-- This is an actual event, with the person parroting the 'we value degrees' line being the head of HR for our opco at the time, on the phone with me cutting me off every time I started to say anything, with me at one point saying a vulgarity and "would you give me some respect and allow me to finish a sentence":

applies to position above me that finally opened up and interviews

"We just wanted to let you know we went with another candidate, we suggest you get a bachelors degree and then an MBA if you'd like to move up"

at entry level position for 13 and a half years I don't want a degree, what sort of a degree

"Oh business is good"

How would a business degree benefit me in any role in this office, at all?

"We prioritize education"

I don't want to take on tens of thousands of dollars of debt to make an extra 10% so that would mean I could pay the loans off in a decade from the raise

"With tuition reimbursement you could get a degree in as little as 6 years"

In 6 years I'll be 40 and you'll give the job to a 22 year old that's been here 6 months, why don't you create some sort of system for advancement internally that is equal to degree work

"We prioritize education, if you want to move up you should really work on a 4 year degree and then an MBA"

Why

"We prioritize education, if you want to move up you should really..."

That's classism

"We prioritize..."

I don't want your defacto dues card, I guess I'll just keep working for people above me that have been at the company a fraction of the time that I have because they have a 4-year degree in Artisan Baking Social Media Marketing with their minor in Mayan Whittling from 2000BC to 1990BC.

--

Sigh. There isn't even a degree in what I do, and the university my employer has partnered with only has one degree you can do, some random super-specific business degree that has literally nothing to do with what I do.

Even if I wanted to get a degree, by the time I do 8 hours of this work my brain is shot. Processing large amounts of data, with all sorts of legal and fiscal ramifications, while being under pressure to not only meet but exceed production... yeah do that all day and the last thing you want to do is go home and study/research/write papers that have to be in some asinine format scheme or you fail outright.

Then even if they did come up with some way to teach us on the job, it wouldn't even work because due to the nature of the job if someone leaves it can take 6-12 months to replace their processing capacity and business volume grows so we're always chugging along with no downtime and we hot bunk desks so when your shift is up, someone else will be along in a couple of hours to sit down and keep going.


Follow the money.

Lots of companies blow smoke about a lot of shit, especially after a merger. What they actually value is what they're willing to invest in. If an initiative has no funding, it's not an initiative.


It sounds like their system is meeting it's design goals.


Or just switch? The knowledge will come once you fix your first bug.


Better to ask for forgiveness than permission.


You ever think about how much we subsidize businesses by paying for college ourselves? The older I get the more I think the university to corporation pipeline is a fucking racket. It also sucks for employees because here comes these kids who were allowed to learn about all the new technologies you wish you knew with zero distractions and other obligations, now comin' in hot on your heels to take yer jobs en masse, and because of the larger labor pool they all undercut one another's bargaining ability, lowering wages. If I owned a company and I was lookin to hire I'd be callin that a twofer: free employee training and hiring discounts!


University is a total racket. We pay these institutions to be treated like their low level employees to provide job training for companies. Corporations have successfully offloaded their training responsibilities onto a process that used to be much broader than vocational training.


And if you try to study some of those broader topics, you're a sucker - don't study philosophy when an extra accounting or STEM course would be a "better" use of your time. So you super specialize and then after 4 years of college and 5 years in industry you're burnt out you have few other skills, so the only option is to get back onto the education treadmill to bet on another highly specific vocation where you once again start your career as a junior.


Most STEM courses aren't vocational training and aren't super specialized. I think you might have a biased negative view for what STEM courses are?

Edit: Let me state it a different way that might shed more light on my point. A CS major can pass all of his/her classes with a perfect GPA and still be incapable of writing software ready for a production system (even at small scale).

The STEM degrees emphasize fundamentals that are rarely (if ever) used in day-to-day "real jobs".


For some value of 'super specialized' you are correct. For a value that includes the bigger perspective on our culture and what it means to live a good life, an exclusive focus on STEM is indeed 'super specialized'.


>For a value that includes the bigger perspective on our culture

Sure

> and what it means to live a good life

That's just self-aggrandizing bullshit. There is no class that will tell you what it means to live a good life. Anyone who thinks so is dearly lacking perspective.

>an exclusive focus on STEM is indeed 'super specialized'.

An exclusive focus on STEM will include the philosophy of science and what it means to seek truths about the physical world. IMO that has immensely more value in a philosophical sense than you seem to imply.


I believe you are proving my point. For example, the goal of much ancient philosophy was exactly what it meant to live a good life, and the theory of such was very well developed. Most of the culture you take for granted as 'common sense' is directly based on this philosophical development.

STEM at best tells you how to do something, but can never tell you what to do, or why to do it. For that you need philosophy, much more than philosophy of science.


Hell, most of S&M is about maximally far from "vocational" training.


Sorry, yes S&M courses are generally poor to include in that list. A calculus or chemistry course or two aren't going to affect your career potential.


This is a very pessimistic world view. I found in the engineering faculty that the subjects we were taught were very broad. However, I continued to learn on my own in my free time after I got my degree. I've been doing this consistently for the last 8 years or so. I now have completely new skills and much more depth on knowledge on CS topics than I had coming out of university.


The point is that while for you it is your choice and pleasure to spend your free time doing something with a direct career benefit, for others, there are often other valid and important uses of their free time and so there is a cost to that.

The question is if that cost an individual occurs if they choose not to spend their free time on career-related skills is ethical or good for society.


I believe this is the crux of the problem - it favours people with minimal external life factors or responsibility, and they’re quite often the ones to rise to power, therefore creating a “well it was good enough for me” sentiment lacking empathy.

While by contrast, there are some people who want to kick back and simply collect a pay check, there is a whole segment of people in the middle ground who are hungry to learn, but are stretched so thin that they can’t outside of work — a whole segment that isn’t being catered for, and therefore an opportunity exists to tap into this.


It's not realistic to expect to go to school for 4 or 5 years and then work for the next 30 without learning anything new. Or rather, not if you care about advancing and making more money. Maybe I'm lucky that I actually enjoy it, so it doesn't feel so much like work to me.


I'm a bit confused what you're arguing against in my point - you continue to super specialize in your off time. If you wanted to switch vocations from something in the CS domain, how much of what you now know and have self-taught would apply?

At my university engineers were offered two elective courses in the faculty of arts or sciences. That's not a particularly broad education.


A B Eng covers so much "basic" knowledge that you never really have time to specialize at anything. Doing physics and math courses is hardly becoming specialized in SE. It takes a long time and years of work afterwards to become specialized at something.

The good news is, you can do it without going back to school. IMO an SE will get little benefit out of going back for another degree. You'll get much more ROI spending your time contributing to OSS projects and making a name for yourself.


I think this is really dependent on your choice of field to enter into. In the tech space, so many engineers didn't major or necessarily even take CS courses in college. I have interviewed and hired plenty of people with diverse colligate focuses.

Sure, CS or STEM courses are probably really solid to pair with a non-CS major because it teaches you stuff that helps extend your abilities in your own field. So I can see why my friend that was going to school to be a nurse might have wanted to take a CS course or two instead of minoring in sociology.


CS, being still a bit of a wild west, still has flexibility, but you definitely couldn't take some nursing courses (which, let's be honest, aren't even offered) and switch to that after burning out in comp sci.


fair point.


At my school I was required to take many courses on broader topics. Personally I didn't like it, I took as many CS courses as I was allowed.


Yes, due to the vocational focus, the broader topics are often watered down introductions.


The courses I took were the same that anyone getting a major in that field would take, and I was required to take them beyond just an introductory level.


They were total jokes at my (admittedly) second-rate-at-best state school. Most of the gen-ed courses failed to go beyond material we'd covered back in 10th grade or so. The English courses were probably the nearest to being remotely "serious" since they at least expected writing and critical reading on a slightly-above-high-school level, usually pretty early in the course.


Yes by ultra focus on tech, we can provide a decent living, but lack the bigger picture and become the peons of those who know better, and can articulate their view clearly and logically.

E.g. look at any PG essay that tried to talk about broader philosophical or political issues and you'll see this limitation. His frame of reference is stuck in a recent enlightenment framing of the world. Granted, PG is indeed a great communicator in technical fields.


We were supposed to take "History of Technology" which I guess is supposed to be the corollary to "Business Math" classes or whatever. I really enjoy the humanities so I took all the real electives I could.


I found that studying philosophy has had an extremely useful application to software development and architecture.


I disagree. I went to a state University and it really made me a better person. I worked in groups with diverse group of people and while it was tough I enjoyed the experience. Plus, it wasn't really that expensive considering my income now and before.

However, I guess non-technical/engineering degrees have different results.


I had a good experience at undergrad, but that was due to its divergence from the norm. It actually gave me the broader perspective by having me read thousands of pages of the primary literature for the Western canon, along with in depth critical group discussions of the texts, and learning to write coherent papers. Nothing in my CS degree impacted my life, except making me marketable. Much of the CS I could have picked up on my own, and very little have I actually used in day to day jobs, except the programming experience. On the other hand, the literature program has indeed changed my life.


This seems to be a very unpopular thought within CS culture and I find that really unfortunate. It feels like people are rushing to reduce their education, their lives, to optimized market interactions and that's a terrible lens for a human life. There may be an argument that it's a method of successfully navigating our society, thus enriching one's personal or familial existence, but it seems to me that would just lead to a poor societal structure with few common bonds among the people within it.


It's a side effect of no safety net, knowing that the slightest mishap could put you into crippling debt. To stand still but for a moment is to be trampled by the masses.

By the time you are in a financially stable situation, old habits are ingrained.


I like how a liberal arts education is "divergent from the norm" now. The primary function of university is to make people read for four years. Business degrees, CS degrees, essentially job training programs are a bastardization of the institution.


I think the classic ideal of a liberal arts degree is awesome... as a second or mid-life degree. The option to read and think in-depth and breadth seems to have more potential once you've lived a little more than the average 18-yr-old, just because you tend to have more experiences and viewpoints than a high school grad heading to uni.


On the other hand, by that point, you'll have a bunch of habits and ways of thinking hardwired that you did not choose for yourself. It also becomes something of a Sapir-Worf dilemma, where it becomes very difficult to even realize one's thinking has been shaped in this way.

My experience of interacting with older, more stable 'intellectuals' who do not have a broad background of reading is an acquired indolence towards foreign ideas and older ideas, subsisting on a shallow 'tolerance' as a sign of their broad mindedness.


My particular liberal arts program is "divergent from the norm," including modern liberal arts programs. Just about all programs, liberal arts or otherwise, are completely framed within an enlightenment view of reality, largely due to dogmatic materialism. Classes that do diverge from materialism have lost a coherent way to talk about an alternate worldview, leaving their terminology sounding very wishy washy and illogical, like a woo woo Deepak Chopra.


Off topic but that still sounds like it's framed within the enlightenment period. If you're reading books and valuing literacy, (ie individual interpretations of texts, as opposed to being told what a book means) then you're still framed within the "enlightenment view of reality."


By 'enlightenment' I'm referring to a particular worldview perpetuated to denigrate the Western tradition and broader philosophical outlook in favor of a focus on empirical sciences and radically egalitarian social mores. The basic idea of 'enlightenment' is there is no objective and learnable purpose to the natural world and human society, and instead once we learn how to manipulate the natural world we can subject it to whatever ends we desire. In general it results in an implicit rejection of the ontology and teleology discovered by philosophers like Plato and Aristotle. This rejection may be valid, but students are not even given a clear view on the matter so that they know what they are rejecting. Instead, they tend to be educated in the criticisms offered by enlightenment writers, and filter the rest of history through that very limited lens.


It's strangely refreshing to see this particular criticism of the Enlightenment; I'm much more accustomed to hearing criticisms from the postmodernist direction. I disagree with your statement in a previous post that their arguments are incoherent, particularly the early exponents like Foucault or some of the Frankfurt school. I'd also point out that much of the Enlightenment tradition is not ontologically materialistic; in particular, German Idealism embodied in Kant, Schopenhauer etc. stands against materialism.

Based on what you're saying here, are you arguing for a kind of Scholasticism?

Finally, your criticism of a "radically egalitarian" view is somewhat perplexing to me. Would you mind expanding on that point?


In my opinion the idealistic variant of the enlightenment is conceptually not significantly different from materialism. The big thing is rejection of teleology, which also results in the radical egalitarianism since there is no longer a purposeful ordering to reality and no longer a natural law.

And yes, a teleological philosophy like scholasticism makes the most sense if we are trying to figure out the best way to live. Otherwise we just end up with the specious word game philosophy that everyone hates


That's sortof what the enlightenment was about... The enlightenment period was a decentralization of information caused by the reinvention and widespread use of the printing press in europe. During the dark ages europe's literacy rate was comparable to pre-mesopotamia. The fall of the roman empire lead to a fracturing of european civilization, the near-total loss of literacy, latin fractured into a dozen languages because priests wrote and read at a first grade level, misspelling words, reading with one finger slowly scrolling the text, mouthing each word phonetically...Ancient Greek texts were completely lost for a time...

Because nobody could read and copies the bible were sparse the catholic church was the single source of word of god. The printing press changed things. The bible became widespread and people read the bible for themselves. With that came an important shift, that one's own interpretation of a text was a valid interpretation. Tons of important literary works became widespread. The middle class valued literacy and saw it as a ticket to wealth and began teaching their kids to read and write competitively at younger and younger ages. They invented the education system we have today; the entire idea of a sequential learning system based around books, and becoming an adult when you could read at a certain level (as opposed to the catholic belief that you were an adult when you were old enough to fight at age ten), that was also the enlightenment and romantic period. Protestantism came about because people valued individual interpretations of the bible, which the catholic church had serious qualms with since that was their entire claim to authority...

So the fact that you grew up in a family which valued literacy, which sent you to a university where you spent four years reading books, and then came out of that with your own valid and rational ideas about what those texts mean, and your rite of passage into adulthood is based on your ability to read and write at a university level, that is still very much framed in the values of the enlightenment.


The precise narrative you just articulated is that of the enlightenment in the 18th century, which is a period much latter than the invention of the printing press and Protestantism.

A good book for you to check out is Rodney Stark's "For the Glory of God", written by a secular historian debunking much of the above narrative.

The fact that many educated today take your narrative for unarguable fact also illustrates the problem. The 'enlightenment' narrative is ironically very self limiting.


The brutality of the dark ages has been debunked. The timeline I just gave you about illiteracy, the printing press, the enlightenment, and our 400 year old education system remains in tact. Neil postman's a good source for the history of education (see "The Disappearance of Childhood") or you can simply wikipedia it. https://en.wikipedia.org/wiki/Dark_Ages_(historiography) There's some graphs that show how the enlightenment coincides an exponential growth in mass publication.

As far as protestantism and the printing press being invented prior to the enlightenment, yeah. Without widespread use of both you don't get the enlightenment for reasons I mentioned previously. And neither were really new ideas, either. Ancient greece had the printing press, high rates of literacy and a belief in interpreting texts for yourself, but these ideas were lost during the dark ages.


Hmm, I've received different information than yourself. There were more printed books, but that doesn't mean there was significantly less learning and literacy that came before, although a different proportion of the population was literate. And lack of general literacy does not necessarily entail lack of learning or understanding. For example, much of the iconography comes from that era, and the lay person was taught through imagery and liturgy, not necessarily to their detriment. As far as I know, the university system we know today came into being mostly within the context of Catholic Church's clericalism and much of the great philosophical synthesis came about during that time, especially with Thomas Aquinas.

At any rate, we are obviously referring to different things by the term 'enlightenment', definitely different historical epochs.


People going to university in order to get a good job are playing the wrong game. Anyone looking primarily at the "cost vs. future earnings" for any particular program or course in a university setting is

1. not going to find a good economic deal

2. not going to get any true value

3. not going to have fun

You're better to take a 2-year programming diploma or go into the trades if you want the highest short/midterm pay-off.

If you're playing a longer term strategic game and actually enjoy learning for the sake of growing (i.e. you do it on your own regardless) look at the career-long pay-off.


I went to community college but didn’t get my Associates because the final class was me paying to work in the computer lab. It was so mind numbing I just couldn’t bring myself to do it.


Yes, paying so much money is absurd. At least in many European countries higher level education is free.


In france, companies all pay quite a lot of money to some fund. The fund’s purpose is to finance employee education if they want to switch careers.

For instance, my friend was a mechanical engineer at Total, and after 3 years left to go study ML 1 year. Not only was the school all paid for by this fund, but the now student got a decent share of his salary everymonth. Best of best, he can go work for another company after his degree without any problem.

I thought that was a great idea. Although in practice, not every one is eligible to this particular program. I forgot how its called, CIFRE maybe

this particular degree isnt paid for by the government like the rest french education. Although it is taught at a very famous engineering school in france


The name got changed several times. CIFRE is a PhD course paid for by a company and subsidized by the state. So you get to do real work and get a PhD.

The thing you are thinking about is probably Compte Personnel de Formation (CPF) which was called Droit Individual à la Formation (DIF) a few years back.

Note that this is not some incredible sum of money, however a year in a classic STEM university costs a few hundred euros in France, so what you get from CPF is quite enough.

For five years of engineering school I spend about 3500 euros which included insurance. A full pension with private room costs a bit more than 300 euros per month. The difference with US education prices is just staggering.


You're correct but the difficulty is finding an alternative. Training employees who are then free to take their skills to another company that didn't bother with training gives that company the ability to lure workers away with higher salaries. There is a bit of tragedy of the commons in the skilled labor pool.


There is an old adage about this:

Q: What happens if we provide education and training to our employees and they leave?

A: Well, what happens if we don't, and they stay?

Growing a company's knowledge and skill base is an investment, not charity. Companies that don't do it reap exactly what they sow -- they're the same companies whose CEOs will otherwise loudly complain about how difficult it is to find skilled employees, especially at a senior level, and decry the terrible state of universities. As if everyone else just stumbles upon people with twenty years of experience in a particular niche on the street.

Yes, some people will leave. The smart thing to do is to convince as many of them to stay and to stay on good terms with those who leave. Keeping a loyal employee base whose knowledge and skills remain largely unchanged after joining the company doesn't provide any kind of meaningful growth.


HR loves to trot out this saying, attributing it to the enlightened CEO or such. In my experience it's about 50-50 if someone stays because we offer advanced training or leverages their new skills to get a new job.


Convincing them to stay means people need to be given competitive pay once they have upped their market value.


“Train people well enough so they can leave, treat them well enough so they don't want to.” — Richard Branson


Well it's also a pool, meaning it rotates, you get some, lose some, get some, lose some, etc. Maybe a dev will take 2-3 jobs to learn the trade (PHP here, Js there, some fundamental web stuff to top it off, and here's your "professional-grade developer". Great.) But you get to hire equivalent devs at each step, then it's just about $/skill.

Now, if all companies made it part of their "offer" to train people "enough" (say, 1d/w), then you'd expect all the workforce to become more qualified, better in time and in age.

You could actually pay/recover the "investment" of training the equivalent of university/grad/postgrad/etc for all employees simply by the fact that everyone else would do it too (and it would certainly lower wages a bit for the early years of these newcomers, since they'd skip the idling 20's decade of many youths currently).

I don't know, it's clearly not something you could do overnight or even over a generation, it's likely to be deeper and more 'revolutionary' than that in people's minds; but mathematically, economically, it tends to make sense (we've done that for years with "guilds" and "companions" in the medieval ages and actually since forever in some trades).

I think the current mainstream / massive education (take hundreds, thousands, and grad them each year) is just the result / need of industrialization (requiring an educated workforce), a novelty of the late 19th and 20th century.

I think the cursor is moving and the explosion of alternative means and times/ages of learning is a strong indicator of that.


One approach that reduces the tragedy of the commons:

Some places require spending a certain percentage of payroll on training by law, failing which the employer must pay the difference to a government training fund. I live in the Canadian province of Quebec which is such a place. I think at least one major tech city in the US has a similar law, though I'm not sure.


Won't it mechanically increase skilled people's wages?


up until the moment you realised that your very specific skill is not that worth much more elsewhere.


Plumbers and electricians don’t go to school for multiple years to learn a trade, they apprentice - learning on the job, until they get to the point where they propel themselves forward.


Well, no (or at least I guess it depends which educational system you are talking about).

On a few educational systems, you have dedicated curriculums for technicians and artisans/craftmen, with theory in classes, practice in labs and internship on the job.

Even if learning on the job is a big part (roughly half of the training), it's not purely that.


they apprentice after become a [trade]-apprentice. If you're brand new you go to school & do a mix of lab work and on-the-job placements.

The key (which I think you're making) is that once you're an apprentice you learn by doing and get paid.


> You ever think about how much we subsidize businesses by paying for college ourselves?

1. Generally, education leads to productivity. 2. College education is typically tranferrable -- learning CS topics improves worker productivity regardless of who they work for. 3. Education is sticky to the individual; it can't be repo'd (or confiscated by fascists / nazis). 4. Employers generally try to match wages to productivity. Even if they pay as little as possible, in a fair market they will have to be prepared to bid close to worker productivity or lose out to a competitor who will.

Given these factors, I think the status quo is going to do a better job optimizing things than requiring employers to pay for training. When deciding who should bear the costs of training, it's appropriate to remember who the benefits accrue to. It's not only fair, but also ensures the incentives are aligned. When the topic is general and tranferable, the benefits of training largely accrue to the trainee.

Which is why I'm sitting in a training class about how our software works and code review culture, and not one about Python or Go.


The state de-funding of the university is great for corporations in a lot of ways. First, as you note, it's a way of getting you to pay for your own job training.

It's also a clever way for tech companies to externalize a lot of their R&D costs—because academic labs often rely on private grant funding more than the state, corporations can determine research priorities by extending grants, and then have the costs of that research partially subsidized by tuition-paying students!

This is an interesting talk on the subject. Haven't listened to it in a while but it really shaped my thinking about the university while I was a student: https://wearemany.org/a/2014/06/fall-of-faculty


Not to undercut your point too much, but no one comes out of university learning about the latest and greatest tech unless they went to a graduate program where they did research in tech. They may have played around with it more on their own time, but most universities aren't teaching cutting edge stuff.


And yet so many dev interviews are focused on the things that they grill you on in CS classes for four years, yet most devs will then hardly touch for the rest of their careers.


You are correct. But that also frees us to study what we want. Which is why some people end up studying Liberal Arts and have very little employable market demand.

When corporations start sponsoring degrees you will see a lot less "fluff" in those forms of higher education.


I’m amazed to hear this from programmers. You can learn this for free, anywhere, at any time.

I’m no longer in software and when I want to upgrade my skills, I’m doing it on my own time and my own dime. Often doubly so if I have to take time off work (hourly) for training.


There is a difference between:

  1. learning outside of work for your own personal enjoyment;
  2. learning for professional growth and;
  3. for professional needs.
Category 1 is where learning for free, at anytime is valuable, you are doing it as a hobby or as a personal curiosity, so taking your own time for that is fun and enjoyable.

Category 2 is murky, I've definitely done a lot of studying on my own dime to improve myself or learn technologies I could apply on my day-to-day work but... When I could I did it on company's time, on this category I'm using my time for something that will have a direct return to the company, if I can get paid for it I'd always choose so.

Category 3 SHOULD NEVER be done on your own time.

This mentality of needing to use your own time to do any kind of professional development only fosters the race to the bottom, where you are required to use your own time to improve yourself because the company you work for is only interested in extracting maximum value out of your billable hours...

I agree that in the parent's comment case they could have used some of their free time to improve their backend skills and then try a lateral move in their job but it shouldn't make you "amazed" that someone isn't willing to sacrifice their free time to earn a little bit more of salary while their company extracts a higher multiple out of that same work.


> I’m amazed to hear this from programmers. You can learn this for free, anywhere, at any time.

Some people are more passionate about their jobs than others no matter the discipline.

I love my job but it's still just something I do to fund my actual dreams/passion. Why would I spend more time on it than I need to when I already feel like I don't have enough time for my own interests?


Your time isn't free. Doubly so when you are spending it to benefit someone else.


Well, not doing it on your own time is the exact point of the article linked. I mean, if you want to pivot into a completely different career in a completely different company you should do it on your own, sure, but GP wants to change position within the same company, and wants to move in an understaffed department. Why should they do it on their time and dime?


Well maybe talk to the backend lead at the same time. His own team lead <> the company and the company might not even realize it. From the pov of his own team lead, it might be a purely damaging move (has to find alternative, has to explain to upper management, etc.).

TBH I wouldn't expect any manager (especially it's just a lead, not even a mgr) to sacrifice for my own benefit. I thank them and provide material reward if they do, but I'm 100% with them (and study on my own, then leave the company) if they don't. It has to be mutually beneficial, or at least looks like. So my feeling is that both OP and his team leader are not good at communication.


I do too, but in a large amount of cases the skills I want to learn in my own time often only tangentially relate to the skills I maybe should acquire to become a better worker at my current job. Unrelated stuff like painting, modeling and playing an instrument. Most of it is still software related, but not specifically to my job. Getting that out of your head from time to time seems to help. Of course there are compromises, but certainly not most of the time.

If companies want efficient and capable workers, they have to be willing to train themselves. We are < 500 people and are currently building a training center just for that purpose (not software focused). Granted, a culture where job hopping is common, that can come at a steep price.


Isn't capital expenditure usually the company's responsibility? If you wanted to have nothing but opex you would rent all of your equipment and hire contractors - but that would get expensive quick.


Though paying an engineer to learn a new skill is not going to be capex on a financial statement. If it was companies would be more willing to do it.


You can do stuff on your own time, especially if it's pretty much orthogonal to your day job. But I expect any job to include material time for relevant/adjacent training. Otherwise, it's in the same category as hiring manager expectations that you're working in open source in GitHub repos in your spare time.


I think spending about a third of my waking hours thinking about stupid computer shit is already plenty. If an employer doesn't want to help me use some of that time to get better at what I do, I bet someone else does.


I don't want to spend my time outside work hours developing skills needed for my job. That's effectively the job stealing my time and money.


And those who do, have a competitive edge over you. It's an investment in yourself that qualifies you to leave and find a better paying job.


Leaving and finding a better paying job is already Easy Enough after a few years at one place regardless.

I've got better ways to spend my time for my career and life than learning things for the benefit of my current employer.


If you end up in that spiral you will never get out of it, there will always be someone else working harder than you on improving some skill.

Time is the only thing you have in life, use it wisely and balanced.


Maybe I just had a different career path, but so far it seems that my job is learning, and the work is a byproduct of that.

I spent a bunch of years in the USAR doing basic computer stuff, both in training, and then learning from others as I went along. I was the US Army version of a general purpose "IT" fellow. Once I got out of the military and I interviewed at a few private sector jobs I realized I was not overly well equipped for IT, but if I could show I wanted to learn they would usually bring me on-board. I did desktop support for ~2 years before I landed a veryyyy basic SysAdmin job. I did that for a year, and rolled it into a real SysAdmin job. Did that for a year and turned it into a Server Engineer job (Super sysadmin?) I did that for ~2 years....you can kind of see where this is going.

I've been out of the military for ~10 years now, and every job I've had has been about learning, (and then learning that my current job wasn't going to pay me what I was actually worth) while trying something new. I got paid to "learn" every step of the way. It just wasn't a direct "learn X and you get Y" process, but even then, most companies offered some level of college reimbursement (I never really did that college thing) or will pay for certifications/training, and really, IT is one of the few fields I can think of where all of the knowledge needed to do it is open, accessible, and available to everyone.


This is how I approach it too and I think it's a healthy approach. I get into a job mainly for the learning opportunities it provides. I'll happily take a pay cut if it means I get to grow in a direction I need. Once I feel like the opportunities become fewer and farther between, and management cannot change that, I'll find a new job that can provide me with the right learning opportunities again.

Of course, I'm not even in my thirties, so I can afford to optimise for learning now to have the compound interest pay off for it when I'm older.


It's all a holdover from assembly-line factory jobs. You need to start at a specific time and work hard the whole shift. If you're 3 minutes late, the line is delayed by 3 minutes. Software development is a very different profession.


If you ever find yourself in a position or company where your boss says to you "You're not here to learn, you're here to work", and measures productivity by how many hours you sit behind a screen, or LOC pushed, then you'r more likely than not working in a sweatshop.

The problem, in my experience, is when they bring in non-technical managers in, preferably from completely other environments (production / manufacturing / industry).

Luckily it's getting more rare these days, but I've seen managers that were hired on the basis "a good manager is a good manager, no mater where he/she comes from", only to manage production floor like a factory.


My employer encourages using around 20% of our time learning. Lower level developers can manage this pretty easily. But most of the level 3’s are kind of expected to just know stuff, plus the heavy work load. We still manage about 10% of our time to learn. But if we want to take Udemy classes, paid tutorials, or something else they’ll gladly reimburse us for the costs. They prefer their developers to “be on point”.


20% time is a day of the week. Why not just say "Wednesdays are learning and development day" or something?


My boss 'encourages' us too, but then they pack the schedule with other stuff. I think I get maybe 4% (friday afternoon, a couple weeks a month).


I agree making time to learn at work for a job you've had for a while is important, but we should probably make a distinction for newhires. Software has a low barrier to entry and I'm sure many of us have been victim to people who earnestly, but mistakenly, believe they are capable programmers. As a result, they're constantly "learning", in effect turning the job into a classroom, and leaving embittered peers who have to pick up the slack indefinitely. This type of "learning" can be toxic. (Of course, good hiring practices should be able to filter those sorts of people, but hiring practices can be its own can of worms.)


This is a good point: the tradeoff for learning vs doing depends on how good a learner the individual is. You're describing people who are not efficient at learning. Therefore allocating a lot of learning to them is a bad idea.

Maybe this is why programming interviews are infamous for arbitrary puzzles: the more unexpected and weird, the more the interview tests adaptability rather than current skill.


Frankly, if your workplace doesn't already think so, it's a red flag.


Many managers believe that learning is something that you should do on your own time. Even when "learning" boils down to memorizing idiosyncrasies of some framework/protocol/micro-controller they force you to use.


How is this possible? I'm often asked to do something I don't know how to do. Figuring it out ("learning") is part of the process of accomplishing. I don't know how you can separate learning from doing.


Yeah, I don't understand that either. As a developer, learning is my job. Writing code that I already possess all the prerequisite knowledge to write requires almost no time at all, so 99% of my time is necessarily going to be spent learning how to do things I don't already know how to do. If I had to wait until I got home to learn those new skills, I'd never get any non-trivial amount of work done.


Yes it is. Another big red flag is micro managing. Managers that want to control even the amount of coffee you drink or how many times you go to the bathroom, if you go out too much etc. Super annoying;


That happens?


I've seen it be a concern at a couple companies where the person takes 30+ minute "coffee breaks" and has to leave the office to go to Starbucks. Not surprisingly they were generally lower output team members as well. (He says, on HN, in the middle of the workday.)


It happened to my wife with a past manager (customer support). It was that manager's first supervisor role and they were trying to be "metric focused" which led to extreme micro management. That management style led to most of the team quitting, including my wife. We heard that the manager was demoted about a month later.


I've seen it most in call centers, where the call balancing is so sensitive they try to stagger bathroom breaks through a forty hour call floor.

Then, that policy, which is a boring dystopia in and of itself, gets applied to other individual contributors "to make things fair". I've seen this in more than one customer-service focused company (though thankfully avoided being in the category).


Maybe they don't make quotas but if they comment on how many coffee breaks or bathroom trips you make, they obviously care about it.


Number/duration of coffee breaks is hardly "amount of coffee you drink". And I find it hard to believe any software engineer would catch grief for any amount of time spent away from the desk of their output was measurably above average. So this complaint sounds like it's actually, "I don't get anything done and don't even bother trying to appear busy and my manager tried to talk to me about it".


It's not unheard of for contractors in non-NA & non-EU locales.


Maybe at Foxconn or something


This is why i love my current employer and i won't leave unless i'm forced to.

We have a policy of "Chargeable work always comes first but if you've got nothing to do go learn something, we'll call you when something comes in"

This has lead to times before where we go dead around the same time every year and you spend 2 months being paid to independently upskill. Come the busy November-February period its really obvious who spent their time wisely and who goofed off.


Yeah, here in my work, and all others i've worked, If you say that you don't have anything to do because you finished it all, people panic.


>but if you've got nothing to do

I wish that was ever the case with me! One of the downsides of being part of a very small team.. there is always too much to do. But it is important to take advantage of any times where it's a little less crazy than others to continue to learn new stuff.


I think its the benefits of my sector and my teams place within it. 90% of sales get made in w46-w52 and then my team and first in a long chain to action after a deal is signed.

This means by July/August its a mad scrap to find work.


I've been lucky enough to work in two, maybe three, places with implicit or explicit policies like this. It's something I shop for now when job-hunting.


I’ve been noticing my employees recently will walk away from there desk and find a nice cranny to read some type of business book to help them improve in there personal and professional career. I LOVE it. The craziest thing is we have never had any discussions about continued education, but we do keep a library of great books and resources readily available. My take, when your employees continue to learn on the job, it’s just them investing in themselves which in turn will be an investment in your company, and will ultimately make them better asset to the org. It’s a win win. I love it.


The thing that annoys me the most is that tech companies, mine included, will reimburse gladly books, but no one is ever allowed to be seen reading books at their desk on company time. Whereas being on hacker news, reddit, NYT visibly is okay. I've just started reading PDF's on my computer screen but I'd prefer a real book of the same thing most of the time.


Really? I’ve seen coworkers reading, but it’s rare. I occasionally encourage people to read at work. People tend to not follow through, since it’s often hard to find good books on niche subjects.

At one point I was in a reading group on Quantum Computing (we were working through some classic text book). I eventually stopped since I was bad at making time to do the exercises.


I'm so lucky to work at a company where every Friday is dedicated to study. My company pays even for English lessons (it's an Italian company).


I love the distinction between incremental and transformative learning. Personally, I've experienced the most transformative learning through working with a coach - a 3rd party who offers the space for reflective engagement, new perspectives, and experimentation. Much of this is tough to do on our own because we get stuck in the same mindsets and patterns of behavior. A coach helps articulate what's going on in a new light, uncovers blindspots, and holds us accountable for taking action, ultimately fueling transformative learning.

(Side note: I'm on a mission to spread the power of coaching by making it easier to find the right coach: https://uplevel.coach/ Happy to chat with anyone interested in learning more!)


Folks, there's so much cynicism here. If you want to learn and invest in yourself, do it! If you want to sit back and wait for your employer to invest in you, don't be surprised when it doesn't happen. Orgs need to make smart investment decisions. If you're not feeling supported it may be that the org doesn't view you as HiPo and that's clear feedback on your current value to the organization.


The problem with your argument is: what if the organization doesn't support anyone in learning? (Just like 99% of companies and society in large) Would you say then that you are a low performer? Because based on your argument that's the only logical conclusion.

You seem to have jumped to a conclusion but I don't think this has anything to do with being high performer or not.


I divide my working time sth like 50/50 for actual tasks solving and learning. My company doesn't understand it, but I work remotely and no one notices. But soon I'm back to office.

I've joined my current company two years ago. We are small (but constantly growing) company - about 10 programmers at the moment. 10 separate programmers, lack of notion of any "team". Time (and money) lost on problems arising from that fact is XXX% or even XXXX% per project.

The most important thing about learning is not accumulation of knowledge by individual, but accumulation of knowlege by group, making knowledge common, therfore creating higher level being called "team".


I have also been working remotely for multiple years now. Since I started taking learning new things as part of the work, I can say my efficiency has increased drastically. That has added value to myself and companies I have worked for.


A better way of thinking of this is in terms of freelancers learning on the job and charging customers for this openly. Often there is a mutual benefit where even having access to a person with the ability and willingness to learn is worth something to customers.

I recently did a python project for a customer where I was very open about my limited python skills and the fact I hadn't really used it in over a decade. But I got them to pay for me because I had other skills that they needed (building search backends using Elasticsearch). In the end, I brushed up my python skills on the job and had a happy customer. It wasn't a big deal but I definitely spent some billable time googling the basics of using Python (I literally had to google the syntax for e.g. for loops, lambdas functions, and a few other things).

I would argue my ability to learn and adapt is actually what makes me valuable. I can wrap my head around complex tech stacks, new languages, etc. while applying skills I already have.


When I was a dev lead I came up with the concept called a "learning spike," where a developer would pick a focused topic to dive into for a set period of time[1] and then do a short[2] presentation of learnings to the entire team. This would be backed by an official story in Jira or whatnot. The goal would be for the developer to choose something that sparked their interest (within the loose boundaries that it be at least marginally related to the team's work.)

I figured this would be great in a number of ways -- developer deepens their knowledge of some topic, possibly even has fun, practices their communication skills by sharing with others, and the entire team learns something. Although the idea was technically backed by management, it didn't really catch on. I think the problem was that the short term opportunity costs were too high when everything was on fire (which was often).

[1] 2-3 days max, probably.

[2] "Short" is the key word here.


“I think the problem was that the short term opportunity costs were too high when everything was on fire (which was often).”

That’s such a sad way of thinking but pretty common. In addition a lot of people seem to be thinking that if not everything is on fire all the time they are moving too slowly.


I found that a sustainable pace with short bursts of fire make for the fastest progress.

Fires keep you on edge, show you what’s actually important, and help trim fat. But without long period of calm in between, I can’t consolidate what I’ve learned, nor build upon that wealth.

Quite like endurance training, actually.


My job gives us about 5K a year for individual training in addition to hosting meetups after work etc etc.

The 5K can be used for training, conference and travel tickets etc etc.

This is in Norway so everything is mostly nice all the time, but to be honest it is the first time I've had anything like this. Feels awesome.


Dang. I asked my job for a PluralSight account or if they have one and they said they'll "look into it."


Makes me want to work in Norway.


The full article has a cost associated with it, but it looks like the writer has also given this past speech with (hopefully) similar content:

https://www.youtube.com/watch?v=ellIRUe2mpU


My entire professional "office" career, if not my entire working experience since starting full time as a bicycle mechanic at 16 years old, has been wholly reliant on the opportunities to do something I wasn't perfectly 100% qualified for (it should be noted that I am a white male.)

I wouldn't knowingly join an employer that pigeonholes anyone across their organization. I'd also propose that a role with little or no horizontal or vertical professional mobility is probably a good candidate for automation and a human shouldn't be subject to it for longer than necessary anyhow.


I call it "sharpening the saw".

If you're working at a factory, the time you spend to sharpen the saw, so you can continue to work, is paid and no one is questioning that. Also, factory workers are not learning how to use new tools at home. They get the training paid by the company and that is normal.

So why would the learning on the job in the computer industry, as a way to sharpen your skills, be treated any different and (I dare to say) almost be stigmatized in some places.


The abstract looks great. How do we access the full article without an account?



I was able to access the full article by searching "Learning for a Living" on Google then clicking on the top stories link.


True, but because competition drives relentless increases in productivity business will cherry pick people that managed to learn off their dime rather than train people. It's simple. Only force applied by people or governments changes the rules enough to open space for these kinds of things.


This is a great point, but it poses an inherent competitive problem, not unlike the prisoners dilemma: every company is seriously incentivised to cheat (not train) because it gives them an advantage. This is pernicious, because it almost forces other companies to do this as well.

The Silicon Valley has made an industry in a way out of 'cheating' by encouraging super long hours far beyond normal labour requirements. As a personal choice, this is fine of course ... but it's never just personal or localised because it 'forces' everyone else to go to that level 'or die'.

I wonder if it might require legislation i.e. '3 weeks training / learning'. Maybe better for the powers that be simply to put pressure on.


One caution though; learn how to limit the learning part, it can become a trap. I have often in the past fallen victim to the pitfall of spending too much time researching tooling and not enough time jumping in.

My methods for avoiding this are to have a core set of criteria that is minimalized. It widens the possible tool list and makes it harder to get caught up in minutiae. Then, I have found that actually just spinning up each set of tooling in testing to find real world implementation issues first-hand is a great first-gate narrower.

This is just a particular subset of learning at work though, so in general what I would say is learn how to get better at learning in general. Learning at work and elsewhere will get better if you do.


I've worked at a few places. Something that I wished existed in all those places is a library. It need not be filled with books. Just reserved room space for people to come and read without interruption. I really wish someday people give this a thought.


I write C++ at work. And that means I wait a lot for the compilation to end. Sometimes, there is also a big file to download from the other end of the world over FTP. Recently I've started spending this time on reading books. That's how I went through "The Little Schemer" and "The Season Schemer" so far. Now I'm going through "Real World Haskell".

I used to have second thoughts about it, but after some time I just thought fuck it, other people spend this time scrolling Facebook on their phones or whatever, there is no reason to feel guilty about it. Since I started doing it, nobody made a problem of it.


Kind of disappointed with this article. I thought it would have more practical advice vs. just defining categories of learning (i.e. incremental, transformational) and talking about the need for learning. On this topic, a great read is Scott Young's "Ultralearning" https://www.scotthyoung.com/blog/ultralearning/. Well-researched, practical and the author has been successful following his own advice. Highly recommend.


Learning I don't mind much. Actually enjoy it. Required training is a huge pain. We're suppose to do our normal job duties along with the required training. The training required by my company isn't connected to our customers so it sort of looks bad when we're at our location and the customer sees us doing something weird. Worse yet is when both the customer and our company require the same training so we end up doing the exact same training twice from two different training companies.

Yeah not gonna lie my job is lousy compared to most people here.


It's weird to see the emphasis on workers rather than emphasizing employers. My employer lets me learn and study during the 9-5. My previous employers didn't. It makes an incredible difference.


I have never been a fan of learning at work. It seems to impose a sort of group thought where employees learn by following patterns in place under the shadow of work culture and procedure. That has always struck me as highly uncritical and non disruptive.

Instead if you need new skills or want to advance your current skills my recommendation is to write an original application that solves some problem. Then you are forced to make original decisions and own the resulting consequences under risk of failure.


I tend to disagree. You are hired AND compensated for your skillset. How you acquire it is up to you. This is why experience and seasoning result in senior roles and higher comp.


One of the big reasons I left my last job was because management talked a lot about PD (professional development). However, the approval process for learning resources was vague and long. On top of that, we were told to learn on our own time and that at work, you were to work. I realized that to them, PD was something they encouraged but did not support. I eventually found out that the avg tenure was much shorter than the number I was told during interviews. I wonder why...


I was an intern at a relatively large retail company this past Summer and there were quite a few points where I had nothing to do except wait on database access to be approved. This took a few hours minimum.

I bookmarked the Rust book and started to learn the language in order to keep me stimulated. I also read a lot of Wikipedia articles on older computer scientists because I love history. Overall a good experience, but definitely would have liked a more engaging job overall.


I've noticed that some companies implicitly do this. Instead of hiring from the pool of people with the skill they need, they hire someone else with potential and whom they like better.

Then when it comes to the job, they don't tell you to learn specifically, but subconsciously they do know this and as a consequence rather judge on the quality of your work than "looking over your shoulder".

Of course, it's also cheaper to hire younger people.


This factor almost exclusively led me to leave a company. They demanded billing time to clients and also required a working day filled with billed hours.

Where was I to learn? On whose dollar? Multiple clients need me to learn the same thing? Who takes the hit?

In the end the system is setup to pressure me into taking the hit out of my own time. I consider this one of the great evils of consulting companies.


Mods is it ok that this user ONLY posts links from paywalled sloanreview.mit.edu links? Seems like spam to me. They never comment, they only post every article from that website.


I've just quit my job after almost a decade. The fact that my employer did absolutely zero to improve my skills, aside from conferences once a year makes me really sad. The longer I think about it the more abusive it seems. In these times there is no excuse to leave your employees behind. But it should be well planned and not just jump every hypetrain.


Totally. And more companies should widen their budget for this. It’s pretty clear that it’s also an investment for employers too.


Ten years ago the sentiment was to not https://news.ycombinator.com/item?id=567115 and I found that very offensive for some reason at the time. It is good to see the HN community very much for learning at work now.


We are "Knowledge Workers" for crying out loud our role is to create knowledge and spread it that's all we're supposed to be doing all the issues that arise are based on people not knowing what learning really is and not understanding the value of the knowledge being created


I was publicly told by my immediate supervisor to stop once reading a technical book on the job for a degree my company has paid for. Apparently that was unacceptable behavior. I quit the company after I earned that degree. My experience is that most people quit because of poor management.


I was once publicly told by my immediate supervisor to stop reading a book about XML (back when XML was new) when we started using XML for new web services.


Something like 70% quite because of a shitty boss.


When I took my current job my manager told me that learning on company time is encouraged, even if it's not directly applicable to what you're doing (within reason). I didn't really expect that to be the case when I started, but it actually is and it makes me insanely happy.


I would tend to agree, but from what I've read in Robert Cecil Martin's book "Clean Coder": it's not the job of your employer to keep your CV up to date, when you pay a musician for a performance, you don't pay them to practice scales.


> when you pay a musician for a performance, you don't pay them to practice scales

A lot of professional musicians (e.g. permanent orchestra members) are definitely paid to practice scales. A lot of instruments even require training specific muscles to manage playing at all.

For gig-musicians (of any kind), the practice must be factored into the gig payment.

I'd say your analogy is appropriate, but for the opposite point. I don't see any reason for programmers to keep their relevant work skills up to date for free.


In addition to having space to learn on a daily basis, more tech companies should consider offering a week of study leave in addition to what they offer in their vacation policy.

I'm imagining the type of week that Bill Gates and John Carmack take.


> a week of study leave

Nice thought, but if my wife can see me, that week will become "fix stuff around the house leave" or "visit her parents and don't read anything leave". If my coworkers can see me, that week becomes "fix these bugs for me leave". They'd have to fly me somewhere for that to work.


yep, should include a hotel room


Some companies claim to own any skills you develop that would not have earned at a usual job - I don’t really want some former employer claiming I can’t work at my next job because of the time they gave me to do online courses...


If you're speaking of a non-compete agreement, I understand them to be largely unenforceable. In the US it depends on what state you're in.


Employees are valued by businesses more for their expertise than for their work. I’d rather have one dev who knows how to solve difficult problems than a hundred who don’t. Roughly.


“The moment you stop learning is the moment you begin to die.”

This quote MADE the article for me. I've been saying something similar for a long time now.


I even setup my reports with a skills matrix which shows the skills they need to get to. Still find them not really pushing themselves.


Well, I’ve been in situations where I was expected to learn this or that, but I was also expected to complete 30 or so “story points” per two-week sprint, and be available to troubleshoot production problems, and attend planning meetings - and the only things anybody ever asked for statuses on were the bug fixes and new features, so I always had to end up choosing before meeting my mandatory learning goals, meeting my mandatory feature implementation goals, meeting my mandatory bug fixing goals or meeting my mandatory meeting attendance goals; “learning” always ends up taking a back seat.


Yes. Very true. Scheduling learning is hard. And incentivizing it is even harder.

I personally love to learn. It’s what I do best. If I don’t do it I get bored.


I got the same idea during my last team lead gig.

I introduced a "interests & responsibilities" matrix, supposed to align team members skills with their interests.

Goal was to: * ensure all needed skills are covered within the team * foster collaboration between knowledgeable persons and those wanting to gain more expertise.

But it's quite hard to use if you don't tie it to the team processes. It has to be visible and integrated to the team's workflow.


No it's not work. It's an investment by the employer taking the risk it will pay off in the future.


An investment with no guarantee that the employee will stay to provide a return.


“We” assumes a level of cooperation and mutual goals that I think doesn’t exist even for most high end jobs.


What is the distinction between learning and developing? Can you develop software without learning?


With more menial jobs, you can definitely develop without learning much or at all.

I have quit a job because it felt like my day devolved into a glorified "copy-paste monkey". We had a basic template we used for clients, we just did a bit of copy-paste-modify crap to it, and then gave it back to them. I rarely if ever learned anything, but it was technically development at some level.


Working and learning at the highest level: What's the difference?


By "learning" you mean "reading Hacker News"? ;-)


This is one of the reasons I quit programming. I could be spending time improving as a developer but instead they've got me doing some totally inconsequential "bug" fix (spend the next two hours updating this translation for the third time). I just got sick of someone rationalizing me doing non work by saying it's better for the business as though I have no broader perspective of what might be better for the business. It was just bullshit.


Can I ask what you do now? I'm programming for one and half decade and wonder about moving to something else. But it's bit scary as software is only thing I know.


Not OP, but I feel the same way as them. I'm taking an EMT class and planning on applying to PA programs. It's scary doing something totally different but it feels so much better.


Ha I’m in the same boat at my finance job. Looking to move into software dev and go back to school, but scared because I’m paid well and I have to start over again.


Great question, this is what I was wondering as well.


Well I dropped the golden handcuffs. So right now I'm flipping cars (replace a clutch, head gasket, etc.) driving for Lyft, and I might get a part-time job. I'm definitely not making as much money as I used to. But for me the traditional strategy of working for a company and saving up for retirement wasn't working anyhow. it was like well I'm either going to have to deal with this now when I can do something about it or when I'm older and I won't be able to do anything about it.

So I'm kind of where I was when I was 20 except now I have a whole lot more skills and experiences. I'm not actually sure what I'm going to do long term. All I really know is I was headed down the wrong path and I've been lying to myself for far too long.

But actually I do kind of know. The only thing that really makes sense is some kind of tech entrepreneurship. I mean I know success isn't guaranteed but it comes down to kind of a deep question which is what do you want to do with your life. Sitting behind a desk coding what someone else tells me to all day long is simply not going to get me where I want to be in life. So I kind of jumped off the cliff hoping I can learn to fly on the way down.

I totally might fail so don't follow my lead too much.

I will tell you that since I've been working on cars all day I'm so much physically stronger than I was as a coder and it feels fantastic. It isn't really long-term because technically you're not allowed to buy a car with the intention of fixing and reselling it unless you're a dealer but succeeding and failing based on my own decisions feels so much better then being cemented underneath someone else in a corporate hierarchy.

I was a developer for about 10 years and even though I could get another development job I'd rather crawl through the mud and find a new way forward before I get too old and it's too late.


That's awesome you took the plunge; jumping off the cliff is probably the hardest part. It's not easy but people who are smart and motivated are really good at figuring things out when they need to, it's just that so many take safe but unfulfilling path.

I went the opposite direction, leaving a comfortable but soul-sucking job in a totally different industry to transition into programming. It was rough for a few years as I was struggling to build up my experience and make enough money to get by on, but now I have a great job that I actually enjoy going to.


That takes a lot of courage, kudos. I've been in tech since 2006. I started out as a min wage tech support agent and worked my way up to a lead software dev position now. I do both back-end (C#) and front-end (Vue, Angular, React) development. In the last 3 months, I've switched jobs twice in a fruitless effort chasing that sense of passion and excitement I had a decade ago.

I have it great; I make a 6-fig salary, I work remote full time, and never more than 40 hours. I feel guilty for complaining, but the work is just damn boring. I would rather be a janitor than code anymore. Some days I sit and just stare at my screen for an hour depressed as hell wondering how I'm going to get out of this and still provide for my family. We have chronic medical issues, so the insurance alone makes it hard to escape. The great recession also left me with severe anxiety about being broke and unemployed.

I recently rebuilt a manual transmission with minimal prior experience working on cars and it made me feel so damn good afterwards, so I really respect your decision to work on cars. I agree, it feels great to do manual labor! I spend my free time doing manual work around the house like yard work or building things like a green-house. I'm learning a lot and having a great time, but when I sit down at the desk it makes me even more depressed about coding.

I hope it all works out for you.


> I have it great; I make a 6-fig salary, I work remote full time, and never more than 40 hours. I feel guilty for complaining, but the work is just damn boring. I would rather be a janitor than code anymore. Some days I sit and just stare at my screen for an hour depressed as hell wondering how I'm going to get out of this and still provide for my family. We have chronic medical issues, so the insurance alone makes it hard to escape. The great recession also left me with severe anxiety about being broke and unemployed.

Jesus. Yeah, this entire thing hits home. I think a lot of mid- to late-career software folks are in this position of hating the work, feeling super guilty for hating the work, and not seeing a way out that won't risk their family's financial security.

My best job ever was a Summer job helping manage a campground. Outdoors, some social interaction, enough down time to read a few pages of a novel or do some pushups here and there, a bit of moving heavy things around, nothing whatsoever to think about when you leave for the day. If something like that paid 75% as much comp + bennies as my WFH software job and had similar future job prospects, I'd quit effective today and go do that. And probably cancel my home Internet service, sell my computers, and downgrade to a dumbphone. It's all just so incredibly low-value, once the shininess wears off.


Managing a campground sounds amazing. I worked at a gas station/food mart for 4 years before getting into tech. To this day I still have dreams where I'm doing that job and having a great time interacting with my coworkers and people while performing mildly physically demanding work. One of my buddies still begs me to come back and work with him. Just like you - if they paid me 75% as much comp/benefits - I would go back in a heartbeat. Ideally I would be a forest ranger; I live in the Pacific NW and I love the outdoors, so it's only fitting.


That sounds really cool. Have you thought about potentially going down the path of becoming a dealer? The auto space continues to be hot as it's highly lucrative (see Vroom for example), and you've found a niche that seems to work on a unit economics basis. Why not strap on some automation bit by bit and make your flipping process more efficient? By the time you get even 25% of the way through, you'll probably have stumbled on to a multimillion if not multibillion dollar company.


Good stuff and good luck. I totally identify with how you feel. I have to take the more measured route, try things after work. You are right about physical work, it is rewarding. Hydroponics is where I spend my spare time.


The only thing I’d caution you is that, in my opinion, starting a successful company is 90% about relationships, and you mention feeling “cemented” in your past work relationships. And there might be something there you need to practice, and improve on, before you start your company.

(I say, as someone who is currently staying full-time employed, paused on entrepreneurship, and trying, and often failing, to be better in my work relationships)

That said, best of luck to you. You’re clearly self aware, which will see you through a lot.


>All I really know is I was headed down the wrong path and I've been lying to myself for far too long.

Good for you in a manner speaking. Most people keep 'lying' to themselves their whole lives, not just about their sorry corporate jobs but life in general ( things like marriage, children etc.)

I really hope you succeed in the long run. I'm cheering for you.


Same here. I moved into Agile/DevOps coaching, where it's expected you learn in order to help your teams grow and improve their practices and products. Haven't looked back. I now spend a lot of time advocating on behalf of my teams to get them in-sprint time to learn and mature what they're interested in.


Sorry for being too direct, but for the sake of clarity, and coming from my experience communicating with agile coaches - I think it is bullshit. I wonder if others have a similar experience/opinion...


Personally—I think because I've seen so little of the code I've written actually live long enough to seem worth the time it took to make it, including a bunch of small-potatoes projects for huge companies that took months of my work-life and then were simply binned because they changed direction or acquired some company with another solution or the project was doomed from the start because they'd very obviously allocated far too small a budget—I'm no longer turned off by the idea of having a job that's more-or-less understood to be entirely bullshit, the way I might once have been.


How did you make that transition? I've been interested in making a similar one but have been frequently advised to stay in engineering and work my way into management instead.


The thing making me want out is having to change jobs every few years not to fall way behind on current pay, and having, for interviews, to prep for a series of goddamn pop-quizzes in a high-pressure environment, which could cover material from any or all parts of your education or career, plus a bunch of stuff you may well never have encountered in either, at practically any difficulty or granularity level, with, usually, no way to narrow the field you need to prep for down meaningfully even if the roles you're applying for are pretty specific, and usually with no clear understanding of how your performance will be judged. It's the only thing I've encountered in adult life that gives me stress-dreams like school did. "Normal" job stress, even when shit's on fire, is about 10% as stessful, at worst. I hate, hate, hate it.


Who is "they"? Did you not have product teams writing clear business cases for everything you work on? Like if a developer doesn't understand why something is important it isn't going to get done right.


I think I'm not understanding something.

Is it usually the case to have product teams documenting clear business cases to explain why something is important?


At my first programming job, my performance suffered when management ran out of things for me to do because I was running the mobile department successfully while the web side of things was constantly on fire. When confronted about the lack of progress on my side, I explained that my sprint tickets were mostly pointless, with no business value. My manager said it's my job to do it anyway. Unfortunately I stuck around for a little while longer after that. I really shouldn't have.


You do not understand what it is to be a programmer. Programmers fix bugs. It might be boring but it's required. Programmers document. It also might be boring but it's required. Programmers write unit tests. We've hired people that don't want to do the non-fun stuff. They don't last long, they generally come in, cause problems for the team and then leave. The people that are willing to do what's needed even if it isn't fun are much better to have on a team.


You do not understand what the parent comment meant. It's not bug fixing he's opposed to. It's busywork and the idea that a programmer themselves have no idea how their time might be well spent. So much human potential wasted by corporate efficiency tyrants that do nothing other than burn out their employees.


I think you're confusing me with somebody else.


I think you mean accountant.


While I’m fortunate to work for an org that pushes training, my generation was not given much feature work at the start. We were expected to take over support and learn the systems and graduate into development/architecture some day. That’s how monoliths swung.

Today, that’s clearly out of fashion. Nobody seems to maintain anything and the next group wants to reimplement much of the same function in the new flavor of the month. Anecdotally, it does not all look more stable, bug free and performant but it does seem more fun.

/okboomer


Anyway around the paywall?


If you aren’t learning on the job, you’re missing out on a better way to do things


I am still learning.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: