The problem I experienced was most of the companies want a bachelor of science or an engineering degree on their job postings. I think this is because the workshop kool-aid has proven ineffective in creating good programmers but rather code workers who are good at menial jobs but unable to provide good solutions. So if a foreign policy major decides to go full on coding by self-teaching skills to himself he'd get slaughtered on job applications, without being able to show his skills on the technical interviews.
Secondly, the job politics of most companies are just dreadful. After all those humdrum job interviews, skill tests, certification badgering; what I experienced is writing code that generates a sql query to the parameters that's taken from the frontend which had NOTHING to do with interview questions or the skills demanded by the employer.
In sum, I have to emphasize that these are my experiences and maybe limited to where I live and tried to earn my keep by coding. So I hope that it is a local thing and where you live has a better environment when it comes to employing programmers who are outside of the field.
Thanks for reading anyway.
Don't get me wrong. On paper it's attractive. Quite high salary. Casual dress. Often flexible hours. No heavy lifting. High demand for skilled workers.
On the other side: You have to concentrate to a level that most people can't even understand -- and you have to do it all day, every day. Nobody understands what you are doing and most people under appreciate the difficulty (which is actually a huge understatement). Many managers believe that the reason things are going more slowly than they would like is because you don't have enough stress in your life. Unfortunately, the reality is that while you don't have much day to day stress (unless you are are in dev ops), you have ridiculous long term stress. No matter what you are working on, you don't have enough information to do your job. As soon as you learn enough to do your job, the job changes. Your tools change on a yearly, if not monthly basis -- and the amount of learning necessary to get good at your new tools is a full time job in itself. On top of all that, the field is immature to the point that it is likely that every single person on your team has a different way to solve every problem. And they are all convinced that everybody else's solution is crap. And they are all correct (pause to let the implication sink in). You're going to get it wrong almost all the time, and it will take you something like 30 years (if I'm any measure) to realise it. But even when you finally figure something out, nobody will listen to you, because they all think you are wrong (and they are probably correct about that too).
It's a terrible job. Only do it if you really, really want to do it.
On the other hand, I've known a few people who are able to clock in and clock out, do a competent job, and not get too stressed about stuff. I'm still trying to figure out how they do it.
This is an amazing skill I only learned during a PhD, and only after burning out half way though.
Obviously, step 0 is clear boundaries and reasonable work hours. Clock out at 5 or 6 (or 7 or whatever -- point is, put in your 40 hours and leave).
After that, things get more difficult.
For me, the answer was to work out directly after work (so before the commute home even!) and to listen to an interesting audiobook/podcast during my workout.
That way, when I got home, I would have something on my mind other than work. Fortunately, my personality is such that as long as the audiobook/podcast was interesting, that's what I will focus on/think about the rest of the night.
Back in the mid to late 2000s, I recall seeing and hearing management say, "we just need a few Google-quality engineers". There was this pervasive sentiment that if they could just find good enough software engineers, it would somehow make up for the fact that the company had no leadership or direction.
That, combined with the competitive nerd gene, zero objective basis for interview techniques, and an attitude of "any reason to boot this person" rather than "how can this person contribute to our company?" has gotten us to where we are in this toxic industry.
On the other hand, I also thinks it's not ideal for someone with only little software engineering education to aim for a job which is full-on coding. I think those kinds of workshops (or single programming courses) are more useful for jobs where coding is just tangentially a topic, like if you're occasionally creating/maintaining advanced spreadsheet files, or work together with programmers (if you're graphics designer, product manager, or similar).
This may happen in other subjects as well. I taught college math as an adjunct for all of one semester, and discovered to my surprise:
1) None of my students knew the meaning of "show your work." Hint: It doesn't mean turning in your scratch paper along with the answers.
2) When I asked the math professors how to explain "show your work," some of them bluffed and blustered, but none of them could give me a straight answer. At worst, I got, "they should just know."
It sounds like you were teaching calculus or maybe college algebra ("they should know" is actually good advice for the higher level mathematics courses -- most departments have a whole course that's more-or-less about how to write proofs).
At research universities, the tenure-track professors rarely teach anything below Calc II, and might not teach any Calculus courses at all. Certainly nothing below Calc I.
If you ad junct again, consider asking the other ad juncts/instructors instead. They're more likely to have lots of exposure to that university's demographic in a relevant course.
In particular, for a TT professor at an R1, odds are extremely good that they've NEVER taught anything below calculus and may have never even been the instructor of record even for calculus courses! They were hired to do research and teach graduate/upper-level undergraduate courses.
Teachers that know how to explain and assess everything in a graduate analysis course with perfect clarity might not know how to handle the situation where a student doesn't know what it means to "show you work" in the context of calculating a derivative or solving a linear equation. In fact, they might not have even thought about learning/teaching that material themselves since early high school or maybe even middle school. This doesn't make them a bad teacher. It just makes them a bad calc I/college algebra teacher. Which is okay -- teaching those courses is not their job and is not what they were hired to do.
So this is somehow a bit like complaining that a web programmer isn't particularly adept at kernel hacking. That doesn't make them a bad programmer, it just means they focus is elsewhere.
Again, most universities do hire experts in teaching the lower-level courses. They're just not typically tenure-track professors.
Admittedly, I was a math major in college, and was unaware of the existence of college algebra until I was hired to teach it.
Oh and the argument doesn't have to be (exclusively) verbal. It can consist of diagrams and symbol manipulation.
The only method I've seen work for a large set of students is to show lots of examples of what "good" and "not good" versions of "show your work" look like. Preferably on problems that look similar to the homework problems. And emphasize this during lectures.
E.g. after doing a derivation on the board: "by the way, this is what I expect to see when I say 'show your work' on a <insert problem type> problem."
Nobody ever explicitly tells you this, but a few lucky students guess it at some point, preferably long before they get to college.
Teaching is an art. To some extent, "show your work" is an art as well. The techniques of art are never perfectly well defined, and are always somewhat arbitrary, but people do manage to learn it. Learning an art by imitating the masters is as old as the hills.
Perhaps my biggest frustration while teaching the algebra course, was not that I was confused about "show your work" means, but that there was such a gaping hole in the preparation of my students.
If they were CS majors I would entertain them learning the lower-level stuff first but not for what they are trying to accomplish.
You are teaching coding, not computer science and electrical engineering on Day 1.
Setting variables, printing things on the screen, creating loops, making your first functions- Thats day 1.
With that, I would assume most people become addicted to programming, when you make a function, you can feel the power of programming.
After that, I encourage people to solve a problem in their own life. Most of the time, you can rig it in python- but if they wanted to build an app, they are basically ready.
My objective is to give them instant gratification and feed into their personal goals.
Using C is not as bad as it seems. For one thing, embedded projects always start out a lot smaller than software projects on a big computer. You spend a lot of your time as a beginner, understanding the hardware side of things, such as interpreting the numbers associated with i/o pins and other devices. In the entire duration of a beginner course, you might never get past programs of maybe 25 lines.
I use microcontrollers in R&D, and have been quite happy sticking with Arduino based boards.
Sorry but, real world problems like foreign-policy practitioners have to encounter on the job don't respect _any_ boundaries, let alone boundaries about what topics one knows/doesn't-know/doesn't-have-to-know. For Christ's sake, these students were majoring in "Science, _Technology_ and International Affairs."
They had some coursework where they encountered a bit of programming and some hardware configuration, SO WHAT? It's good for them.
Now, if we can get computer science students to wet their feet in some public-policy...
>On the first day I soothingly assured them that they wouldn’t have to do any programming, which of course was a complete lie.
I don't have a strong stance whether his decision was ethical or not, just interested in seeing your opinion as you seem to be implying that it might not be :P
Is is still unethical if the professor argues that the lies were for the students' own good?
Is it still unethical if, a few months into the course, the professor tells the truth?
Is it still unethical if, a few months later, the professor tells the truth and one or more students says "you know, I understand now why you lied to us"?
This really isn't a challenging conundrum. The student-teacher relationship is a special one. There are expectations that need to be fulfilled on both sides. One of the most basic ones that can be expected of a teacher is don't lie to your fucking students.
Otherwise, presenting informal proofs to students in engineering or science that would not be considered mathematically rigorous would also be a lie and unethical.
That's a good point. I like to tell programming newbies (or just other people who don't know what software development is) that a computer program is like a cooking recipe: It defines step by step what to do, and someone (the computer) executes it.
That's why C or especially C++ is probably the worst possible language to teach programming.
Also, be frank with your students from the beginning about one thing: there are areas that they should understand, and then there are areas that they should just treat as "magic" and not go into, on a certain level of development. Not because they're really magic, but just keep first things first.