I must say I am conflicted to write on this subject. On the one hand I am a person who has a humanities background turned to coding (bachelor's on philosophy; master's on sociology) but on the other hand I am burned out now and unwilling to do anything code related thanks to previous employers so my outlook isn't a nice one.
The problem I experienced was most of the companies want a bachelor of science or an engineering degree on their job postings. I think this is because the workshop kool-aid has proven ineffective in creating good programmers but rather code workers who are good at menial jobs but unable to provide good solutions. So if a foreign policy major decides to go full on coding by self-teaching skills to himself he'd get slaughtered on job applications, without being able to show his skills on the technical interviews.
Secondly, the job politics of most companies are just dreadful. After all those humdrum job interviews, skill tests, certification badgering; what I experienced is writing code that generates a sql query to the parameters that's taken from the frontend which had NOTHING to do with interview questions or the skills demanded by the employer.
In sum, I have to emphasize that these are my experiences and maybe limited to where I live and tried to earn my keep by coding. So I hope that it is a local thing and where you live has a better environment when it comes to employing programmers who are outside of the field.
I've been programming for the vast majority of my life, professionally for something like 30 years. I've done other jobs as well. My general advice to people thinking of becoming a programmer: Go ahead if you like programming. If not, then there are almost certainly better jobs than this.
Don't get me wrong. On paper it's attractive. Quite high salary. Casual dress. Often flexible hours. No heavy lifting. High demand for skilled workers.
On the other side: You have to concentrate to a level that most people can't even understand -- and you have to do it all day, every day. Nobody understands what you are doing and most people under appreciate the difficulty (which is actually a huge understatement). Many managers believe that the reason things are going more slowly than they would like is because you don't have enough stress in your life. Unfortunately, the reality is that while you don't have much day to day stress (unless you are are in dev ops), you have ridiculous long term stress. No matter what you are working on, you don't have enough information to do your job. As soon as you learn enough to do your job, the job changes. Your tools change on a yearly, if not monthly basis -- and the amount of learning necessary to get good at your new tools is a full time job in itself. On top of all that, the field is immature to the point that it is likely that every single person on your team has a different way to solve every problem. And they are all convinced that everybody else's solution is crap. And they are all correct (pause to let the implication sink in). You're going to get it wrong almost all the time, and it will take you something like 30 years (if I'm any measure) to realise it. But even when you finally figure something out, nobody will listen to you, because they all think you are wrong (and they are probably correct about that too).
It's a terrible job. Only do it if you really, really want to do it.
On the other hand, I've known a few people who are able to clock in and clock out, do a competent job, and not get too stressed about stuff. I'm still trying to figure out how they do it.
> On the other hand, I've known a few people who are able to clock in and clock out, do a competent job, and not get too stressed about stuff. I'm still trying to figure out how they do it.
This is an amazing skill I only learned during a PhD, and only after burning out half way though.
Obviously, step 0 is clear boundaries and reasonable work hours. Clock out at 5 or 6 (or 7 or whatever -- point is, put in your 40 hours and leave).
After that, things get more difficult.
For me, the answer was to work out directly after work (so before the commute home even!) and to listen to an interesting audiobook/podcast during my workout.
That way, when I got home, I would have something on my mind other than work. Fortunately, my personality is such that as long as the audiobook/podcast was interesting, that's what I will focus on/think about the rest of the night.
Working out before going home works great for me, too. And it has the added bonus of helping you skip some of the traffic on that commute home by pushing the start time of that commute back a bit. Depending on where you live and when you work, I suppose.
Nice to hear I'm not the only one who feels this way. I finished a bachelor's degree in CS back in '94, have been programming professionally ever since. I went back and got a Master's degree ten years later, while I was working. At this point, I have a LOT of education, and a LOT of experience. More, I imagine, than most anybody out there. Yet I'm continously hammered for not being fast enough, not catching all the problems before they happen, sometimes not knowing the answers to questions on the spot... if I didn't have a degree in CS, I might think, "well, maybe they teach you this stuff in college". If I didn't have much experience, I think, "well, maybe you learn this stuff with experience". But I have all that and still find myself unable to solve every problem instantaneously with no margin for error. Either I'm REALLY stupid, or this is harder than they think it is.
After all those humdrum job interviews, skill tests, certification badgering; what I experienced is writing code that generates a sql query to the parameters that's taken from the frontend which had NOTHING to do with interview questions or the skills demanded by the employer.
Back in the mid to late 2000s, I recall seeing and hearing management say, "we just need a few Google-quality engineers". There was this pervasive sentiment that if they could just find good enough software engineers, it would somehow make up for the fact that the company had no leadership or direction.
That, combined with the competitive nerd gene, zero objective basis for interview techniques, and an attitude of "any reason to boot this person" rather than "how can this person contribute to our company?" has gotten us to where we are in this toxic industry.
I think the job posting issue is partially related to recruiters not knowing much about technology either, so they stick to "measurable" means like degrees and technologies, even though it's often stupid. Also, job postings are generally optimistic, and you might still have a good chance if you don't meet all requirements.
On the other hand, I also thinks it's not ideal for someone with only little software engineering education to aim for a job which is full-on coding. I think those kinds of workshops (or single programming courses) are more useful for jobs where coding is just tangentially a topic, like if you're occasionally creating/maintaining advanced spreadsheet files, or work together with programmers (if you're graphics designer, product manager, or similar).
I've seen recruiters do some really weird stuff, not to mention HR feeling the need to insert themselves into every process, despite their vast unknown-unknowns.
>>> More broadly, this illustrates that there are hidden traps of confusion and misconceptions among first-time programmers that are very hard to see if you’ve been programming for a long time.
This may happen in other subjects as well. I taught college math as an adjunct for all of one semester, and discovered to my surprise:
1) None of my students knew the meaning of "show your work." Hint: It doesn't mean turning in your scratch paper along with the answers.
2) When I asked the math professors how to explain "show your work," some of them bluffed and blustered, but none of them could give me a straight answer. At worst, I got, "they should just know."
> 2) When I asked the math professors how to explain "show your work," some of them bluffed and blustered, but none of them could give me a straight answer. At worst, I got, "they should just know."
It sounds like you were teaching calculus or maybe college algebra ("they should know" is actually good advice for the higher level mathematics courses -- most departments have a whole course that's more-or-less about how to write proofs).
At research universities, the tenure-track professors rarely teach anything below Calc II, and might not teach any Calculus courses at all. Certainly nothing below Calc I.
If you ad junct again, consider asking the other ad juncts/instructors instead. They're more likely to have lots of exposure to that university's demographic in a relevant course.
I'm not conceding that it's reasonable for a professor to answer something with "they should just know," but I think the issue here is that they seemed incapable of explaining it. It's one thing to say to a class "you should know how to $x, so I'm not going to teach it to you right now" but another entirely to be either unwilling or unable to explain exactly what they mean to a colleague.
Keep in mind that teaching is its own skill. Especially in universities, that skill can get awfully specific awfully fast.
In particular, for a TT professor at an R1, odds are extremely good that they've NEVER taught anything below calculus and may have never even been the instructor of record even for calculus courses! They were hired to do research and teach graduate/upper-level undergraduate courses.
Teachers that know how to explain and assess everything in a graduate analysis course with perfect clarity might not know how to handle the situation where a student doesn't know what it means to "show you work" in the context of calculating a derivative or solving a linear equation. In fact, they might not have even thought about learning/teaching that material themselves since early high school or maybe even middle school. This doesn't make them a bad teacher. It just makes them a bad calc I/college algebra teacher. Which is okay -- teaching those courses is not their job and is not what they were hired to do.
So this is somehow a bit like complaining that a web programmer isn't particularly adept at kernel hacking. That doesn't make them a bad programmer, it just means they focus is elsewhere.
Again, most universities do hire experts in teaching the lower-level courses. They're just not typically tenure-track professors.
Indeed, the professors are either surprised or offended that the university even offers college algebra, which they obviously never had to take, and which they consider to be a few levels below remedial.
Admittedly, I was a math major in college, and was unaware of the existence of college algebra until I was hired to teach it.
It means "write down a good argument for your solution being correct". And good means roughly that every step of the argument follows trivially from the last. And what counts as trivial depends on which course your are taking.
Oh and the argument doesn't have to be (exclusively) verbal. It can consist of diagrams and symbol manipulation.
This isn't a very helpful answer for students at that level. It's necessary to say these words at some point, but without lots of examples students will have no idea what "good argument", "follows trivially from the last", or "counts as trivial" mean.
The only method I've seen work for a large set of students is to show lots of examples of what "good" and "not good" versions of "show your work" look like. Preferably on problems that look similar to the homework problems. And emphasize this during lectures.
E.g. after doing a derivation on the board: "by the way, this is what I expect to see when I say 'show your work' on a <insert problem type> problem."
I think an even broader generalization can be made, that I only learned by accident as a student: A huge part of classroom education is learning to communicate knowledge by imitating the teacher.
Nobody ever explicitly tells you this, but a few lucky students guess it at some point, preferably long before they get to college.
Teaching is an art. To some extent, "show your work" is an art as well. The techniques of art are never perfectly well defined, and are always somewhat arbitrary, but people do manage to learn it. Learning an art by imitating the masters is as old as the hills.
Perhaps my biggest frustration while teaching the algebra course, was not that I was confused about "show your work" means, but that there was such a gaping hole in the preparation of my students.
Starting off with C? Ow. R or Python seem like they'd be much better choices for something students would actually use in the future, given their data analysis tools.
I agree, this doesn't make sense. Python especially strikes the perfect balance between learning programming concepts and being productive almost immediately in real-life. It seems like most of the problems the students experienced were lower-level CS details like types and pointers that didn't directly translate with the kinds of data analysis problems that they were likely be able to apply to their studies.
If they were CS majors I would entertain them learning the lower-level stuff first but not for what they are trying to accomplish.
You are teaching coding, not computer science and electrical engineering on Day 1.
Setting variables, printing things on the screen, creating loops, making your first functions- Thats day 1.
With that, I would assume most people become addicted to programming, when you make a function, you can feel the power of programming.
After that, I encourage people to solve a problem in their own life. Most of the time, you can rig it in python- but if they wanted to build an app, they are basically ready.
My objective is to give them instant gratification and feed into their personal goals.
They don't start with C, they start with Arduino, which incidentally requires C++. The big difference is that you can easily have blinking LEDs, sensors and what not, which makes the act of programming more "concrete".
Yes. To bring other up to speed, the world of low cost "hobbyist" microcontroller boards has converged on the Arduino development environment, which is based on GCC. In fact I think that offering a widely available, free, and tolerable firmware development environment was a major breakthrough for hobbyist embedded systems.
Using C is not as bad as it seems. For one thing, embedded projects always start out a lot smaller than software projects on a big computer. You spend a lot of your time as a beginner, understanding the hardware side of things, such as interpreting the numbers associated with i/o pins and other devices. In the entire duration of a beginner course, you might never get past programs of maybe 25 lines.
I use microcontrollers in R&D, and have been quite happy sticking with Arduino based boards.
Why isn’t it a good introduction to programming? It has a really consistent syntax, and polymorphism via multiple dispatch is a lot simpler to learn than polymorphism via classes imo.
I understand why the teacher in this case hid the coding requirements of the course. His reasoning makes sense. But I'm not sure it was ethical misrepresenting the course even for the students own benefit.
So many students, it seems, are grade-hounds that try too hard to workaround topics they find difficult or uninteresting.
Sorry but, real world problems like foreign-policy practitioners have to encounter on the job don't respect _any_ boundaries, let alone boundaries about what topics one knows/doesn't-know/doesn't-have-to-know. For Christ's sake, these students were majoring in "Science, _Technology_ and International Affairs."
They had some coursework where they encountered a bit of programming and some hardware configuration, SO WHAT? It's good for them.
Now, if we can get computer science students to wet their feet in some public-policy...
In older use, "coding" meant only assembly and machine language programming, which would be a separate process from writing a high-level algorithmic version. A paper on FORTRANS claims that it will "virtually eliminate coding and debugging" Thus the statement that the class is no coding could be considered truthful.
Sorry, I must have misremembered and thought they said "no coding" instead of "no programming". But people would probably call such a usage a lie if it is intended to mislead even if strictly speaking it isn't.
Is it unethical to do something if all parties are happy? Does it only become unethical when a party has been tricked and become unhappy, or is tricking somebody even for their own good unethical? In that case, what would be your definition of ethics?
I don't have a strong stance whether his decision was ethical or not, just interested in seeing your opinion as you seem to be implying that it might not be :P
Is it unethical for a professor to lie to their students?
Yes.
Is is still unethical if the professor argues that the lies were for the students' own good?
Yes.
Is it still unethical if, a few months into the course, the professor tells the truth?
Yes.
Is it still unethical if, a few months later, the professor tells the truth and one or more students says "you know, I understand now why you lied to us"?
Yes.
This really isn't a challenging conundrum. The student-teacher relationship is a special one. There are expectations that need to be fulfilled on both sides. One of the most basic ones that can be expected of a teacher is don't lie to your fucking students.
It could be considered a non-lie under the concept of mental reservation. In older usage, coding a computer meant writing assembly or machine language only; thus, a report on FORTRAN indicates that "FORTRAN virtually eliminates coding and debugging". This is a standard usage of the word coding although slightly archaic.
Otherwise, presenting informal proofs to students in engineering or science that would not be considered mathematically rigorous would also be a lie and unethical.
Lying always being unethical is actually a relatively standard idea in classical ethical theories e.g. Summa Theologica, Question 110, Article 3. However, the statement that the class involves "no coding" could be considered truthful in the social context or as part of a use of mental reservation.
Lying can be perfectly okay under consequentialism, which also goes way back. Personally I'm not very convinced of ethical theories where lying is unethical in all circumstances (sorry Kant), and I suspect I'm not the only one. Stating that there exists an ethical theory that supports your point just isn't very convincing for people who don't subscribe to that particular theory.
Their job is to educate, and stimulate the mind. They aren't telepathic, any "for your own good" they engage in is born of assumptions unless they know the student well.
> Lesson #2: the basic paradigm of code (strictly sequential execution, with flow control and function definitions) isn’t always intuitive and should be explained clearly to first-time programmers.
That's a good point. I like to tell programming newbies (or just other people who don't know what software development is) that a computer program is like a cooking recipe: It defines step by step what to do, and someone (the computer) executes it.
> Another problem was that I used several different ways to do the same thing in the code, without being clear about why. For example, I declared some parameters as constants and others (which weren’t being changed) as variables — mostly because I was being sloppy.
That's why C or especially C++ is probably the worst possible language to teach programming.
Also, be frank with your students from the beginning about one thing: there are areas that they should understand, and then there are areas that they should just treat as "magic" and not go into, on a certain level of development. Not because they're really magic, but just keep first things first.
I think the Professor is vastly overestimating how much his students benefited from this exercise. What they need are data analysis and modeling skills. R + Rstudio would be the perfect tooling for them. Laudable goal nontheless.
The problem I experienced was most of the companies want a bachelor of science or an engineering degree on their job postings. I think this is because the workshop kool-aid has proven ineffective in creating good programmers but rather code workers who are good at menial jobs but unable to provide good solutions. So if a foreign policy major decides to go full on coding by self-teaching skills to himself he'd get slaughtered on job applications, without being able to show his skills on the technical interviews.
Secondly, the job politics of most companies are just dreadful. After all those humdrum job interviews, skill tests, certification badgering; what I experienced is writing code that generates a sql query to the parameters that's taken from the frontend which had NOTHING to do with interview questions or the skills demanded by the employer.
In sum, I have to emphasize that these are my experiences and maybe limited to where I live and tried to earn my keep by coding. So I hope that it is a local thing and where you live has a better environment when it comes to employing programmers who are outside of the field.
Thanks for reading anyway.