This is why languages like Pascal (and it's descendents) use := as assignment.
There are contradictory opinions about sytnax among programmers. Many programmers don't like the example assignment syntax above, but then many programmers also say that syntax is a relatively minor aspect of learning a language (even though the syntax of a language shapes the way you think about solving problems).
And many programmers do avoid certain languages precisely because they dislike the sytnax.
Nikalus Wirth, who created Pascal, Modula-2 and Oberon, had this to say about using the equal sign for assignment:
"A notorious example for a bad idea was the choice of the equal sign to denote assignment. It goes back to Fortran in 1957[a] and has blindly been copied by armies of language designers. Why is it a bad idea? Because it overthrows a century old tradition to let “=” denote a comparison for equality, a predicate which is either true or false. But Fortran made it to mean assignment, the enforcing of equality. In this case, the operands are on unequal footing: The left operand (a variable) is to be made equal to the right operand (an expression). x = y does not mean the same thing as y = x."
As contradictory as it sounds, syntax is both important and unimportant at the same time.
It's important because it affects how code is read and how code is written. Programmers naturally lean towards constructs with light syntax and away from constructs with heavy syntax: just think about defining maps in Python or JavaScript (with nice literals) vs in Java, or think about higher-order functions in Haskell (with currying and tiny lambdas) vs JavaScript vs Java. And, as the article pointed out, the syntax can affect how somebody learns a programming language.
On the other hand, syntax is unimportant because we learn to see beyond syntax, to the concepts underneath. Syntax is ultimately just notation and notation is not the most important thing. Semantics are important! There are much deeper differences between languages than mere syntax. At this point, I don't even think of code in terms of concrete syntax most of the time, although this might be exacerbated by my background studying programming language design and theory.
Personally, I think different people tend to overemphasize one side of this or the other, with more people over valuing syntax than not. In my perfect world the solution to confusing syntax would be to teach people to disassociate syntax and semantics early on, but that's a tall order. Perhaps people could read Godel Esher Bach, which engenders the right mindset :).
Could part of the problem be that programming languages hijacked ASCII characters which partially hijacked typewriter functionality?
Pascal was always my least favorite language, because it never felt practical. For instance, its assignment requires two shifted keystrokes. Whereas the equal symbol requires one non-shifted keystroke.
In my ideal world, ISO would just standardize programming symbols. Then, languages could build around those common symbols, and then better input devices could be designed.
It amazes me that for as much money we spend on software (How many millions of programmers are there in the world?), we're still using a modified typewriter as our primary input device. Look at all the nicely designed specialized input devices we have for video games; surely we can come up with something better for programming.
I had a wise teacher early on who without fail used "becomes the same as", "is assigned to", and "points at" when reading code out loud. "Is equal to / equals" was only for comparisons.
The funny thing is that there is a mathematical symbol for assignment, the left-arrow, "<-". At least it's used widely in algorithmic notation to show assignment- and also in R.
I program a decent amount in R, and this left-arrow assignment operator is incredibly annoying. I believe it's a holdout from the days when there was a single key that would input '<-'.
I'm just starting out[1] and recognise all of these points, especially number 4 even though on first glance it might seem the wrong attitude.
Every time I run into something new I feel the need to 'fully' understand it before moving on. With such a vast corpus of information out there and a lifetimes worth of skill development in just one particular branch (database admin anyone?) I have certainly felt that sinking feeling when yet another new thing pops up.
Learning part time has not helped this as it becomes easy to say 'oh I'll catch up on that later but I won't start anything new yet'. That leads to a 6 month gap of getting anything appreciable done, at least for me!
Tomorrow I start full time in this learning endeavour (very lucky I know) and these 8 points probably won't be far from my mind when trying to actually get stuff done.
Even though I've been programming/developing software for 10+ years, number 4 is still relevant to me also.
What I've found is taking on the attitude of "make it work, then make it right" works well for me.
With anything moderately complex and new, you are never going to fully understand everything and get it right the first time. The best strategy is just hack something together until it "works", then constantly be revising and refactoring from there. Instead of spending hours architecting, reading and planning just jump in and start coding.
Or else you just end up in analysis paralysis or over engineering.
I'm working on a project which uses Golang, swift (ios) and Postgresql which are all new technologies for me. I haven't read a single book on any of them, I just jumped straight in and started developing. And bit by bit, line by line I have made good measurable progress.
I think the tricky thing about #4 is essentially just the leaky abstraction issue.
Each one of us is working and building on mainly one layer of abstractions. It helps our puny human mind to comprehend and visualize whatever limited functionality we are trying to do. Unfortunately, most abstractions are leaky, and when things spill over, you have to learn how to clean up (and then the next one spill over, we got the yak-shaving situation that any developer would encounter at least once ...).
It's not unique to programming. But in other subjects, when the abstraction is leaky you probably doesn't know it yet until you learn more. Here when abstractions break, things stopped working immediately. I guess that could be both a good and bad thing?
Leaky abstraction is an interesting way to put it. That makes sense to me. I really do feel like I'm building something on a certain layer of abstraction.
I guess when something new comes up it has the potential to push you down a few more layers than you intend and you can get caught trying to race to the bottom of the stack and learn everything.
A certain amount of curiosity for what's going on 'underneath' is surely a healthy thing and probably helps long term progression, but for learning it can be a bit distracting!
To expound on your last point: I agree that you have to be able to stop yourself from always getting drawn by curiosity into the rabbit-hole, but in my experience one of the things that has separate the mediocre developers from the great ones is their ability to cross those layer boundaries. You don't have to know everything about every layer, but it helps to know enough about each one to have a general sense of how it works, and to know where to start digging if you do need to go deeper. Obviously this kind of experience takes time to build up.
For example, if you are doing front-end JavaScript development it can be very good to understand HTTP, and have at least a vague idea of how things like TCP/IP and DNS work. Nothing kills productivity like hitting a bug that you can't even investigate without someone else's help.
Question: I'm curious to hear other people's experiences with "4. Stop asking questions". I ask questions quite a bit. It does slow me down. But I feel like if I understand something, I can move faster in the longer term, besides the value derived from just understanding how the system/abstraction works.
I think the point is not to get paralysed by the desire to know everything at once.
Each question tends to just give rise to ten more. Which is fine for someone who has a conceptual understanding of programming-in-general (whether from studying it academically or just being more experienced/talented), enabling them to contextualise all that information. But for the beginner who is trying to self-tech (self-learn?), even if they understand the words, they don't really mean anything.
>Django is a application framework, written in Python, which follows the model–view–controller (MVC) architectural pattern
I may know what all those words mean and thus understand that sentence. I may know what MVC is (in that, again, I understand the words used to describe it). But until I've done a bit of Python and implemented something following the MVC pattern (and compared it to the alternatives) I don't really get it. The knowledge has to become practical to be useful.
It's better to only ask questions that go one level deep at a time, practise what you learned until you actually understand it, then ask the next level of questions.
For self-taught coders who are never going to be amazing at it but just want to make stuff (like me), I think this is very good advice.
To elaborate, the goal should be to move forward by cycling between asking questions and doing something:
Ask question <-------> Do something (ie bang out code)
The "do something" will never be perfect, so don't waste hours trying to understand the theory perfectly. Use the coding time as a way to both test your knowledge and drive further questions.
This is why it's helpful to pick a motivating project while learning to code. It keeps you moving ahead and makes sure the "question time" is truly filling in the knowledge gaps you need to fill in.
>This is why it's helpful to pick a motivating project while learning to code.
Very much the case for me, yeah. I've tried learning from tutorials and books and I just get bored. But if I come up with an idea and try to make it, I love it, learning the bits I need as I need them.
Should note that I think that if one is capable of learning the academic way, that's probably a better approach. Understanding programming in the general and thus being able to apply it to the specific. You'll certainly make better code that way.
But for me, for whom learning that way is never going to work, the ad-hoc way is fine. It doesn't really matter if my code is scrappy so long as it works and doesn't do harm.
I generally agree. For me it just pissed off the people who were trying to help me. Seems like most practitioners (and this is in any industry, I think), know what to do and how to do it, but the logical "why" behind it has long since been archived. They learned it and accepted it, but don't think about it regularly. I would ask a lot of questions about language design, programming paradigms, and stylistic decisions. I read about theoretical and conceptual lessons that were difficult to directly apply to writing my own code.
The part where I disagree is that beginners should stop because it's advised. Some foundation in theory can really help to frame the lessons you learn as you hack your first projects together. The understanding also helps you recognize patterns and learn some vocabulary. When you're naturally ready to say "screw the theory, let's just do it" -- that's when you'll make measurable progress.
There's a line where you have to stop asking questions (lest you end up at the big bang). Experience will guide you in narrowing down where that line exactly is - over time, you better know what you don't know and don't need to know.
I'm also a proponent of constantly asking "why? how?", but at the beginning, you have to at least pause between your questions to process what's happening, and sometimes trust that you have enough to work with, or trust others to tell you what you don't need right away.
I agree with this, there is a line at which you have to stop. I need to know 'why' when I'm learning in order to put what I'm learning in context. This can often lead to chasing rabbits down innumerable holes.
Learning to stop asking why and at what point to just accept a certain level of understanding and move on has been tough for me.
Stopping asking right now doesn't preclude coming back later to flesh out your knowledge or course, which I've also found myself doing and it works well enough.
I think the point is to stop asking broad question like "Why doesn't this work?" before trying to figure it out yourself. In the process you will learn much more than just have the question answered in an equally broad form of "Oh there's a semicolon missing here". You still don't know why that semicolon was required in the first place afterwards.
#4 is phrased badly. Don't stop asking questions. But do identify which questions are relevant to getting your stuff done and which are sidetracking into irrelevant stuff. You don't need to know how a webserver works as a newbie. You should know how it works when you're starting to tune for performance.
both works: asking questions or not asking questions.
it mainly depends on the person
some ppl will learn by doing, they just want to go out there, break stuff, break it, break it, break it again and then they understand it
other ppl will want to know the subject even before writing the first line of code, so they ask questions and questions till they feel they have the thing figured out and they can start
yet other ppl will make an honour point to never ever ask a single question, they would rather spend hours reading doc, specifications, RFC, and whatnot
Who is to say there is a best way? I don't think there is
as a dev what I don't like is beginners asking a bit too obvious questions when all the answers are in the documentation if they took few minutes to read it, and what I really don't like is ppl asking questions feeling entitled to an answer like you owe them
but again it's my personal view on the thing, it's different for everyone
for example, I'm mentoring a friend and he often comes "I will bother you again with a question" (he is learning JavaScript) and it does not bother me at all, whatever the question is: 1. already agreed to help him and 2. it's easy to answer
I would be more worried about someone who want to learn programming and who doesn't care if he/she understand stuff, or is not curious; eg. the "asking questions" part is more about being curious and/or motivated enough to understand things than the act of asking a question, there many ways to understand "asking questions" is just another tool among "reading documentation", "reading previous answers to other ppl questions", etc.
my point: the pace you learn things is totally personal, if you think asking question is faster and you are concerned about the speed then ask a question; but then I could argue in some case you can ask a question and never get an answer (stack overflow) and maybe it is faster to read the doc.
I disagree. Keep asking questions, but do it only when you don't have an answer. If you deconstruct the question at it's simpler form, chances are that you'll find the answer yourself.
Personally, I've always read '=' as meaning the left hand side was equal to the right. So if I see
x = 5
then I'm already assigning the value of 5 to x, because they're equal! Does that make sense? Curious to see what others think, because before reading this article, I didn't even realize that was a particular tripping point for anyone looking to start programming.
Then you read it as assignment. It's a bit unfortunate that it means something else in mathematics. Pascal used := as the assignment operator, and used = as the equal test. It could also do 1-based indexing of arrays, which is another classic pain point when you start programming with just a tiny bit too much knowledge.
All I know that R's assignment notation, `<-`, bugged the hell out of me after years of equals-sign-assignment...but after playing around with R, I naturally and accidentally use `<-` in Python and other traditional assignment languages.
When a good friend of mine started to ask about learning programming
I answered that
“be ready to do it every day, be ready to learn for the next 10 years”
asked him to seriously think about that and we’ll talk again in a week
there is much more to it, but imho those are the 2 points that I’ve seen
somehow valid for the last 20 years or so
and when I read “I’ve been on again off again with it” I thought
“first mistake”, really if I would have 2 quick advices to give
do it everyday: because it test how much you can take, if you feel
programming 2 days in a row is “too much” that’s probably a good sign
“it’s not for you”
be ready to do it for 10 years: search “Teach Yourself Programming in Ten Years” by
Peter Norvig and you will understand why.
A lot of ppl when it come to “learn how to program” underestimate very basic stuff like the involvement and the time it take, I saw ppl give up just because it took too much time
or because they thought they would be good at it much sooner.
So yeah staying in front of a computer for hours, for days, solving problems with code,
it may not fit everyone, but to me the longer you can keep at it, or if you forgot to eat, to sleep while doing it, are in general good signs of “you can do it”.
All the 8 points are very valid, even if some could be argued,
I personally remember the fear, back in the 90s for a full month I was paralysed
because of the fear to do wrong, the fear to break stuff.
If it was not for this older guy who kept encouraging and reassuring me
I would have probably gave up, so I would say try to get a mentor.
But the opposite advice would work too: don’t get a mentor and try to understand everything by yourself, or in another context: resist the urge to go ask your question on stack overflow and try to solve as much of it by yourself. Solved it? then keep not asking questions on stack overflow as long as you can.
IMHO there are many ways to learn programming, some advices would not fit everyone,
there are no best ways to do it.
I think I'm going to use this comment for friends asking if they should start programming. Especially the do it every day think is quite important I believe. Thanks for sharing this.
One of the reasons I love Hartl's Rails Tutorial <https://www.railstutorial.org/book> is because he integrated a completely free c9.io account which (with the commands he gives you) is guaranteed to setup Rails and a shell environment identical to what the book expects. I have seen so many aspiring programmers stymied by things like installing a text editor, let alone the right version of Ruby or incompatible gems. And, people with less money are more likely to have a cheap Windows laptop, which makes configuration even more difficult. By contrast, anyone with a computer or even an Internet cafe should be able to get an up-to-date browser, and c9 is plenty good enough for getting started.
I've started taking my 9-year-old son through the book, and though it's crazy how large a vocabulary you need if you really wanted to understand everything, that's not actually necessary and he's able to follow the directions and make progress.
> Getting things installed and set up is where 70% of your time goes when you’re starting out. (...) I remember whole days dedicated to just installing a damn module. This is normal.
It's not normal to struggle with installing modules / libraries. It's just Python sucking. Normal programming languages rarely give you this kind of headache.
I absolutely agree about the words 'just' and 'obviously'. I've been trying to cut them out of my vocabulary. The word 'just' has this feeling of being short with someone. And that they are a fool for not knowing what to do already.
"Just put the book down over there".
"You just need to..."
#2 The Fear.
This is what I have struggled the most with. During all of my programming classes I worried that I was not cut out for it, because I am business student minoring in comp sci in the world of students who live for programming. But after a while I really learned that everyone in class always struggled with the same problems that I had, everyone was just to proud in a way to ask each other for help. Eventually everyone broke down and that was when the most fun happened in class, and with the programming. It was really cool struggling with other programmers.
By the end of all my programming classes my fear was not that bad, but I still fear doing any hack-a-thons or anything like that, because I shoot myself down.
I am working on getting over that fear, but its good to know I am not the only one.
This has occurred opposite to me while learning Mathematics. As a programmer, who has been learning fundamental mathematics from the past year, the whole assignment/equivalence thing screwed me up so many times.
A good way to look at it is not as two different symbols, but as the difference between a definition and a theorem. So if you are creating a new variable, function, etc. for convenience you assert "x = 5" (or something similar). Since you have no other constraints this variable has to satisfy, this is totally safe. However, if "x" already shows up in one of your formulas you need to be careful because you can't merely assert it is equal to whatever you want.
In pure functional programming, = = = (bindings are unmodifiable, and it's easier to substitute equals for equals).
The assignment operator should have been <- or something other than = from the beginning. I wonder if there was some discussion about that, back in the day?
Back in the day a large proportion of languages used = for equality and something else for assignment. That something else might have been e.g.
* := (Algol-style :=, not Go-style).
* A non-ASCII arrow character (surprisingly not always a <- 1, but sometimes 1 -> a)
* A keyword (SET a TO 1)
So yes, there was certainly a lot of discussion about it, but in the end this is also just a minor design detail. It's just an accident of history that we ended up with =- as-assignment being the norm.
Catch up? Catch up to whom? You will never really catch up to the people with a ton of experience today. If course at some point you might know something more in depth than them. And you will be ahead of people who start after you. Don't worry about catching up, worry about learning what your you are doing. That is all you can really do.
I went from basically zero (a week of C in college) to where I am now by working on it pretty much every day. And the only way that was possible was because I actually like programming. She should say something about how her motivation levels are doing.
In terms of development as a coder, I think there's a number of plateaus. It's easy to recognize those below you, hard to see those above. So here are those I came across:
- Able to make simple modifications to other people's code. Everyone starts here. If you see backgroundcolor = red, what happens if you make it blue? You have no idea of what the platform is, how the codes executes, what slightly unusual syntax means, etc. At some point, you will exhaust the places where your guesswork works, because there aren't that many straight replacements in most code.
- Able to implement your own functions. Some Excel spreadsheet doesn't have a function that you want. You type it out and testing is a few times will uncover some corner cases that you can fix. Still no idea of how to organise things, especially since the trading floor mentality is that time is money. So you have an ungodly amount of VBA functions, all over the place. There's an enormous number of guys in finance at this level. I went to see some guys from a well known bank running several billion dollars of funds with this kind of spaghetti.
- Realised that Excel is a crappy way to organize code. Discovered you can write your own standalone programs without knowing everything about how the screen is drawn (and I still don't know the nitty gritty of it). When you realise a great deal of code is already written for you, that's massive. Of course I take the screen pixels as an example, but it's true in just about everything that you write. For the most part, you are glueing things together that already exist. So now I could write VB6 programs and .NET as well. Realised there was a lot I didn't know that was already working out of the box. I wrote a poker probability calculator back during the poker boom to help me give money away more efficiently (guess what, it's not just the odds that matter).
- Spent ages learning about SQL and databases. Looks easy from a distance, rows go in, rows comes out. But why does it take so long? What's an index? etc.
- Started to realise there's a whole load of interesting ways to organize code. Object oriented of course sticks out, and you can't avoid it. It will take a while to write classes that are not just oddly collected functions. Maybe run into the GoF patterns. At this point, a large rewrite might occur as various Eureka moments happen. Also you may be doing a LOT of backtracking, because OO isn't all that obvious. Maybe look at functional, or at least figure out what first-class functions means.
- Started to look at performance. The vast bulk of code up to now needed zero performance optimization, despite being inefficient. At some point, you wonder why a calculation that can be explained in a few sentences can take a non-neglible amount of time. Queue big-O notation. Memory profilers, timers. Threading issues. Cache coherence. Network stack.
- Started to look outside of finance. Web, apps, etc. You realise there's a value to being polyglot. Also that if the foundations are there, switching between technologies is not hard. Looked again at architecture for scale.
Anyway, I could write a book about this journey. Or I could just continue coding.
> Setting up is something that fully fledged developers still battle with.
Amen to that. I have made a perfectly successful, 10+ year career of development in PHP with some JS sprinkled in there. But I still do it in FileZilla and Notepad++. Lately, I have been trying to learn about newer technologies so I have taken up learning JS more seriously using WebStrom and Python using PyCharm. I've been picking up all kinds of things bit by bit, but in the last few days in particular I have taken up a small project and gone down a road of:
* Python
* PyCharm (and IDEs in general - anything beyond syntax highlighting)
* React
* Node
* npm
* Babel
* Git
* Github
* Gulp
* Browserify
* File watchers in PyCharm
* Bootstrap
* React-bootstrap
* sqlite
* Database integration in PyCharm
* jQuery (the only thing on this list I have any previous experience with)
* Moment.js
In the time it has taken me just to understand how all these things work together and the way I need to use them, I would've been able to fully complete the project I am working on if I were just using PHP and FileZilla, hah. As it stands, I'd say I'm about 5% in to it.
Absolutely worth it though. I've made many revelations along the way and am definitely seeing how amazingly useful these tools (the IDEs in particular) can be once you have a grasp on them.
I started getting trapped into what you're complaining about, and now say it's worth it, but what happens when something better comes along or something gets abandoned? What happens if three of those things on your list become a problem? Is it a house of cards?
My company always followed this statement I made up years ago, "Everything should be as simple as possible", but we started bending to large contracts that insisted we use "just one thing", some framework or library or tool, that they used which we found no use for. That made it "easier" for the next client to say, "Well, since you use that, you can use this", and all of a sudden we're getting bogged down wondering which tool is the best, learning those tools, keeping track of the latest changes, dealing with incompatibilities, integrating and interfacing, and on and on.
All of a sudden I realized we had a list of things we never worked with before, or even needed, for our small $2.x million web dev company but it felt like we spent far too much time dealing with them.
So we were in a meeting, a meeting we put together just to solve the problems of updating and integrating problems when it hit me what was going on and I yelled, "STOP!". It was a "wtf are we doing?" moment.
We had lost our direction. We had almost gotten to the point where we couldn't nail two pieces of wood together cause we didn't have an air hose for our pneumatic nail gun.
(I just realized I"m writing a book here so I'll get to the end.)
We went back to the fundamentals. We're a web dev company and we use ONE programming language, ONE source control, with HTML/CSS/Javascript. Almost always the tools we need are available on FreeBSD/*nix. We're more relaxed, faster, quick to adapt, and if a customer really insists on using something we don't, then it's either on them to deal with it or we hire a freelancer but then treat him as a black box interface to that with no contact otherwise.
Agree totally on the struggle being worth it. I'm not really complaining (at this moment, anyway, hah.
It is amazing how fragile it makes everything seem. I think a lot of those tools are safe, but yeah seemingly at any moment something could be abandoned or there could be some other drastic change in the chain.
Example: I can't even remember the specifies of this, but somewhere along the line I was having a hard time working with some npm module and while searching for answers I discovered multiple sources saying the module was "blacklisted". This eventually led me to an easy working replacement, but I still have no idea what the "blacklisting" means - it was still available on npmjs.com.
Also, building my Gulpfile.js was very interesting (and time consuming!). Seemingly so many different ways to approach it and I'm honestly not even quite clear on how I got it to work right so hopefully that doesn't break on me.
The only part of all this I expect to act on at my actual work, for now, is the IDE. I think I could really benefit from getting myself up and running on PHPStorm instead of Notepad++/FileZilla.
It goes for lots of technologies. There are a few dozen filesystems around and new ones being developed all the time. But I say XFS was pretty good in the 90s, it's pretty good now, and I don't see the need to revisit the "what filesystem shall we use" question 'til at least 2030. Let's get some real work done instead.
That seems only to be a problem for native English speakers. Programming, coding, hacking, tinkering, etc... is all pretty much the same. Also, personally I think it has an elitism smell ('software engineering' is the worst). We shouldn't pretend that a 'professional software engineer' fresh from university can in any way produce better code than an 'amateur' who's at it for 20 years, because in my experience the experienced amateur is always a better programmer/coder/whatever...
I don't agree (and I don't agree with the whole cooking analogy). Writing code that is readable, maintainable, simple yet high-performing is a very important if not the most important part of the whole craft. We don't need high-flying words like 'engineering', 'art' or 'architecture' for writing good code. A great high-level design is worthless if the actual code is an unreadable mess, and I have seen plenty examples of this in the real world.
"Software engineering" certanly feels better but I think generally our work is closer to a craft than an engineering practice; imagine if we built bridges, tunnels and planes as we build software - catastrophes every day!
As someone who learnt to code as a kid, then studied and worked in civil/structural engineering, then ended up later on back in software, I do find comparisons (somehow always involving bridge analogies) between software engineering and 'traditional' engineering to always be off somehow.
Most traditional engineering projects are not magical bastions of rigor and certainty. It is really only in very large budget or critical projects that a lot of analysis or rigour comes into play to create that certainty.
At least in the civil/structural world you'd be surprised how much comes down to just gut feel by experienced engineers who then throw down some very quick calculations (budgets don't allow for much more than that) to back up their choices. Most numbers are looked up in tables. They then hopefully get run past another experienced engineers gut feel for validation then get signed off. And most of the time the public agency will just accept that uncritically - as they no longer have the resources to double check designs themselves.
The main difference with software as I see it, is that software is binary and the real world is analogue. Civil/structural engineering standards have factors of safety for materials and loading codes etc to cope with inconsistencies or minor mistakes/oversights or unforeseen circumstances etc. Even most failures are not catastrophic and can be detected and fixed before a catastrophic failure eg things can yield or crack rather than snap.
Software being binary though means for a certain set of circumstances it either works or it doesn't. You don't have the luxury of factors of safety and just overspeccing components to be sure.
Thanks. You are right, the comparison is off probably on both sides.
As with critical civil/structural engineering projects, there are software development projects built with correctness in mind. I've worked on a few of those, mostly in aerospace and VLSI simulation in the late 70s til the early 90s, and the level of engineering was very high.
Fast forward two decades to the world of web development and corporate IT and, wow, the lack of rigour is simply abysmal. I can imagine the same been true in the housing/building sector.
"software engineering" feels broader, like it includes knowing the right aspects of project management and risk for the given project and business needs, beyond just getting something to work.
I'm not a fan of that term. It sounds so stiff and formal. When people ask what I do, I make it a point to always say "programmer" rather than "developer" or "software engineer".
For me, coding is the act of physically working in my editor. Software development is the whole thing - coding, writing docs, troubleshooting, support, etc.
This sounds like a statement from the 90's when 'software architects' and UML were still a thing. Thankfully we are over this. "Designing software" away from the code only works in theory. You will only find out whether your design works during coding and testing, or even later when the software is already in the hands of the user. The only way to good software is through constant rewrite. Make a rough plan, but be prepared to throw it away with the first line of code. Making up the recipe while cooking sounds like a perfectly fine strategy to me.
Thank you for saying that. The amoubt of times ive written something down, coded it and realised a better way had me thinking i was missing a lot.
Although dont get me wrong, there is still a truck load i need to learn just to finish my current project! (Although finally i feel like im getting somewhere!)
That's a good sound bite statement but does not really follow from the original post, nor seems to be very accurate. What exactly is the relation between a chef work and a programmer, and what is the equivalent of recipe in programming?
Both professions involve applying techniques, management and processes to ensure that a product is created to a predictable standard. I believe there are numerous significant practical similarities between the two professions. Experienced individuals may be able to take an idea directly to code, but for software projects developed by teams of mere mortals, there are well establishing engineering processes that are shown to give a project the best chance of success.
> what is the equivalent of recipe in programming?
Probably, chef work is not that similar to programming. Construction would be more prolific analogy. Note using of construction related language in software world, eg "development", "architecture".
That's true when programming for a job or otherwise trying to get serious work done.
But programming can (and also should) be play. For me it's explorative and playful, I'm just trying to make a fun thing happen or see what things I can connect together in what interesting ways. I imagine it's the same for kids taking their first steps by making lights flash with a RPi and Python and then wondering what they can do next (led by imagination).
The latter is very much like being in a kitchen and throwing ingredients together, trying things out to see what tastes good.
I think limiting ones thinking of programming to the former sort devalues it.
I agree. While learning it is (or should be) playing with the code and having the most fun, somwhere along the career path you find yourself being an engineer first and everything else second.
Well, good chefs do, it's called author's cuisine I believe. Someone, somewhere, has to come up with some recipes - and hold on there, I know it sounds like crazy talking, but why wouldn't these people be chefs, who already cook for a living?
Asking for mentoring and asking someone to do your homework are two different things. There's always been and there always will be freshmen who attempt to short cut their way through classes by trying to make someone else do solve their homework. It is and should be frowned upon.
Politely asking smart questions about how to progress in your homework, on the other hand, is just fine.
From my experience it's the channels that don't have a lot of active members, maybe like 10 or 20 people logged in at all times. The value of conversation is increased majorly in those channels.
Are you seriously coming into a thread to ask what it's doing on HN, then in that same post you start talking about something completely unrelated for the sake of bringing up the subject and stirring up trouble?
You're not a misogynist, you just seem like a bit of a prick. Your instinct wasn't telling you you'd be downvoted, it was telling you not to post something utterly meaningless.
Anyway it's on HN because it's interesting. The perspective of a fresh coder is certainly interesting and this is a better written account than you commonly find on the web. It was a good read, a bit short maybe.
As someone who teaches programming, this was an interesting read for me, because it helps me understand the issues faced by people who are just trying their best to learn "software", but get stuck in lots of things including "Jargon".
Well, perhaps some HN users are interested in computers/tech but are not well-versed programmers. For them it might help. Furthermore, people who do believe they know how to program may identify with the story, hence it's an interesting read to some.
In college, I experienced: 1, 2, 3, 6, 8
and between 12 and 14 y/o I experienced the other numbers.
And she basically stated the article was about her building confidence in one of the first few sentences. So the article/author is quite upfront what this is about.
This is why languages like Pascal (and it's descendents) use := as assignment.
There are contradictory opinions about sytnax among programmers. Many programmers don't like the example assignment syntax above, but then many programmers also say that syntax is a relatively minor aspect of learning a language (even though the syntax of a language shapes the way you think about solving problems).
And many programmers do avoid certain languages precisely because they dislike the sytnax.
Nikalus Wirth, who created Pascal, Modula-2 and Oberon, had this to say about using the equal sign for assignment:
"A notorious example for a bad idea was the choice of the equal sign to denote assignment. It goes back to Fortran in 1957[a] and has blindly been copied by armies of language designers. Why is it a bad idea? Because it overthrows a century old tradition to let “=” denote a comparison for equality, a predicate which is either true or false. But Fortran made it to mean assignment, the enforcing of equality. In this case, the operands are on unequal footing: The left operand (a variable) is to be made equal to the right operand (an expression). x = y does not mean the same thing as y = x."