I first encountered programming in the late 70s, when I learned Fortran in a college level course. I did not have much opportunity to use it then, but I found it fairly straightforward to use. I hardly used it, nor did any other programming until the late 80s.
In the late 80s, I had to relearn Fortran to do numerical work, which I did for a couple of years. Again, I found it very straightforward.
After I moved to a new job (solitary university prof with no research budget) I needed to do some numerical work and could not get a (free) Fortran compiler. I learned C and could easily translate (in my brain) algorithms that I knew how to program in Fortran; it was not quite as straightforward ... but it was fairly easy. (In other words: C was not the best tool for the job.) Then, I got a copy of Numerical Recipes in C ... and it was (at the time) as though it was written in a completely different language, one that was definitely not suited for its purpose. Thankfully, I got the results I needed and could put that behind.
In the mid 90s, I learned Java to write applets for teaching concepts to students. Using Java was not pleasant, as I found it rather verbose and a bit like having a straightjacket on.
I did not program for 10 years. Then, around 2004 I stumbled upon this language called Python ... and found that programming in it was incredibly straightforward. Fortran made numerical code easy to write (by design); Python made everything easy to write. It was so much fun, that I picked up programming as a hobby.
Since then, I've learned a few other programming languages and found that having the right tool for the job makes the job easier (duh!) ... and that there are a lot more useful tools (read: programming languages) today than back then. So, you don't have to be as good today as you had to be back then to write programs that actually do stuff.