I've been programming for years, and I still find pieces hard. And it's almost always the dressing: the fundamental algorithm and approach can be solved, and you think you are done... but having to deal with all the "frameworks" just to make a site or an app becomes an exercise in pain. And those were supposed to be the easy parts, right?
I have to admit, I do long for the days of Pilot where I could tell Turtle to just turn left and go, and it would just do it. No windows, no nested frames, no viewports, just describing what I wanted done and the machine does it.
(Yes, I know that modern apps, the machine-learning bayesian-based twitter sentiment analyzer that mashes weather forecasts, stock movements and movie ratings, parallelized and built with enterprise-grade user rights management is a far way from drawing a rotating box... but for sake of argument, wouldn't it be nice to just say "do it" every once in a while?)
Maybe I tried to bite off far more than I could chew, I don't know. I guess the trouble is that so much of web development relies on the use of prepackaged tools that the ramp-up learning curve is significantly more challenging than merely learning a single language's vocabulary. As a relative beginner (albeit with prior exposure to programming concepts), it look longer than I expected to get to my current point. Web development isn't just programming, it involves learning a new environment, a whole set of concepts and issues, and a toolbox that just grows and grows and grows...I've started storing my knowledge in a Freemind mindmap just to keep track of it all. I even save my IRC chatlogs whenever I manage to get advice. (If I had a more experienced partner to pair-program with, things would probably go a lot faster, but I'm mostly on my own.)
Now I'm looking at websocket libraries and other nodeJS frameworks like DerbyJS and Meteor for new functionality, and it's analysis paralysis all over again. It never ends for a beginner...I miss programming simple stuff.
It never ends for a programmer. Constant learning and reinventing is what defines us. It gets easier to pick up new stuff, but you never end the process of picking up new stuff.
That cycle of layering (abstraction, addition, complexity and then another layer of abstraction) has been going on since we invented the programmable computer. Machine language got us started but it was a bit on the hard side to put it mildly, so we came up with assembly language. That was fine on simple chips with little instruction sets, but then features were added and platforms multiplied, so we abstracted away the complexity with C. Then we started to solve problems for end users, so we added text- and graphics-mode libraries and the whole thing grew again. Then graphical OS's came along and suddenly object orientation and event handling made a lot of sense and we evolved various languages to do that, including C++. All built on the layer below and the layer below and so on; underneath it all we're directing electrons with little gates made from transistors.
The point is, everyone at any stage of the evolution of computer programming has wished for what you want: a simpler, more expressive language with less 'busy work' (for want of a better term.) Someone creates it, we're happy for a bit and we make amazing things with our new toys, then the pain starts again as the more visionary people among us (like your good self) start to think about what problems they could solve if only they didn't have so much dreadful housekeeping to do and all this stuff to wrangle.
Think about what your world - the web - is built on!
The tension between simplicity, abstraction, and just throwing it all away is fascinating and ever present.
"If people do not believe that mathematics is simple, it is only because they do not realize how complicated life is." - John von Neumann
“Any darn fool can make something complex; it takes a genius to make something simple." - Albert Einstein.
One point the speaker, Rich Hickey, argues is that we have a bad habit of focusing too much on our experience sitting at our desks ("look how quickly I can do this one thing") at the expense of complexity in the code.
It's a short view as opposed to a long view, since over time your project inevitably becomes larger and more complex. When you're trying to add a feature to a large, possibly mature product, you're seldom doing the same kind of work you see for a framework or language's demo code. Complexity will dominate everything else, to the point where it probably won't matter as much how easy it is to change the color on a div or whatever.
That said, if you can isolate complexity behind an API (not just hide it, but truly abstract it), it's probably better for your software. The problem is that software only seems to get more complex over time, and after a while your framework which ostensibly abstracts adds its own complexity, complexity incidental to the problem at hand.
Anywasy, I sympathize. :) As I see it, programming as we know it tends to involve gluing together frameworks and APIs more often than it does writing raw code. It's a huge stack of abstractions and I wonder at what point it's futile to try to understand more than a certain subset.
That said, some people like being able to write code which ships almost instantly ~everywhere, or to a great many tiny computers in people's pockets. It's not all bad, right? It could be simpler, but it just isn't, so the decision is to take it or leave it. (Or try to replace it, but now you have two problems and/or N+1 frameworks.)
I still remember a distinct moment when I was 20 or so that everything magically clicked into place in my mind. Suddenly I grokked the nature of abstraction and what it meant to translate a mental model into a computer program irrespective of the underlying language. Trying to put it into words always sounds trite, and it's not as if I was magically a better programmer—more like the beginning of a journey, but it was a clear point where I feel like I crossed the steep part of the learning curve, and it was all downhill (at least on the first derivative) from there. I'm not sure if anyone else has ever experienced it this way, I suspect this type of (seemingly) grand epiphany is more likely to happen when you're younger and the brain is more plastic, or maybe I just took too much LSD. In any case, I'm convinced programming belongs to a class of skills (perhaps music is another) that require some higher level foundational concepts that do not occur naturally in most common human experience.
But funny enough, I find your description extremely accurate: I suddenly grokked the nature of abstraction, a beginning of a journey.
For me when learning a new programming language, I find that I move through three important phases. First I'm copying what other people are doing and consulting books and manuals constantly for everything I do. After a while I'm consulting other things I've done myself and imitating those. Then finally I get to where I'm not consulting anything and the code is just flowing, often with my own interesting twists.
Incidentally, feeling like it's too hard for you and that you're not smart enough just goes with the territory. Also try to ignore the temptation to feel like it is all very easy for everyone else. I find that the feeling of failure is often its worst right before I make a breakthrough. After about the fifth time of thinking this is too hard for me, I can't do it, then doing it, I realized this is a pattern for me and learned to ignore the feeling of it being too hard.
After learning Python for a couple of weeks, I found that problem disappeared. With Python, I can just THINK and then write what I'm thinking, which is a huge upgrade for a newbie like me.
I'm going to stick with learning design (including JQuery stuff, which I didn't mention but I have already learned and actually implemented to some extent) and Python to attempt to limit the scope of my learning in order to progress faster toward some degree of mastery.
But here is a fair warning: this stuff is hard because it's genuinely tricky. Computers are unforgiving in a way the human mind can never be. So be warned: there are other walls like the one you just vaulted over, and the other side of them is worth reaching. But, it will feel like what you just experienced all over again.
One famous wall that my college used to weed out upper and lower division students at UCSB was pointers. Recursion also really shocks some people. Some people get really distracted by the high level modeling that gets used to build systems. Some people have a really hard time learning how to incorporate how the computer actually works into their programming model.
I don't mean to discourage you! I mean to say that you are not alone. Many people find this stuff hard and work diligently to learn it throughout their careers. It's a fact often lost due to the culture of dominance and bravado that the computer science world has grown.
That's when I realized the answer. It will never be "easy" in the sense that learning these new elements will always be difficult in a way, but there will come and already is starting to come a time when I can do it in a way that fulfills me instantly when I'm doing it.
The trick to staying motivated is to directly acknowledge it's difficulty. Then proceed learning it anyway.
Things stop being hard when I know enough so immediate satisfaction is possible. Then things seem easy.
+1 for self awareness progress.
Keep in mind:
1. We are not scientist. We do not use the scientific method to discover what is already out there. Our creations didn't exist before we write them
2. We are not engineers. Our creations are not bound by what science discovers. In some rare cases they might be a requirement, but never a bind. This is a curse and a blessing
3. We are closer to a mathematician painter. Not because we use math (this is irrelevant for this definition), but because we think in symbolic terms which is the essence of mathematics (btw. this makes arguable that ip over algorithms is like ip over colors)
4. Society is being build on top of it... and "they" are not very aware of it yet
By the way, I totally agree with you, but by your definition the above seems like a logical conclusion to me.
I don't believe that our solution space is limited by the hardware we run it on because our solution can be intended to run in different hardware, and when its adecuate enought it should produce the same results.
Lets take math as an example.
Lets say the axiom a + b = b + a
We know that the expression is true in its essence, regardless the alphabet that we used to expressed. One can argue that ths expression does not makes sense to a chinesse personon because the chinesse alphabet does not have the letters 'a' or 'b'.
But we know that if we use valid chinesse letters, then the concept is still true.
You see? Is the concept that we express what remains valid, not the symbols that we use to express them. And this allows us to talk about commuting regardless of the language. The symbols are used to communicate an idea that is beyond the meaning of the particular symbol
Same thing with software. Software development is not about the computer that it runs on anymore than astronomy is not about a particular telescope. Thus we need to separate the software solution from its implementation environment. The software is not bound by the architecture, a particular implementation is bounded by the hardware architecture and the ability to think out of the box of the developer; but the software it self it isn't (which is why we can do things like calculate the big O, which is hardware independent)
Like in math, a particular implementation is just one way to express that software on that hardware using a particular programming language. For a different hardware (chinesse) you will use a different implementation for the same software concept.
I cannot address how or why is the logical conclusion for you, but I can tell that it is not for me.
 Quantum computing may change the O() of some functions. I.e. maybe some O(n^2) problems become O(n) with a quantum computer, or NP problems become P; I'm not familiar enough with quantum computation to know for sure which classes are known to be reduced to smaller classes. But in principle you can always simulate a quantum computer on a classical computer given enough time -- of course, in practice "enough time" may be many orders of magnitude out of the realm of possibility. So quantum computation isn't a "new" form of computation as far as the Church-Turing thesis is concerned.
 There are purely theoretical machines called "oracle machines" -- exactly like a regular computer/Turing machine, but equipped with a (theoretical) hardware device called an "oracle" which is able to solve some problem that is known to be impossible for Turing machines, like the halting problem . An oracle machine can be more powerful than a Turing machine, but its violation of the Church-Turing hypothesis is sort of a technicality.
Saying "Build magic hardware that does something a Turing machine can't, and attach it to a computer" is too vague about how the non-Turing computation is to be achieved to really constitute a fair definition of non-Turing computation.
To me it stand to logic that different hardware has different capabilities, and if an implementation can use these characteristics then the implementation can have a different O. But my point still remains that as a theorical concept, the O does not depends of the hardware in the sense that it is not a necessary piece of information to make the calculation of O.
Also i would like to point that in my original definition the math paint has a purpose of utility for society. To build magic hardware would be like telling Picasso here is infinite canvas and infinite degree of colors, what can you do with them? The magic is not in the resources or the tools, the magic is in thinking independent of those tools, and then using the tools to implement. I understand that people might come from the other end of the spectrum and think that we need to begin with the tools, i just don't share that view, and it has worked for me just fine.
If you're intending to be a tech startup, why don't you code something useful for yourself, like something that turns an excel spreadsheet into HTML tabular data.
The reason I wrote a simple game is it was an exercise I thought of to do that seemed hard. I did it quickly, correctly, and in my opinion cleverly. It made me happy. For the first time, my technical co-founders also agreed I had written the code the way they would have written it.
I disagree. Games are a great starting point. Creating your own world where the user can immerse themselves and make a variety of choices is way more fun, on a visceral level, than converting Fahrenheit to Celsius or using jQuery to put a few special effects on your website.
That fun is part of a reward feedback psychological loop which helps get you motivated, and the motivation helps you study harder and get past frustration. So the positive effect on the learning process may be large enough to justify such an "impractical" application.
> games have no practical use
Let's not forget that computer/video games are a large, visible and highly successful part of the entertainment industry that essentially didn't exist before the computer era. We could also say that art, music, movies and fiction "have no practical use" -- but a large segment of humanity still finds them compelling enough to spend enormous amounts of time, money and other resources creating and consuming, and these things are widely considered to be a core part of what makes us human.
> beginners need to focus on the pure code concept
Depends on the person. I would argue that many people learn better when they can apply their learning to a non-trivial problem. This is why introductory university science courses have labs. This is why classes at all levels have large projects in addition to the tiny problems on homework and exams. In fact, an inability to apply your learning to a non-trivial project implies that the learning process is incomplete.
> requires spending attention on entirely orthogonal concepts
Most projects in the modern world require interfacing multiple languages, libraries and API's. Any web app, for example, will likely have -- at minimum! -- parts in HTML, JS, CSS, database, backend language, and a template engine. That's six orthogonal components.
If you're going to be dealing with multiple orthogonal concepts in the "real world" of programming, the "academic world" of programming (whether a formal college curriculum or just self-learning) would be remiss if it didn't give you some preparation for that aspect through exposure to projects involving integrating different technologies.
Even single-language projects involving only core libraries often require putting together different API's (for example, many applications can be roughly divided into dealing with files, dealing with a user interface, and internal core functionality.) In fact, not being able to modularize a large problem into manageable pieces that mostly treat other pieces as black boxes is a recipe for failure of any and all attempts at large projects.
While you will have to learn multiple API's and libraries to do anything money-generating, I think it's not an efficient route when first learning programming...why not do a simple project that will be immediately useful to you as soon you complete it?
I remember understanding the DOM for the first time, understanding server side code for the first time, chaining methods, JOINs, scope, regexp, MVC, a lot of individual things.
My most practical advice is that you ignore hot new frameworks and languages for now and stick to what you know until you know it much better.
But, as soon as you feel comfortable with what you're doing you should take the next step.
I'm really getting tired of this "programming is hard" meme. It's not.
Maybe programming is not hard -- for you. Maybe software development is "as intuitive as it gets" -- for you. But that is an extreme generalization that's simply not fair.
Programming is not trivial and something that everyone should by definition be able to intuit. It requires a very specific and logical way of thinking. It requires intellect. It requires practice. I'd venture to guess that a vast majority of people out there couldn't "just figure it out".
Programming requires interest, practice, intellect and a very specific way of thinking. It can be taught, it can be learned, and it can even be mastered with enough effort (by some people). But let's not dismiss the hard work of the OP and of the many people out there trying to learn (or, frankly, failing to learn) by calling it intuitive and easy.
Programming really doesn't come naturally to everybody.
Take a very basic task like "Apply this text replacement to all .txt files in this folder and all subfolders". If you are a beginner programmer for example in Python, you have to search for the module and get a basic understanding of how it works. While you never run into any conceptual difficulty, this can still take you a lot of time and you will make strange errors along the way.
I will admit that many beginners find programming hard and counter-intuitive. This fact, however, is a reflection of the utter failure of our education system to teach a few important things:
Learning on your own. A K-12 curriculum is highly structured and guided, especially K-8. Consulting resources outside the curriculum and figuring things out for yourself is a valuable life skill that's seldom taught in a typical classroom.
Thinking rigorously. A typical curriculum involves memorizing facts, not figuring out the consequences of known facts. Knowing something well enough to make detailed, testable predictions simply isn't taught. (When I learned the scientific method in school, it was simply memorizing the list ["Formulate hypothesis", "Test hypothesis with experiment", "Write paper"]. Even intro college physics was basically applying known theory to measure numbers in the lab, more engineering than science.)
Tenacity. In school, when you learn your answer to a question is wrong, it happens at the end of the learning process, and it means you've failed at mastering that particular tiny corner of the curriculum, but need to move on anyway. However, in the real world -- particularly programming -- when you get a traceback or incorrect output, it's the beginning of a dialog between you, the language, and your mental model. Making a mistake doesn't reflect on your ability until you give up on fixing it. People simply don't realize how humbling programming is, in terms of the sheer number of mistakes that even the best programmers make. (Beginners don't use or can't yet understand tools like IDE's which instantly notify you of many errors -- especially syntax errors -- and incorrectly highlight many others, like missing quotes.)
Differential diagnosis. "What I expect isn't happening. Situations alpha, beta, or gamma might conceivably be happening based on my mental model of how things work. I already observe y, which eliminates situation alpha. Now z would be different between situation beta or gamma, so I just need to print z out and rerun...aha! Now that I know z, I can eliminate gamma, so the problem must be beta..." This sort of intense critical thinking is absolutely vital to functioning in technical subjects, and also enormously helpful in everyday life. But it's a skill that has to be trained, and our education system completely misses it.
That was just quick and dirty