Not to mention that computer science and software engineering are such young fields that it seems unhealthy to take readily available abstractions as absolute givens. Everything stands to improve, even products and concepts that have been around for decades and that everyone uses.
Yes there is, watch someone else do it. In fact there are three ways to learn, as the saying goes: trial and error, copying, and insight. I'd be hard pressed to explain the difference of trial and error vs insight, but I wouldn't confuse them either, because only one of them is painful.
This is not as good. There's a lot of evidence from neuroscience that this leads to an illusion of competence. I.e. it's much easier to follow along through a sample solution than to craft a solution yourself. Until you craft the solution yourself, the knowledge won't actually be chunked as firmly in your brain.
I've seen this time and again, as a teacher, and it was mentioned explicitly in Coursera's Learning How To Learn course: https://www.coursera.org/learn/learning-how-to-learn
NOTE: watching someone is a good way to get started, but until you do it on your own, you haven't learned it deeply, and it may be difficult to recall in a real situation.
I agree that there isn't enough time in a life to learn everything you'd want to first hand or from a low level of abstraction, but school should be a place to do as much of it as possible. Just my 2c.
Whether or not you succeed in solving it on your own, it will emotionally invest you in the problem and its solution while showing you what didn't work and having a better handle on the shape of the problem. This lets you get more out of seeing someone else work out the solution.