My problem with learning electronics, and, to a lesser extent, electricity, was that most of the guides gave an 'ad-hoc' approach, giving "rules of thumb", recipes, etc. without really going into the reasons for it. They would start off with an (imo) overly technical explanation of quantum effects, then jump the more fundamental Ohm's law, etc., then jump into all the tips-n-tricks of circuit design.
For me, the two major factors to learning electronics were getting enough math sophistication that I could do calculus and linear algebra and being able to program (microcontrollers). The calculus and linear algebra gives tools for the 'passive' analysis and once you realize that most 'practical' electronics nowadays are basically routing power and signal, being able to program is the "meat" of it.
After understanding how to do passive steady-state circuit analysis, I briefly looked at how to do non-passive simulation (transistors, etc.) just to see how it was done (aka, learned how SPICE et. all do it).
Anyway, I found the "Practical Electronics for Inventors" book to be one of the few books that was practical from the outset and actually went into the theory, even if only briefly, without assuming I would get frightened by complex numbers.
There's obviously a path that doesn't involve calculus, linear algebra and programming, because people do it and have been doing it for many years, but these were the tools that helped me understand.
I would also recommend not doing this in the abstract. Arduino's  are, in my opinion, one of the better places to start. You can get an LED blinking within 5 minutes of onboxing. Adafruit  has many tutorial but they're more focused on using pre-built modules and I guess programming, to a lesser extent, than underlying theory.