I got a Radio Shack 50-in-1 electronics kit for Christmas when I was a little kit. I thought it was the coolest thing ever, but when I tried to go further with my interest, all I got was equations and theory. For years I just wanted to understand how a transistor, with only 3 connections, could be like a relay, which had two pairs of connections and made logical sense to me. Keep in mind this was pre-Internet, so resources were somewhat limited. In any case, I eventually gave up on hardware and focused my attention on programming.
It wasn’t until the advent of the Arduino that the spark was rekindled and I found a new on ramp to electronics. Start with a breadboard and cookie-cutter circuits that you don’t really have to understand at first, and control everything with software, because you know software. Gradually phrases like “current-limiting resistor” and “bias voltage” start making sense. Watch people like Dave Jones and Big Clive take things apart on YouTube and reverse engineer them. The circuits get bigger and more complex. The next project needs an op-amp, and to your surprise, you can understand the theory now. The data sheet parameters make sense. You can pick out which 4 of the 5 answers on the electronics Stack Exchange are wrong, and adapt the right one to your needs.
I’m having a lot of fun making things, and to answer the question in the book’s introduction, “have you considered how electron collisions lead to Ohm’s Law‘s linearity”, the answer is a resounding “no”. And if people hadn’t told me I needed to consider that to play with electronics, I might not have missed out on many more years of fun.
Thanks for the feedback. I had a similar experience starting with one of those 50-in-1 kits! And for me it didn't entirely click until I studied more of the theory and then looped back to the practical.
I've decided to mix both the theory and the practical here, and if "conventional academic textbook" is on one end of the spectrum and "Arudino" is on the other, then I'm aiming somewhere in the middle.
For a recent anecdote: this week, a coworker asked me at lunch, "How does the wall receptacle know to send more power to a 1500W space heater than to my MacBook charger?" And that's a great question. And in 15 minutes we ended up talking about fields and forces and resistive materials and the microscopic origins of Ohm's Law and hydraulic analogies... and somewhere in there, it seemed to start to click for him. And that foundation opened the door to his next question, "So why can't we hear the electrons colliding into the wire?" And that opened the door to a new topic: of course we can "hear" them if we just "listen" at the right frequencies, as analog electronic noise.
I'd love to be able to communicate that sort of intuition on a wider scale than the lunch table!
I could only view it through a mobile device earlier, and now that I’ve had a more detailed look, I see how the simulation and experimentation aspect adds another dimension to the theory. I intend to work my way through the whole thing and see how knowing more of the math will influence my future tinkering.
This is clearly an engineering textbook, starting with theory and moving on to analysis, understanding a system through mathematical modeling. If you really want to know everything (or at least most things) quantitatively about a circuit before you build it, these are the tools you need.
The parallel skill set is electronic design, sticking components and circuits together to make things. As you build bigger and more complex things, you learn more rules of thumb about different components and how to combine them into something useful.
Both of these skills together are needed in some measure in order to really be good at electronics, but everyone is more suited to one approach or the other, and learns best when they lead with that approach and fill in with the other one. For myself, I tried to learn electronics with those 50-in-1 kits but just 'didn't get it' because I didn't know what the different components did. With first year EE under my belt it all made sense and from then on I was able to learn the tinkering side.
Knowing rules of thumbs and how to "combine components" into something useful is not a parallel skill set to electronic engineering, it's a part of electronic engineering. Rules of thumb, models and reference schematics (from manufacturers or whatever) are pretty much how you come up with the first draft of a circuit.
Also, electronic engineering consists precisely of designing circuits. Quantitative analysis is a big part of it because without it you can't always know whether the circuit you've designed works according to specs (and in practice, yes, experimentation and testing plays a big role, but there are only so many prototypes that you can blow up before you run out of time, and only so many things that you can test for in a regular lab). The whole point of one's activity as an engineer is to combine components into something useful.
Sure, you don't sit in a lab drawing schematics and building things all day, because there are a lot of other activities that go into building a good product. There's a lot of validation work and a lot of analysis and a lot of planning. Some engineers focus their time only on one of these things, especially because they really are so complex, and so complicated, that you can spend a lifetime studying just one of them and you're still left with a lot of stuff to know.
But at the end of the day, designing electronic gizmos is pretty much what you do, regardless of what role you're playing there.
You can certainly make things by sticking components and circuits together without a thorough understanding how they work. But the idea that engineering somehow mostly about understanding a system through mathematical modelling before building it, and design is mostly about making things by sticking components together by intuition, is very much absurd.
The ur-book for EE is Horowitz and Hill's The Art of Electronics. It can be found online with a little scrounging or for $100+ on amazon. H&H is extremely dense, fairly comprehensive and more like a handbook than an introduction. If you are very math-oriented maybe it will work for you, but be warned that this is over a thousand pages of formulas, greek letters, graphs, and subscripts and superscripts. It's... taxing.
As for a higher-level friendly introduction like taneq was talking about, there are a ton of resources that all cover small but essential parts. Unfortunately that means a ton of repeating things and difficulty in bringing everything together. Arduino resources are great, the adafruit/sparkfun articles/blogs are great, whitepapers from TI and others are great. I don't know of a single atlas to bring these together or say them in a single place, which sucks. I may try giving it a shot- I'm certainly math-dumb enough to understand how to translate.
The EEVblog and sparkfun youtube channels are excellent, particularly for PCB design. IMO PCB design is essential to transition from tinkering to a true hobby. Most sophisticated components only come in PCB-only packages. PCBs are far cheaper than breadboards, and mandatory for any project with more than a dozen parts. PCBs make debugging far easier. They're required for anything operating over a few MHz, and most digital stuff. Unfortunately the software still kind of blows- Kicad is the best, but still a huge pain.
I can't recommend electronics enough as a hobby! It's more intense than brewing beer, but the scene has blown up exponentially in the past two decades and is incredibly accessible. Electronics are more affordable than any other engineering discipline- PCBs are simple and incredibly cheap to order in single lots, and 80% of components can be ordered in single units. Compare eg metal prices, which are easily 20% the price in bulk vs. small units. Single electronic components are 75-50% of the cost in bulk. Entire industries are dedicated to making cheap, simple modules that handle incredibly sophisticated tasks like location tracking, video, wireless communication, or battery power. You can do anything you can think of.
AoE isn't really supposed to be a textbook in itself. It was written for physics students, typically at the graduate level, who need to design experimental apparatus without a formal engineering background. It's not an ideal introduction for newbies, and unfortunately it's recommended for that role way too often IMO. But it's a great sophomore resource, so to speak.
What's needed is something between the Forrest Mims "cookbook" level and AoE... something that gives you the theoretical underpinnings needed to know what chapter in AoE to turn to. People who are interested in the RF and communicstions side have always had the ARRL Handbook as a resource, but that book has limited appeal to those who are more interested in microcontrollers and other electronics topics. This page looks like it might make a useful contribution there.
Anyway, it's a great resource, and was a big part of why I decided to do EE instead of CS. It's also got pictures; lots of pictures!
and for the specific example you mentioned, their section on the bipolar junction transistor is simple and accessible:
> The motto of this book is “one level deeper.”
Really we need both. Lots of people do learn by experimentation. There are (now) a lot of tutorials for them. But on the other hand there are people who really aren't satisfied until they've pinned down all the details - what is an electron really etc - and this is more suitable for them.
The more the merrier. We just need a bit of signposting too.
Fast forward to my 20s when I could afford to take a short sabbatical to apprentice for a practicing electrical engineer (self-taught as well). Within four months I went from basic breadboarding and soldering skills to designing high speed digital PCBs from start to finish and began contracting as an EE immediately. There was a little math but mostly just a bunch of specialized calculators for impedance matching high speed traces, capacitance, and length matching high speed buses.
I went back to software for the pay but when I designed a small control board last year, the whole exercise from inception to sending off to fab and assembly took four days for an 8 layer PCB. Between all the online parts databases (with footprints and schematic components!), reference and open source designs, and software like Altium and TopoR, electrical engineering has never been easier and involved so little actual theory taught in schools.
It hurts my heart that I haven't yet found a resource that just dumps people into the deep end with a proper focus on the engineering instead of the theory.
I think its exactly what I need, I feel really lost on where to go with respect to electronics learning, and I think a mentor could help me a lot.
The next step is to realize that solid-state devices usually control current instead of voltage, but that we can add voltages, currents and resistances to the circuit that let us work with voltage in a linear region of a device's response curve. So our normalized 0-1 control corresponds linearly to 0-1 on the main circuit. From there, it's straightforward to build amplifiers where the control voltage or current is thousands or even millions of times smaller than the main.
After that it gets.. complicated. It took me 4 years of math and physics to finally understand solid state theory and be able to analyze large circuits by subdividing them into simpler linear sub-circuits for my degree. Then the really interesting stuff happens when we abandon all of that and convert to the frequency domain using the Fourier, Laplace or Z transform. So discrete signals have periodic frequencies and periodic signals have discrete frequencies. Which lets us analyze the transient and steady state portions of a signal separately and gain valuable insights about what a circuit will do.
Of course I've mostly forgotten all of that. One word of advice: keep your college textbooks. I still find concise explanations there that haven't made it onto the internet over 20 years later.
Edit: IMHO, the above way of translating between abstraction and application is the primary advantage of getting a degree from a university. For anyone young reading this, I highly recommend applying to the best schools you can. If you just settle, then you might miss out on the underlying theory behind each discipline. You'll want the theory later when you've forgotten everything like everyone else, because you'll be able to re-derive everything you've learned from first principles when you need it.
Also, for what it's worth, "Practical Electronics for Inventors" is one of my favourite books on electronics. It's well balanced between theory and practical applications.
1. if you disconnect it from power, you'll get a huge voltage spike
2. V = -L (d/dt) I
1. seems like a practical explanation. 2. seems like a theoretical explanation. But an expert can play with both, and (under the assumption of first-order linear ODEs for this primitive components, which is an assumption you can only make after the fact, so... bare with me for the following BS statement), they are both equivalent. (That is, I can't think of another relationship that achieves 1.
Your idea of a "trick" has mostly to do with 1, I think. And it's an essential part of understanding an inductor, and it's also most likely how humans discovered inductors.
Maybe 'trick' is the wrong way of looking at it, I guess the real issue is that from seeing a single component in a circuit diagram you can't know which of the effects of a component is being utilised.
Otherwise I think your explanation is great. I can't claim to understand transistors super well, but your explanation touches on a key point for someone who is trying to understand the relationship between relays and transistors. Transistors cannot achieve the impossible--with 4 pins, you can achieve electrical isolation between the switcher and the switchee (and share a "ground" if you want to). A transistor cannot do that.
This might also be a high-level way to see why we need two "kinds" of transistors. Because suppose the electromagnet in the relay has some polarity, and only activates if current flows "north to south". Well, there are two possible choices of what side of the coil you tie to a pin on the switch, and what side of the coil you expose as the base/gate.
It's been said many times before but I'd like to recommend The Art of Electronics to anyone who's graduated from the YouTube School of Electrical engineering and has some motivation to keep going. It has some mathematical theory (which it insists is optional) but I've found it useful for filling in the holes in my education and has loads of practical advice like the EE mentor I've never had.
Like magnets and EMF the forces is explained by showing lines of force but what are they? From what I understand the force is energy due to the exchange of virtual photons between the magnetic poles.
I've also read that EMF is also due to relativistic effects. Moving and stationary charges differ in quantity due to contraction of the moving charges. Like charges repel so the difference creates a force.
Take all that with a huge grain of salt.
> Michael F. Robbins holds the S.B. in Electrical Science and Engineering and the M.Eng. in Electrical Engineering and Computer Science degrees both from the Massachusetts Institute of Technology. Mike is the co-founder of CircuitLab, Inc. and developer of the CircuitLab circuit simulation software used by universities, hobbyists, and practicing engineers in 196 countries.
It looks like an introductory book to electrical circuits with interactive simulation exercises built into the book. This should give you an intuitive understanding of how the math that was presented actually works at the circuit level, which, I think, is very cool.
More circuit knowledge = more need for simulation resources. Also serves as a pretty good SEO resource for CircuitLab.
There are plenty of great analog electronics tutorials out there already that mostly differ in that they don't point to CircuitLab's material. My personal favorite is Analog Devices electronics course.
I'll add that it's also great documentation / example material for how to use CircuitLab. In fact, that's how it started.
In many ways, circuit simulators are a power user tool, and so I started this project by just making a larger library of example circuits. It then became clear that I needed to put the circuits together in context, and the idea for the textbook popped up since the context many of our customers share (at least the hobbyist and education segments) is in learning electronics.
Now, I'm inspired by communicating in this part-theory, part-practical style that seems hard to find in conventional resources.
It wasn't until I learned current and voltage laws that the way current flowed actually made sense.
Analogies were super helpful too. A capacitor is like a rubber membrane that pressure can build up on. Voltage is pressure. Current is rate of flow. An inductor is a wheel in the pipe with inertia. A transformer is two mechanically linked inductors. A Mosfet is a pressure-activated switch. A BJT is a current-activated switch. A diode is a check valve.
It wasn't until I took Physics in high school/college though, that it all finally "clicked". All of electronics could be described quite well by those finicky electrons that repel each other. How a capacitor actually works. How a charged pointy surface draws the electrons to the point, sometimes enough to escape. How magnetic fields are actually produced.
Digital electronics was actually much easier especially coming from a background in computing. It's all logic, you don't really need algebra or calculus.
Anyways I would like to offer this as a counterpoint and say that learning the fundamentals and physics made the world of electronics way more satisfying than to always be tinkering but not really understanding.
I do think though that these topics are really quite difficult to understand and internalize. Even most practicing EE's I think struggle with really able to internally translate the physics, and Maxwell's laws into the day to day work. It is just quite abstract. I know I still struggle with it.
The truth is most EE work is not so different from software where you can get by most of the time by reading datasheets, basic laws, following working circuit examples, and relying on what's worked in the past. You don't have to derive everything from first principles all the time. There is just a wealth of well understood designs out there to pick from. Additionally a lot of the work is plumbing and being careful, such as basic PCB design.
Unfortunately I could not go to far because I am missing an "electronics cookbook".
As an example I wanted to build an internet radio off a Raspberry Pi, hooking a small amplificator "chip" (a pre made circuit with an IN from the rpi and an OUT to small loudspeakers).
There was a buzz from the loudspeakers and I am sure an electronician (of there is such a word) would immediately say "you need a 1 pF capacitor here and a 200 ohm resistor there because this is a basic Schmidt-Landau-Trump bridge, obvious in loudspeakers".
I will never understand why a parallel resistor and a capacitor in series is the way to go, but would love to have a cookbook for such circuits, with some explanation such as "if it buzzes, uncrease the resistance between 100 and 1000 ohm" - for the typical circuits one would make in IoT (say, a plant humidity detector where I have the sensor, the nodemcu and need to kniw what to use in electronic parts to hook them up).
Which brings me to another way to learn, which is by hands-on tinkering and picking up theory where it is needed. I am pretty sure the guy that ran our labs didn't have a degree in electrical engineering, but he was amazing at building things and troubleshooting them. Building up intuitive knowledge and experience also takes years though.
I could never go far learning just theory because I seem to forget soon after I think I have understood it.
Studying/tinkering with existing electronics is fantastic -- there is so much knowledge in most products that I feel drunk with excitation to see how something can be implemented way better and more efficient than I thought. It is very interesting to see how different designers approach their problems and it builds my repository of solutions I can implement.
This is no proxy for actually trying to solve the problems. I think only after you have really tried to solve a problem you can actually appreciate alternative designs.
Now, rinse, lather, repeat.
You're kidding, right? Electrical engineering from University and never heard of bypass capacitors?
Same experience in my university. I think the curriculum has analogues with the CS/SW dichotomy. CS programs are not meant to teach you programming. Similarly, EE programs are not meant to teach you how to build stuff.
In fact, the electronics for physicists course often had more practical material than you'd get from an EE course.
Similarly CS is required to have labs which force students to be able to program. If someone gets a CS degree but can't write a program, they either cheated their entire way through every programming assignment, or got that degree at some international university that would never be accredited here.
I have no idea how an EE curriculum doesn't cover bypass capacitors. They're in virtually every real-world circuit. Even if it's not an item that's specifically covered in a course, lab or seminar, there's no way you can put a real-world schematic on the projector and not run into one.
I can't point at a specific course I took where they were covered but I am sure everyone who made it to the third year knew what they were. I definitely remember talking about them extensively in at least three courses (Circuit Theory, Digital Circuits and Digital Instrumentation).
Let me summarize almost all my engineering lab courses:
"Design and build X" where X could be some kind of amplifier, etc.
The "design" part is identical to a HW problem. You already know in advance the circuit (one of the standard ones in the textbook), and you just need to figure out R/C/L values to get the desired output. Then you build it and show it to the lab TA who'll check it is actually behaving as desired.
This isn't a good lab assignment: It's just a theory problem masquerading as a lab exercise. After the first semester, any idiot can put the circuit together on the breadboard if they already know the circuit topology.
And yes, while I was there, the accredition board actually reaccredited the program. And yes, they looked at the lab assignments we were getting.
After graduating, I visited my department a number of times, and I did give them the feedback that "your lab assignments are useless".
> And from what I've seen, it's a lot easier to get an engineering degree by being able to build stuff but struggling with the math, versus learning a bunch of math but not being able to build anything.
Definitely not the case at my university. If you struggled with the math, you'd get really poor grades. And we had a mandatory requirement to get a B in second semester circuits (and pass the final with a score of 8/12 or better). Until you did that, you were not allowed to take junior level EE courses. The labs were trivial, but the exams were tough.
The only time when not being able to build anything was a barrier was for a Senior Design assignment we all had. And lo and behold everyone either got an electronics book (notably not a textbook), or searched the Internet.
> If someone gets a CS degree but can't write a program, they either cheated their entire way through every programming assignment, or got that degree at some international university that would never be accredited here.
Eh. It's not so much that they couldn't write a program, but that they would forget the stuff fairly quickly. In my university there definitely was a fair amount of nontrivial programming required in some CS courses. But on the EE side most of the lab assignments were just trivial.
I would quibble a little with you on the EE/CS comparison (even though I'm the one who introduced it to this thread). IMO, EE is a lot broader a discipline than EE. Stuff that is considered part of EE: Electromagnetics, control theory, acoustics (believe it or not), information theory, semiconductors, signals (e.g. Fourier transforms, etc), power, electronics, and others. There are quite a few professions that fall within the aegis of electrical engineering but have little to no circuit aspect. This is less true of CS. So it is a tad bit more understandable that someone gets an EE degree but sucks at electronics.
I think this is not really possible in electronics as there are maybe too many cases (though some are probably quite common)
There are at least two O'Reilly "Cookbooks" for electronics.
The best way to do this as a hobbyist, if you don't want to read textbooks, is just to mess around and soak up information from blogs. There is an awful lot of "do this because it works" in electronics. Most of it is backed up by theory somewhere - e.g. why is 100nF so often used as a bypass cap value? - but really you don't care as long as you follow the recommended application circuit in the datasheet.
I think your second example summarizes it quite nicely.
I am a physicist by education and an IT guy by career, so to speak. When my son asks me about basic physics I usually give him an erroneous answer, good enough for him to go ahead. This is what I would love to have in electronics.
But as you say (and with which I agree), years of tinkering is probably what builds a cookbook in your head.
I tinkered with circuits as a
kid in the 60’s and subscribed to Popular Electronics back then—it was may favorite magazine when I was 16 years old.
I always viewed electronics as only a hobby, kind of like being a Ham Radio enthusiast and went to college to be a mathematician instead. However, I had a great elective that covered how to build electronic apparatus for scientists. After all the years of Math, it awakened my old interest in electronics and I stayed in college for a second degree in EE.
Popular Electronics was a hobby magazine and it was a lot of fun. On the other hand, a book like this one is more like what one learns when studying for an engineering degree. It teaches at a level where you can really understand the circuits you see and gives you the ability to put together your own, not just wire up projects someone else designed. I wish the author good luck it and look forward to future chapters. It could be a nice complement to Paul Horowitz‘a The Art of Electronics.
By the way, I know that many HN readers are computer scientists. That’s what I ended up being, but having a deeper understanding of electronics than most of the other software guys has opened many doors for me.
I think interactive books are destined to go to the graveyard. I admire the effort and the medium in every other way - except for aforementioned issue.
This is absolutely true for me
Studying electronics did not enhance my understanding of calculus, but maybe I didn't study it enough. On a related note, I have grokked logarithms only when I needed them to implement some graphing module
EDIT: see https://wiki.archlinux.org/index.php/List_of_applications/Sc... and, especially, https://wiki.archlinux.org/index.php/List_of_applications/Sc... for other simulators.
Side question: I would like to get to being able to design PCBs which I could then get produced and populated in very small quantities, just as a hobby. What are the fields of knowledge which one should approach before designing PCBs oneself? Any good books? After that, I would need to choose between gEDA and KiCAD, while the latter seems to be quite a bit more popular, gEDA's supposed design in the UNIX philosophy (as a suite of tools that the user uses through programming/scripting) appeals to me, so does somebody have any experience with gEDA to share (pros, cons, etc.)?
Another question, more on-topic: could somebody clarify in what cases can we simulate electronic circuits while representing values with phasors/complex numbers? As far as I understand there are some limitations, like it is always a correct identification if all components are linear, or if the AC parts of the signals have "small" values compared to the DC parts of the signals. Am I mixing stuff up? What are some of the real world situations in which simulating with phasors breaks down?
I want to build my own electric car, well, convert an ICE car to an electric one. I have the fabrication skills but I want to learn more about of the _electric_ stuff (the motor, AC vs DC, the controller, differences in batteries, etc). Does anyone have a good resource for that?
I hope it get completed someday.
When you flip that switch, the drift velocity of charges in (on) the bare copper is only a few cm/s, but the field propagates at the speed of light, between the wire and it’s return (mathematically convenient ground). The energy is around the wires in the fields. The only true ground is at infinity.
If you can get that drilled into your head from early on, it will be beneficial to high speed circuits and RF. Introductory books never do that.
Thanks for the feedback. I agree so much with you about the importance of fields and keeping track of energy! And so much confusion about ground... "The only true ground is at infinity." is perfect.
When I get to inductance and capacitance, I'll definitely be looking at them from a field and energy perspective.