the book Digital Design and Computer Architecture by Harris and Harris (A RISC V Edition will release soon in 2-3 months, buy that one)
For Electronic circuits choose Microelectronics by Behzad Razavi. Instead of Purcell go for "Engineering Electromagnetics with Ida", it is more intuitive.
And th first book you should start with is "Foundations of Analog and Digital Electronic Circuits"
Then you go for DDCA and Ida in parallel.
The list does give me head scrather, this is too broad to be ever accomplished. My recommendation to redo the list is first find out what piques your interest in EE, Digital hardware, analog hardware or control systems or embedded systems and try to have a self study focus in that concentration.
The way I see it with this plan you are setting yourself up for failure.
Edit:Removed the ditching recommendation as I see it relevant to OP's goals.
I Horowitz and Hill is a LOUSY textbook. And, personally, I find its usefulness as a cookbook overrated.
Going through the Forrest Mims notebooks/cookbooks/etc. for a solution to your problem is generally a way better idea than Horowitz and Hill. The Mims stuff does a really good job of pointing out the pitfalls you're likely to hit as well as the basic pedagogy.
One might think that it's historically closely tied with the development of the radio and the telephone, see Tesla, Eddison, Bell Labs.
The common wisdom about electromagnetics with regard to EMI and such things seems to be that it's pretty much black magic. Similar for inductor selection in SMPS design.
It's pretty much physics, so a good deal about radiation can be found in medical physics for example, which is simply a different course of study with electronics as a side show, and maybe better as consecutive program for sophomores.
Applied control theory for embedded systems
It is less math intensive and more intuitive and aims at folks with a software background like yourself.
Razavis RFIC is a good one too, but that’s really getting too specialized. Pozar is good for undergrad microwave.
The OP wants to study EE because he has a specific goal. My suggestion was that instead of trying to study everything EE focus only on those subjects that are relevant.
For example: If I was interested in robotics , I would not bother with digital,RF or Analog or eveen communication systems. I woud primarily focus on Control Systems and Embedded Systems.
This is something that I've seen often in self study plans for software development - the "learn everything and then try to use it" rather than "learn what you need to start solving the problem... and start solving it."
This is where a university class (and degree) have an advantage - they've got a set of problems for the student to solve (homework and labs) and then take the student through learning specific knowledge to solve those problems.
This also shows what self teaching often lacks - those small problems that can be accomplished as part of learning how to solve the big problems.
In this case, though, I think OP's approach to this shows that they're serious about keeping with it, which is really cool to see.
Hard disagree. Much of the page involves what normally would be electives. You need exposure to some subset, but not all of it.
To give you an idea, my undergrad in EE did not require a course on materials (although it was an elective).
Everything in "Phase 2" was an elective - none was required (although many universities do require the "electronic devices" course).
Needless to say, if everything in Phase 2 was an elective, so was everything in Phase 3.
Also, when I look at pretty much any job requiring EE, and intersect it with the courses I took as an EE undergrad, I find that most courses are not needed. EE (and an EE curriculum) is often quite broad. For any given course, there are plenty of jobs that will need that course, but most EE jobs will not. If the submitter has some specific goal in mind, he won't lose much by skipping courses not related to that goal.
To give you an idea, when I worked as an EE, I had to use basic circuit theory 2 or 3 times, digital logic only once, and the physics of electronic devices a lot. The level of EM I needed was satisfied by high school physics, so I won't even count EM. Everything else I took: Control theory, communications, electronic circuits, power/machines: Never used it.
There are two ways to learn an existing technical-ish subject: you can spend a lot of time reading textbooks, then do some projects (the "slow-fast" approach); or you can dive in to projects and refer to textbooks when you get stuck (the "start-stop" approach). In the slow-fast approach you will go slowly through a lot of textbooks for a long time, and then in theory you will be able to do projects very quickly once you are done. In the start-stop approach you will start a project, quickly get stuck and spend a while searching for and understanding the answer, then go back to your project.
In my opinion electrical engineering, being a subject where fast feedback is generally possible, is very well suited to project-first learning. I would recommend grabbing a few textbooks (Horowitz and Hill's Art of Electronics holding pole position for a practically-oriented learner, in my opinion), reading their introductory material (table of contents, preface, etc.; enough that you know what each book has in it), and then setting all the books aside until you need them. Avoid books targeted at "makers"; most are fine but a sizeable fraction are written by people with no clue what they are doing, and they will actively set you back. (It is very difficult to learn from an author who does not themself understand the subject, and all the worse if they do not realize that they do not understand. Since there are plenty of better sources out there, it's little trouble to just avoid the whole class.)
Trying to work on brain-computer interfaces is challenging because it blends biology with electrical engineering. The biology will naturally drive things, because you cannot really control it like you can the electronics. So learning EE in this context is about two things: 1) What can I do with circuits? and 2) What do organisms behave like and respond to electrically? Your project is then using your knowledge of circuits to solve R&D problems relating to bioelectric signals.
This isn't easy (I think you know that), but the benefit is that you can quit with "just" EE skills and still come out ahead.
You don’t fall for marketing gimmicks.
Another thing is you know the relative price (ballpark figure) of the technology, as in how much it costs to make something, often just by eyeballing the actual product or by looking at its specifications. Sometimes this translates to more abstract and somewhat unrelated fields such as medications (if you read the patents and study them).
It's an easy leading indicator of who has their shit together.
It's probably at the same level of complexity, give or take.
Not to dunk on the rest of your reply, which I agree with, but there is a humongous overlap between your typical undergraduate physics and electrical engineering degree, and I think you're able to be successful because they're so similar. Academically, the required courses are mostly identical until your 3rd year and if you choose an RF, microwave, or semiconductor physics specialization it's just more of the same applied physics, so it would make sense you would easily be able to pick up the concepts necessary with experience.
New graduates in EE also can't do the things you listed :)
That was the spirit of my point, yes :)
More than once I've heard a coworker complain that "I wish I had learned about that in school instead of [filler course so useless that I forgot what they said]."
The Z-transform is much more related to the others than is clear at first glance. This post on transforms  from the The n-Category Café is fascinating, and my go-to for understanding what the Laplace transform really is, even if I don't quite grasp many things in the post. (And I also have a math degree! But not a graduate one in active use, as most of the people around there do.)
I understand some of these words... they're very familiar to me...
I'm saying this as someone who's dealt with the discrete and continuous time Fourier transforms, and Z-transform, and wants to get into Laplace transforms.
it might as well be from a random text gwnerator
“Avoid for new designs”
I even went a step further with how involved with physics I was during the EE degree and specialized in RF; hardly any of that was covered in my physics degree.
As someone who has a degree in both: Hard disagree. The physics curriculum had one course on circuits/electronics combined, and they covered almost nothing practical when it came to tools like oscilloscopes, etc. Few physicists I know have heard of "3db point". No statistics in the physics curriculum (no, quantum mechanics and stat mechanics don't count). Absolutely nothing with regards to digital, communications, or control theory.
The only real overlap was math, EM and semiconductors. Most people who get an EE degree are not targeting that world.
Like others have said, there's a lot of overlap, especially for experimental physicists, which is what I studied. The stuff that makes studying engineering hard for a lot of students is the math and physics.
There are subjects that we don't learn in physics, such as control theory. Yes, that's worth learning. I defer to engineers for really hard feedback control problems. My approach is instead, to design the hardware so its physical characteristics make the control problem easy. That's not always possible.
One reason why we can find a way to fit in, is the huge diversity within engineering itself, leaving some niches that look a lot like what physicists do. When I taught in an engineering department for one semester, the professors always had their latest papers posted outside their office doors, and I noticed that one prof seemed to publish everything in Physical Review.
Out in the work world, a lot of people with engineering job titles don't really do engineering: They can be quite busy and productive, and rewarded, for basically arranging things, fitting things together, troubleshooting, dealing with vendors, and so forth. In fact, they can get so busy at that stuff that they forget their math and theory, leaving the physicist as the go-to "math person" when a quantitative problem needs to be solved.
Then there are what I call the real engineers, for whom the engineering skill is accompanied by an attitude and discipline about making things safe, reliable, maintainable, and traceable to documented and published information. These are the ones who won't accept a measured value, but need to see it guaranteed on a data sheet. I'm not that kind of engineer, and I admit it. And we definitely need that kind of engineer for systems that potentially involve public safety or massive economic liability.
In my work, the title "manufacturing engineer" title goes to people that work all day (and at a hard pace) doing nothing other than working in the PLM system, orchestrating ECO bureaucracy, and BOM work. To them, the actual products are nothing more than a collection of part numbers and rules applied in a cumbersome framework. I almost feel sorry for them. The sad thing is, there's an increasing population of these types, along with product/project managers and supply-chain specialists, while at the same time a decrease in engineers and techs.
I also have a physics educational background and make my living doing a weird mix of EE, software, and failure analysis work. I love my job, I see myself as a kind of general purpose problem-solver. Unfortunately actual hands-on technical generalists, IMHO, are in a downward spiral these days as far as status within large organizations goes.
The OP, I hope, is aware of this. He might be happier specializing in his interests and teaming up with other specialists who focus on EE.
Outside of engineering, a lot of people with "manager" titles are similarly engaged. Their supervisory work, while important, is about 4 hours of work per week. The rest of the time is spent on tasks assigned to them, such as creating a new process for replenishing the hand sanitizer, or approving documents.
It's just that we believe that by now we should have eliminated clerks, so to make ourselves seem modern, we re-title them engineers and managers.
I'm not saying that's a bad development. It's just what it is, probably follows a smooth bell curve distribution of expertiese. The hard stuff is just, like, really hard (as is English!)
This makes total sense to me. The bulk of time I’ve spent on many projects goes into supply chain management and factory coordination. I can easily see how the work of one engineer can keep 10 people like this busy full time.
The more recent versions bring it up to date well, it's a dense book but one I find myself coming back to more than any other for the incredible depth of practical engineering knowledge.
Here's the website for the book https://learningtheartofelectronics.com
EE is a vast field that encompasses everything from high power transmission to designing semiconductors. Even full course work from undergrad to PhD in EE is going to be fairly specialized.
All that being said, I agree that if you just want to learn how to build a catalog of reasonably simple circuits, learning academic EE is a waste of time.
Can you expand on how analog electronics benefits in particular from a formal EE education? I build analog circuits (amplifiers, filters, power supplies mostly) very frequently in my job as a physicist. We have to care about noise so I've picked up a knowledge of how to deal with it in analog circuits. Is there some other area of analog electronics that "hackers" like me might not get exposed to, compared to an EE undergrad? I'm thinking of moving into EE and would like to work out the gaps in my knowledge. I also ask because I can see obvious reasons why your other example - RF electronics - would benefit from formal training but none for analog electronics.
More advanced stuff that you probably lack vs a practicing EE or an EE graduate education is going to be edge cases, advanced stability analysis, translinear logic and exposure to all the different types of component design. There are tons of different types of say amplifiers used in specific applications whereas most people working in a lab just slap opAmps on everything. A lot of advanced analog design is just applied control theory. Also keep in mind that these days Digital, RF, and Analog all blur a lot in a cutting edge design environment.
Quick Edit: A lot of the more traditional EE design companies will consider someone with a physics degree to be equivalent to someone with an EE degree unless they are looking for a very specific niche.
But if you can design an amplifier or power supply, you probably already understand how to think of all the basic circuit elements and write down a differential equation modeling the circuit behavior.
With respect to RF, it's also a large field. In lower frequency regimes you can model everything as a lumped circuit element. As you get into higher microwave frequencies, you start needing to worry about modeling things as a distributed circuit. If you are focusing on things like antennas then you need to know more about electromagnetics. These days practicing engineers dealing with things like antennas and feedlines typically model them with computers. In some ways RF analog circuitry is disappearing as ADCs and associated digital circuitry are becoming advanced enough to swallow large bandwidth signals.
In many ways EE is pretty close to "applied physics", just focusing more on emag and less on mechanics.
My main concern with EE is that once I'll get to the brain-computer interfaces, I'll be in a situation where there aren't many off-the-shelf components/solutions available, and at the same time I'll likely need to know how I can push physics closer to the edge. I suspect I may need a better theoretical foundation to do that.
That said, I definitely like the idea of focusing a lot on hands-on projects.
First make your thing do something, anything at all. Second, make it do something useful. Third, make it do the right thing, the thing you need, your goal from the beginning. Only then should you optimize it, making it smaller or cheaper or lower power or prettier or.... This is the road to success in the "R" phase of R&D.
And +1 for Art of Electronics, it's the bible for physicists working with electronics.
Brain-computer is going to be very tough, even ignoring all the safety and physically wiring into a brain stuff. The signals are irregular, weak and fast which makes them very difficult (but not impossible obviously) to measure.
I studied EE, and got on better with the the more theoretical textbooks than I did practicals ('huh, ok.. why?!').
You dont know what you dont know. Its better to at least read all of the coursework even fast and without understanding than go green and make basic I didnt know that existed mistakes.
Optics. Optics optics optics.
A tremendous amount of neural interfacing, especially in non-human primates and other organisms, is done via optics. ~All the advances in neural data acquisition over the past decade have been optical. Microscopy is the future for a tremendous amount of neuroscience and more and more people are considering it seriously for human-scale BMI.
I know optics isn't always thought of in an EE context, but it should be! Many people doing amazing computational imaging and optics work are in EE departments. Computational imaging is the new hotness and can let you combine your existing CS skills with signal processing and optics to do things like build a lensless camera! https://waller-lab.github.io/DiffuserCam/
If I were you I would ditch the RF part of your plan and study optics. Yeah, it's all EM, but the order-of-magnitude differences in the frequencies involved makes the underlying engineering quite different.
Maxwell would have agreed.
That's a good intro into real optics.
It's much more than the optics chapter you'll get in a physics textbook. It goes over the classical ray optics in good detail, does a great job with traditional matrices and that formulation of optics (the one that the design programs like Zemax use), goes well into the real meat-n-potatoes of wave optics (including birefringence, a huge part of biological optics), gives you a good accounting of how lenses and other optical devices are actually Fourier transformers, and also dives into the more esoteric optical devices (a must for practical neuro-optics).
It's an upper-division/graduate level book, fyi. So I'd back-load it in your study course. Though in terms of neuro-optics it's more of a keyhole book.
If you are particularly interested and really want to know what's actually going on with EM, then you need to go through Jackson:
This is the book on EM, but is very much physics graduate student level. And honestly, I don't think you's need it for BMI stuff. But if you don't go through it, you'll just be trusting other people when they say your ideas won't work and they can't really explain it to you. Just going through Jackson is a bit of a hazing experience and will earn respect.
Yes, you do.
"Control" is another name for "optimisation" or "systems with feedback".
It is the theory covering any system that has a closed loop in it. Optimisation is a mindbogglingly broad field with application to nearly everything in the physical world. Other branches of engineering, science and maths study this area but give it their own name.
Examples of systems with cycles:
* Any system that does optimisation: Deep learning, adaptive systems, ...
* Error control decoders in digital communications systems.
* The majority of non-trivial circuits.
* Pretty well every circuit operating at high frequencies.
* Echo cancellers in telecoms.
* Computer networks (eg. TCP congestion control)
* Systems of chemical reactions
* The brain (your area of interest) is a seething mass of feedback paths.
Optimisation (a.k.a. Control Theory) and Information Theory (some of which is covered under the name Communications Theory) are fundamentals. "Digital" in their title doesn't mean they have narrow application, as Information Theory (Shannon, ...) treats everything, including analogue, in terms of bits.
Given your background in maths, one of the first things you should do is to try to construct a "Rozetta Stone" to relate a complete list of Electrical Engineering topics back to what you already know. For example, you will have already done a lot of control theory, but have learned it as optimisation. Part of your task is to recast your existing knowledge in terms of EE jargon, identify the gaps, then fill them in. Unlike an undergraduate you're not starting from the bottom.
A suggestion: Along with the list of EE areas you want to learn, why not add to your post a list of all the areas you already know in maths? HN readers may be able to link the areas that you want to learn with what you already know. It's hard to make such links yourself, as you don't yet know what each EE topic contains.
All these were full semester courses. These courses were actually needed if somebody wanted to properly understand the whole theory of electrical engineering (signals, em transmission, antennas, microwaves, optical fibers, theory of electronics, electrical machines, electric power systems, etc).
Depending on which subject you want to focus on you may not need all these mathematics and physics but your will definitely need some theoretic knowledge to actually understand it!
So of course you can't go back in time and replicate that, but you can do this... Get SPICE, or better a real lab, and start messing around with the absolute basics until you can dream it like when you finally do when learning a second language.
Without that primal understanding all the advanced stuff will just be rote memory learning.
It will feel slow and you sound like someone who wants to move fast. But getting a feel for voltage and current and their basic interaction in a hands on way will set you up to absorb the more advanced stuff like a sponge.
Just because you 'know' ohms law etc doesn't mean you can 'think' in ohms law like you do when you speak your first language.
So go deep on the basics. Give it lots of boring hours. Then start getting into the more esoteric stuff.
Enjoy the trip. Its been a wonderful one for me and I hope you get the same joy
Also, I'd mention that the use of cgs in Purcell can be a bit annoying as you move on (it's very physics based) since most constants (permittivity, dielectrics, etc) are usually in mks instead. Those are used in the EE books.
One thing you will definitely want to learn is SPICE for simulation (any real job will probably be using Spectre or something built into your tool set), and luckily there are quite a few free ones. I'd recommend LTSpice for simple projects. Similarly there are "free" tools for building and testing FPGAs for the digital simulation side.
I ended up reading through "Foundations of Analog and Digital Electronics" and was quite happy with it. Though I believe there are other textbooks that are more commonly used for learning basic circuit analysis.
And I am going through Oppenheim & Wilsky now and have no complaints. The last chapter is on linear feedback systems so you'll get a bit of the control theory background there.
Not sure what I want to read next... maybe digital signal processing or control theory. I have no real goal in mind here (outside of an interest in RF), just reading for fun.
Same here. "Practical electronics for inventors" ended up being way more approachable.
https://wiki.analog.com/university/courses/electronics/text/... - An excellent free "Intro to Electronics" course from Analog. In fact, all of the courses on their website are pretty good, I would say comparable to what you would get at a university minus the TA support when things don't work. But /r/ECE or Stack Overflow can probably help if you ever end up really stuck.
https://www.analog.com/en/education/education-library/softwa... - Another from Analog, it's a great resource for learning SDR. It assumes you're coming from an EE background though, so it would be helpful to do the fundamentals first.
http://freerangefactory.org/pdf/df344hdh4h8kjfh3500ft2/free_... - Free Range VHDL is what my FPGA class used, and it's free!
I would also suggest playing with some ECAD software like DipTrace or KiCAD. It's generally not part of a normal EE curriculum, but really should be! Being able to draw a schematic or lay out a circuit board will be useful if you have any advanced projects you want to try at home. Especially with cheap fabs like OshPark it's a good skill to have.
Purcell is a physics book, but I think with your math background it might be fine? From there I'd suggest Griffiths E&M, as far as setting up more complicated problems goes. I don't really like the EE-oriented E&M books, but if you need some of the "calculate this value" style of problem maybe you'd want to take a look at them.
Circuit design is kind of unsatisfying these days since on the professional side there's a lot of throwing stuff in the simulator, especially with IC design. I'm an advocate for more hands-on stuff. For the absolute basics I feel there's no substitute for getting some LEDs, resistors, breadboard, and multimeter, and doing some kid level projects. Then there's audio projects, and RF projects, since once you've learned the textbook fundamentals of amplifiers, there's no substitute for building some. Pozar and the ARRL RF project book will take you a long way, though you'll have to buy some test equipment...
But honestly, do you really want to get distracted from your main focus? You may have lost interest by the time you're done with the curriculum. There's a lot you can get done by forging ahead and just learning what you need to as you go along. Why learn amplifier design when the industry is all too happy to sell you a black box gain block? Why learn digital design when microcontrollers are getting faster and cheaper all the time? ;)
I recommend "The Art of Electronics" by Horowitz and Hill. It strikes the right balance between theory and practice. You will need to dig deeper in some theoretical areas later, but this will give you a very good starting point.
1/ A complete set of tutorials on computational neuroscience as Jupyter notebooks: https://github.com/NeuromatchAcademy/course-content/tree/mas... This is the material from last year; I think they will be running a summer school again this year so you might be able to join and learn as part of a group.
2/ If you need a review of linear algebra, you can check out my book No Bullshit Guide to Linear Algebra. In particular the Applications chapter contains a summary of everything I used most often from back in my EE days (Fourier transforms, circuits, least-squares, etc.) See a preview of the book here: https://minireference.com/static/excerpts/noBSLA_v2_preview.... (note it's not a free book, but not expensive either)
For basic circuits and electronics, I'd recommend Electronics with Professor Fiore. It's comprehensive with free lecture videos, textbooks, and lab manuals.
For digital, Introduction to
& Logic Design
with Verilog by LaMeres (not free, sorry).
For electromagnetism, Electromagnetics Volume 1 by Ellingson is free.
I don't have a signals & systems reference that I actually like and would recommend to anyone, unfortunately.
That's pretty much the core of EE. Everything else is a specialization.
More importantly - many of those lab projects require:
1. Expensive equipment. Spectrum analyzers, big motors/transformers/generators, etc.
2. Bespoke/boutique setups for things like automation projects , that simulate real-life processes. Schools spend a lot of money on these things, and have in-house engineers that perform maintenance, updates, repairs on them. These are normally not things you can just build over the weekend, and then use for self-learning. Hell - in many cases, Bachelors Thesis projects consist of building stuff like that, and then validating the generated data (measurements, etc.) by theory.
Sure - one can simulate A LOT of things today, but there are things you need to work on with your hands, in order to learn something useful.
I’ll add, I was somewhat surprised, given the explosion in MOOCs over the past few years, to find very few courses equivalent to introductory undergrad EE classes.
Just an FYI, a lot of BCI companies are running stuff like repurposed audio analyzers ala the U8903B for lab work and bench testing their designs. Parallelism is the name of the game, and analog performance requirements aren't super strict so you won't be designing custom ICs any time soon unless you want to work on the probe interfaces themselves (which are more MEMS than circuit design but need a little of both).
Something like Medical Instrumentation: Application and Design by Webster is a great place for a beginner who wants to toy with human interfacing circuits. Back it up with something like The Art of Electronics and that will get you to professional lab tech territory.
It looks like the article author is just "guessing" textbooks. I'm browsing the U. Waterloo curriculum and it doesn't specify any books.
The article is still useful, don't get me wrong, but I would love to see a list of the textbooks that are actually used at a university program. I've Googled it many times and I only find the names of the courses.
The higher the course the lower the confidence of having the right book.
The plan looks quite complete, similar to the list of courses I did in university. I remember I also did a power electronics course which I didn't see in your list.
Fabrication of a chip is not really feasible to do at home. The chemicals you might be able to get, but not the equipment.
The Art of Electronics by Horowitz and Hill has a permanent place on my desk. It is quite simply the bible of electronics engineering, the EE analogue of the famed Machinery’s Handbook.
I also recommend Signals and Systems by Oppenheim for any aspiring EE.
Assume that workable consumer BCIs will come within ~2 decades and focus on only a small part of it, that's the only way you can contribute meaningfully.
> I am almost certainly missing something important, but I don’t know what.
You will know once you start. Don't plan too much - pick a realistic goal and just start. Build a clock. Program a microcontroller. Log your heartbeat. Measure your brainwaves with OpenBCI. Build a feedback loop of some kind. Get a feeling for it.
Specifically with the "Getting Started in Electronics" and his "Engineer's Mini-Notebook" series.
Open Source Society University
Path to a free self-taught education in Computer Science
also bioinformatics and data-science
Buy some BCIs and reverse engineer them, possibly. Maybe try to improve them. You might want to reach out to the authors of the papers you've been reading for advice. Neuralink put a BCI in a pig so try figuring out how they did it, and maybe they'll give you a job? Even Elon's pitch for recruitment during that presentation was "we don't know much about the brain anyway", and mostly just want you to have solved hard problems. There likely won't be a straight-forward path since this stuff isn't commercialized yet.
I also had a hardware interest later my career and my approach was slightly different. I found an embedded systems job that pays about 3 times less than what I used to get paid (since I have no experience). It is definitely fun and I learn a lot, but I definitely don't have the financial freedoms that I used to have. I'm not sure which is the correct approach, but surely there is no "easy" way of getting there.
Please have in mind that this is a very serious time (and financial in my case) commitment that you are about to make.
I sometimes wonder how much of the practical and theoretical know-home related to designing modern cutting-edge silicon is actually buried in the brains of private sector workers and how much of it makes it back to academia.
I've implemented all the standard things, AM, FM, SSB radios, etc.. I had a lot of fun figuring out how to decode and display the local VOR beacon near my house.
You can also play with audio frequencies, and your microphone and speakers... it's fairly easy to get an intuitive idea of what a negative frequency really means if you have an IQ channel.
But i find Gnu Radio somewhat intimidating (not having a Signals/DSP background). Are there any books/articles/videos etc. which will ease my learning curve? Note that i already know of Michael Ossmann's course with HackRF (https://greatscottgadgets.com/sdr/)
The thing is you can take an existing flowgraph, modify it, and see what happens in about 30 seconds.
This video seems to be a good starting point: https://www.youtube.com/watch?v=ufxBX_uNCa0
Would it be cruel to suggest that you might want to advance a bit more before weighng in ?
I'd say that semiconductor physics, real math, control systems, real mixed signal and a couple of others should get a go ... but my eldest child didn't get much past this, so maybe the state of the art today?
Again- I mean no cruelty in my comments, but seems as if modern curricula are not teaching a person what a person needs to know to go into any related industry job...
(And I could be wrong - as I often am)
I'm not sure they ever did. They should imho be teaching the ability to learn and adapt to changing and emerging technologies, and to think critically. I'm still using the mathematics I learned in college, to understand things that didn't exist back then such as elliptic curve cryptography.
Practical Electrical Engineering by Makarov et.al.
Electronic Circuits: Handbook for Design and Application by Tietze, Schenk et.al.
Sensors and Signal Conditioning by Ramon Pallas-Areny et.al.
Introduction to Embedded Systems: Using Microcontrollers and the MSP430 by Jimenez et.al.
Patterns for Time-Triggered Embedded Systems by Michael Pont.
Spending just a week or two talking to all the experts could save a lot of wasted effort.
I am currently investigating more efficient forms of study (which would imply creating a new language) for the compression of academic text (which is often very long and not very accessible to inexperienced people).
Whoever is interested in studying with me, my email is: email@example.com (I speak Spanish)
This is extremely broad and ambitious. Younger me would have said go for it as I loved to learn everything, but older me has forgotten much of the stuff that I so much loved to learn, so I moved to the camp of learning what you need.
Unfortunately I don't know too much about brain-computer interfaces, especially if it's cutting edge research.
At a high level, these are my recommendations:
The basic ideas about how circuits work is presented in any introductory book, the E&M book (Purcell) would mostly be useful for device physics and transmission lines plus other RF topics (mostly EMI, crosstalk, and other things that can go wrong). Some purists might argue on which side of the equation an inductor voltage should be, but it has zero practical effect. Also, this is a book for usually the second physics course in college, so you might have done that already and just need a refresher.
Similarly, unless you expect to be either developing novel devices, or be involved in fabricating existing devices in new nodes/conditions, you can skip anything about devices (types, structures, fabrication, materials, electron bands, doping concentrations, diffusion, drift, etc) and just the voltage/current behavior between pins should be plenty (these are covered in any introductory book). The chemistry book is mostly irrelevant for EE, although in the neuroscience case it's more applicable if we are talking about invasive electrodes (but still, probably too general and broad).
Books on integrated circuits depend a bit on whether you need to learn about some other topics that are not usually presented on their own, such as fast amplifiers, mixers, oscillators, etc with CMOS technology. I'd say though that RF/MW integrated circuits differs considerably from discrete RF/MW work, so again most likely you'll get away with treating various parts as opaque building blocks, connected by transmission lines. And I'm going to guess that for BCIs the frequencies involved are quite low, so this whole branch might be irrelevant.
Probably you'll need to learn the basics of data converters to digitize the brain signals, but again I'm not sure this warrants going through a course versus just the wikipedia page and a datasheet of a specific part you want to use. As with the other things above, courses are usually designed for people making converters, not people using them.
Signals, systems, feedback, control systems are very fundamental "mathy" engineering tools that apply to more than just EE, so probably a good tool to have in general.
I see your questions about wireless systems. Again as above. Usually these books are designed for people wanting to develop these things professionally, and if you just want to communicate wirelessly it's mostly learning the "API" that some chip has to do what you want. Not to mention the compliance nightmare to roll your own if it's beyond a handful of prototypes.
I think you get the theme. Sadly EE outside the companies making ICs has become very similar to software where you are basically plumbing black boxes together. And if you don't have a standard application, with lots of time spent on figuring out hacks to use existing parts in non-standard ways, because if you can't find the perfect part the barrier to rolling your own is much steeper than not in software.
So in a way, The Art of Electronics is very applicable. Unfortunately I think it's terrible to learn from unless you already know the stuff, and (unless it has been refreshed to the point of a major rewrite) the copy I have is extremely outdated that I never really recommend it to anyone, and I haven't opened it in a decade.
Unfortunately I don't know of such thing, but if anyone here knows a course from the Neuroscience side doing experimental work, you could see what the prerequisites for that are, and go from there.
But if you are not like me and can still learn a lot of new things without forgetting too much, go for it all and live the dream!
I want to underscore the e͟n͟g͟i͟n͟e͟e͟r͟i͟n͟g͟ in the electronics engineering. Engineering everywhere is very hands on, and you cannot be an "engineer in theory only" if you want to perform on a job.
Learning from mistakes in a class setting is much easier, and c͟h͟e͟a͟p͟e͟r͟ than casually failing a USD $1M design in a very simply way, but a way not taught in any textbook.
Not to disparage you, I know many people who were similarly dragged into electronics engineering by necessity, and got to the level of degreed engineers over many years.
But those guys had years, and years to perfect their skills in a time when the industry was more forgiving, and was growing with their skill.
I would say that today, nobody will hire a 18 year guy who was just an electronics hobbyist to a factory, that was not the case 12-10 years ago.
What I can say against modern electronics engineering education is that excessive focus on producing "workplace ready" cadres makes for worse workers past the basic level.
I know people who are quite adept with digital electronics, but can't even understand how anything but textbook versions of SMPS power supplies work because of universities thought that analog circuits are now what people pay for. This the same for many more fields in electronics.
I believe properly taught EE can figure out just anything with the right approach, and time, and this attitude is the best what education can give you, unlike mass produced engineers who keeping find lame excuse "I'm not a logic/power/high speed/rf/motion control/asynchronos circuit/metrology/network/audiovideo engineer! I did not study this at school!"
Electrical Engineering 101: Everything You Should Have Learned in School