Hacker News new | past | comments | ask | show | jobs | submit login
My self-study plan for electrical engineering (i-kh.net)
369 points by bucket2015 on March 20, 2021 | hide | past | favorite | 134 comments



I have some recommendations for you as an EE graduate, you can club Digital Circuits and Systems, Embedded Systems and Digital Hardware with one book in one continuous course with

the book Digital Design and Computer Architecture by Harris and Harris (A RISC V Edition will release soon in 2-3 months, buy that one)

For Electronic circuits choose Microelectronics by Behzad Razavi. Instead of Purcell go for "Engineering Electromagnetics with Ida", it is more intuitive.

And th first book you should start with is "Foundations of Analog and Digital Electronic Circuits"

Then you go for DDCA and Ida in parallel.

The list does give me head scrather, this is too broad to be ever accomplished. My recommendation to redo the list is first find out what piques your interest in EE, Digital hardware, analog hardware or control systems or embedded systems and try to have a self study focus in that concentration.

The way I see it with this plan you are setting yourself up for failure.

Edit:Removed the ditching recommendation as I see it relevant to OP's goals.


The one problem I see is "The Art of Electronics".

I Horowitz and Hill is a LOUSY textbook. And, personally, I find its usefulness as a cookbook overrated.

Going through the Forrest Mims notebooks/cookbooks/etc. for a solution to your problem is generally a way better idea than Horowitz and Hill. The Mims stuff does a really good job of pointing out the pitfalls you're likely to hit as well as the basic pedagogy.


Totally agree it’s a lousy textbook, but it’s not a cookbook either. It’s more like the Cliff’s Notes version of many many many application notes. One needs to be somewhat experienced to take advantage of it. For example, you might be decent at amplifiers but now you need to do a low noise measurement. Read the low noise chapter on AoA first. Now you know the lay of the land and you can understand the vocabulary of the area and navigate the actual datasheets and application notes.


Videos of Anant Agarwal's lectures that closely follow his "Foundations of Analog and Digital Electronic Circuits" book are online: https://ocw.mit.edu/courses/electrical-engineering-and-compu...


Yes, it is too broad. You should look at a school like Berkeley and look at their concentrations. Maybe RF isn’t important if you’re doing systems (unless you’re interested in RF).

https://m.box.com/shared_item/https%3A%2F%2Fberkeley.box.com...


I don't think you can call yourself an EE if you don't know anything about RF.


RF is not a required course in most universities. Some don't even have it as an elective. Were you thinking of electromagnetics? RF is a very specific subfield.


That's true maybe because it became such a big and deeply complex field that it has no space in the basic curriculum?

One might think that it's historically closely tied with the development of the radio and the telephone, see Tesla, Eddison, Bell Labs.

The common wisdom about electromagnetics with regard to EMI and such things seems to be that it's pretty much black magic. Similar for inductor selection in SMPS design.

It's pretty much physics, so a good deal about radiation can be found in medical physics for example, which is simply a different course of study with electronics as a side show, and maybe better as consecutive program for sophomores.


These are awesome recommendations - thanks!


Also for Digital Control Systems, check out this book by Tim Wescott

Applied control theory for embedded systems

It is less math intensive and more intuitive and aims at folks with a software background like yourself.


Why is it to broad to be accomplished? It seems to cover your typical BSEE, and you need an exposure to all of it.

Razavis RFIC is a good one too, but that’s really getting too specialized. Pozar is good for undergrad microwave.


As someone who survied a BSEE, the range is too huge. I work with digital design as my day job, I only intuitively use EE101, digital logic and computer architecture and occasionally analog when dealing with post silicon issues.

The OP wants to study EE because he has a specific goal. My suggestion was that instead of trying to study everything EE focus only on those subjects that are relevant.

For example: If I was interested in robotics , I would not bother with digital,RF or Analog or eveen communication systems. I woud primarily focus on Control Systems and Embedded Systems.


> The OP wants to study EE because he has a specific goal. My suggestion was that instead of trying to study everything EE focus only on those subjects that are relevant.

This is something that I've seen often in self study plans for software development - the "learn everything and then try to use it" rather than "learn what you need to start solving the problem... and start solving it."

In software development this often takes the form of a self-learner learnings Java, JavaScript, Python, C, and C++. Once in an interview it becomes apparent that they don't know enough about any one language to solve a problem in that language.

This is where a university class (and degree) have an advantage - they've got a set of problems for the student to solve (homework and labs) and then take the student through learning specific knowledge to solve those problems.

This also shows what self teaching often lacks - those small problems that can be accomplished as part of learning how to solve the big problems.


OP has a formal background in applied mathematics and is an experienced software engineer, I don't think he will have any problems with generalization.


I did a basic project and I don't see how you can do robotics without analog systems unless you don't intend to build custom actuators and just buy very expensive off the shelf parts. The digital portion is absolutely trivial.


Analog systems and Analog Integrated Circuits, Analog Systems deals with using Analog chips, analog integrated circuits chips deals with designing such chips, a robotics engineers needs the first one not the latter. And the first one requires EE101 knowledge and signals and systems knowledge for filters etc which the OP is already covering in other courses


I think it's a bit of a pessimistic take, to be sure, but I do agree that most people will not make it through this in a self-directed manner. Sure, some may, and I wish the OP the best and hope it all works out, but it's hard for most people to tackle much simpler, straightforward topics in a self-learning environment.

In this case, though, I think OP's approach to this shows that they're serious about keeping with it, which is really cool to see.


> It seems to cover your typical BSEE, and you need an exposure to all of it.

Hard disagree. Much of the page involves what normally would be electives. You need exposure to some subset, but not all of it.

To give you an idea, my undergrad in EE did not require a course on materials (although it was an elective).

Everything in "Phase 2" was an elective - none was required (although many universities do require the "electronic devices" course).

Needless to say, if everything in Phase 2 was an elective, so was everything in Phase 3.

Also, when I look at pretty much any job requiring EE, and intersect it with the courses I took as an EE undergrad, I find that most courses are not needed. EE (and an EE curriculum) is often quite broad. For any given course, there are plenty of jobs that will need that course, but most EE jobs will not. If the submitter has some specific goal in mind, he won't lose much by skipping courses not related to that goal.

To give you an idea, when I worked as an EE, I had to use basic circuit theory 2 or 3 times, digital logic only once, and the physics of electronic devices a lot. The level of EM I needed was satisfied by high school physics, so I won't even count EM. Everything else I took: Control theory, communications, electronic circuits, power/machines: Never used it.


Why push yourself through a degree-style path? So much of what EEs learn in their coursework is of low utility. (I'm a physicist who transitioned to working as an EE. I have never had a single EE course, and yet I find myself with no obvious deficits compared to my colleagues who have.)

There are two ways to learn an existing technical-ish subject: you can spend a lot of time reading textbooks, then do some projects (the "slow-fast" approach); or you can dive in to projects and refer to textbooks when you get stuck (the "start-stop" approach). In the slow-fast approach you will go slowly through a lot of textbooks for a long time, and then in theory you will be able to do projects very quickly once you are done. In the start-stop approach you will start a project, quickly get stuck and spend a while searching for and understanding the answer, then go back to your project.

In my opinion electrical engineering, being a subject where fast feedback is generally possible, is very well suited to project-first learning. I would recommend grabbing a few textbooks (Horowitz and Hill's Art of Electronics holding pole position for a practically-oriented learner, in my opinion), reading their introductory material (table of contents, preface, etc.; enough that you know what each book has in it), and then setting all the books aside until you need them. Avoid books targeted at "makers"; most are fine but a sizeable fraction are written by people with no clue what they are doing, and they will actively set you back. (It is very difficult to learn from an author who does not themself understand the subject, and all the worse if they do not realize that they do not understand. Since there are plenty of better sources out there, it's little trouble to just avoid the whole class.)

Trying to work on brain-computer interfaces is challenging because it blends biology with electrical engineering. The biology will naturally drive things, because you cannot really control it like you can the electronics. So learning EE in this context is about two things: 1) What can I do with circuits? and 2) What do organisms behave like and respond to electrically? Your project is then using your knowledge of circuits to solve R&D problems relating to bioelectric signals.

This isn't easy (I think you know that), but the benefit is that you can quit with "just" EE skills and still come out ahead.


As a practicing EE / RF comms engineer, I will say that it is very obvious when you're working with someone who thinks that their EE coursework wasn't useful for the real world.


As a practicing EE (power systems), I agree. It often becomes clear when someone is unable to distinguish a practical limit (this equipment is not rated for X, our operating procedures prohibit doing X) from a physical one (X is not possible because of underlying physical principles).


This. There are so many real world, practical, and pragmatic uses for EE. This example, which is basically knowing that the hardware specifications cannot meet the claims that are being made about a product, in an accurate or reliable fashion, is one I use on an everyday basis.

You don’t fall for marketing gimmicks.

Another thing is you know the relative price (ballpark figure) of the technology, as in how much it costs to make something, often just by eyeballing the actual product or by looking at its specifications. Sometimes this translates to more abstract and somewhat unrelated fields such as medications (if you read the patents and study them).


Yeah I have run into plenty of EE's who don't understand how to model a simple filter, for instance.


The first interview question is always an RC filter.

It's an easy leading indicator of who has their shit together.


RC filter is like fizzbuzz for EE.


One of those is a basic building block used in pretty much everything, the other is software bullshit.


Looping through a list of items, and doing different operations based on their value is an extremely common occurrence, and you’d hard pressed to find a codebase that doesn’t use that pattern somewhere.


Welp, this is why interview fizzbuzz exists, because some people think of it as "software bullshit" instead of the simplest possible program you can imagine that should come out as fast as you can type it.


I feel attacked. I'm a DSP engineer and I haven't implemented an analog filter since I graduated. If we got into detail on RC filter gain and phase shift I'd fail. Do I have to give my EE degree back?


No, but you at least realize an analog filter is a thing that exists. You don't remember the specifics, but you could study them if you needed to.


As an EE you should know what goes in front of ADC to avoid aliasing.


I think it depends on what is meant by details. For a DSP engineer, if details are not knowing the difference between and RC made with ceramic or film cap, I’d give a pass there. If details are not solving continuous time equations in the form 1/s, that’s probably a fail.


or you buy a basic building block ADC and follow instructions.


Don't the software methods have very similar problems?


fizzbuzz is literally a filter, albeit digital, and it involces a circuit, well a loop in most realizations, too.

It's probably at the same level of complexity, give or take.


And, like fizzbuzz, lots of people fail it.


yep exactly, even with training and degrees and claims of competence


> (I'm a physicist who transitioned to working as an EE. I have never had a single EE course, and yet I find myself with no obvious deficits compared to my colleagues who have.)

Not to dunk on the rest of your reply, which I agree with, but there is a humongous overlap between your typical undergraduate physics and electrical engineering degree, and I think you're able to be successful because they're so similar. Academically, the required courses are mostly identical until your 3rd year and if you choose an RF, microwave, or semiconductor physics specialization it's just more of the same applied physics, so it would make sense you would easily be able to pick up the concepts necessary with experience.


No. I disagree, and I’m a Physics SB and PhD (lasers). EE is a discipline unto itself. A stack of physicstextbooks and coursework is useless when you need to build a circuit that does what you need. Indeed, even writing down the desired circuit specs in a reasonably professional manner has zero overlap with physics.


If it's just circuit design, the difference between an EE and a physicist is 6-9 credit hours of courses. You will both share all the necessary prerequisites in mathematics (mostly Fourier analysis, linear algebra, basic statistics, and differential equations) and electromagnetics.

New graduates in EE also can't do the things you listed :)


6-9 credit course of courses and you’re a circuit designer? You are dreaming. Which capacitors are used for what function? What kind of trouble can you get into if your comparator is too fast? Do they tell you that a 7800 series regulator needs a load? Going to school is good. But if the alternative is 6-9 course hours, I’ll get further with scope, meter, soldering iron, app notes, and LTSpice in the same time than that student. And I assure you, from the bottom of my heart, as a near-expert in both, that the Fourier analysis in quantum mechanics, solid state physics, optics, etc. bears little resemblance to that used in signals and systems and DSP. LITTLE. RESEMBLANCE.


> New graduates in EE also can't do the things you listed :)

That was the spirit of my point, yes :)

More than once I've heard a coworker complain that "I wish I had learned about that in school instead of [filler course so useless that I forgot what they said]."


Very good point. Often I think the hardest part about being an EE is using our CAD tools.


Also perhaps units conventions etc may be different in EE and Physics, like use of Gaussian units in electrodynamics in Physics, and also things like the mysterious 'Z'-transform that seems to pervade much of EE.


Gaussian units seem to be on their way out. I, for one, won't miss them. (Or cgs.)

The Z-transform is much more related to the others than is clear at first glance. This post on transforms [0] from the The n-Category Café is fascinating, and my go-to for understanding what the Laplace transform really is, even if I don't quite grasp many things in the post. (And I also have a math degree! But not a graduate one in active use, as most of the people around there do.)

https://golem.ph.utexas.edu/category/2019/07/what_is_the_lap...


> the Laplace transform is really just a generalization of the familiar Laurent series representation of complex analytic functions, but where the exponents are allowed to be non-integers and to “vary continuously” rather than discretely.

I understand some of these words... they're very familiar to me...

I'm saying this as someone who's dealt with the discrete and continuous time Fourier transforms, and Z-transform, and wants to get into Laplace transforms.


https://golem.ph.utexas.edu/category/2018/02/mlab.html

it might as well be from a random text gwnerator


Gaussian units make physics prettier.

“Avoid for new designs”


Only now I had noticed, in the Gaussian system, the unit for capacitance is centimeters.


Exactly. Quick now, how many farads in a cm?


I have undergrad degrees in EE and physics. There is barely any similarities besides a class or two in electrodynamics.

I even went a step further with how involved with physics I was during the EE degree and specialized in RF; hardly any of that was covered in my physics degree.


I think it’s eye opening that physicists cannot explain how a computer operates physically lol


> but there is a humongous overlap between your typical undergraduate physics and electrical engineering degree

As someone who has a degree in both: Hard disagree. The physics curriculum had one course on circuits/electronics combined, and they covered almost nothing practical when it came to tools like oscilloscopes, etc. Few physicists I know have heard of "3db point". No statistics in the physics curriculum (no, quantum mechanics and stat mechanics don't count). Absolutely nothing with regards to digital, communications, or control theory.

The only real overlap was math, EM and semiconductors. Most people who get an EE degree are not targeting that world.


I'm a physicist too, and learned electronics on my own. I don't have an engineering job title, but have done fairly extensive design work. In my workplace, I'm the go-to person for anything analog and quantitative, such as figuring out a noise budget for a measurement system, as well as for figuring out how to prove that it actually works. Horowitz and Hill had a chapter entitled "digital meets analog," and there should be another chapter, "analog meets physics."

Like others have said, there's a lot of overlap, especially for experimental physicists, which is what I studied. The stuff that makes studying engineering hard for a lot of students is the math and physics.

There are subjects that we don't learn in physics, such as control theory. Yes, that's worth learning. I defer to engineers for really hard feedback control problems. My approach is instead, to design the hardware so its physical characteristics make the control problem easy. That's not always possible.

One reason why we can find a way to fit in, is the huge diversity within engineering itself, leaving some niches that look a lot like what physicists do. When I taught in an engineering department for one semester, the professors always had their latest papers posted outside their office doors, and I noticed that one prof seemed to publish everything in Physical Review.

Out in the work world, a lot of people with engineering job titles don't really do engineering: They can be quite busy and productive, and rewarded, for basically arranging things, fitting things together, troubleshooting, dealing with vendors, and so forth. In fact, they can get so busy at that stuff that they forget their math and theory, leaving the physicist as the go-to "math person" when a quantitative problem needs to be solved.

Then there are what I call the real engineers, for whom the engineering skill is accompanied by an attitude and discipline about making things safe, reliable, maintainable, and traceable to documented and published information. These are the ones who won't accept a measured value, but need to see it guaranteed on a data sheet. I'm not that kind of engineer, and I admit it. And we definitely need that kind of engineer for systems that potentially involve public safety or massive economic liability.


> a lot of people with engineering job titles don't really do engineering: They can be quite busy and productive, and rewarded, for basically arranging things, fitting things together, troubleshooting, dealing with vendors, and so forth.

In my work, the title "manufacturing engineer" title goes to people that work all day (and at a hard pace) doing nothing other than working in the PLM system, orchestrating ECO bureaucracy, and BOM work. To them, the actual products are nothing more than a collection of part numbers and rules applied in a cumbersome framework. I almost feel sorry for them. The sad thing is, there's an increasing population of these types, along with product/project managers and supply-chain specialists, while at the same time a decrease in engineers and techs.

I also have a physics educational background and make my living doing a weird mix of EE, software, and failure analysis work. I love my job, I see myself as a kind of general purpose problem-solver. Unfortunately actual hands-on technical generalists, IMHO, are in a downward spiral these days as far as status within large organizations goes.

The OP, I hope, is aware of this. He might be happier specializing in his interests and teaming up with other specialists who focus on EE.


Something I keep thinking about is that 100 years ago we had a huge cadre of workers called "clerks," whose job was basically to gather, organize, and transfer information. You'd think those people would be replaced by computers, but there's always a bit of complexity in each transaction that needs the human touch: Does this ECO make sense, for instance.

Outside of engineering, a lot of people with "manager" titles are similarly engaged. Their supervisory work, while important, is about 4 hours of work per week. The rest of the time is spent on tasks assigned to them, such as creating a new process for replenishing the hand sanitizer, or approving documents.

It's just that we believe that by now we should have eliminated clerks, so to make ourselves seem modern, we re-title them engineers and managers.


they are titled engineer if that's what their diploma says, which is indeed not that rare, and maybe paires well with software engineers fresh out of college who fail fizzbuzz (as mentioned before in this thread), precisely because so much of software engineering is glueing packages together.

I'm not saying that's a bad development. It's just what it is, probably follows a smooth bell curve distribution of expertiese. The hard stuff is just, like, really hard (as is English!)


I think it's an inevitable outgrowth of complexity. If the number of pieces grows by O(n), then interactions between pieces grows by O(n^2). It doesn't take much complexity before gluing pieces together becomes the dominant activity in an enterprise.


> The sad thing is, there's an increasing population of these types, along with product/project managers and supply-chain specialists, while at the same time a decrease in engineers and techs.

This makes total sense to me. The bulk of time I’ve spent on many projects goes into supply chain management and factory coordination. I can easily see how the work of one engineer can keep 10 people like this busy full time.


I can't recommend The Art of Electronics highly enough - it's way more intuitive and informative than any other resource on the subject I've found. It's exceptionally great when paired with the self-paced lab manual Learning the Art of Electronics[0].

[0] https://www.digikey.com/en/resources/edu/harvard-lab-kit


I'll second this, despite the first version being written over 40 years ago it's still one of the best books I've seen on the subjects.

The more recent versions bring it up to date well, it's a dense book but one I find myself coming back to more than any other for the incredible depth of practical engineering knowledge.


Unfortunately these kits are unavailable from either digikey or mouser, because they include various obsolete parts which are no longer sold.

Here's the website for the book https://learningtheartofelectronics.com


It definitely depends on what you want to do. If you really want to design RF or analog circuits, having a mastery of undergrad EE signals and systems courses would be helpful. But if you only want to make digital logic work, you only need some basic knowledge of circuitry.

EE is a vast field that encompasses everything from high power transmission to designing semiconductors. Even full course work from undergrad to PhD in EE is going to be fairly specialized.

All that being said, I agree that if you just want to learn how to build a catalog of reasonably simple circuits, learning academic EE is a waste of time.


> analog circuits

Can you expand on how analog electronics benefits in particular from a formal EE education? I build analog circuits (amplifiers, filters, power supplies mostly) very frequently in my job as a physicist. We have to care about noise so I've picked up a knowledge of how to deal with it in analog circuits. Is there some other area of analog electronics that "hackers" like me might not get exposed to, compared to an EE undergrad? I'm thinking of moving into EE and would like to work out the gaps in my knowledge. I also ask because I can see obvious reasons why your other example - RF electronics - would benefit from formal training but none for analog electronics.


Physics and EE have a lot of overlap and exposure to analog circuits in undergrad is pretty shallow. There will typically be a class that covers RLC and switched circuits in time and frequency domain and the basic uses of amplifiers and filters. After that it's theory applied from Fourier analysis and control theory and then they'll have a class on semiconductor physics and another that covers basic amplifier design. These days a lot of focus is on integrated chip design instead of board design. As far as most board level design these days, someone coming from a physics background will pick it up just fine on the job or in the lab as long as you have solid fundamentals.

More advanced stuff that you probably lack vs a practicing EE or an EE graduate education is going to be edge cases, advanced stability analysis, translinear logic and exposure to all the different types of component design. There are tons of different types of say amplifiers used in specific applications whereas most people working in a lab just slap opAmps on everything. A lot of advanced analog design is just applied control theory. Also keep in mind that these days Digital, RF, and Analog all blur a lot in a cutting edge design environment.

Quick Edit: A lot of the more traditional EE design companies will consider someone with a physics degree to be equivalent to someone with an EE degree unless they are looking for a very specific niche.


Thanks for the reply! Happy to hear they might consider my physics background. Stability is another thing I have to deal with a lot.


Probably not. I'm thinking of things like Laplace transforms and their relationship to differential equations, and stability analysis for things like amp feedback. Most of that you should have gotten academically in physics, just with a more specialized application when applying it to how you model an inductor or capacitor for instance.

But if you can design an amplifier or power supply, you probably already understand how to think of all the basic circuit elements and write down a differential equation modeling the circuit behavior.

With respect to RF, it's also a large field. In lower frequency regimes you can model everything as a lumped circuit element. As you get into higher microwave frequencies, you start needing to worry about modeling things as a distributed circuit. If you are focusing on things like antennas then you need to know more about electromagnetics. These days practicing engineers dealing with things like antennas and feedlines typically model them with computers. In some ways RF analog circuitry is disappearing as ADCs and associated digital circuitry are becoming advanced enough to swallow large bandwidth signals.

In many ways EE is pretty close to "applied physics", just focusing more on emag and less on mechanics.


I don't tend to dig down into equations and just use the simple heuristics I've built up from examining other designs. I guess with a bit of practice I could do it though. Thanks for the info!


That's a good point. The project-first approach worked very well for me when learning new software frameworks/libs/etc.

My main concern with EE is that once I'll get to the brain-computer interfaces, I'll be in a situation where there aren't many off-the-shelf components/solutions available, and at the same time I'll likely need to know how I can push physics closer to the edge. I suspect I may need a better theoretical foundation to do that.

That said, I definitely like the idea of focusing a lot on hands-on projects.


You are not going to start anywhere near the area where you have to go "full custom". That is, you won't be spinning custom ICs, you'll be assembling custom PCBs from off-the-shelf components. Possibly expensive ones. But that $1000 OTS part is an insane bargain compared to any chip fabbed just for you.

First make your thing do something, anything at all. Second, make it do something useful. Third, make it do the right thing, the thing you need, your goal from the beginning. Only then should you optimize it, making it smaller or cheaper or lower power or prettier or.... This is the road to success in the "R" phase of R&D.


I'm interested in how you found the transition from physics to EE. Did you target any particular "entry" jobs in the industry that were more genial towards physicists for instance? I'm a physicist with some electronics experience and I've got the idea to move into some EE-related field one day.

And +1 for Art of Electronics, it's the bible for physicists working with electronics.


Also, there are big differences within the field of just something like analog signal processing, which will be very important for this project. The textbook linked (Sedra&Smith) is great if one plans to pursue analog in VLSI but not so great if one wants to do it on a PCB (I think Franco’s amplifier text is way better here).

Brain-computer is going to be very tough, even ignoring all the safety and physically wiring into a brain stuff. The signals are irregular, weak and fast which makes them very difficult (but not impossible obviously) to measure.


I think it's way more about what suits the learner than what suits the subject.

I studied EE, and got on better with the the more theoretical textbooks than I did practicals ('huh, ok.. why?!').


>or you can dive in to projects and refer to textbooks when you get stuck

You dont know what you dont know. Its better to at least read all of the coursework even fast and without understanding than go green and make basic I didnt know that existed mistakes.


I used to build BMI systems in graduate school, from the lowest level (mixed-signal analog design for 70 uV extracelleular signals) to DSP (128 DSPs doing real time analysis) to the network (built my own ethernet MAC, foolishly!) to all the vis and RT-linux-based analysis. I left the area and switched into ML in grad school, but if I had to do it all over again there's one thing I think is missing:

Optics. Optics optics optics.

A tremendous amount of neural interfacing, especially in non-human primates and other organisms, is done via optics. ~All the advances in neural data acquisition over the past decade have been optical. Microscopy is the future for a tremendous amount of neuroscience and more and more people are considering it seriously for human-scale BMI.

I know optics isn't always thought of in an EE context, but it should be! Many people doing amazing computational imaging and optics work are in EE departments. Computational imaging is the new hotness and can let you combine your existing CS skills with signal processing and optics to do things like build a lensless camera! https://waller-lab.github.io/DiffuserCam/

If I were you I would ditch the RF part of your plan and study optics. Yeah, it's all EM, but the order-of-magnitude differences in the frequencies involved makes the underlying engineering quite different.


OP I highly recommend you listen to this guy here, among all the replies here they have the most experience with what you are aiming for.


> Optics. Optics optics optics.

Maxwell would have agreed.


OP here - Thanks! I'll put back the optics courses I deleted then


Bit late to the party, but this person is very much correct. I also do BMI work and optics is undoubtedly the future for neuroscience. The optics book you want to get is Hecht:

https://www.amazon.com/Optics-5th-Eugene-Hecht/dp/0133977226

That's a good intro into real optics.

It's much more than the optics chapter you'll get in a physics textbook. It goes over the classical ray optics in good detail, does a great job with traditional matrices and that formulation of optics (the one that the design programs like Zemax use), goes well into the real meat-n-potatoes of wave optics (including birefringence, a huge part of biological optics), gives you a good accounting of how lenses and other optical devices are actually Fourier transformers, and also dives into the more esoteric optical devices (a must for practical neuro-optics).

It's an upper-division/graduate level book, fyi. So I'd back-load it in your study course. Though in terms of neuro-optics it's more of a keyhole book.

If you are particularly interested and really want to know what's actually going on with EM, then you need to go through Jackson:

https://www.amazon.com/Classical-Electrodynamics-Third-David...

This is the book on EM, but is very much physics graduate student level. And honestly, I don't think you's need it for BMI stuff. But if you don't go through it, you'll just be trusting other people when they say your ideas won't work and they can't really explain it to you. Just going through Jackson is a bit of a hazing experience and will earn respect.


Feel free to email / dm me (HN and twitter handle are the same) if you ever have any questions.


> Digital Control Systems. Do I need this?

Yes, you do.

"Control" is another name for "optimisation" or "systems with feedback".

It is the theory covering any system that has a closed loop in it. Optimisation is a mindbogglingly broad field with application to nearly everything in the physical world. Other branches of engineering, science and maths study this area but give it their own name.

Examples of systems with cycles:

* Any system that does optimisation: Deep learning, adaptive systems, ...

* Error control decoders in digital communications systems.

* The majority of non-trivial circuits.

* Pretty well every circuit operating at high frequencies.

* Echo cancellers in telecoms.

* Computer networks (eg. TCP congestion control)

* Systems of chemical reactions

* The brain (your area of interest) is a seething mass of feedback paths.

Optimisation (a.k.a. Control Theory) and Information Theory (some of which is covered under the name Communications Theory) are fundamentals. "Digital" in their title doesn't mean they have narrow application, as Information Theory (Shannon, ...) treats everything, including analogue, in terms of bits.

Given your background in maths, one of the first things you should do is to try to construct a "Rozetta Stone" to relate a complete list of Electrical Engineering topics back to what you already know. For example, you will have already done a lot of control theory, but have learned it as optimisation. Part of your task is to recast your existing knowledge in terms of EE jargon, identify the gaps, then fill them in. Unlike an undergraduate you're not starting from the bottom.

Edit:

A suggestion: Along with the list of EE areas you want to learn, why not add to your post a list of all the areas you already know in maths? HN readers may be able to link the areas that you want to learn with what you already know. It's hard to make such links yourself, as you don't yet know what each EE topic contains.


I don't see any mathematics nor psychics books. During the first two years of our 5 year ECE degree we mainly did theoretical courses. Some examples: single variable analysis, multi variable analysis, differential equations, arithmetic analysis, algebra, mechanics, electronagnetism, waves.

All these were full semester courses. These courses were actually needed if somebody wanted to properly understand the whole theory of electrical engineering (signals, em transmission, antennas, microwaves, optical fibers, theory of electronics, electrical machines, electric power systems, etc).

Depending on which subject you want to focus on you may not need all these mathematics and physics but your will definitely need some theoretic knowledge to actually understand it!


This gentleman has an applied mathematics degree. So he lacks only physics. To work on that brain computer interface this gentleman needs basic chemistry and basic biology. For interaction on that interface solid understanding of analog electronics is needed. These topics aren’t trivial, I wish good look and strong motivation during this few year long journey.


I took up EE professionaly at around age 38. I learned really quickly, and am currently quite productive. But there is a catch. I learned the basics when I was a kid, and got into ham radio. A basic knowledge and 'feeling' for what's going on in the electromagnetic space meant 25 years later I could learn at a rapid pace.

So of course you can't go back in time and replicate that, but you can do this... Get SPICE, or better a real lab, and start messing around with the absolute basics until you can dream it like when you finally do when learning a second language.

Without that primal understanding all the advanced stuff will just be rote memory learning.

It will feel slow and you sound like someone who wants to move fast. But getting a feel for voltage and current and their basic interaction in a hands on way will set you up to absorb the more advanced stuff like a sponge.

Just because you 'know' ohms law etc doesn't mean you can 'think' in ohms law like you do when you speak your first language.

So go deep on the basics. Give it lots of boring hours. Then start getting into the more esoteric stuff.

Enjoy the trip. Its been a wonderful one for me and I hope you get the same joy


I didn't get a degree in EE although I've done quite a bit of it. I really liked The Art of Electronics by Horowitz and Hill, which has an associated lab book you could use for little projects. It covers a lot of real world issues that are missing from the more ideal academic books.

Also, I'd mention that the use of cgs in Purcell can be a bit annoying as you move on (it's very physics based) since most constants (permittivity, dielectrics, etc) are usually in mks instead. Those are used in the EE books.

One thing you will definitely want to learn is SPICE for simulation (any real job will probably be using Spectre or something built into your tool set), and luckily there are quite a few free ones. I'd recommend LTSpice for simple projects. Similarly there are "free" tools for building and testing FPGAs for the digital simulation side.


There are newer versions of Purcell.


YMMV, but I tried to self-study out of "Art of Electronics" and I ended up giving up because I didn't feel like I was getting the fundamental basic circuit analysis skills that I needed to actually comprehend everything they were doing. The difficulty level ramped up very quickly, at least for me. I suspect that it might be better as a reference for a project than a self-study textbook.

I ended up reading through "Foundations of Analog and Digital Electronics" and was quite happy with it. Though I believe there are other textbooks that are more commonly used for learning basic circuit analysis.

And I am going through Oppenheim & Wilsky now and have no complaints. The last chapter is on linear feedback systems so you'll get a bit of the control theory background there.

Not sure what I want to read next... maybe digital signal processing or control theory. I have no real goal in mind here (outside of an interest in RF), just reading for fun.


> YMMV, but I tried to self-study out of "Art of Electronics" and I ended up giving up because I didn't feel like I was getting the fundamental basic circuit analysis skills that I needed to actually comprehend everything they were doing. The difficulty level ramped up very quickly, at least for me.

Same here. "Practical electronics for inventors" ended up being way more approachable.


I've got some good free resource recommendations if you want:

https://wiki.analog.com/university/courses/electronics/text/... - An excellent free "Intro to Electronics" course from Analog. In fact, all of the courses on their website are pretty good, I would say comparable to what you would get at a university minus the TA support when things don't work. But /r/ECE or Stack Overflow can probably help if you ever end up really stuck.

https://www.analog.com/en/education/education-library/softwa... - Another from Analog, it's a great resource for learning SDR. It assumes you're coming from an EE background though, so it would be helpful to do the fundamentals first.

http://freerangefactory.org/pdf/df344hdh4h8kjfh3500ft2/free_... - Free Range VHDL is what my FPGA class used, and it's free!

I would also suggest playing with some ECAD software like DipTrace or KiCAD. It's generally not part of a normal EE curriculum, but really should be! Being able to draw a schematic or lay out a circuit board will be useful if you have any advanced projects you want to try at home. Especially with cheap fabs like OshPark it's a good skill to have.


Hi, I have degrees in EE and Physics. It's good that you want to get a well-rounded education, but I think focusing on E&M and circuit design will probably pay the most dividends.

Purcell is a physics book, but I think with your math background it might be fine? From there I'd suggest Griffiths E&M, as far as setting up more complicated problems goes. I don't really like the EE-oriented E&M books, but if you need some of the "calculate this value" style of problem maybe you'd want to take a look at them.

Circuit design is kind of unsatisfying these days since on the professional side there's a lot of throwing stuff in the simulator, especially with IC design. I'm an advocate for more hands-on stuff. For the absolute basics I feel there's no substitute for getting some LEDs, resistors, breadboard, and multimeter, and doing some kid level projects. Then there's audio projects, and RF projects, since once you've learned the textbook fundamentals of amplifiers, there's no substitute for building some. Pozar and the ARRL RF project book will take you a long way, though you'll have to buy some test equipment...

But honestly, do you really want to get distracted from your main focus? You may have lost interest by the time you're done with the curriculum. There's a lot you can get done by forging ahead and just learning what you need to as you go along. Why learn amplifier design when the industry is all too happy to sell you a black box gain block? Why learn digital design when microcontrollers are getting faster and cheaper all the time? ;)


Electrical engineering is very hands-on. Reading the textbooks should not be #1 on your list. You do need a solid base, but once you have it, you'll get more from building and testing circuits than from reading more textbooks.

I recommend "The Art of Electronics" by Horowitz and Hill. It strikes the right balance between theory and practice. You will need to dig deeper in some theoretical areas later, but this will give you a very good starting point.


@Iouri I have two links for you that might help you on your quest to learn EE:

1/ A complete set of tutorials on computational neuroscience as Jupyter notebooks: https://github.com/NeuromatchAcademy/course-content/tree/mas... This is the material from last year; I think they will be running a summer school again this year so you might be able to join and learn as part of a group.

2/ If you need a review of linear algebra, you can check out my book No Bullshit Guide to Linear Algebra. In particular the Applications chapter contains a summary of everything I used most often from back in my EE days (Fourier transforms, circuits, least-squares, etc.) See a preview of the book here: https://minireference.com/static/excerpts/noBSLA_v2_preview.... (note it's not a free book, but not expensive either)


Some of your reference materials are overkill and better suited to guided classes than self-study.

For basic circuits and electronics, I'd recommend Electronics with Professor Fiore. It's comprehensive with free lecture videos, textbooks, and lab manuals.

https://www2.mvcc.edu//users/faculty/jfiore/index.cfm

For digital, Introduction to Logic Circuits & Logic Design with Verilog by LaMeres (not free, sorry).

https://www.montana.edu/blameres/

For electromagnetism, Electromagnetics Volume 1 by Ellingson is free.

https://vtechworks.lib.vt.edu/handle/10919/84164

I don't have a signals & systems reference that I actually like and would recommend to anyone, unfortunately.

That's pretty much the core of EE. Everything else is a specialization.


My main concern is that you won't get anywhere near of lab-time. It is a huge part of EE education, and I honestly probably spent 30% of my time working on lab projects.

More importantly - many of those lab projects require:

1. Expensive equipment. Spectrum analyzers, big motors/transformers/generators, etc.

2. Bespoke/boutique setups for things like automation projects , that simulate real-life processes. Schools spend a lot of money on these things, and have in-house engineers that perform maintenance, updates, repairs on them. These are normally not things you can just build over the weekend, and then use for self-learning. Hell - in many cases, Bachelors Thesis projects consist of building stuff like that, and then validating the generated data (measurements, etc.) by theory.

Sure - one can simulate A LOT of things today, but there are things you need to work on with your hands, in order to learn something useful.


In phase 2 for analog circuits and DSP you should brush up on discrete math, calculus, and most importantly learn Laplace transforms . That’s the only thing I see missing.


OP here. Nice, thanks! I didn't put math down because my undergrad was in applied math and I got to use it a lot early in my career. But sounds like I'll definitely need a refresher.


I’m kind of self-trained in EE. It is important to have basic courses in Signals & Systems and logic design. The former, in particular, is not something that you are likely to acquire on your own. You and I share a love for Purcell. However it has little to do with EE. It is a physics book, and EE is not physics. You will never use or need to even know that the magnetic field arises from a relativistic transformation of the linear charge density of a current. I have a recommendation. Become an expert in LTSpice. Also, for about $2k, you can assemble a nice lab of Chinese scope, power supply, multimeter, soldering station, selection of ICs, resistors, caps, and a power supply. And if you are going digital, you need to learn VHDL.


I thought about doing this a while back, but I quickly gave up. I don’t know if I’m just an idiot or something, but I quickly found my self way out of my league unable to understand a lot of the foundational theory and physics. I also picked up a handful of textbooks in the particular domains I’m interested in (boy were those expensive) and those were even worse, filled to the brim with notation I’ll never understand. Part of it is likely to my extremely poor math education in fairness.

I’ll add, I was somewhat surprised, given the explosion in MOOCs over the past few years, to find very few courses equivalent to introductory undergrad EE classes.


The core of EE is Fourier analysis and BCI is no exception but you should be strong there because you've studied applied math. Most of your target books are good after your HN update, but one thing you are missing that will probably make a big difference are some books on measurement science. You'll want to get some focused both on EE and biomed/biology because they will focus on different things and both are relevant to your interests. Eventually you'll also want to touch on some non-linear and statistical control which play into the implementation of the more modern cutting edge BCIs. DSP overlaps a lot with this but still has some uniqueness you'll need to learn to put things into practice.

Just an FYI, a lot of BCI companies are running stuff like repurposed audio analyzers ala the U8903B for lab work and bench testing their designs. Parallelism is the name of the game, and analog performance requirements aren't super strict so you won't be designing custom ICs any time soon unless you want to work on the probe interfaces themselves (which are more MEMS than circuit design but need a little of both).

Something like Medical Instrumentation: Application and Design by Webster is a great place for a beginner who wants to toy with human interfacing circuits. Back it up with something like The Art of Electronics and that will get you to professional lab tech territory.


I've wanted to do the same for a while now, but I'm unable to find a curriculum that lists the textbook for each subject.

It looks like the article author is just "guessing" textbooks. I'm browsing the U. Waterloo curriculum and it doesn't specify any books.

The article is still useful, don't get me wrong, but I would love to see a list of the textbooks that are actually used at a university program. I've Googled it many times and I only find the names of the courses.


About 1/2 of the textbooks came from the old course outlines (e.g. https://ece.uwaterloo.ca/~ece207/) and class shopping lists on UWaterloo book store. About 1/4 were guesses, and the other 1/4 I don't know where to even start looking.

The higher the course the lower the confidence of having the right book.


For Integrated Analog Electronics I'd suggest Design of "Analog CMOS Integrated Circuits" by Razavi instead of "Analog Integrated Circuit Design" by Carusone, Johns, Martin.

The plan looks quite complete, similar to the list of courses I did in university. I remember I also did a power electronics course which I didn't see in your list.

Fabrication of a chip is not really feasible to do at home. The chemicals you might be able to get, but not the equipment.


Onur Mutlu‘s Digital Design & Computer Architecture course at ETH Zurich seems to be quite well regarded, and all the lectures & materials are openly available online. It uses Harris & Harris as a textbook, IIRC.

https://people.inf.ethz.ch/omutlu/lecture-videos.html


Why don't you go to Amazon and buy some books on brain-computer interfaces, start reading them and when you get stuck read the relevant electrical engineering information?

https://www.amazon.com/s?k=brain-computer+interfaces&i=strip...


I would suggest leveraging MIT's OpenCourseWare [1]. You can filter for courses that have lecture videos, notes, etc. These courses are usually very well organized and taught by some of the best professors in the world.

[1] https://ocw.mit.edu/


Electrical engineer turned software engineer, here.

The Art of Electronics by Horowitz and Hill has a permanent place on my desk. It is quite simply the bible of electronics engineering, the EE analogue of the famed Machinery’s Handbook.

I also recommend Signals and Systems by Oppenheim for any aspiring EE.


Don't try to do too much at once, you won't get anywhere. You probably got inspired by the recent growth of interest in BCIs, right? Please realize how complicated that subject is - it dwarfs AI and it's closest applications like self driving by one or two orders of magnitude, depending on who you talk to.

Assume that workable consumer BCIs will come within ~2 decades and focus on only a small part of it, that's the only way you can contribute meaningfully.

> I am almost certainly missing something important, but I don’t know what.

You will know once you start. Don't plan too much - pick a realistic goal and just start. Build a clock. Program a microcontroller. Log your heartbeat. Measure your brainwaves with OpenBCI. Build a feedback loop of some kind. Get a feeling for it.


Depending on the level of electronics background, I have great luck have people fall in love with the subject using Forrest M. Mims III authored books.

Specifically with the "Getting Started in Electronics" and his "Engineer's Mini-Notebook" series.


for anyone thats interested: OSSU

Open Source Society University

Path to a free self-taught education in Computer Science https://github.com/ossu/computer-science also bioinformatics and data-science


Before any of that, read the book "Div, Grad, Curl, and All That: An Informal Text on Vector Calculus" by H. M. Schey. It will make math of EE, particularly the interaction between electricity and magnetism (Maxwell's equations) a lot easier to understand.


>Once I’m done, then what? At the present, I don’t have a clear picture of how to transition from studying to working with brain-computer interfaces.

Buy some BCIs and reverse engineer them, possibly. Maybe try to improve them. You might want to reach out to the authors of the papers you've been reading for advice. Neuralink put a BCI in a pig so try figuring out how they did it, and maybe they'll give you a job? Even Elon's pitch for recruitment during that presentation was "we don't know much about the brain anyway", and mostly just want you to have solved hard problems. There likely won't be a straight-forward path since this stuff isn't commercialized yet.


Given that you already have a job and a family to support (at some point?), I would say that this is realistically a 15+ year plan that you've built yourself.

I also had a hardware interest later my career and my approach was slightly different. I found an embedded systems job that pays about 3 times less than what I used to get paid (since I have no experience). It is definitely fun and I learn a lot, but I definitely don't have the financial freedoms that I used to have. I'm not sure which is the correct approach, but surely there is no "easy" way of getting there.

Please have in mind that this is a very serious time (and financial in my case) commitment that you are about to make.


35 years since I graduated. A little surprised to find that many of the books are the same.


Noticeably absent from the curriculum is anything to do with gate-level electronic design, which makes up a vast percentage of the bulk of 2021 electronics.

I sometimes wonder how much of the practical and theoretical know-home related to designing modern cutting-edge silicon is actually buried in the brains of private sector workers and how much of it makes it back to academia.


Radio - Get an RTL-SDR dongle kit, and install GNU radio. Being able to get signals from the real world, and manipulate them via a flowgraph lets you do far more hands on than years of labs used to.

I've implemented all the standard things, AM, FM, SSB radios, etc.. I had a lot of fun figuring out how to decode and display the local VOR beacon near my house.

You can also play with audio frequencies, and your microphone and speakers... it's fairly easy to get an intuitive idea of what a negative frequency really means if you have an IQ channel.


Neat idea. I have been wanting to do something like this for a long time to really understand Wireless RF.

But i find Gnu Radio somewhat intimidating (not having a Signals/DSP background). Are there any books/articles/videos etc. which will ease my learning curve? Note that i already know of Michael Ossmann's course with HackRF (https://greatscottgadgets.com/sdr/)


Everything in GNU radio is a flowgraph (a fancy flowchart, actually a directed acyclic graph).

The thing is you can take an existing flowgraph, modify it, and see what happens in about 30 seconds.

This video seems to be a good starting point: https://www.youtube.com/watch?v=ufxBX_uNCa0


Nicely written. For certain.

Would it be cruel to suggest that you might want to advance a bit more before weighng in ?

I'd say that semiconductor physics, real math, control systems, real mixed signal and a couple of others should get a go ... but my eldest child didn't get much past this, so maybe the state of the art today?

Again- I mean no cruelty in my comments, but seems as if modern curricula are not teaching a person what a person needs to know to go into any related industry job...

(And I could be wrong - as I often am)


> not teaching a person what a person needs to know to go into any related industry job

I'm not sure they ever did. They should imho be teaching the ability to learn and adapt to changing and emerging technologies, and to think critically. I'm still using the mathematics I learned in college, to understand things that didn't exist back then such as elliptic curve cryptography.


Well, I don't know how old you were, but ECC has existed in college for a long time - but may not have been so useful to a lot of engineers building machines (for sure), at the time.


You might find the following books (in addition to those listed by others here) helpful;

Practical Electrical Engineering by Makarov et.al.

Electronic Circuits: Handbook for Design and Application by Tietze, Schenk et.al.

Sensors and Signal Conditioning by Ramon Pallas-Areny et.al.

Introduction to Embedded Systems: Using Microcontrollers and the MSP430 by Jimenez et.al.

Patterns for Time-Triggered Embedded Systems by Michael Pont.


I would spend more time understanding your motivation for brain computer interfaces. It seems more like a research project with some applications to neurodegenerative diseases, but to actually get anything done you'll need to master several fundamental texts which takes most people years.

Spending just a week or two talking to all the experts could save a lot of wasted effort.


And why not build a team with professionals in each field? Create a cooperative.

I am currently investigating more efficient forms of study (which would imply creating a new language) for the compression of academic text (which is often very long and not very accessible to inexperienced people).

Whoever is interested in studying with me, my email is: fabricioteran06@gmail.com (I speak Spanish)


The most interesting EE class I took was on MEMS, NOEMS, and MOEMS, i highly recommend you learn how that stuff works esp if you're going to work on human/brain interfaces. Then there's also biomedical engineering stuff, i.e. learning about strength of materials, micro fluidics, etc


I would suggest a dive into neural networks from the bottom up. Getting the electrical interface is the physical part. For a real brain interface though it's probably going to look like NN at the interface from a software point of view.


I would suggest adding Communication Systems by John Proakis. It is the seminal text for digital comms and should be included, given the plan's foray into SDRs.


since raspberry pi came out, I did read about electronics, and one thing that surprised me is that physics and electricity are only one side, you may need electrochemistry, material science, mechanics .. it's wide


How much time does this person have to devote to this?


I think a lot depends on what your goal is. Is it to better understand the methods section in the papers you are reading, or do you want to do some experimental work, or you have a job you are going after, or...

This is extremely broad and ambitious. Younger me would have said go for it as I loved to learn everything, but older me has forgotten much of the stuff that I so much loved to learn, so I moved to the camp of learning what you need.

Unfortunately I don't know too much about brain-computer interfaces, especially if it's cutting edge research.

At a high level, these are my recommendations:

The basic ideas about how circuits work is presented in any introductory book, the E&M book (Purcell) would mostly be useful for device physics and transmission lines plus other RF topics (mostly EMI, crosstalk, and other things that can go wrong). Some purists might argue on which side of the equation an inductor voltage should be, but it has zero practical effect. Also, this is a book for usually the second physics course in college, so you might have done that already and just need a refresher.

Similarly, unless you expect to be either developing novel devices, or be involved in fabricating existing devices in new nodes/conditions, you can skip anything about devices (types, structures, fabrication, materials, electron bands, doping concentrations, diffusion, drift, etc) and just the voltage/current behavior between pins should be plenty (these are covered in any introductory book). The chemistry book is mostly irrelevant for EE, although in the neuroscience case it's more applicable if we are talking about invasive electrodes (but still, probably too general and broad).

Books on integrated circuits depend a bit on whether you need to learn about some other topics that are not usually presented on their own, such as fast amplifiers, mixers, oscillators, etc with CMOS technology. I'd say though that RF/MW integrated circuits differs considerably from discrete RF/MW work, so again most likely you'll get away with treating various parts as opaque building blocks, connected by transmission lines. And I'm going to guess that for BCIs the frequencies involved are quite low, so this whole branch might be irrelevant.

Probably you'll need to learn the basics of data converters to digitize the brain signals, but again I'm not sure this warrants going through a course versus just the wikipedia page and a datasheet of a specific part you want to use. As with the other things above, courses are usually designed for people making converters, not people using them.

Signals, systems, feedback, control systems are very fundamental "mathy" engineering tools that apply to more than just EE, so probably a good tool to have in general.

I see your questions about wireless systems. Again as above. Usually these books are designed for people wanting to develop these things professionally, and if you just want to communicate wirelessly it's mostly learning the "API" that some chip has to do what you want. Not to mention the compliance nightmare to roll your own if it's beyond a handful of prototypes.

I think you get the theme. Sadly EE outside the companies making ICs has become very similar to software where you are basically plumbing black boxes together. And if you don't have a standard application, with lots of time spent on figuring out hacks to use existing parts in non-standard ways, because if you can't find the perfect part the barrier to rolling your own is much steeper than not in software.

So in a way, The Art of Electronics is very applicable. Unfortunately I think it's terrible to learn from unless you already know the stuff, and (unless it has been refreshed to the point of a major rewrite) the copy I have is extremely outdated that I never really recommend it to anyone, and I haven't opened it in a decade.

Unfortunately I don't know of such thing, but if anyone here knows a course from the Neuroscience side doing experimental work, you could see what the prerequisites for that are, and go from there.

But if you are not like me and can still learn a lot of new things without forgetting too much, go for it all and live the dream!


As somebody who got self-learned into electronics engineering on the workplace, I'd say self-learning is the hard way.

I want to underscore the e͟n͟g͟i͟n͟e͟e͟r͟i͟n͟g͟ in the electronics engineering. Engineering everywhere is very hands on, and you cannot be an "engineer in theory only" if you want to perform on a job.

Learning from mistakes in a class setting is much easier, and c͟h͟e͟a͟p͟e͟r͟ than casually failing a USD $1M design in a very simply way, but a way not taught in any textbook.

Not to disparage you, I know many people who were similarly dragged into electronics engineering by necessity, and got to the level of degreed engineers over many years. COB But those guys had years, and years to perfect their skills in a time when the industry was more forgiving, and was growing with their skill.

I would say that today, nobody will hire a 18 year guy who was just an electronics hobbyist to a factory, that was not the case 12-10 years ago.

What I can say against modern electronics engineering education is that excessive focus on producing "workplace ready" cadres makes for worse workers past the basic level.

I know people who are quite adept with digital electronics, but can't even understand how anything but textbook versions of SMPS power supplies work because of universities thought that analog circuits are now what people pay for. This the same for many more fields in electronics.

I believe properly taught EE can figure out just anything with the right approach, and time, and this attitude is the best what education can give you, unlike mass produced engineers who keeping find lame excuse "I'm not a logic/power/high speed/rf/motion control/asynchronos circuit/metrology/network/audiovideo engineer! I did not study this at school!"


I would add this cheap book to anyone interested in EE, or fooling around with circuits. The book just basically goes back to Ohm’s law, but I guess there are EE whom graduate, but forget the basics?

Electrical Engineering 101: Everything You Should Have Learned in School


I disagree; this book is too shallow and uneven. It is one of the books that i bought when i started on Electrical/Electronics self-study and was disappointed.


Autodidacts > Students. This is fantastic. I love it.


Study backend SWE, higher RoI




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: