Do you have any suggestions for people who want to self-study in OR? I would like to know more in general but am specifically interested in applications to healthcare
My recent previous employment was in healthcare (on the insurance side) so I can give you some relevant insight
I do want to mention that on top of any self-studying, try to attend a talk or two or start following some feeds online that are close to actual healthcare operations. Operational teams are the ones who have to figure out what to do even when there is no good answer. the easiest way to keep up with which topics are most valuable right now is inside knowledge.
There is no purity in the OR field outside of phd's doing their research - it is entirely about getting shit done efficiently, however possible, to the extent that the operations team can understand. That last part of that sentence is a big catch. For example, if your 'solution' has interns making judgement calls on data entry (because moving the work upstream is efficient!), you are fucked if you assume that data will be accurate.
BUT there's obviously plenty of skillset stuff you can learn to help you in a general way so here are some important areas:
1: Linear & Nonlinear programming (tool: AMPL)
2: Markov Chains (good knowledge)
3: Statistics & probability (necessary knowledge)
4: Simulation (start with monte-carlo [it is easy and you will be surprised it has a name])
5: databases: SQL / JSON / NoSQL
6: data structures and algorithms (big O notation / complexity)
OR work in general overlaps a lot with business analysis. The core stuff they teach you in school is listed above.
Healthcare right now has a big focus on Natural Language Processing, and applying Standardized Codes to medical charts - and then working with those codes. The most common coding standard in US is ICD-10 I believe.
Other than that it is mostly solved logistical items like inventory control systems that need a qualified operator. You do not want your hospital running out of syringes. You do not want your supplier to be unable to fulfill your order of syringes because you need too many at once. You do not want to go over budget on syringes because there's a hundred other items that need to be managed as well.
Now the important thing to keep in mind is that almost all operations problems at existing companies/facilities have solved their problems to SOME extent since if they didnt then their operations teams would have fallen apart and failed. So in practice it is rare that you are going to implement some system from scratch, or create some big model. Youre probably going to work with a random assortment of tools and mix data together between them on a day to day basis to keep track of when you need to notify people of things. With a lot of moving parts, you will have to task yourself with finding improvements and justifying them. Expenses are very likely to be higher than optimal, and you can earn extra value for yourself by finding incremental improvements.
No one is going to say: "work up a statistical model for me". They are just going to be doing something inefficiently with no idea how to do it better, and you are going to have to prove to some extent why doing it another way will be better - and also be worth the cost of training people to do it a new way. It will be monumentally difficult to convince anyone to make any sort of major change unless the operations team is redlining, so your best skill will be in being resourceful and adapting to the way things are no matter and improving THAT mess - not creating a new way of doing things.
Databases house the data you need to make your case. SQL was the norm, but a lot of stuff is JSON now. You might need to work with an API, add django to the req list.
Simulations let you test multiple scenarios of resource allocation on a complex system with moving parts (for example, resources includes # of hospital rooms as well as employees and their work schedules). Statistical analysis lets you verify the output of your simulations as meaningful or not. There are proprietary simulation programs that do the A-Z of this for you if you know how to configure it (ARENA), and theres pysim + numpy + pandas + ...
Markov chains are related to building a model for your system. It's theory stuff, but helps wire your brain for it. Laplace transforms are "relevant" somewhere in this category
(non)linear programming is the calculator of the fields distribution problems. In practice you create 2 files: a model file and a data file. Model file expects the data file to be a certain format, and is a programming language for processing the dataset.
For example, if you manufacture doors and windows at 3 locations, and sell them at 10 other locations, and you have a budget of $10000: how much wood and glass do you buy for each manufacturing location and how many doors and windows do you make and ship to each selling spot from each factory? The answer depends on data - price each sells at, cost for each plant to produce, capacity of each plant to produce, cost of transporting from factory to selling location, etc. So you make a model file for your problem and then you put all the numbers in a data file. You can change a number, run the model again, see if the result changed. You can script this to test many different possible changes
Data structures and algorithms:
There are a lot of different optimization algorithms, all with different use-cases and theres no real upper limit on learning them; so this area can be a good time sink... since someone else will have already coded the implementation but you are providing value in knowing how to use it. Therefor - you dont need to learn how to make the algorithms or what magic they perform outside of whatever helps you understand what its good at. Outside of research, its unlikely this stuff will really get you anything other than maybe being able to impress at an interview - BUT who knows, maybe you find a use-case for some random alg that is career-defining.
I know a ranted a bit, and I didnt proof read, but I hope there was some helpful info in there