That's cool. I had to learn some intricacies of PHP for a job once quite a while ago and the docs weren't quite sufficient. Could have used that. Language specs are typically harder to digest then other material but ultimately you can get all the details out of them you need and they're typically worth reading.
Languages that are in popular use often take decades to get there. Much of the software in the world has been there a while. The only thing surprising to me on this list is about how popular Go is after only a decade.
The Asus Zephyrus g14 is a 14 inch thin and light laptop which has the 4900HS, so, not quite a Macbook Air, bit still very compact for an 8 core laptop.
I recently presented my master's thesis exactly on a process involving kanbans. I implemented a discrete event simulator that compares different dynamic scheduling agent policies of which one is a MIP optimization model. All very cool, but the company still uses paper kanbans and the whole process is far from the so called "industry 4.0". All in all, change costs and the change needs to give enough fruits to justify it.
There is perspective, so it's 3D. Unless you are talking about the fact that your display can show only 2D images then you are technically correct since most displays are not 3D displays.
I think you're right, its a sort of pseudo3d. Its a 3D presentation of a 1D (2D environment) perspective. Cut a horizontal row of pixels from the center of the screen and you have its flatland point of view.
Having developed a discrete event simulator myself for my master's thesis I can guarantee that the usefulness of tools like this one depends heavily on the veracity and quality of the time parameters and the random distributions.
This one is a very cool exercise, but applicability is dubious.
Anyway, does anybody have any suggestion about job positions where the job requires developing this kind of stuff (simulators, process optimization, heuristics)? Being a fresh graduate in this time period is pretty bad, but doesn't hurt to do some research on interesting job positions. I only see web related job positions and "machine learning" job positions lately.
If you're interested in MIP/LP optimization models and have either AIMMS or GAMS experience and either CPLEX or GUROBI knowledge, the vendors of most power systems simulators probably have some openings. They're large codebases, but actively saving billions annually in the US. The problems of unit commitment and economic dispatch are well understood, but the business rules framework is massive for all the US RTO/ISOs and is always changing.
No problem. Vendors are GE, Siemens, and ABB. I'm sure at least one is hiring. They generally prefer some industry experience, but a Masters or PhD would likely go far, especially with some research projects.
These models are amongst the most difficult out there as far as size and time requirements. One of the founders of GUROBI got involved with the industry recently to try to help some researchers speed things up.
If you want a decent example/starter model to analyze, there is a unit commitment model someone made in Xpress that is free online if you Google for it that shows the fundamental formulation although they are much larger in practice.
Unit commitment is the MIP problem that combines linear and integer constraints and tries to determine the least cost set of units to bring online for each hour of the day. Constraints include things like the minimum amount of time the unit has to remain offline before being started up again, the minimum amount of time it has to run once online, how fast it can move (ramp), how much capacity it has..etc. You try to minimize the costs of starting up the unit, the cost of the unit just being online, and energy costs. The economic dispatch problem is much simpler and asks, with the set of units that were given to me by unit commitment, where should I set each one. The commitment problem runs the day ahead at hourly granularity for the next day and is also run periodically throughout the day. The dispatch problem generally runs every 5 minutes 24/7 365. There are also other constraints like not burning down the transmission grid.
The US military (and other too) use simulation, modeling, and analysis tools to help design all sorts of activities and equipment. Examples at http://jasp-online.org/model-simulations/
* I used ESAMS (Enhanced Surface-to-air Missile Simulation) and CODER (COnceptual DEsign Representation) to help flesh out design details for various US Air Force avionics platforms. ESAMS is a tools to help evaluate missile / aircraft scenarios. CODER is a discrete event simulation tool for designing and evaluating aircraft cockpit automation.
The RAND Corporation is a group that uses games and simulation to better understand many subjects (policy, healthcare, education, etc). See https://www.rand.org/search.html?query=games
No idea if they are hiring but a friend of mine works for a company called Sandtable and they mostly do advanced agent-based modelling. Based in London.
I guess. But we already optimize these things. Without any specific domain in mind, I wonder how much better we could get. Like maybe amazon could find the global minimum cost to route, but right now it’s getting pretty darn close.
Or maybe that’s entirely naive and the potential for finding faster solutions to real time problems is huge
it is huge. that's the only reason people are motivated to implement QC- because in principle then we can break crypto, do QM much faster, solve intractable problems, etc.
Day-to-day, we're not really blocked on QC to get important work done.