meta comment: the website you're viewing when you click that link is generated using Jupyter Book , which is like the best thing ever https://jupyterbook.org/intro.html
Imagine all the beauty of .Rmd (easily generate books by combining markdown explanations with executable cells), but adapted for the Python ecosystem (jupyter notebooks).
One of the coolest things is the "Launcher" option which gives readers the options to "run" any notebook interactively (using the rocket button in the top right). It's a one-line config https://github.com/tum-pbs/pbdl-book/blob/main/_config.yml#L... A similar config would enable the "Launch in Pybinder" option which is a free ephemeral jupyter provider, see https://mybinder.org/
This "execute anywhere" option is nicely abstracted away as the `thiebe` library, and there is even POC work to run a pyodide kernel (https://github.com/executablebooks/thebe/issues/465) so soon all of this goodness will work offline in your browser!
As an educator, it's hard not to get excited about the future, given the pace at which learning/teaching tooling is developing!
Because the python/numpy/tensorflow/Pytorch ecosystem is the deep learning ecosystem. You write a book on what you know, and what the audience wants. And everything they do here has tools that work just fine in python, no need to switch to Julia.
And as someone who does this sort of work as my job, I can tell you the tool chains we use for what could be considered “industrial-scale” applications do not yet have appropriate Julia-based replacements.
That's because it's keeping things fairly simple. Implementing the more involved/advanced techniques would quickly outpace what's possible in python, now and for the foreseeable future
https://www.youtube.com/watch?v=HKJB0Bjo6tQ (Interpretable Deep Learning for Physics - I don't think there's any Julia in the video itself, but Miles Cranmer uses Julia for this work - he created SymbolicRegression.jl)
Yes, pretty much. Not just deep learning, pretty much all data-related work. And, much like Rust, it's a good question, worth asking, and asking it repeatedly helps the world arrive at a more refined answer, and improves Julia in the process.
None of the guts of these packages is written in Python anyways. C mostly, maybe with some Cython or Fortran. Python is where the API is presented, because that makes it easily accessible to lots of users.
Yeah, the Julia-language equivalent is referred to as "scientific machine learning" but I don't like that either. I think "simulation learning" would be snappier and maybe more appropriate.
I was doing a project that simulate a projectile motion with drag and Magnus force. This has no analytical solutions and I currently solve it with a numeric solver.
Does anyone know if I could approximate a closed form solution with a simple MLP network?
"Physics-based" Deep Learning seems like a misnomer. From the abstract "Deep Learning Applications for Physics" sounds more apt.
There definitely is value in transferring standard terminology and methods from physics to deep learning. But from the preview it's unclear if that is the focus.
This does not appear to be the usual approach of training the neural network on tons of data from the physics simulation. Instead they use the actual physics equations to form the loss function, which is a far more robust way of creating such an emulator. So physics based deep learning title is appropriate.
Indeed. I see where everyone on HN is coming from. But if you're a physicist, and you've come across a lot of "deep learning applied to physics but the model has no physics in it" (and there's plenty of that), then the title may make perfect sense.
Yeah, many scientists in my field have been justifiably skeptical of black-box machine/deep learning applications -- just sounded like the latest meaningless buzzword. I think this approach potentially is a big deal.
[Edit: by "this approach" I mean what the article is calling "differentiable physics" -- but I don't love that moniker. The "physics informed neural network" approach doesn't seem that great to me. It's much slower than doing an actual simulation, the resulting errors are larger, and you can't re-use results -- it's a one-off solution. The fact that you can use it to interpolate isn't that much of a selling point. The only nice thing is that you can throw any system of equations you want at it without having to design a numerical solver.]
Eh, they may put more emphasis on that technique, but its only a subset of the scope:
> This document contains a practical and comprehensive introduction of everything related to deep learning in the context of physical simulations. ... Beyond standard supervised learning from data, we’ll look at physical loss constraints, more tightly coupled learning algorithms with differentiable simulations,...
The question of that is it target the network. But if it is for physics … should it be neural network using physics for physics (by physics scientist).
Imagine all the beauty of .Rmd (easily generate books by combining markdown explanations with executable cells), but adapted for the Python ecosystem (jupyter notebooks).
Jupyuter-book is a really well thought-out project. You create the book with a _toc.yml file: https://github.com/tum-pbs/pbdl-book/blob/main/_toc.yml and all the config is in one file: https://github.com/tum-pbs/pbdl-book/blob/main/_config.yml (the build system leverages Sphinx which is the docs workhorse in the Python world)
One of the coolest things is the "Launcher" option which gives readers the options to "run" any notebook interactively (using the rocket button in the top right). It's a one-line config https://github.com/tum-pbs/pbdl-book/blob/main/_config.yml#L... A similar config would enable the "Launch in Pybinder" option which is a free ephemeral jupyter provider, see https://mybinder.org/
This "execute anywhere" option is nicely abstracted away as the `thiebe` library, and there is even POC work to run a pyodide kernel (https://github.com/executablebooks/thebe/issues/465) so soon all of this goodness will work offline in your browser!
As an educator, it's hard not to get excited about the future, given the pace at which learning/teaching tooling is developing!