Hacker News new | past | comments | ask | show | jobs | submit login
Beyond message passing: A physics-inspired paradigm for graph neural networks (thegradient.pub)
94 points by andreyk on May 9, 2022 | hide | past | favorite | 16 comments



Can anyone recommend any arXiv/paper links on the subject for someone with reasonable prerequisties (e.g. neural ODEs, physics informed neural networks and message passing)? The number of references in the article is a bit of an overload! Looks like fascinating field.


From a quick glance, the blog post seems to be based on the following paper involving the same author: https://arxiv.org/abs/2106.10934

The point of the post is how graph NNs are limited in capability, so I’m not sure graph NN surveys are the best references (sibling comment) for the main punchline, though they might certainly contain many important details.


A survey paper is usually a good way to go, such as A comprehensive survey on graph neural networks (https://arxiv.org/abs/1901.00596) or Graph Neural Networks: A Review of Methods and Applications (https://arxiv.org/abs/1812.08434)


Can someone explain the downvotes here? If you think these are bad papers, maybe recommend some better ones? Or what is wrong with this post?


The reply is not addressing GP's concern. A survey of the literature of GNN fundamentals will contain a lot of repetition for someone who is already comfortable with message passing and neural networks (particularly neural ODEs) generally.


This is good stuff. One really cool use of sheaf theory is being able to predict race conditions between 2 states in switching electrical circuits by showing H_1 ("line-like", if H_0 is "point-like") cohomology isomorphic (? not sure if that's the right term) to Z_2: https://www.drmichaelrobinson.net/sheaftutorial/20150826_tut... (slide 29)


Are implementations of belief propagation considered message passing GNNs?


Pretty sure that's the case, yeah


So when will we have message-passing processor architectures again?


They do actually exist. Caltech has (had?) a group doing asynchronous microprocessors, which were most naturally expressed as channels between computation units. (They had a subset of CSP they called CHP -- communicating hardware processes that they used for a lot of the high-level designs).

As usual with nice things, the market didn't actually want it. Though Intel did buy a startup they spun out and used it mostly for switch fabric purposes.


anyone running graph neural networks in production ? what framework do you use ?


Check out dgl (https://github.com/dmlc/dgl). A lot of papers and algorithms are implemented in the examples section.


Isn't this used for predicting protein folding/interactions?


that's like Google's experimental work. I mean Google & Facebook using something is kinda moot.

Is anyone non-FAANG using it ?


Absolutely. Protein folding has been worked on by a large part of the scientific community for a long time and they're not going to look at a breakthrough like that and just ignore it. Pharma companies will be putting significant resources towards this research as well.

Here is an open source implementation of AlphaFold from an academic at Columbia.

https://github.com/aqlaboratory/openfold


my apologies for my language - i was still referring to GNN in general production use. I do see a lot of cool research coming out ... but i still dont see much in production.

Even alipay uses a plain vanilla word2vec + xgboost for its graph based fraud detection.

Is anyone running anything related to GNN in production ?

this is a critical question because if the tooling isnt available ... all these cool research will actually not go live. And that's what im seeing today.

Not a ding on the researchers...but something to think about.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: