Hacker News new | past | comments | ask | show | jobs | submit login

I am familiar with NEAT, it was very exciting when it came out. But, NEAT does not use back propagation or single network training at all. The genetic algorithm combines static neural networks in an ingenious way.

Several years prior, in undergrad, I talked to a professor about evolving network architectures with GA. He scoffed that squishing two "mediocre" techniques together wouldn't make a better algorithm. I still think he was wrong. Should have sent him that paper.

IIRC NEAT wasn't SOTA when it came out, but it is still a fascinating and effective way to evolve NN architecture using genetic algorithms.

If OP (or anyone in ML) hasn't studied it, they should.

https://en.m.wikipedia.org/wiki/Neuroevolution_of_augmenting... (and check the bibliography for the papers)

Edit: looking at the continuation of NEAT it looks like they focused on control systems, which makes sense. The evolved network structures are relatively simple.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: