Hacker News new | past | comments | ask | show | jobs | submit login

Ensemble methods have basically nothing to do with neural networks. The output of a NN is not some kind of "average" or "best pick" taken from the outputs of individual neurons. Rather, there are multiple layers each of which performs a kind of generalized multivariate regression on the outputs of the previous layer, and the parameterization for the whole hierarchy of layers is fitted as a whole. Very different.



NNs with dropout are, trivially, an ensembling. And I think it's not so hard to show NNs, by default, meet a criterion like it -- namely that if we have something like batch normalization between the layers, so they are something PMF-like, then each is taking an expectation.

either way, the technique has absolutely nothing to do with the biological cells we call neurones -- as much as decision tress have to do with forests.

It is metaphorical mumbojumbo taken up by a credulous press and repeated in research grant proposals by the present generation of young jobbing PhDs soon to be out of a job.

The whole structure is, as it has ever been, is on the verge of a winter brought about by this shysterism. Self-driving cars, due in 2016, are likewise "just around the corner".


I sympathesize about the overhyping. I certainly don't know if it's a good idea or not, but if you work for Google driverless cars are already on the road. https://www.theverge.com/2022/3/30/23002082/waymo-driverless...


I don't know if you can use the word "already" for something that's been nearly here for so long.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: