Hacker News new | past | comments | ask | show | jobs | submit login

Oh, I should have mentioned somewhere. Credit for putting the notebook together goes to Elliot Saba (https://github.com/staticfloat) :).



This is brilliant work. I'm not strong in NNs yet, but I am strong in prerequisites/blockers. This demos:

* Working in a rapid application development (RAD) fashion by operating on vectors using a language like Julia/MATLAB/Octave/Scilab which allows focusing on abstractions instead of implementation details and other distractions.

* Running code optimized automagically on GPU/TPU/etc.

* Sharing work over the web in a standard fashion (Jupyter Notebook on colab.research.google.com)

It's not clear to me where in this process the code is actually run on TPU (maybe someone has a tutorial?) but that doesn't really matter. The specific machine learning algorithm used is also not really that important.

The important part is that this enables amateurs to tinker with machine learning, see results quickly and share their work. Which means that now we'll finally see the accelerated evolution of machine learning.

Any of these blockers alone hindered the evolution of AI for decades, but seeing all three knocked down in one fell swoop is pretty astonishing, at least for me. I favorited it as a watershed moment in the history of AI! Congrats to him.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: