Hacker News new | past | comments | ask | show | jobs | submit login
Facebook launches PyTorch 1.0 (fb.com)
245 points by jimarcey on Oct 2, 2018 | hide | past | favorite | 71 comments



It's amazing to watch Pytorch and TF slowly converging and offering more or less the same capabilities. TF started in graph mode and are now adding eager mode. Pytorch started with eager mode (prototyping) and are now moving to graph mode (production). TF evolved from Google's production environment so it comes as no surprise that it was geared for production first deployments. Pytorch evolved from researchers which explains how it was tailor made for tinkering and prototyping. Good to see the two converging. I am guessing a year from now there won't be any marked difference between the two.


Does pytorch come with tensor comprehensions built in, or do you have to use them yourself?


They're not built in yet. Still suggested to compile from source. I'm not sure if it supports PyTorch 1.0 yet. More info: https://pytorch.org/blog/tensor-comprehensions/


As part of partnerships announced today at the Livestream event, Udacity to launch a new Intro to DL using PyTorch course, to be taught by Soumith Chintala who is the creator of PyTorch and also continues to be a lead contributor on the project. They intend to offer scholarships to 300 students who sucessfuly complete the free course to go on to do their Deep Learning specialization/nano-degree. Official announcement here which also links to a page where you can signup for the course that starts in early November. https://blog.udacity.com/2018/10/introducing-the-pytorch-sch...


Looks like the Intro DL in PyTorch course isn't released yet, there's just a "notify me" link. I searched around but I couldn't find a release date. Do you know when the release date might be?


Awesome! Thank you for giving us the headsup!


Awesome announcement. It seems they updated the website as well. Sadly the contrast is so low that I can barely read anything :( https://pytorch.org/features


Should be better now. Thanks for reporting.


Thanks for chiming in!

Note that these low-contrast, tiny font designs are almost universally hated. What were the design decisions that led to that? I'm seriously interested. Here is a UX.SE thread on it, I'm the OP:

https://ux.stackexchange.com/questions/67891/what-is-the-rea...


> What were the design decisions that led to that?

I'm guessing "It looks really great on designer's retina display"


@pesenti, it's still looking grey and very thin font on my computer - quite hard to read. Especially with the grey background for the code. I did a full refresh.


We just did a quick fix but we will rethink the color scheme to make it better.


+1. The code is easy to read, but the main text is hurting my eyes.


I just updated the text on the features page to straight #000000. I hope that ends up being more readable for you.


Wow that is so much better! Thanks a lot.

Btw, the color on other pages is still #6c6c6d. Given how readable features page is, it'd be nice to see #000 on others as well :)


I doubt it's the color so much as it's "font-weight: 300;"... that's pretty thin. Removing that makes it a lot more pleasing to look at, in my opinion.


Thanks @Alupis. Got a tip for that internally as well. I just pushed the fix up with 400 instead of 300.


Awesome! It seems such thin and/or light fonts are okay to read on retina displays (which is where I assume they were tested) but not on other displays. I happened to be reading on a non-retina display.


I don't think the colors are the problem, the font is... it's like the opposite of bold it's so thin...


Little concerning that AMD is not one of the new partners, we really need good competition on the Deep Learning GPU front to push Nvidia.


I made a comment on HN some time ago now that I didn't see much PyTorch being used in the wild or in research, and so was pessimistic about its wide adoption.

Happily, I was thoroughly mistaken, and PyTorch has gone from strength to strength. It's a real joy to use, and I'm excited to see its further development.


Pytorch - a beautiful, elegant, well thought out framework.

Tensorflow - a mess.

Anyone who has used both knows just how much of a struggle tensorflow is at every step of the way as you fight it to perform even the simplest of task. I just hope more people migrate to pytorch.



Will this release change the landscape of deep learning frameworks and lessen the Tensorflow lead? In other words, for someone that has not touched DL before, is it worth starting with PyTorch instead of TF?


I don’t know. I use both. I contribute to both.

If you’re interested in supervised neural networks, Keras, in my opinion, is the best option. You’d use CNTK, or more likely, TensorFlow for your Keras backend.

If you’re interested in unsupervised or semi-supervised neural networks, TensorFlow and PyTorch both work. However, they both have noteworthy issues.

Until eager execution was added to TensorFlow, TensorFlow models needed to be compiled. It was difficult to spice your network with dynamic behavior that couldn’t be easily constrained to a TensorFlow graph. Especially if you weren’t strong programming with common parallel primitives. Eager execution has made this easier, but it’s still more cumbersome than PyTorch’s execution methodology.

PyTorch sacrifices many of the benefits of compilation for usability and this is most obvious when deploying to, for example, the cloud or mobile devices. PyTorch, also, inexplicably, breaks many of the conventions present in the scientific Python ecosystem making it pretty cumbersome to integrate into existing workflows that rely on a package like scikit-image.

It’s complicated. In fact, I believe it’s way too complicated. Thankfully Keras solves most problems for most people in this area.


That breaking of compatibility is why I often recommend Chainer [1] over Pytorch, as one of the back ends to Chainer is literally numpy. Plus, Chainer is very fast and easy to work with.

[1] https://chainer.org


Yes! It was a mistake not to mention Chainer since it’s an exceptional package. Honestly, it presently fulfills the PyTorch mission better than PyTorch!


What are the scientific Python ecosystem conventions that PyTorch breaks?


The most obvious is API compatibility (e.g. function definitions) with NumPy, SciPy, scikit-image, etc.


I wrote a whole article about that: "Keras or PyTorch as your first deep learning framework" (https://deepsense.ai/keras-or-pytorch/). And it got somewhat popular here: https://news.ycombinator.com/item?id=17415321 (if you are curious of others' opinions).

In short: don't learn (raw) TensorFlow as your first framework. Both TF-via-Keras and PyTorch are viable options, with their pros and cons.


Pytorch is way easier to learn DL than Tensorflow because of its natural place within the python workflow. Just write a simple 1 layer feed-forward NN both in pytorch and TF, and you will see the difference.


once you know one, learning the other is very easy. I'd start with pytorch as it's much easier to reason about and debug


What's your source for Tensorflow being in the lead? I'd love to have some actual data on this. In my experience, for a long time now, I've barely seen anyone outside Alphabet doing anything with TF. Everywhere I look and everyone I talk to uses PyTorch. So from my perspective, PyTorch is (IMO deservedly) miles ahead in usage, and TF 2.0 will probably do only little to anything to close that gap. Disclaimer: I'm in research, things might be different in production environments.


I don’t know your area of expertise (or familiarity) but TensorFlow is extremely present in both academic and industrial computer vision communities. I don’t have data to support my claim, but, from my experience, TensorFlow is likely the most popular framework. Hell, I’d bet Caffe (1) is still more popular than PyTorch (but I’d expect that to change)!


I'm in Deep Learning research (e.g., you'd see me at NIPS, ICML or NIPS), so I don't have too much insight into the CV community. Thanks for sharing, interesting to know that they still rely on TF so much. As far as deep learning research goes, PyTorch is the clear winner by now. At least that's my subjective impression, but I also lack data. One recent data point is this one about frameworks used for ICLR 2019 submissions compared to the previous year: https://www.reddit.com/r/MachineLearning/comments/9kys38/r_f... , which would suggest about parity in terms of usage: TensorFlow went from 228 mentions last year to 266 this year, Pytorch from 87 to 252. Considering that a large number of these submissions is from Alphabet (this NIPS 10% of the papers were from either Google or DeepMind, so ICLR will be similar), so they naturally use TF, meaning that PyTorch actually as a slight lead (actually less than expected, but still. At least the growth numbers are quite impressive).


I've had to use both pytorch and tensorflow for various purposes, and i felt pytorch was more pythonic. In the end, you can use either to implement an architecture, but I liked pytorch quite a bit.


Congrats to the PyTorch team!

These guys are doing wonderful work, have wonderful taste, and care deeply about their users.

Big fan!


Is there a tensorflow serving equivalent for pytorch?


I have a naive question. What is the tensorflow serving service?


Lets you take a model you have trained and run it as a web service.


They make it sound as if Facebook owns PyTorch, which is not the case afaik.


For the past few years, the main developers of PyTorch have been working at Facebook with the copyright on the code assigned to Facebook. For years before that, though, it was developed by people working at other companies. It's open source of course so typically this doesn't matter all too much. But you can see the history reflected in the LICENSE file: https://github.com/pytorch/pytorch/blob/master/LICENSE


Tbh, I was trying to avoid projects by FB for ethical reasons, but I guess I have to give up on that now (for a while at least) since one of my projects depends on pytorch.


You may want to avoid quite a few large websites then, lest you end up executing code that Facebook employees wrote (React).


This misses the point. If you want to understand my reasoning, then read this comment: https://news.ycombinator.com/item?id=18125829


It doesn't miss the point at all. If you don't want to support this supposed implicit "R&D" machine, don't support websites that support it. Why wouldn't the support be transitive?


For what it's worth, Facebook doesn't make any money when you use PyTorch, nor does it help any of their business goals to have more people use PyTorch. If I were you I would just use the open source projects you want to and not worry about it.


I appreciate your suggestion, but that's not how I think it works. The use of open-source projects of a company directly supports the "R&D machine" of that company, by acknowledging the output of the researchers. The reason is that many researchers would not work for a company if there was no possibility of publication and acknowledgement. So by avoiding these projects, I implicitly vote for these researchers to work for companies with better ethical standards.


They are the creator of pytorch and have a team dedicated to drive its development. They implicitly own it.


Yep. As much as Google owns Tensorflow.


I didn't know that. Strange that this isn't mentioned on PyTorch's website home page.


https://en.wikipedia.org/wiki/PyTorch

Based on Torch initially developed by now VP AI @ Nvida.

For some reason they don't tend to publicize much directly. I have observed that with other initiatives they have taken up.

E.g. atscale conference. No mention of FB on their about page.


Damage control?


Watching the livestream now, looks pretty impressive


PyTorch 1.0 is just fantastic. A shameless plug: we (fast.ai) are also releasing fastai 1.0 for PyTorch today. fastai makes a lot of stuff in PyTorch easier to use, whilst allowing full customization with PyTorch code.

http://www.fast.ai/2018/10/02/fastai-ai/

edit: just noticed someone has posted the above link to HN and it's on the front page, so I guess follow-ups should go there instead https://news.ycombinator.com/item?id=18123587


Since tensorflow has released eager execution a while ago, did you ever consider switching back to TF for your course (or have a version of it)?


I think PyTorch is still ahead of Tensorflow for interactive coding. But the next version of Tensorflow (2.0) looks like it could be a lot better, so there's a chance we'll add support for that later. No promises though!...


Fair enough.


Well said ..

> I think PyTorch is still ahead of Tensorflow for interactive coding.

> But the next version of Tensorflow (2.0) looks like it could be a lot better


Fast.ai marketing in every single ml article


Can you please stop posting this? If you'd please stop posting unsubstantive and/or uncivil comments generally, we'd appreciate it.

https://news.ycombinator.com/newsguidelines.html


From https://news.ycombinator.com/newsguidelines.html : "Please don't impute astroturfing or shillage. That degrades discussion and is usually mistaken. If you're worried about it, email us and we'll look at the data."

The OP is about the exact platform that the software I linked to uses. The OP even links to the exact announcement that I linked to. It is about the first launch of software that's been under development for 2 years.

This seems like an entirely appropriate and relevant HN comment.


[flagged]



This isn't reddit. If you're not going to contribute intelligently to a topic, don't comment.


[flagged]


In this case, yes. They develop and maintain some of the best deep learning tools out there and it's completely free and open source. They have also publish a ton of high quality research in the field of AI.

Facebook may have it's moral and ethical failings but that's a stupid reason to blindly attack everything they do; especially when, in instances like this, it is objectively a positive influence. They're promoting open source tools and open access research.


Thanks for the constructive response to someone who cannot look beyond facebook.com.

I think this OP's expletive is more appropriate to some applications of FB's AI research and tools in their products.

If they weren't investing heavily in AI research & tools, they probably would've leveraged Google's TF, Microsoft's (since they like them) CNTK, etc.

So, in net, they have a positive impact with PyTorch which is the subject matter. Not facebook.com.


Absolutely! PyTorch and React are both awesome.


Look okay Fuck Facebook might be a bit antagonistic.

However, in my very personal opinion there is no way any amount of technical prowess can be considered compensation enough for the mental health issues brought about on young people by all social media platforms, especially Facebook.

Add to that the malicious use of data, add to that the failure on Facebook to keep data secure, and I struggle to see why we would back their cause...technical or otherwise.


No need to incite.


[flagged]


Please stop. We're hasty to ban accounts that go off like this.


At the least, this is funny.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: