Reading through the top level comments, they are all a form of surface level aversion to the unfamiliar. It really highlights that many industry trends are based on hot takes and not any sort of deep analysis. This makes sense why Javascript and C++ will remain as entrenched favorites despite their technical flaws.
For those who actually spent time with Swift, and realize its value and potential, consider yourself lucky that large portions of the industry have an ill informed aversion to it. That creates opportunity that can be taken advantage of in the next 5 years. Developers who invest in Swift early can become market leaders, or run circles around teams struggling with slowness of python, over-complexity of c++.
Top three comments paraphrased:
> 1) "huh? Foundation?"
but you have no qualms with `if __name__ == "__main__"` ?
> 2) "The name is pretentious"
is that an example of the well-substantiated and deep technical analysis HN is famous for?
> 3) Swift is "a bit verbose and heavy handed" so Google FAILED by not making yet another language.
You're completely misinterpreting (I'm the author of (1)).
C++ and javascript are languages of professional software engineers, of which there are many many more languages with various pros and cons.
Python has been the defacto standard in scientific/data/academic programming for decades. The only other language you could say rivals it would be MATLAB which is even more simplistic.
My point is that simplicity and clarity matters to people who don't care that much about programming and are completely unfocused on it, they are just using it do get data for unrelated research.
'if __name__ == "__main__"' is not in the example code nor is it a required part of a python program so not really sure what your point is here.
> Python has been the defacto standard in scientific/data/academic programming for decades
In my experience (Genomics) this is simply not true. Python has caught on over the last 5 or so years, but prior to that Perl was the defacto language for genetic analysis. Its still quite heavily used. Perl is not a paragon of simplicity and clarity.
I feel like trying out various languages/frameworks would affect compsci labs a lot less than other fields, since the students probably have some foundational knowledge of languages and have already learned a few before getting there. Might be easier for them to pick up new ones.
(a) While I'm being honest that my observations are based on the fields I have experience, there is no such justification that "It is true broadly for computation in academia" in your comment.
(b) Interpreting "niche" as "small" (especially given your "true broadly" claim): Computational genetics is huge in terms of funding dollars and number of researchers.
I have, my impression when doing an applied math degree more than 10 years ago was that python was by far more prevalent than R. I know through my wife that python is much more prevalent in bio-informatics and bioengineering too.
Doesnt really change my argument though, R is also a slow but simple language that is popular among academics but not professional software engineers. My whole point is that Swift is never going to be popular with academics because the syntax isn't simple enough.
Hasn't been my experience. I was programming mostly in python in 2007 in applied math, oceanography, and physics classes. It had already been established in those fields (at least at my university) in 2007 so it's been at least 15 years.
>4)"Google hired Chris Lattner and he forced Swift down their throat."
Does anyone force anything on Google? This seems to express little confidence in the competence of Google and their people. Perhaps Google chose Swift and brought Lattner in for his obvious expertise.
The biggest drawbacks of Swift are the legacy relationships it has to the Apple ecosystem.
Yet swift is open source, and Apple and the community can fork it if they so choose. This is great news for me personally as an iOS developer and an ML noob who doesn't want to write Python. I can't comment on Julia because I have no experience with it, but I applaud the efforts to build the Swift ecosystem to challenge Python.
I think a lot of the criticisms so far are that it's early days for Swift in ML, and that's one point the author is emphasizing.
I have spent 2 years of my life with Swift and I would say that I have a very well informed aversion to the language.
Some ideas in the language or parts of its core concepts are really good. First class optionals and sum types, keyword arguments, etc., I liked all of those.
Unfortunately, by and large, Swift is lipstick put on a pig.
I have never used any other language that regularly gave me type errors that were WRONG. Nowhere else have I seen the error message "expression is too complex to be type-checked", especially when all you do is concatenating a bunch of strings. No other mainstream language has such shoddy Linux support (it has an official binary that works on Ubuntu... but not even a .deb package; parts of Foundation are unimplemented on Linux, others behave differently than on macOS; the installation breaks system paths, making it effectively impossible to install Python afterwards[1]). Not to mention, Swift claims to be memory-safe but this all flies out of the window once you're in a multithreaded environment (for example, lazy variables are not threadsafe).
In addition, I regularly visited the Swift forums. The community is totally deluded and instead of facing the real problems of the language (and the tooling), it continues to bikeshed minor syntactic "improvements" (if they even are improvements) just so the codes reads "more beautifully", for whatever that is supposed to mean.
But the worst thing is how the community, including your post, thinks Swift (and Apple) is this godsend, the language to end all language wars, and that everyone will eventually adopt it en masse. Even if Swift were a good language, that would be ridiculous. There was even a thread on that forum called "crowdfunding world domination". It has since become awfully quiet...
>Developers who invest in Swift early can become market leaders, or run circles around teams struggling with slowness of python, over-complexity of c++.
While other people will do the sensible thing and learn Rust. Because it runs circles around Swift, it offers many paradigms and can be used in almost any industry, operating system and product, not just developing apps for Apple's ecosystem.
Swift will take over the world when Apple will take over the world which is safe to assume it will never happen.
I am not saying at all that is bad to learn Swift and use Swift, but have correct expectations about it.
I think you misunderstood criticism about the name differential programming and the idea that building in a gradient operator in to a language is somehow a breakthrough that warrants the label "software 2.0".
This is not really about swift. Swift seems to have been chosen because the creator was there when they picked the language, even though he left.
I think my point stands that the criticism on this thread is mostly a surface level reaction and hung up on meaningless slogans like "software 2.0" or "breakthrough".
You use of the word "seems" is very apt here.
Have you considered that Google might have hired Lattner precisely because he is the founder of LLVM and Swift, and they hoped to leverage his organizational skills to jump start next generation tooling? We know google is heavily invested in llvm and C++, but dissatisfied with the direction C++ is heading [0]. They also are designing custom hardware like TPUs that isn't supported well by any current language. To me it seems like they are thinking a generation or two ahead with their tooling while the outside observers can't imagine anything beyond 80s era language design.
I'm a deep learning researcher. I have an 8 GPU server, and today I'm experimenting with deformable convolutions. Can you tell me why I should consider switching from Pytorch to Swift? Are there model implementations available in Swift and not available in Pytorch? Are these implementations significantly faster on 8 GPUs? Is it easier to implement complicated models in Swift than in Pytorch (after I spend a couple of months learning Swift)? Are you sure Google will not stop pushing "deep learning in Swift" after a year or two?
If the answer to all these questions is "No", why should I care about this "new generation tooling"?
EDIT: and I'm not really attached to Pytorch either. In the last 8 years I switched from cuda-convnet to Caffe, to Theano, to Tensorflow, to Pytorch, and now I'm curious about Jax. I have also written cuda kernels, and vectorized multithreaded neural network code in plain C (Cilk+ and AVX intrinsics) when it made sense to do so.
I've taken Chris Lattner / Jeremy Howard's lessons on Swift for TensorFlow [0][1]. I'll try to paraphrase their answers to your questions:
There aren't major benefits to using Swift4TensorFlow yet. But (most likely) there will be within the next year or two. You'll be able to do low level research (e.g. deformable convolutions) in a high level language (Swift), rather than needing to write CUDA, or waiting for PyTorch to write it for you.
You can't. It won't be available for at least a year I'm guessing.
Even then I'm not sure what granularity MLIR will allow.
On the other hand you can do it in Julia today. There is a high-level kernel compiler and array abstractions but you could also write lower level code in pure Julia as well. Check out the Julia GPU GitHub org
If it's not ready I don't see much sense in discussing it. Google betting on it does not inspire much confidence either. Google managed to screw up Tensorflow so bad no one I know uses it anymore. So if this Swift project is going to be tied to TF in any way it's not a good sign.
As for Julia, I like it. Other than the fact that it counts from 1 (that is just wrong!). However, I'm not sure it's got what it'd take to become a Python killer. I feel like it needs a big push to become successful in a long run. For example, if Nvidia and/or AMD decide to adopt it as the official language for GPU programming. Something crazy like that.
Personally, I'm interested in GPU accelerated Numpy with autodiff built in. Because I find pure Numpy incredibly sexy. So basically something like ChainerX or Jax. Chainer is dead, so that leaves Jax as the main Pytorch challenger.
I was looking around for a language to write my own versions of convolution layers or LTSM or various other ideas I have.
I thought I would have to learn c++ and CUDA, which from what I hear would take a lot of time.
Is this difficult in Julia If I would go through some courses and learn the basics of Julia?
This would really give me some incentive to learn the language.
You could just use LoopVectorization on the CPU side. It's been shown to match well-tuned C++ BLAS implementations, for example with the pure Julia Gaius.jl (https://github.com/MasonProtter/Gaius.jl), so you can follow that as an example for getting BLAS-speed CPU side kernels. For the GPU side, there's CUDAnative.jl and KernelAbstractions.jl, and indeed benchmarks from NVIDIA show that it at least rivals directly writing CUDA (https://devblogs.nvidia.com/gpu-computing-julia-programming-...), so you won't be missing anything just by learning Julia and sticking to using just Julia for researching new kernel implementations.
In that benchmark, was Julia tested against CuDNN accelerated neural network CUDA code? If not, is it possible (and beneficial) to call CuDNN functions from Julia?
That wasn't a benchmark with CuDNN since it was a benchmark about writing such kernels. However, Julia libraries call into optimized kernels whenever they exist, and things like NNLib.jl (the backbone of Flux.jl) and KNet.jl expose operations like `conv` that dispatch CuArrays to automatically use CuDNN.
I’m not telling you to switch. I don’t think the S4TF team is telling you to switch anytime soon. At best you might want to be aware and curious about why Google is investing in a statically typed language with built in differentiation, as opposed to python.
Those that are interested in machine learning tooling or library development may see an opportunity to join early, especially when people have such irrational unfounded bias against a language, as evidenced by the hot takes in this thread. My personal opinion, that I don’t want to force on anyone, is that Swift as a technology is under-estimated outside of Apple and Google.
Please read the article. It answers your question pretty straightforwardly as "no, it's not ready yet."
But it also gives reason it shows signs of promise.
So you should get involved if you are interested in contributing to and experimenting with a promising new technology, but not if you're just trying to accomplish your current task most efficiently.
Google hopes you will be using their SaaS platform to do ML, not just use your own server. This is one of the reasons they push hard to develop some instruments.
> To me it seems like they are thinking a generation or two ahead with their tooling while the outside observers can't imagine anything beyond 80s era language design.
Given the ML and Modula-3 influences in Swift, and the Xerox PARC work on Mesa/Cedar, it looks quite 80s era language design to me.
But I'd like to point out that while Google has some of the top C++ experts working for them, is heavily involved in C++ standardization and compiler writing process, in 2016 they claimed to have 2 billion lines of C++ running their infrastructure...
.. and yet they don't suffer from familiarity bias or the sunken cost fallacy I hear in your comment.
Instead Google C++ developers are sounding an alarm over the future direction the language and its crippling complexity:
For those who actually spent time with Swift, and realize its value and potential, consider yourself lucky that large portions of the industry have an ill informed aversion to it. That creates opportunity that can be taken advantage of in the next 5 years. Developers who invest in Swift early can become market leaders, or run circles around teams struggling with slowness of python, over-complexity of c++.
Top three comments paraphrased:
> 1) "huh? Foundation?"
but you have no qualms with `if __name__ == "__main__"` ?
> 2) "The name is pretentious"
is that an example of the well-substantiated and deep technical analysis HN is famous for?
> 3) Swift is "a bit verbose and heavy handed" so Google FAILED by not making yet another language.