I thought, yeah that's just feature gated stuff.
It's just marketing but my code will break even if I don't use the new edition because my dependencies will use it.
Turn out that it's false. What's amazing is that the compiler ensures that projects using the legacy edition can still use libraries relying on the new edition.
that if you’re using Rust 2018, you can use dependencies that use Rust 2015 with zero problems. If you’re sticking with Rust 2015, you can use libraries that use Rust 2018. It all works together
> Turn out that it's false. What's amazing is that the compiler ensures that projects using the legacy edition can still use libraries relying on the new edition.
To add on top of this is the amazing work to make this work even for Macros!
Not really, because libraries have headers that have to be textually included into your source. Because they're pasted in as far as the compiler is concerned, you have to pick one version of the language to compile your app and all the included headers with.
Yes, most C++ compilers let you specify your language version.
One of the problems with C++ is there is less of a dependency management story around it.
- Maybe staticlibs will let you mix language versions?
- Maybe DLLs will (granted you probably shouldn't have C++ in your ABI anyways)?
- headers-only or vendored libraries won't allow mixing language versions.
Contrast this with Rust where all of the standard approaches for declaring a dependency allow different language editions.
I'm really loving Rust. I've been 100% C++ dev for work, and Rust is starting to gain internal traction. Go fell over as a replacement for our less resource intensive applications because the lack of generics killed interoperability between it and C++. Rust wins in that area. My only fear would be Rust continuing to tack on features without every removing anything and turning into C++ 2.
C++ 17 is still really nice, the language is just old and C compat + lots of historically bloat makes it crusty. A better type system with more FP concepts and more safety is a win-win in my book.
What do you mean by C++ interoperability (re Rust)? As far as I was able to determine a couple of months ago, there isn't really a good way of doing interop except by going through a C shim.
The case I was analysing was Qt C++ UI plus Rust library. There's a Rust text editor (xi?) which is using JSON to communicate between the UI and the core. In that case I believe it was designed like that to allow for different UIs, but it looks more appealing than the C interface.
Can't talk for the parent, but things like ..=, impl Trait (especially for arguments), new try/catch syntax, ... were pretty controversed (and that's among the accepted RFCs only).
These controversies were mainly bikeshedding names and syntax (vs `...`, `any Trait`, `do/fail`), but there was little opposition to the features themselves.
There are some often-requested features (specialization, const generics, box syntax, placement new) that even got implemented, but are still unstable, because they aren't good enough. The bar is high.
I think Rust is getting better over time in making ideas polished. The few warts it has so far (e.g. Error description() and cause(), struct literal parsing in `if`) are from before 1.0, and probably wouldn't get past the RFC process used today.
It's interesting that Rust is starting to use the same version-naming scheme that ECMAScript and Ubuntu did, being year-based. Personally I think Ubuntu's naming scheme and release schedule is the best, releasing $N times per year and naming it $YY.$N, where N is the same each year. This has a host of benefits: it's obvious how old any given version is, it's predictable when a new version will be released, it helps prevent development from stalling for any reason, especially unnecessary-rewrites and way-too-big-features, and more!
What's funny is, it kinda goes both ways: I've recently spent a bunch of time talking to some TC39 members, and they have been talking a lot about a roadmap process that kinda looks like ours, while we've been talking about making our RFC process be a little closer to the "staged" concept that they have. Very interesting times!
Most importantly though, it had a lot of influence because they have similar stability constraints as we do; I mentioned C++ and Java in the post, but maybe should have mentioned ECMAScript as well.
Rust's version numbering is semver-based and will remain so. The year-based scheme is for epochs, which are mostly orthogonal to versions (each new epoch will require the latest version, but each new version will support every existing epoch).
Rust does follow a time-based release schedule, though (a new stable release every six weeks), and it does have many of those advantages.
We don't just add things for the sake of adding them. Most new features are being driven by two things:
1. Making the language friendlier for beginners and easier to understand.
2. Addressing pain points by production users.
That being said, I'd push back a little on "number of features" as a measure of complexity. There's a few ways in which this is a problem.
For example, the "waterbed theory of complexity", that is, if you make the language simpler, you push the complexity elsewhere. This can be good or bad, depending. I generally hesitate to compare Rust to other languages, but there was a good illustration of this the other day, about Rust and Go: https://news.ycombinator.com/item?id=17618918
Basically, Go has kept the language incredibly simple. Rust has added many features that Go does not. But that means that error handling in Go is significantly more verbose than in Rust. You can't just wave away the inherent complexity of properly handling errors; it has to go somewhere. Both choices are 100% valid, just different.
The other big issue with simply enumerating features is that cohesion and orthogonality is important. C++ did something truly impressive; they changed the fundamental model in which you write code. Idiomatic C++98 and idiomatic C++17 look and feel very different. The cost of this is that many features don't quite fit together as well as you would like. Or at least, that's what people say. We try to really make sure that features fit together in a way that makes sense.
>We don't just add things for the sake of adding them.
I hope you realise that the C++ design committee also doesn't add things for the sake of adding them. They aren't morons. Often there is a very real tradeoff in every decision, but generally the motivations seem to also be those two you mention.
I honestly disagree. When Bjarne writes a paper [0] saying how C++ is going the crumble under the weight of disparate and incoherent features then the language and its wider community has a problem.
To quote:
The foundation begun in C++11 is not yet complete, and C++17 did little
to make our foundation more solid, regular, and complete. Instead, it added
significant surface complexity and increased the number of features people
need to learn. C++ could crumble under the weight of these – mostly not
quite fully-baked – proposals.
I was hoping that my comments about C++ specifically would make that clear, but yes, I also very much agree that the committee doesn't do things just because. I have a very deep respect for their work.
Yes, web view and standard library are being used in the same sentence. How on earth a web view might be considered for inclusion in a standard library is beyond me.
In-context, this makes a bit more sense, though I'm not sure I'd vote for this proposal if I were on the committee. The introduction does a decent job of explaining the motivation; this is an alternate to the long-going discussion about putting 2D graphics in the standard.
Getting outraged at proposals, especially from the outside, doesn't make for a healthy process; not every proposal becomes accepted. Off-the-wall proposals can sometimes help explore a problem space with a new outlook. That doesn't mean that every single proposal is worth taking equally seriously, but Hal is a well-known name in this space, and has done a lot of good work.
(Incidentally, this kind of situation is why we're interested in adding stages to Rust's process; we want clarity around the maturity of a proposal. Some proposals are just for brainstorming. Some are more mature. It can be hard to tell sometimes from the outside which is which.)
> 1. Making the language friendlier for beginners and easier to understand.
make documentation a priority! With Elixir or Golang you can access doc super easily from the command line. Some sublime text plugin shows you the doc for highlighted std functions as well. These are what make a language awesome imo.
It is literally my job to make documentation a priority. We're on the same team here :) I spent the last 18 months throwing out an entire book I wrote to write a new one that’s oriented around how real users learn the language in real life. We’ve invested a lot.
(Some editors do have inline doc showing support; we don't have terminal doc access but we do have local html doc access)
opens documentation for the current project, including all its dependencies. Rust generates docs in HTML. While it doesn't stay purely in command line, it enables cross-linking, collapsible sections, and has built-in search.
Why do you prioritize making things easier for beginners? I mean, it's a system language many people are going to choose to code advanced performant stuff. Wouldn't it make sense to prioritize programmers who are going to build important stuff in it?
Beginners to Rust, not people who are new to programming, sorry that was ambiguous! We are putting basically zero effort into "learn programming via Rust."
The Rust 2018 edition isn't bells and whistles, but usability improvements.
For example, many users were confused by Rust's module system, because modules behaved differently than in other languages. The 2018 edition added the "missing" features to the module system to meet expectations that new users have.
The end result is that Rust 2018 is easier to learn.
easier to learn as of now.. but look below ( or other rust threads) and see how people that want to jump into rust ( from c++ for example) always ask about X feature missing and when is going to be added...
C++ has fucked up expectations people have from language evolution. Don't project C++'s mistakes on Rust.
For Rust the evolution is users asking "Why is doing X so hard?" and the Rust team saying "OK, we'll see how we can make X easier".
Rust has been adding features for 30 releases now, and every new release has been easier to use and easier to learn.
There is nothing scary in new features if they are built on the same basic principles, carefully elaborated, sound and consistent.
People added features to languages like scheme, ocaml, lisp for decades, and it was fine. There are type systems for racket, object system for ocaml, pattern matching for lisp, all of which are simple and fit into the design of the language well.
The problem with C++ is that it was based on the C language, which already was an example of terrible design (by modern standards), and new features also were half-baked or badly designed (SFINAE, accidentally turing-complete templates, 666 *values, too much implicitness and unnecessary entities, like constructors).
So far rust is very nice, concise, elaborated and explicit language. Hope that the new features would be as elaborated and neat, not just monkey patches.
Template meta-programming is awful but that particular type of turing completeness isn't a problem at all. Mere arithmetic and some kind of ability to loop gives you that kind of turing completeness. The typical compiler limits looping depth to a couple hundred, and the problem is solved. Such a construct isn't a notably slow use of templates either. It's more trouble to avoid it than to have it. Compare some macros that can't loop and need a bunch of extremely repetitive lines for different sizes.
You don't need infinite loops, all you need are simple inductive types and reduction rules, it has not to be a turing complete language. You can have simple typelevel functions in haskell, whilst its typechecker is not turing-complete.
Having not started on learning rust yet, I am worried about the complexity increasing as well. However, I think the idea of editions is that you can stick with one if you like, for as long as you like.
If anything, then the complexity is getting less, not more, as the team as well as the ecosystem are very much focused on making things simpler and more approachable and are very aware of any existing, potentially unnecessary complexity. As an example, the borrow checker is currently undergoing a major (internal) revision and will allow you to write some things in the future (and even now, at least on nightly Rust) that its old implementation was not so sure about and thus complained about. Another example: async/await syntax is coming, along with improved futures, that will make it much easier to get started with async/io programming.
I got into Rust recently and I have to say I like it. The syntax is a bit noisy to my taste (generics, lifetimes and macros) but overall I really like it.
It seems like it's maybe not a good idea to get too much into it though, the language seems to move a lot, there are still a lot of things that are not here (a rand library, benchmarks) or are subject to change. I'm not sure if I should give it more of my time.
The thing keeping me from Rust is the weaker metaprogramming relative to C++. Do you have plans to expand Rust to things like compile-time expressions, template templates, and perfect forwarding?
We do! We've been pretty quietly developing it, but right now, the Rust compiler contains a full interpreter as well, specifically to do compile-time evaluation. But, we also do not want to have unlimited compile-time facilities, as that has serious soundness problems. "const fn" is the name of the user-facing feature, and MIRI is the underlying interpreter, if you want to do some searching.
That said, that work tends to fall under "compile time expressions"; "template templates" are more likely to be served by some sort of eventual HKT, or maybe even by ATC. I always forget what perfect forwarding is so I probably shouldn't comment on it.
I am willing to bet that this stuff is a major part of next year's roadmap, but we'll see!
For the uninitiated, HKT means higher kinded types, which is the type theoretic term for what C++ calls template template parameters. That is, the possibility to pass around type-level functions. I don't know what ATC refers to.
Yes, thank you! ATC is "associated type constructors". I believe Haskell calls them "type families"? It's sort of a similar idea; a form of higher-kindness. We are likely to get ATC in the medium term, but full HKT is farther out.
I have a question about that actually, macros in C/C++ are disgusting, why isn't Rust trying to improve it? It's better obviously but it is still not a nice thing to parse.
Well, patterns are a very different language than rust; rust has no way to express “one or more of this thing” for example. I wasn’t involved when the syntax was chosen, and that was well before our RFC process, so I don’t have any real, deep explanation for you, sorry!
Const fn is being worked on (available in unstable rust). C++ style templates (if that's what you mean by template templates) are not something that is planned for Rust, but some limited forms of specialization are being considered (a simple version is available in unstable).
For powerful meta programming Rust aims for procedural macros, not for templates, but I'm not too familiar with the details.
That’s totally fine. Most importantly, I want to work well with functors, generic functions, and compile-time evaluation (for things like manual loop unrolling and so on).
I value expressivity, and I appreciate your efforts! I don’t think it will be long before I try it out.
I have one sore spot about Rust: The Rust build tools are almost unusably slow on small ARM devices like the Raspberry Pi Zero. I can compile a simple C++ program just fine on such a device, but Rust will take ages building dependencies and requires a swap file because it eats up all the physical memory. Meanwhile, in C++ land, my distribution's package manager includes prebuilt C++ libraries and headers. I tried cross compilation but that was a nightmare to set up, and got even worse when Cargo packages depended on C libraries, which then also had to be cross compiled. In the end, I was unable to get it working, and had to give up my attempt at targeting that platform with Rust.
Cross-compilation should be much easier when rust starts using the lld linker. You can try it out now by building a statically-linked musl binary in nightly.
An interesting approach to fixing and changing without breaking backwards compatibility. Time will tell if it lives up to its promises, but I'm intrigued...
The closest I've seen to this model is "use strict", but having this at the package level is a nice quality of life improvement.
IME we put far too much gravity on 1.0 and 2.0. If you’re building a team to last, you’ll be at 3.0 before you know it. If you didn’t, you’ll be out of business before it happens. I know which team I want to work with...
Turn out that it's false. What's amazing is that the compiler ensures that projects using the legacy edition can still use libraries relying on the new edition.
that if you’re using Rust 2018, you can use dependencies that use Rust 2015 with zero problems. If you’re sticking with Rust 2015, you can use libraries that use Rust 2018. It all works together
That's great!