Hacker News new | past | comments | ask | show | jobs | submit login
GCC 11.1 (lists.gnu.org)
208 points by lelf on April 27, 2021 | hide | past | favorite | 94 comments



Good to see the experimental C++23 features[1] starting to get support. Looks like literal suffixes for size_t are first. [2]

[1] http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2019/p059...

[2] https://gcc.gnu.org/gcc-11/changes.html

[3] https://en.wikipedia.org/wiki/C%2B%2B23#cite_note-1


Oh, reflection! <3


I'm naive about the language development process. I have no area of expertise in C++. But coming from a background in Python I've seen people complain about Python getting bigger and bigger, and people have been complaining about C++ being huge for even longer.

In my naive opinion, It seems to me that both C++ and Python have reached a point where most people are satisfied with the features the language has. Most of the complaints are around warts in the language itself that people wish could be fixed (but can't due to backwards compatibility).

My question is why the need to keep adding features? Sure if something comes out that C++ is desperately lacking, add it in. But it seems like that hasn't been the case for a while.

My other question. Would it be possible for a language to work like an OS? Where there is a LTS version of the language that is supported for X years, and then a new version comes out that contains potentially (but not always if it's unnecessary) breaking changes? I guess Python was kind of an example of that with Py2 to Py3, but no one who started using python after 1.0 expected there to be a shift like that. But if from the outset there is that expectation that after 5-10 years there will be a new version that removes warts in an old version would people accept that?

It seems to me that the lifecycle of a language that is successful is 1. Be the new hotness and solve a problem in the programming space 2. Gain traction and users 3. Release version 1.0, become bound by decisions that might haunt you for the rest of the programs life 4. Accumulate features, bloat and warts 5. Have people complain about warts that you can't fix due to breaking changes 6. A new language develops that fixes your warts. 7. Repeat

It seems like the cognitive load from "upgrading" a language as opposed to learning an entirely new language from scratch (even if it fixes a lot of your gripes) would be a lot easier.

Having been playing around with Rust for a bit, I've seen the conversation come up about warts in Rust, and I think people fear if it will eventually become another C++. I think the answer is probably yes, if it endures for a similar amount of time that C++ did


Lots of answers to this, but

(1) Computing is growing as a whole. What was good enough in 1980 isn't good enough now. Computing is used for more problems, and by more people. Programs are bigger, so there is pressure toward compressing common code and idioms. Hardware is more capable and more complex: it's more heterogeneous, dynamic, has more sensors, etc.

(2) Different problems have different use cases. Your "wart and bloat" is another person's feature, and vice versa.

Multi-core, multithreading and GPU are major reasons that features have been added to C++, and those were less of a concern in 1980 when C++ was conceived. (especially GPU).

The "cycle" you lay out isn't accurate because languages in fact do make progress. We're not going around in a circle. Rust fixes a lot of things wrong with C and C++; Go and Swift are also improvements, etc. (I don't even use those languages, but I can recognize the improvements as someone who's been using C and C++ for a long time).

I recommend writing some C from scratch to get a feel for this. I do this on occcasion, because there are advantages. But you will also be extremely hard pressed to do anything that's interesting for a user. The gap to bridge is very large.

For example, try writing a web app in C (not for production). You can save a lot of code by using CGI, but even then it's not fun. (It IS still done; look at the source code to the modern cgit UI if interested.)

Also try to modify and understand source code from the 90's like GNU bash or CPython. (Interestingly, there is a pretty big difference between the codebases, despite being from similar eras.) Nonetheless it should be clear from that experience that we've made progress.

The progress isn't perfect, but freezing languages in time isn't a reasonable option, given the massive change in the problems being solved, and the environment.


> people wish could be fixed (but can't due to backwards compatibility)

This is actually not completely true. The only example I can think of right now is how C++20 deprecated some uses of volatile, but the standards definitely do deprecate features and break backwards compatibility. This is done with care, only when it makes sense. The standards and individual papers/proposals are public, so feel free to take a look, they include rationales in such cases.

> My question is why the need to keep adding features? Sure if something comes out that C++ is desperately lacking, add it in.

Well, that already is how it works. Just because you don't know about some change doesn't mean it wasn't worthwhile.

There are often some complaints regarding the standardization process itself though, I think that, e.g., the coroutines proposals situation with the multiple competing proposals was pretty controversial. But that's just how it is with C, C++ (and Ada, Pascal, Basic, ...). It seems like the days of ISO reigning over programming languages are over though, in that newer programming languages aren't given over to ISO.


When not ISO, it is ECMA.

Many industries won't touch a language if it doesn't come with a standard and multiple implementations.

Even languages like Java, C#, F#, have their standard documents, or foundation with working groups that are hardly any different from either ISO or ECMA, as means to be taken into those industries.


>My question is why the need to keep adding features? Sure if something comes out that C++ is desperately lacking, add it in. But it seems like that hasn't been the case for a while.

C++20 does address legitimate pain points of the language. - Modules alone can cut 30% off compile times, as well as solving some other corner cases. - Coroutines will make writing network code easier. - constinit and consteval greatly simplifies the horrible TMP hacks C++ programmers are already using. - The spaceship operator removes a lot of boilerplate when defining comparison operators. - Library additions like <format>, <bit>, <numbers>, std::jthread, etc, standardize widely used operations and libraries.

In contrast, it seems like most of the language features proposed for Python are to "keep up" with other languages, e.g. the pattern matching proposal. It's made even worse in Python's case because it has historically been sold as a simple and easy to understand language.

>I think the answer is probably yes, if it endures for a similar amount of time that C++ did

Maybe. But Rust is in a much better position to avoid C++ style complexity. Editions allow Rust to "clean up" the language without breaking existing code (C++ is trying to do the same with Epochs). And Rust doesn't try to maintain broad compatibility with another language like C++ does with C89. Other features, like powerful macro definition facilities, also help offload language features to libraries. And most of all, Rust can learn from the mistakes of C++, e.g. destructive moves are definitely the way to go.

Anyway, it does seem like programming languages are either destined to ossify, like C, or sprawl out of control, like C++. Although, this problem can be largely avoided if the language lets code leverage the compiler, such as Lisp macros.


I still don't buy into epochs at the industrial scale languages like C and C++ are used today.

It relies on using the same compiler for the whole project, no use of binary libraries, or language semantic changes, for the whole idea of mixing epochs to work out.


It happens to be the case today that Rust doesn't support binary libraries build by different compiler versions, but I don't see any way that editions could interfere with this.

Editions only include changes that are crate-local, and crates using different editions must always be interoperable. For example, the module system path changes in Rust 2018 change how you organize code within a crate, but there's no way for any user of the crate to possibly know or care about this. Rust 2018 adds the ? operator, but again there's no way for users of the crate to know or care about whether code in your crate uses ? or not.

Am I misunderstanding you, or could you share more details of how crates specifying different editions could rely on them all using the same compiler?

Are there any changes in the 2018 edition, or planned for the next edition, that would interfere with using different compilers, or binary libraries? What's a possible change that you imagine could be introduced as part of a Rust edition that would cause interoperability problems here?


Sure, imagine Rust in a couple years from now, with Rust Editon "C++20", so lets come up with 2015, 2018, 2021, 2024, 2030 as possible epochs.

Now imagine a scenario where everyone uses binary libraries, like plenty of corporations do with C, C++, Java, .NET, Swift and other compiled languages.

How can epochs ensure that a binary library compiled in edition 2018, will be able to take a callback using a lambda written in edition 2024 main application, calling into a function available in a edtion 2015 crate, and then statically linked into a common runtime?

The current answer is it can't, unless all libraries happen to be compiled with the same compiler, and linked with the same runtime version.

Yes, it is a problem hard to solve, which not even a stable API fully fixes, because having it stable it restricts the language evolution on what can be exposed at library boundary and runtime library expectations.

Long term editions won't be much different than a -source in Java or -std= in C, C++ and so on.

It works for the time being because 2018 is basically the only edition available, with 2015 being pre-1.0, the same compiler gets backports to have a runtime compatible with both versions and there are no breaking changes where the language semantics have changed.


Hey, thanks for the response. I still feel like I'm missing an assumption, or we're otherwise talking past each other.

Specifically, it sounds like you're assuming there is (or could be, or should be?) some relation between editions and ABI?

I agree that Rust's current answer to your scenario is that all of the crates must have been compiled with the same compiler, but that has nothing at all to do with editions, and it's the same answer for exactly the same reasons if all of the crates were using the same language edition.

Editions aren't relevant here because editions are very limited in the scope. Consider these quotes from the in-progress RFC "2021 Edition"[1]:

  Editions are used to introduce changes into the language that would
  otherwise have the potential to break existing code, such as the
  introduction of a new keyword.

  Editions are never allowed to split the ecosystem. We only permit
  changes that still allow crates in different editions to interoperate.

  The most important rule for editions is that crates in one edition
  can interoperate seamlessly with crates compiled in other editions.
  This ensures that the decision to migrate to a newer edition is a
  "private one" that the crate can make without affecting others,
  apart from the fact that it affects the version of rustc that is
  required, akin to making use of any new feature.

  The requirement for crate interoperability implies some limits on
  the kinds of changes that we can make in an edition. In general,
  changes that occur in an edition tend to be "skin deep". All Rust
  code, regardless of edition, is ultimately compiled to the same
  internal representation within the compiler.
Whatever stable ABI Rust might eventually get, I expect it to be completely edition-oblivious. There should not be any way to tell by inspecting a compiled artifact in a stable Rust ABI which edition its code was written in.

Having written all this, I'm now wondering if you might have instead been trying to express something about how Rust's editions are insufficient to handle the requirements that a stable ABI would address? If so, I completely agree, although I'd be confused by the comparison, as they're very different kinds of things.

[1]: https://github.com/nikomatsakis/rfcs/blob/edition-2021-or-bu...


> Editions aren't relevant here because editions are very limited in the scope.

Which is exactly my whole point, as language evolves they won't be able to cover all possible language changes and compromises will be required, specially regarding possible incompatible semantic differences across editions.

The ABI example is just one way to make this issue more visible.


Modules in theory for big source trees with a lot of small files and long include lists can reduce compilation time by factor of 5 as that time is dominated by repeated parsing of headers.


I don't think Python and C++ are really comparable here. Python was meant to be fairly minimalist from the start. I think if Python "feature froze" right now it'd still remain relevant for a long time. As you mention for its core use case you could argue that it's feature-complete already.

C++ on the other hand was never small. It was never an objective. C++ started as C with classes so it was really just a way to have proper OOP in C. Then it grew and grew and grew, adding more paradigms along the way.

C++ is in a relatively tough spot because it has to maintain backward compatibility all the way back to C89 while at the same time attempt to stay relevant when pitted against Rust, Zig, D and friends. You can't simplify anything because you'd lose backcompat and if you just feature freeze you're doomed to become irrelevant in the not-so-far future.

Unlike Python you can't really argue that C++ as it is now is feature complete. Dependency management is still a nightmare (as is the entire build system really, given that you don't have any standard C++ build system). Concepts are also a great addition to the language IMO that I've been missing for as long as I've used the language.

Coroutines are also fast becoming a standard tool for system programming (for better or worse, but that's a different discussion), and C++'s support in the standard is still experimental.


> C++ is in a relatively tough spot because it has to maintain backward compatibility all the way back to C89

C++ does not maintain backwards compatibility with any version of C. Almost any relatively large idiomatic C program will fail when compiled as C++ for a few simple differences (most commonly, in C it is idiomatic to write `int *c = malloc(sizeof(int))`, while this is a type error in C++).


> Dependency management is still a nightmare (as is the entire build system really, given that you don't have any standard C++ build system)

CMake sucks but it’s OK that CMake quickly became so dominant, because at least it’s everywhere. It’s also peculiar, unhelpful, and dangerous in the same sort of general way as C++. So, I find CMake to be very comfortable, and I like it, and I wouldn’t be entirely surprised if Bjarne announced one day that CMake is now the standard C++ build system.


I don't think that's true at all for Python. It's explicitly created as a "batteries included" language with a very large standard library. This is a quote from PEP 206:

"The Python source distribution has long maintained the philosophy of "batteries included" -- having a rich and versatile standard library which is immediately available, without making the user download separate packages."


Except Python is quite huge as well, and introduces breaking changes even across minor versions, people think it is simple, but that is actually quite deceiving.


I don't know what the situation is with the C++ development. But in general, based on my experience on standardization work, it's really hard to stop a committee from inventing new features when the members get paid to participate and make an impact. Maybe exaggerated but sometimes I got the feeling that some proposal existed only so that the presenter was able to justify a week in a nice meeting location like Hawaii.


> it's really hard to stop a committee from inventing new features when the members get paid to participate and make an impact.

This could also explain most awful, unnecessary GUI updates, unprovoked by any customers, throughout software.


The main issue I have with C++ is that you basically have to relearn it from scratch every time a standard comes out. C++11 threw away decades of best practices, and C++17 again changed so many things it's hard to keep track of them (for instance, stuff like `string_view` has had a _huge_ impact on how I write code and how I reason about strings, making `const char *` almost disappear from some of my codebases).

C++20 with concepts and ranges will revolutionise everything, again, because those features are so big and pervasive they make lots of patterns and "older" parts f the language suddenly feel like "legacy". For instance, I see concepts vastly reducing the need for inheritance in lots of cases, massively simplifying what lots of folks like me were already doing with SFINAE and templates.


As far as I understand, that is the whole point of many of these changes: there are patterns that people have to bend their code into and unsafe constructs that need to be used with care, and the newer standards seek to replace patterns with language features (e.g. concepts more or less instead of SFINAE), or bring safe alternatives to the previously necessary unsafe ones (string_view instead of `char*`).

Now, the major concerns that some have is that this will not be successful, and that instead of replacing X with Y, you'll be left with both X and Y in the language as still necessary constructs for new code, making the language ever more complicated even if you can disregard the legacy.

C++ already suffers somewhat from the strong ties between many of its features (for example, many people would like to use C++ without exceptions, but then you can't use `new` or constructors that can fail, and without constructors you can't use RAII, and so on).


Yes, that's definitely a possibility, but I think most of the time when two patterns keep existing, instead of a new one just replacing the old one, is most of the time due to people refusing to change the ways they write code.

For instance, `new` is basically nothing short of obsolete in modern C++, unless when used in "placement new". There are legally 0 reasons for ever writing `new T`, unless you are implementing `std::make_unique()` in the STL. Disabling exceptions is also IMHO a very poor choice - first, because there is no valid performance reason to do so, and second because it basically cripples half of the STL (and lots of other libraries). Using it becomes then tricky to use safely, because basically everything might just call std:: terminate () anywhere without any formal specification, or without being declared `noexcept`, by just changing compiler options.

C++ exceptions, when used sparingly and for, well, exceptional events, are not as bad as people think they are. What's very unfortunate is that they got a very bad rap in the 90s, and then countless developers have been mislead by Java's unfortunate design choices, so they often fail to understand that handling failures should always considered part of the ordinary code path, and not treated as exceptional at all.

I personally use exceptions as if they are a lighter version of assert, like `panic` from Rust or Go, in order to indicate that some basic code assumption has been violated, and thus executing a certain action is no longer possible (like, some parameter is malformed and you specifically asserted it's not, and such). Using them to represent normal runtime failures, such as IO errors, is sloppy design, and it's something that is much better accomplished by returning error values (which might also consist in enum classes or an error struct).


Just FYI

> but then you can't use `new`

You can certainly use new without exceptions, see https://en.cppreference.com/w/cpp/memory/new/nothrow or https://stackoverflow.com/a/15292148

> and without constructors you can't use RAII, and so on).

I think RAII is more about destructors (which must never throw exceptions). You can certainly write RAII style without throwing constructors. Generally, exceptions depend on RAII, but not the other way round.


> You can certainly use new without exceptions, see https://en.cppreference.com/w/cpp/memory/new/nothrow or https://stackoverflow.com/a/15292148

You can use new without exceptions, but you can't detect OOM then, if I'm not mistaken.

> I think RAII is more about destructors (which must never throw exceptions). You can certainly write RAII style without throwing constructors. Generally, exceptions depend on RAII, but not the other way round.

The whole idea of RAiI is that you Acquire a Resource in the constructor (during Initialization), so that the destructor can guarantee it gets de-allocated.

If you construct your object and then acquire a resource in it later, from a function that can return an error without throwing, then you introduce complexity in the destructor and may risk errors more.


> The whole idea of RAiI is that you Acquire a Resource in the constructor (during Initialization), so that the destructor can guarantee it gets de-allocated

Yes, that's the meaning of the acronym means, but unfortunately it's really a misnomer.

When we say RAII style programming, we rather mean scoped destruction to avoid manual resource management. A classic example would be using std::lock_guard over a raw std::mutex. Some RAII objects don't throw on resource acquisition failure (std::fstream) or they expect that resource acquisition can fail (std::unique_lock with std::try_to_lock tag). Other RAII objects might not require any resource in the constructor at al, instead their only job is to release the resource that is given to them, like std::unique_ptr.


> You can use new without exceptions, but you can't detect OOM then, if I'm not mistaken.

AFAIK `new (std::nothrow) T` returns nullptr if allocation fails, like std::malloc.


Exactly! It's worth pointing out that 'new (std::nowthrow) T' itself also performs a NULL check, so it doesn't call the constructor if the allocation failed.


> You can use new without exceptions, but you can't detect OOM then, if I'm not mistaken.

You can detect the same errors with and without exceptions - the nothrow variant returns nullptr on error. Due to overcommit, neither can be relied on to detect OOM in practice.

> If you construct your object and then acquire a resource in it later, from a function that can return an error without throwing, then you introduce complexity in the destructor and may risk errors more.

Not neccessrily, e.g. free and delete are both no-ops for nullptr.


> In my naive opinion, It seems to me that both C++ and Python have reached a point where most people are satisfied with the features the language has.

most definitely not. C++ still doesn't have reflection ffs, which I am reminded every time I have to type the name of an enum or class twice


Idle question, if you could toss everything added to C++ in the last 20 years in return reflection would that be worth it?


I'm of the opinion that powerful enough reflection + metaclasses (e.g. with the ability to synthesize almost arbitrary code ) would be able to transform a lot of language features (noexcept, coroutines, ...) into library features. But it's not possible for all, and you'll pry auto, range-based for, atomics semantics, lambdas, restructuring bindings, and brace-init... from my cold dead hands


Reflection is also one of the things I miss the most.


New features get added to C++ only when people vote for them. I believe anyone gets to cast a vote. You have to write a paper that explains the rationale for your idea.

So, features get added because they're considered good ideas.


I fear this is the road C# is on.


If anyone is looking for a build system to try C++20 modules with GCC, there is build2: https://build2.org/blog/build2-cxx20-modules-gcc.xhtml

There is also a repository of module examples (some trying to imitate real-world usage like distributing modules as part of a library): https://github.com/build2/cxx20-modules-examples/


Where can i find build2 binaries to try this out?



There's no binary package for Windows apparently


I wish there was a C++ standard that didn't add any features, but only fixed bugs or deprecated old features, and possibly improved existed features, giving compiler writers a chance to actually catch up and developers a chance to stabilize their codebases. C++26?


Pick a C++ version and use that then. Compilers are not dropping old versions of the std, but the new versions are just that new. No deprecation affects an older version of C++ and no one is forcing developers(as seen by the 20-30% that still use C++98/03) to upgrade their code bases.


Yes, all you need to do, with both g++ and clang, is to specify the version of the standard that you want. Picking a mature standard makes it likely that most bugs have been fixed.


Compiler writers are pretty quick to add support for new features--I'd say it's about a year-ish between being added to the standard (which is not the same as the actual release it's in!) and actually being usable in a compiler.

Where the delay comes into play is that most projects require a minimum support of a compiler that's several years out of date. And if you're required to support 4-year-old compilers, then C++17 features aren't going to be available, despite them being available in the newest versions of all compilers. Back in 2018, I was working on a project where I gave up and used C++17 features (constexpr if) because I knew I could get away with only supporting a Clang that's a few weeks old.


Compiler writers aren’t that far behind? C++17 is mostly supported. C++20 is one year old!


"C++20 is one year old!"

The ISO standard was published in December of last year. So, technically, C++20 is only four months old.


> C++17 is mostly supported.

For essentially any other language this would be an absurd thing to say. For those languages you wouldn't consider a feature added until it was supported by the reference compiler


If you are using GCC, Clang, or MSVC then the standards support is excellent[1].

But putting that aside

> For those languages you wouldn't consider a feature added until it was supported by the reference compiler

Well, there is no reference compiler for C++. I don't think any language with either an ISO or ECMA standard has a reference compiler or interpreter.

There are gaps in support here in and there. But I would still say those mainstream compilers "support" C++17 for the same reason I would say that Chrome "supports" CSS3, despite it not implementing the full specification.

[1] https://en.cppreference.com/w/cpp/compiler_support


Only for languages that lack multiple implementations.


Some of the additions, like coroutine support, are roughly bug fixes.


Nope, Python 3 showed what happens.


What happens? You become the most-widely used language on the planet? The Python 2-3 wars are over. Python 3 won.


I just installed Python 2.7 on my new project because some critical libraries used by the customer don't care Python 3 exists.


Open source ones?


To some extent yes, and no I am not going to mention which ones.


I just can't think of a major open source python lib that hasn't added support for 3 if the functionality wasn't replaced by a new feature in 3.


Github is a tiny drop on the universe from enteprise computing.


You get 10 years of headaches for everyone who gets close to the language. Even today, the python executable in Ubuntu is called `python3`, not `python`.

To be fair though, it would be less of a problem for C++ than it was for Python, since you wouldn't have to depend on the compiler from the end-user system.


backwards compatibility is a bitch.

But yes, there's a reason why high-visibility C++ projects like Chrome basically white-list 30% of the language and keep it that way, to some small sub-set they feel is "good enough".

The problem is the fragmentation this causes. I pick these 5 features of the language, you pick another 7, I can't use your lib, etc, etc.

C++ is becoming extremely bloated. Bjarne said as much in one of his recent criticisms at the highly-specialized use-case proposals that people wanted to make "standard".

I enjoyed his C++11 book but now that's probably all outdated "oh we don't do it that way anymore" stuff and I can't afford to just buy 2-3 1000-page books every year to keep up. Got better things to do.


Sadly, I really doubt it. The standards committee seems mostly interested in adding new template metaprogramming features at this point. What I /really/ wish they would focus on is modules. 100% of C++ developers suffer from horrible compile times, and yet it seems to always get punted.

(Not to mention that forcing developers to write header-only libraries to use templates pretty much makes them way less attractive anyway)

Edit: maybe I'm wrong? It looks like modules are in C++ 20. No idea if any compilers are supporting them yet.


Modules doesn’t even really speed up the build if you have a highly parallel build... and it’s unclear how build tools should support it in a uniform way. I think modules will go unused but I hope I’m wrong


Modules have been released with C++20, but no compiler has a complete implementation for them yet. Also the standard library hasn't been modularized yet.


How is it possible that modules were released, if nobody has a working implementation? I was under the impression that proposals needed at least a proof-of-concept implementation from a sponsor. My exposure to this is searching the CMake tracker for this last year and seeing that it wasn't even working there yet.


> How is it possible that modules were released, if nobody has a working implementation?

there were plenty of experimental implementations over the years, some people have been using some flavour of "modules" with MSVC and Clang for something like 5 years... hell, technically it has always been possible to compile code as objective-C++ to leverage the objc module system if you could afford to use clang everywhere


As the other answer said, nobody has (yet) an implementation of the _released_ standard, but both GCC and especially Clang have had modules very close to what stabilized by C++20 for years. You can enable them by specifying a flag (-fmodules-ts) on any recent release of Clang.


I know that but I'm just baffled that the text of the standard could be released at a point in time when nobody has succeeded at implementing it fully. That seems like it would almost guarantee that no one implements the standard as it's written, as there is no proof-of-concept that it's truly viable... I'd expect to see several revisions to this before C++23 as the compilers work out the issues.


MSVC is expected to have C++20 feature completeness in VS 2019 16.10 Preview 3.

https://github.com/microsoft/STL/wiki/Changelog#expected-in-...


Watch the Visual C++ virtual day coming next week.


The standardization work for modules is done. Now all build tools and code bases need to start supporting it in a good way.


Isn't modules already a part of C++20?


There is. It's called Zig[1]. Usually I would have written Rust, but it's good to give Zig more publicity.

[1] https://ziglang.org/


This evangelism stuff has really got out of hand on HN - a whole new language is literally the opposite of what the parent asked for. To make matters worse, the languages you choose to evangelize are new/experimental, making your comment even less on point.


Yeah. I like rust but I feel like its pushed way too hard. If its really 'that good', people will slowly pick it up.

I personally don't like c++, but I wouldn't enjoy someone going on a rust thread and shilling c++. So I won't shill rust on a c++ thread. Seems similar for Zig.


As a huge fan of Rust, I am very interested in people writing comments about what C++ does for them that I'm missing in Rust.

I want more C++ shilling in Rust threads, and I don't think I'm alone.

Please, give me detailed examples of how you can use SFINAE or whatever to express useful misuse-resistant abstractions that can't be expressed with Rust's feature set. Please tell me about classes of bugs you can prevent in C++ that can't be similarly prevented in Rust.

Or maybe C++ isn't safer, but it's more performant? Easier to use? Easier to learn? Easier to troubleshoot and debug?

I agree that the Zig comment that started this sub-thread didn't contribute to the conversation. If we disagree, although I'm not sure we do, it's in that I'd rather call for higher-effort comments about the benefits of other languages instead of fewer comments about them.


- GPGPU programing ecosystem like CUDA and SYSCL (note eco-system, not just a compiler that generates PTX code)

- Game engines like Unreal and Unity

- GUI frameworks like wxWidgets, Qt, MFC, WinUI

- Being the language of choice to contribute anything to GCC or LLVM

- Being the language of choice for the native libraries ecosystem and runtime extensions to plug into Java, .NET, nodejs.

- Being the language to write drivers in macOS (IO Kit/Driver Kit), and Android alongside Java (Project Treble)

- Being avaialble out of the box on macOS, iOS, Windows, Android, Playstation, XBox, Switch SDKs and respective IDE tooling

So you can decide to deal with some of C++ flaws, and enjoy 40 years of history, or spend part of your application development budget into building a Rust ecosytem.


> Windows

The C++ SDK is definitely not available out of the box on Windows. You have to install it (and update it manually!).


Only for those that chose the path of not using the OS vendor tools and don't install Visual Studio.

Not only it is automatically selected in the respective workloads that depend on it being avaiable, it gets updated.


Eigen, although often criticized for its long build times, is still the only linear algebra library that can lazy-evaluate and optimize expressions in compile-time, thanks to C++’s template metaprogramming.

In overall Rust’s current type system isn’t good enough to match what C++ has for numeric computations, although there have been some recent improvements such as the min-const-generics feature. I’ll probably try the language again once const generics are finished.


> If its really 'that good', people will slowly pick it up.

Empirically, that's been happening?


It's just shilling.

OP asked for:

> only fixed bugs or deprecated old features

When you get right down to that, you do end up with something like Rust. E.g. C++'s move semantics are weird because for historical reasons copying is the default. Get rid of those historical reasons, and there's no reason not to do it the Rust way.

40 years of "no big breaking changes" really is a lot of cruft.


> and there's no reason not to do it the Rust way

memcpy everywhere is definitely not the best answer to every problem


If you want a copy constructor, it would work like this:

Copy : Clone :: Move : Relocate

I.e. there would not be the presumption that everything is automatically movable or coppiable, there would be magic traits to indicate memmove / memcpy and friends, and then plain old stdlib super traits for user-defined cloning and relocating.

This is the the right design for move constructors, full stop.


It's not like C++'s move semantics magically get rid of the necessity of copying data either.


I meant to write it's not just shilling.


OP didn't ask for a new language that is stable, they asked for a stable C++.

But yes, it's true that Zig aims to be a minimal, stable language - when they go to 1.0, which hasn't happened yet. It's still changing frequently.


It took this mail a bit over 3 hours to be sent to my mailbox. (I subscribe to gcc-announce.) I'm curious what the reason for such a delay is. (Nothing wrong with the delay, just curious about the technical reason.)


My guess: Sending email to a lot of recipients when you're not a large email provider can end up with you getting throttled / needing to apply throttling to individual servers to prevent blocking.

In the past when I still operated my own email servers I often saw list emails minutes before on e.g. gmail.


They might be using a cron job or some other batch process to send the email to each member of the mailing list. There are limits to CC and BCC, I believe.


Do you use gmail? Gmail restricts the amount of mails that a single sender can send, so sometimes mailing lists have to retry until gmail lets them send (this is a paint point for the Linux kernel mailing list)


I use Fastmail.


The changelog isn't final yet, so I'm not sure if I should link it or not, but there are some nice additions to the D frontend in this one.

"Reports of my death are greatly exaggerated" - GCC


__attribute__ ((malloc (mydealloc, 1))) looks interesting. Didn't know about that one.


Likewise, this caught my eye: As in C++, function definitions no longer need to give names for unused function parameters.

Goodbye UNUSED() macro!


The improved static analyzers is one of the features I am looking forward.


The MODE_CC conversion for VAX is in this version. Yay!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: