Hacker News new | past | comments | ask | show | jobs | submit login

I tried to do a simple web app with C++ recently.. nothing against the language but trying to decide which framework to use and then trying to integrate them to my project and build took me a couple of days, ultimately I gave up and did the same in python in 30 minutes.

I don't understand how such a popular language can have such a weak story in dependency management and integration with third party libraries.

Even something that should be simple like using grpc / protocol buffers in C++ took me way more than I was intending, plus my final CMake file ended up looking extremely hacky with some other dependencies...

IMO C++ needs something akin to the rust book (https://www.rust-lang.org/learn) or the python official docs (https://docs.python.org/3/) and a built in dependency manager.

Every C++ project Make/CMake seems to end up like its own unique dumpster of fire that is non trivial to add dependencies to...




> I don't understand how such a popular language can have such a weak story in dependency management and integration with third party libraries.

For one thing, C++ pre-dates the common notion of package managers, and a widely available internet. So it was just never designed as part of the language.

Language and tooling, is a relatively "new" concept.

C is 48 years old, C++ is 35 years old, the first web browser is 30 years old, and Java merely 25 years old.


Java wasn't designed with it in mind either, yet Maven's dependency resolution scheme has been the de-facto standard for over a decade at this point (despite all of its warts, like being completely binary-oriented).


This is one of the reasons that header-only libraries have become quite popular. They're easy to deal with: just add to your include path and you're ready to go.

I put off starting a C++ project I've wanted to start for months because I couldn't bring myself to dealing with the dumpster fire known as CMake, and I now try to limit myself to projects I can do that only require header only libraries, if possible (and use tup [1] to build, since its simple). Since I only really do C++ for fun these days, that's worked out ok.

[1] http://gittup.org/tup/


The reason is called laziness and unwilling to learn how things work.

If I as a 80's high school student was able to manage how to deal with dependencies in MS-DOS world, with several commercial compilers, in age where information came only on books and magazines, so can anyone on the Internet age learn how to do it.

I will avoid them as much as possible.


I disagree. CMake is a freaking mess, but its won, so most non-header-only libraries are setup for CMake. I've tried to learn CMake a number of times and have written non-trivial CMake configs. I spent a lot of time on it that I could have spent on my actual project instead and in the end it was still not quite right (breaking after moving to a different system and trying to compile there). I made multiple attempts to try and learn how to do it and failed.

So I gave up, I have better (more fun, more productive) things to spend my time on. Life is too short to waste on such things.


Didn't know tup. Seems to be Makefile done right. (at least at first glance...)


Take a look at DJB Redo, as well. There are many implementations, so here's a well-documented one: https://redo.readthedocs.io/en/latest/


I haven't used it for anything particularly complex and I haven't used it with anything that has a ton of non-header-only third party libraries, so I don't know how it holds up in those cases, but for my smaller projects, its been really pleasant. The tupfiles are easy to write and the tool is super fast. It took me from dreading starting a new project with cmake to enjoying using C++ again.

I used to just use QtCreator and let it setup the build system. I could just go back to that, but for now I'm very happy with tup.


C++ is a terrible language for small, one/few-person projects.

It still sucks for bigger projects that cannot reinvent the ecosystem around them and need to rely on third-party libraries. Frameworks try to alleviate the problem and to provide a saner ecosystem sandbox, with varied success.

Once a project becomes large enough to define its own ecosystem (Android, Chrome, Firefox, KDE/Qt, google3, etc), C++ becomes remotely sustainable: the language ecosystem is not a bottleneck anymore, compilers get fixed and extended when needed, libraries are heavily morphed or rewritten, build systems are usually custom anyway. It takes a gigantic snowflake to fit C++ niche, and even then it requires a lot of internal engineering and in-house experience to make developer experience bearable.

Rust is eating the small-scale niche like a fire, it is proving successes in the middle-scale niche. It will be interesting to see growing pains and ecosystem language/adjustments when the new generation that started with Rust grows up and large projects start to get written and maintained in Rust.


As one person team, C++ was quite good to avoid touching C as much as possible, while being able to use the safety and abstractions I enjoyed so much from Turbo Pascal.

Rust isn't eating any small-scale niche of GUIs and shipping binary libraries any time soon.


To be fair Google libraries are famously hard to build and integrate. It doesn't help that they use their own build system, though it seems gRPC can be built with CMake now. It's a good number of years since I did it last, but integrating gRPC was one of the worst experiences I ever had integrating a dependency (on Windows at least).


>I don't understand how such a popular language can have such a weak story in dependency management and integration with third party libraries.

>All of it could literally be replaced by something akin to: cppvenv --depend-on grpc-1.40 zeromq json

C++ existed about ~20 years before the practice of "automatically reach out to the internet and download dependencies".

As a result, C++ doesn't yet have a Schelling Point[1]. Your proposed packager syntax requires a Schelling Point (i.e. canonical http repo source as a Focal Point to download "grpc-1.40,zeromq,json"). My previous comment about this: https://news.ycombinator.com/item?id=22583139

Tldr, no vendor with enough sway to influence the fragmented C++ community (commercial shrinkwrap software vs open source, embedded vs server, HPC/scientific, games, etc) has created such a Schelling Point.

[1] https://en.wikipedia.org/wiki/Focal_point_(game_theory)


This is a very good point.

In fact I would add that no vendor exists with enough sway to influence the a significant part of the C++ community (not even the committee can).


Historically the dependency management tool for C++ is the system's package manager, and/or a shell configure/provision script.

I don't see any value add to a "built in" dependency management tool to C++. Not as in it isn't a useful thing to have, just that it can't solve the big problem which is that most C++ projects would never support it.

There's also the fact that long dependency chains are frowned upon in many C++ code bases.


What about supporting multiple versions of libraries at a computer? They all support it: npm, rvm, virtualenv, cargo, golang, python, maven and etc.

A very common use case is to use some newer version of something than what shipped with the OS but at the same time you don't want to potentially cause instability in the OS...

Or you know, developing multiple projects w/ different versions from a single OS.

Also in case I really want to do it, I have to go to some obscure website download some payload and read some build instructions to see how to best integrate to my messy CMake/Make system...

All of it could literally be replaced by something akin to

    cppvenv create myproject
    cppvenv --depend-on grpc-1.40 zeromq json
    cppvenv build


You would have to go to some obscure website to download some payload and read build instructions to see how to beat integrate it with your new cppenv program instead. That's the problem to solve, C++ is old and most of it written before those solutions existed, and most of it doesn't use the same set of tools.

Like I said, I don't think this isn't a problem or that a solution wouldn't be nice. It's just not functionally different from the myriad of existing solutions.


You edit your generated makefile or tweak the autoconf (or whatever you used) config to point at a specific version of the library, I guess.


Nix does what you want.


Yes, I love Nix. The only caveat is that it does not work on Windows...


> dependency management tool for C++ is the system's package manager

The system package manager has a different purpose than a "development" package manager.

The system packages are supposed to give you a consistent set of libraries needed to support the application shipped by the distro.

If a package you depend on for daily work, say Firefox, requires libfoo-1.0 and your project needs libfoo-2.1, you're screwed.

Similarly if you're supporting an old release myapp-1.0 for some customers and it needs libfoo-1.0, but at the same time you're working on myapp-2.0 which needs libfoo-2.1, you're out of luck.

The system package manager just doesn't cut it for development work.


You can resolve libs by full name beside the system default version.


> Historically the dependency management tool for C++ is the system's package manager, and/or a shell configure/provision script.

I agree with you here. The system package managers where invented to solve system global dependency problems between libraries/executables as well as provide headers and static libraries for development on the current machine.

> I don't see any value add to a "built in" dependency management tool to C++. Not as in it isn't a useful thing to have, just that it can't solve the big problem which is that most C++ projects would never support it.

Even if you don't see a need, you have to acknowledge that many other people see the need to have some sort local dependency management, be it compile time like cargo or the different cross-build environments (openembedded, buildroot, co.) or runtime like chroots, containers, virtual machines.

It can be argued that the local dependency management solutions are in their infancy.


> Not as in it isn't a useful thing to have, just that it can't solve the big problem which is that most C++ projects would never support it.

For a living language the solution is never: "throw your hands up in the air and give up".


Unlike Rust and Golang. C++ has long history and baggage, thus there is no standard tooling for C++ such as building systems and package managers. In the past, every operating system had its own set of preferred tooling for C and C++, on Linux and Unix-based system, lots of projects still use GNU autotools or Makefiles; on Windows, many projects still use the XML-based MSBUild from Visual Studio IDE. Many IDEs also have their own XML based building systems as well. Nowadays, CMake major IDEs support CMake and allows using CMakeLists.txt as a project file by just opening the directory containing this file. Some of those IDEs are Visual Studio IDE, Visual Studio Code, CLion, Eclipse (Via plugin), KDevelop and so on.

Regarding libraries dependencies, CMake has a FetchContent feature which allows downloading source code directly from http, ftp or git servers and adding this code as project dependency. This approach can avoid wasting time installing and configuring libraries. However, this technique is only feasible for lightweight libraries when the compile-time and library object-code size are not significant. For large libraries and frameworks where the compile-time is significant, such as Boost Libraries, Gtk or Qt, Conan or Vcpkg are the best solutions as they can cache and reuse library object-code with many other projects which reduces the compile-time and disk space usage.

C++ is not best solution for Web applications, in most cases you will not gain anything using C++ for this case, unless the web application is running in an embedded system such as router or a network printer. C++ is most suitable for cases where you need: high performance; access the operating system low level features; implement an operating system; implement an embedded system or a triple-A game.


It is not ideal but for the subset of libraries that use CMake it can be as easy as dumping the library in a subdirectory and you are done. Things only start to go sideways when you have to integrate with google libraries since they have typically insane build tooling (skia).


Aren't things like Nuget supposed to solve this : https://devblogs.microsoft.com/cppblog/nuget-for-c/


> I don't understand how such a popular language can have such a weak story in dependency management and integration with third party libraries.

At some point 40 years from now, someone will complain about Rust not having full integration with whatever IPFS/blockchain distributed computing package managers/sharded source control/vulcan mind meld the internet turns into. There really was a world before the internet was universal. People used tools like Fortran on these things called Crays; (or CDC6600 or IBM doodads before that) -you'd bring your data on a VHS cassette or reel to reel tapes, because your university only rented a single T1 line -1.55mbps FWIIW; it was seen as plenty of bandwidth for all the scientists in the university. Nobody but the scientists used the internet in those days.

That universe is still around; matrix math is still ultimately Fortran -matrix math is why people invented computers in the first place, as amazing as this may sound to people with ipotatoes in their pocket. That's 50s and 60s tech; with gotos and line numbers: back then, using multiple files for your software was considered getting fancy. You were usually submitting jobs as a stack of punched cards. I never did, but my older roomies in grad school certainly did.

Similarly, the OS you're using is basically mid-1970s technology, written in mid-1970s programming languages. Back then, package management wasn't so much of a problem; the whole source code for the OS could be published in not-too-big book form (and was, FWIIW). Just a reminder: the runtime for modern Rust is absurdly bloated compared to those days; "Hello World" in Rust is literally larger than the entire hard drive of 1970s computers with the OS written in C; hell there were 80s computers with 4mb drives that were seen as fairly adequate. Anyway, that's why C++ (basically PDP-11 assembler with Bjarne's deranged ideas about objects ... and some later better ideas) has a shitty package management story. It's not so bad really; just a skill that n00bs don't have any more. I personally never needed anything fancier than makefiles, and saw CMake as a regression from autotools.


> Just a reminder: the runtime for modern Rust is absurdly bloated compared to those days;

Just a reminder: the runtime for Rust is effectively the same as in C; the smallest binary rustc has produced is 137 bytes https://github.com/tormol/tiny-rust-executable

The defaults do not focus on size reduction because we aren't in the 1970s anymore, though, it's true.


> basically PDP-11 assembler with Bjarne's deranged ideas about objects

C++ was originally pretty much Simula-67 with C instead of Algol-60 as the foundation. What's "deranged" about the Simula object model?


With the Bjarne's goal of never having to do BCPL style programming ever again.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: