fouronnes3
2 days ago
This is a good article but it only scratches the surface, as is always the case when it comes to C++.
When I made a meme about C++ [1] I was purposeful in choosing the iceberg format. To me it's not quite satisfying to say that C++ is merely complex or vast. A more fitting word would be "arcane", "monumental" or "titanic" (get it?). There's a specific feeling you get when you're trying to understand what the hell is an xvalue, why std::move doesn't move or why std::remove doesn't remove.
The Forest Gump C++ is another meme that captures this feeling very well (not by me) [2].
What it comes down to is developer experience (DX), and C++ has a terrible one. Down to syntax and all the way up to package management a C++ developper feels stuck to a time before they were born. At least we have a lot of time to think about all that while our code compiles. But that might just be the price for all the power it gives you.
jandrese
2 days ago
In Linuxland you at least have pkg-config to help with package management. It's not perfect but neither is any other package management solution.
If I'm writing a small utility or something the Makefile typically looks something like this:
CC=clang
PACKAGES=libcurl libturbojpeg
CFLAGS=-Wall -pedantic --std=gnu17 -g $(shell pkg-config --cflags $(PACKAGES))
LDLIBS=$(shell pkg-config --libs $(PACKAGES))
ALL: imagerunner
imagerunner: imagerunner.o image_decoder.o downloader.o
duped
2 days ago
Consider that to do this you must:
- Use a build system like make, you can't just `c++ build`
- Understand that C++ compilers by default have no idea where most things are, you have to tell them exactly where to search
- Use an external tool that's not your build system or compiler to actually inform the compiler what those search paths are
- Oh also understand the compiler doesn't actually output what you want, you also need a linker
- That linker also doesn't know where to find things, so you need the external tool to use it
- Oh and you still have to use a package manager to install those dependencies to work with pkg-config, and it will install them globally. If you want to use it in different projects you better hope you're ok with them all sharing the same version.
Now you can see why things like IDEs became default tools for teaching students how to write C and C++, because there's no "open a text editor and then `c++ build file.cpp` to get output" for anything except hello world examples.
jlarocco
2 days ago
It's really not that big of a deal once you know how it works, and there are tools like CMake and IDEs that will take care of it.
On Windows and OSX it's even easier - if you're okay writing only for those platforms.
It's more difficult to learn, and it seems convoluted for people coming from Python and Javascript, but there are a lot of advantages to not having package management and build tooling tightly integrated with the language or compiler, too.
scrubs
a day ago
I agree -- I've been at it long enough -- cmake etc makes stuff pretty darn easy.
But in industrial settings where multi groups share and change libs something like debpkg may be used. You add caching and you can go quite deep quickly esp after bolting on cdci.
One must cop to the fact that a go build or zig build is just fundamentally better.
jlarocco
a day ago
Yeah, I definitely agree the newer tools are better, but sometimes the arguments against C++ get blown out of proportion.
It definitely has a lot of flaws, but in practice most of them have solutions or workarounds, and on a day-to-day basis most C++ programmers aren't struggling with this stuff.
rapidlua
a day ago
Go build is fundamentally better? How so? Go build is so light on features that adding generated files to source control is a norm in go land.
scrubs
a day ago
Generated files are noise.
Newer languages builds have built in version resolution to resolve dependencies together with smarter ways to reference dependencies without #include.
And that's better
saghm
a day ago
"It's not really that big a deal once you know how it works" is the case with pretty much everything though. The question is whether the amount of time needed to learn how something works is worthwhile though, and the sheer number of things you need to invest the time to learn in a language like C++ compared to more modern languages is a big deal. Looking at a single one of them in isolation like the build system essentially just zooms in one problem far enough to remove the other ones from the picture.
duped
a day ago
So I come from the C/C++ world, that's part of why I disagree with these takes. I wouldn't say any process involving CMake is "not that big of a deal" because I routinely see veteran developers struggle to edit cmake files to get their code to compile and link.
Defletter
a day ago
This is pure Stockholm syndrome. If I were forced to choose between creating a cross-platform C++ project from scratch or taking an honest to god arrow to the knee, the arrow would be less painful.
AlienRobot
a day ago
I don't want any arrows in my knees but I agree.
The main reason I don't want to use C/C++ are the header files. You have to write everything in a header file and then in an implementation file. Every time you want to change a function you need to do this at least twice. And you don't even get fast compilation speed compared to some languages because your headers will #include some library that is immense and then every header that includes that header will have transitive header dependencies, and to solve this you use precompiled headers which you might have to set up manually dependending on what IDE you are using.
It's all too painful.
jstimpfle
a day ago
It gets better with experience. You can have a minimal base layer of common but rarely changing functionality. You can reduce static inline functions in headers. You reduce data structure definitions, but put only forward declarations in header files. (Don't use C++ methods, at least don't put them in an API, because they force you to expose your implementation details needlessly). You can separate data structures from functions in different header files. Grouping functions together with types is often a bad idea since most useful functionality combines data from two or more "unrelated" types -- so you'd rather make function headers "by topic" than putting them alongside types.
I just created a subsystem for a performance intensive application -- a caching layer for millions or even billions of objects. The implementation encompasses over a 1000 LOC, but the header only includes <stdint.h>. There are about 5 forward struct declarations and maybe a dozen function calls in that API.
To a degree it might be stockholm syndrome, but I feel like having had to work around a lot of C's shortcomings I actually learned quite a lot that helps me in architecting bigger systems now. Turns out a lot of the flexibility and ease that you get from more modern languages mostly allows you to code more sloppily, but being sloppy only works for smaller systems.
einpoklum
a day ago
If you were forced to choose between creating a cross-platform project in one of the trendy language, but of course, which must also work on tiny hardware with a weird custom OSes on some hobbyist hardware, and with 30-year-old machines in some large organization's server farm - then you would choose the C++ project, since you will be able to make that happen, with some pain. And with the other languages - you'll probably just give up or need to re-develop all userspace for a bunch of platforms, so that it can accommodate the trendy language build tool. And even that might not be enough.
Also: If you are on platforms which support, say, CMake - then the multi-platform C++ project is not even that painful.
nxobject
a day ago
> but of course, which must also work on tiny hardware with a weird custom OSes on some hobbyist hardware, and with 30-year-old machines in some large organization's server farm - then you would choose the C++ projectt, since you will be able to make that happen, with some pain.
With the old and proprietary toolchains involved, I would bet dollars to doughnuts that there's a 50% odds of C++11 being the latest supported standard. In that context, modern C++ is the trendy language.
CyberDildonics
a day ago
Why? There are lots of cross platform libraries and most aspects are not platform specific. It's really not a big deal. Use FLTK and you get most of the cross platform stuff for free in a small package.
einpoklum
a day ago
It is a big deal even after you know how it works.
The thing is, the languages like Rust only make this easier within their controlled "garden". But for C and C++, you build in the "world outside the garden" to begin with, where you are not guaranteed of everyone having prepared everything for you. So, it's harder, and you may need third-party tools or putting in some elbow grease, or both. The upside is that when rustaceans or go-phers and such wander outside their respective gardens, most of them are completely lost and have no idea what to do; but C and C++ people are kinda-sorta at home there, already.
steveklabnik
a day ago
What is "outside the garden" for Rust?
einpoklum
11 hours ago
Oh, say, use some binary C library with some header, that you found on some system.
hiimkeks
10 hours ago
That shouldn't be too tricky, assuming the binary is built for the sort of device you want to run on. At least not much more complicated than calling any other C code, using bindgen.
steveklabnik
7 hours ago
Yep, Rust made design decisions to make this case as zero overhead as it can be.
imtringued
a day ago
I used to write a lot of C++ in 2017. Now in 2025 I have no memory of how to do that anymore. It's bespoke Makefile nonsense with zero hope of standardization. It's definitively something that doesn't grow with experience. Meanwhile my gradle setups have been almost unchanged since that time if it wasn't for the stupid backwards incompatible gradle releases.
gpderetta
a day ago
> It's bespoke Makefile nonsense with zero hope of standardization
technically Makefile is standardized (by POSIX), contrary to most alternatives.
/extremely pedantic
pjmlp
a day ago
I would rather deal with Makefiles than Gradle.
saghm
a day ago
I think we can afford to strive for more than just "not quite the absolute worst" (for however we decide to measure quality).
einpoklum
a day ago
> I used to write a lot of C++ in 2017... It's bespoke Makefile nonsense
1. Makefiles are for build systems; they are not C++. 2. Even for building C++ - in 2017, there was no need to write bespoke Makefiles, or any Makefiles. You could, and should, have written CMake; and your CMake files would be usable and relevant today.
> Meanwhile my gradle setups have been almost unchanged since that time
... but, typically, with far narrower applicability.
feffe
11 hours ago
CMake has become the defacto standard in many ways, but I don't think it's that easy to deal with. There's often some custom support code in a project (just as with make files) that you need to learn the intricacies of, and also external 3pp modules that solve particular integration issues with building software that you also need to learn.
For me, base CMake is pretty easy by now, but I'd rather troubleshoot a makefile than some obscure 3pp CMake module that doesn't do what I want. Plain old makefiles are very hackable for better or worse [1]. It's easy to solve problems with make (in bespoke ways), and at the same time this is the big issue, causing lots of custom solutions of varying correctness.
[1]: Make is easy the same way C is easy.
einpoklum
11 hours ago
I didn't say "easy to deal with", I said it's not bespoke nonsense, and that you could keep it mostly unchanged today, 8 years later.
Plus - the "obscure third party modules" have been getting less obscure and more standard-ish. In 2017 it was already not bad, today it's better.
account42
a day ago
> Use a build system like make, you can't just `c++ build`
This is a strength not a weakness because it allows you to choose your build system independently of the language. It also means that you get build systems that can support compiling complex projects using multiple programming languages.
> Understand that C++ compilers by default have no idea where most things are, you have to tell them exactly where to search
This is a strength not a weakness because it allows you to organize your dependencies and their locations on your computer however you want and are not bound by whatever your language designer wants.
> Use an external tool that's not your build system or compiler to actually inform the compiler what those search paths are
This is a strength not a weakness because you are not bound to a particular way of how this should work.
> Oh also understand the compiler doesn't actually output what you want, you also need a linker
This is a strength not a weakness because now you can link together parts written in different programming languages which allows you to reuse good code instead of reinventing the universe.
> That linker also doesn't know where to find things, so you need the external tool to use it
This is a strength not a weakness for the reasons already mentioned above.
> Oh and you still have to use a package manager to install those dependencies to work with pkg-config, and it will install them globally. If you want to use it in different projects you better hope you're ok with them all sharing the same version.
This is a strength not a weakness because you can have fully offline builds including ways to distribute dependencies to air-gapped systems and are not reliant on one specific online service to do your job.
Also all of this is a non-issue if you use a half-modern build system. Conflating the language, compiler, build system and package manager is one of the main reason why I stay away from "modern" programming languages. You are basically arguing against the Unix philosophy of having different tools that work together with each tool focusing on one specific task. This allows different tools to evolve independently and for alternatives to exist rather than a single tool that has to fit everyone.
juliangmp
a day ago
> This is a strength not a weakness
Massive cope, there's no excuse for the lack of decent infrastructure. I mean, the C++ committee for years said explicitly that they don't care about infrastructure and build systems, so it's not really surprising.
gpderetta
a day ago
The reality is that for any moderately complex C++ application, actually compiling C++ code is only a small part of what the build system does.
raverbashing
a day ago
Well yeah
We have autoconf/automake checking if you're on a big endian PDP8 or if your compiler has support for cutting edge features like "bool"
palata
2 days ago
I can use pkg-config just fine.
Not sure how relevant the "in order to use a tool, you need to learn how to use the tool".
Or from the other side: not sure what I should think about the quality of the work produced by people who don't want to learn relatively basic skills... it does not take two PhDs to understand how to use pkg-config.
duped
2 days ago
I'm just pointing out that one reason devex sucks in C++ is because the fact you need a wide array of tools, that are non portable, and require learning and teaching magic incantations at the command line or in build scripts to work, doesn't foster what one could call a "good" experience.
Frankly the idea that your compiler driver should not be a basic build system, package manager, and linker is an idea best left in the 80s where it belongs.
menaerus
2 days ago
For most people this is a feature not a bug as you suggest. It may come across as PITA, and for many people will do, but as far as I am concerned, while also having experienced the pain of package managers in C++, this is the right way. In the end it's always about the trade-offs. And all the (large) codebases that used conan, bazel or vcpkg induced a magnitude more issues that you would have to handle which otherwise in a plain CMake you would not have. Package managers are for convenience but not all projects can afford themselves the trouble this convenience brings with it.
soanvig
a day ago
Coming from a different ground (TypeScript) I agree, in a sense that there is a line where apparent convenience because a trouble. JS ecosystem is known for its hype for build tools. Long term all of them become a problem due to trying to be more convenient, leading to more and more abstractions and hidden behaviors, which turns into a mess impossible to debug or solve when user diverges from author's happy path. Thus I promote using only the necessities, and gluing them together by yourself. Even if something doesn't work, at least it can be tracked down and solved.
palata
a day ago
> For most people this is a feature not a bug as you suggest.
Exactly: it makes many things nicer to use than the language package managers, e.g. when maintaining a Linux distribution.
But people generally don't know how one maintains a Linux distribution, so they can't really see the use-case, I guess.
palata
a day ago
> require learning and teaching magic incantations at the command line
That's exactly my point: if you think that calling `cmake --build build` is "magic", then maybe you don't have the right profile to use C++ in the first place, because you will have to learn some harder concepts there (like... pointers).
To be honest, I find it hard to understand how a software developer can write code and still consider that command line instructions are "magic incantations". To me it's like saying that calling a function like `println("Some text, {}, {}", some_parameter, some_other_parameter)` is a "magic incantation". Calling a function with parameters counts as "the basics" to me.
bluGill
2 days ago
that idea that packages and builds belongs to simple problem, large projects need things like more than one laguage and so end up fighting the language
duped
2 days ago
Every modern language seems to have an answer to this problem that C and C++ refuse to touch because it's out of scope for their respective committees and standards orgs
nothrabannosir
2 days ago
On the front page right now:
Shai-Hulud malware attack: Tinycolor and over 40 NPM packages compromised (stepsecurity.io)
935 points by jamesberthoty 16 hours ago | flag | hide | 730 comments
Maybe obstreperous dependency management ends up being the winning play in 2025 :)
pclmulqdq
2 days ago
C++ has a plethora of available build and package management systems. They just aren't bundled with the compiler. IMO that is a good thing, because it keeps the compiler writers honest.
jimbob45
2 days ago
You say that as if Cargo, MSBuild, and pip aren’t massively loved by their communities.
jcelerier
2 days ago
Coming from c++, pip and python dependency management is the bane of my life. How do you make a python software leveraging pytorch that will ship as a single .exe and be able to target whatever gpu the user has without downloads?
jononor
a day ago
Funnily enough a lot of the challenges in this particular case is related to PyTorch and CUDA being native code (mostly in C++). Of course combined with the fact that pip is not really adequate as a native/C++ code package manager.
Perhaps if C++ had a decent standardized package manager, the Python package system reuse that? ;p
gpderetta
a day ago
just wait for next week and python will get a better package manager!
pclmulqdq
2 days ago
"Massively loved" and "good decision" are orthogonal axes. See the current npm drama. People love wantonly importing dependencies the way they love drinking. Both feel great but neither is good for you.
asa400
2 days ago
Not that npm-style package management is the best we can do or anything, but I would be more sympathetic to this argument if C or C++ had a clearly better security story than JS, Python, etc. (pick your poison), but they're also disasters in this area.
What happens in practice is people end up writing their own insecure code instead of using someone else's insecure code. Of course, we can debate the tradeoffs of one or the other!
bluGill
a day ago
This isn't only about security. This is about interoperability, in the real world we mix (and should mix!) C, C++, Rust, python.... In the real world lawyers audit every dependency to ensure they can legally use it. In the real world we are responsible for our dependencies and so need to audit the code.
Defletter
a day ago
I'm getting the impression that C/C++ cultists love it whenever there's an npm exploit because then they can gleefully point at it and pretend that any first-party package manager for C/C++ would inevitably result in the same, nevermind the other languages that do not have this issue, or have it to a far, far lesser extent. Do these cultists just not use dependencies? Are they just [probably inexpertly] reinventing every wheel? Or do they use system packages like that's any better *cough* AUR exploits *cought*. While dependency hell on nodejs (and even Rust if we're honest) is certainly a concern, it's npm's permissiveness and lack of auditing that's the real problem. That's why Debian is so praised.
pclmulqdq
a day ago
What makes me a C++ "cultist"? I like the language, but I don't think it's a cult. And yes, they do implement their own wheel all the time (usually expertly) because libraries are reserved for functions that really need it: writing left pad is really easy. They also use third-party libraries all the time, too. They just generally pay attention to the source of that library. Google and Facebook also publish a lot of C++ libraries under one umbrella (abseil and folly respectively), and people often use one of them.
bluGill
a day ago
STOP SAYING CULTIST! The word has very strong meaning and does not apply to anyone working with C or C++. I take offense at being called a cultist just because I say C++ is not nearly as bad as the haters keep claiming it is - as well I should.
palata
a day ago
> Or do they use system packages like that's any better cough AUR exploits cought.
AUR stands for "Arch User Repository". It's not the official system repository.
> I'm getting the impression that C/C++ cultists love it whenever there's an npm exploit
I am not a C/C++ cultist at all, and I actually don't like C++ (the language) so much (I've worked with it for years). I, for one, do not love it when there is an exploit in a language package manager.
My problem with language package managers is that people love them precisely because they don't want to learn how to deal with dependencies. Which is actually the problem: if I pull a random Rust library, it will itself pull many transitive dependencies. I recently compared two implementations of the same standard (C++ vs Rust): in C++ it had 8 dependencies (I can audit that myself). In Rust... it had 260 of them. 260! I won't even read through all those names.
"It's too hard to add a dependency in C++" is, in my opinion, missing the point. In C++, you have to actually deal with the dependency. You know it exists, you have seen it at least once in your life. The fact that you can't easily pull 260 dependencies you have never heard about is a feature, not a bug.
I would be totally fine with great tooling like cargo, if it looked like the problem of random third-party dependencies was under control. But it is not. Not remotely.
> Do these cultists just not use dependencies?
I choose my dependencies carefully. If I need a couple functions from an open source dependency I don't know, I can often just pull those two functions and maintain them myself (instead of pulling the dependency and its 10 dependencies).
> Are they just [probably inexpertly] reinventing every wheel?
I find it ironic that when I explain that my problem is that I want to be able to audit (and maintain, if necessary) my dependencies, the answer that comes suggests that I am incompetent and "inexpertly" doing my job.
Would it make me more of an expert if I was pulling, running and distributing random code from the Internet without having the smallest clue about who wrote it?
Do I need to complain about how hard CMake is and compare a command line to a "magic incantation" to be considered an expert?
Defletter
a day ago
> AUR stands for "Arch User Repository". It's not the official system repository.
Okay... and? The point being made was that the issue of package managers remains: do you really think users are auditing all those "lib<slam-head-on-keyboard>" dependencies that they're forced to install? Whether they install those dependencies from the official repository or from homebrew, or nix, or AUR, or whatever, is immaterial, the developer washed their hands of this, instead leaving it to the user who in all likelihood knows significantly less than the developers to be able to make an informed decision, so they YOLO it. Third-party repositories would not exist if they had no utility. But this is why Debian is so revered: they understand this dynamic and so maintain repositories that can be trusted. Whereas the solution C/C++ cultists seem to implicitly prefer is having no repositories because dependencies are, at best, a slippery slope.
> "It's too hard to add a dependency in C++"
It's not hard to add a dependency. I actually prefer the dependencies-as-git-submodules approach to package managers: it's explicit and you know what you're getting and from where. But using those dependencies is a different story altogether. Don't you just love it when one or more of your dependencies has a completely different build system to the others? So now you have to start building dependencies independently, whose artefacts are in different places, etc, etc, this shouldn't be a problem.
> I, for one, do not love it when there is an exploit in a language package manager.
Oh please, I believe that about as much as ambulance chasers saying they don't love medical emergencies. Otherwise, why are any and all comments begging for a first-party package manager immediately swamped with strawmans about npm as if anyone is actually asking for that, instead of, say, what Zig or Go has? It's because of the cultism, and every npm exploit further entrenches it.
pclmulqdq
a day ago
C++ usage has nothing to do with static/dynamic linking. One is a language and the other is a way of using libraries. Dynamic linking gives you small binaries with a lot of cross-compatibility, and static linking gives you big binaries with known function. Most production C++ out there follows the same pattern as Rust and Go and uses static linking (where do you think Rust and Go got that pattern from?). Python is a weird language that has tons of dynamic linking while also having a big package manager, which is why pip is hell to use and PyTorch is infamously hard to install.
Dynamic linking shifts responsibility for the linked libraries over to the user and their OS, and if it's an Arch user using AUR they are likely very interested in assuming that risk for themselves. 99.9% of Linux users are using Debian or Ubuntu with apt for all these libs, and those maintainers do pay a lot of attention to libraries.
palata
a day ago
> But this is why Debian is so revered: they understand this dynamic and so maintain repositories that can be trusted.
So you do understand my point about AUR. AUR is like adding a third-party repo to your Debian configuration. So it's not a good example if you want to talk about official repositories.
Debian is a good example (it's not the only distribution that has that concept), which proves my point and not yours: this is better than unchecked repositories in terms of security.
> Whereas the solution C/C++ cultists seem to implicitly prefer is having no repositories because dependencies are, at best, a slippery slope.
Nobody says that ever. Either you make up your cult just to win an argument, or you don't understand what C/C++ people say. The whole goddamn point is to have a trusted system repository, and if you need to pull something that is not there, then you do it properly.
Which is better than pulling random stuff from random repositories, again.
> I actually prefer the dependencies-as-git-submodules approach
Oh right. So you do it wrong, it's good to know and it will answer your next complaint:
> Don't you just love it when one or more of your dependencies has a completely different build system to the others
I don't give a damn because I handle dependencies properly (not as git submodules). I don't have a single project where the dependencies all use the same build system. It's just not a problem at all, because I do it properly. What do I do then? Well exactly the same as what your system package manager does.
> this shouldn't be a problem.
I agree with you. Call it a footgun if you wish, you are the one pulling the trigger. It isn't a problem for me.
> why are any and all comments begging for a first-party package manager immediately swamped with strawmans about npm
Where did I do that?
> It's because of the cultism, and every npm exploit further entrenches it.
It's because npm is a good example of what happens when it goes out of control. Pip has the same problem, and Rust as well. But npm seems to be the worse, I guess because it's used by more people?
Defletter
a day ago
Your defensiveness is completely hindering you and I cannot be bothered with that so here are some much needed clarifications:
> I am not a C/C++ cultist at all, and I actually don't like C++ (the language) so much (I've worked with it for years). I, for one, do not love it when there is an exploit in a language package manager.
If you do neither of those things then did it ever occur to you that this might not be about YOU?
> I find it ironic that when I explain that my problem is that I want to be able to audit (and maintain, if necessary) my dependencies, the answer that comes suggests that I am incompetent and "inexpertly" doing my job.
Yeah, hi, no you didn't explain that. You're probably mistaking me for someone else in some other conversation you had. The only comment of yours prior to mine in the thread is you saying "I can use pkg-config just fine." And again, you're thinking that I'm calling YOU incompetent, or even that I'm calling you incompetent. But okay, I'm sure your code never has bugs, never has memory issues, is never poorly designed or untested, that you can whip out an OpenGL alternative whatever in no time and it be just as stable and battle-tested, and to say otherwise must be calling you incompetent. That makes total sense.
> AUR stands for "Arch User Repository". It's not the official system repository.
> So it's not a good example if you want to talk about official repositories.
I said system package, not official repository. I don't know why you keep insisting on countering an argument I did not make. Yes, system packages can be installed from unofficial repositories. I don't know how I could've made this clearer.
--
Overall, getting bored of this, though the part where you harp on about doing dependencies properly compared to me and not elaborating one bit is very funny. Have a nice day.
palata
a day ago
> Your defensiveness
Start by not calling everybody disagreeing with you a cultist, next time.
> I said system package, not official repository. I don't know why you keep insisting on countering an argument I did not make. Yes, system packages can be installed from unofficial repositories. I don't know how I could've made this clearer.
It's not that it is unclear, it's just that it doesn't make sense. When we compare npm to a system package manager in this context, the thing we compare is whether or not is it curated. Agreed, I was maybe not using the right words (I should have said curated package managers vs not curated package managers), but it did not occur to me that it was unclear because comparing npm to a system package manager makes no sense otherwise. It's all just installing binaries somewhere on disk.
AUR is much like npm in that it is not curated. So if you find that it is a security problem: great! We agree! If you want to pull something from AUR, you should read its PKGBUILD first. And if it pulls tens of packages from AUR, you should think twice before you actually install it. Just like if someone tells you to do `curl https://some_website.com/some_script.sh | sudo sh`, no matter how convenient that is.
Most Linux distributions have a curated repository, which is the default for the "system package manager". Obviously, if users add custom, not curated repositories, it's a security problem. AUR is a bad example because it isn't different from npm in that regard.
> though the part where you harp on about doing dependencies properly compared to me and not elaborating one bit is very funny
Well I did elaborate at least one bit, but I doubt you are interested in more details than what I wrote: "What do I do then? Well exactly the same as what your system package manager does."
I install the dependencies somewhere (just like the system package manager does), and I let my build system find them. It could be with CMake's `find_package`, it could be with pkg-config, whatever knows how to find packages. There is no need to install the dependencies in the place where the system package manager installs stuff: it can go anywhere you want. And you just tell CMake or pkg-config or Meson or whatever you use to look there, too.
Using git submodules is just a bad idea for many reasons, including the fact that you need all of them to use the same build system (which you mentioned), or that a clean build usually implies rebuilding the dependencies (for nothing) or that it doesn't work with package managers (system or not). And usually, projects that use git submodule only support that, without offering a way to use the system package(s).
Defletter
16 hours ago
> Start by not calling everybody disagreeing with you a cultist, next time.
You'd do very well as a culture war pundit. Clearly I wasn't describing a particular kind of person, no, I'm clearly I'm just talking about everyone I disagree with /s
palata
9 hours ago
So, not interested at all in how to deal with dependencies without git submodules, I reckon?
We can stop here indeed.
Defletter
2 hours ago
You misunderstand, I am already well aware. My comment about your lack of elaboration was not due to any ignorance on my part, but rather to point out how you assumed that and refused to elaborate anyway. The idea that I may have my reasons for preferring dependencies-as-git-submodules or their equivalents (like Zig's package system) never crossed your mind. Can't say I'm surprised. Oh well.
palata
a day ago
They are massively loved because people don't want to learn how it works. But the result is that people massively don't understand how package management works, and miss the real cost of dependencies.
pjmlp
a day ago
MSBuild also does C++.
palata
a day ago
Modern languages don't generally play nice with linux distributions, IMO.
C and C++ have an answer to the dependency problem, you just have to learn how to do it. It's not rocket science, but you have to learn something. Modern languages remove this barrier, so that people who don't want to learn can still produce stuff. Good for them.
bluGill
a day ago
no they don't - at least not a good answer. It generally amounts to running a different build system and waiting - this destroys parralism and slows the build down.
jchw
2 days ago
I'm not going to defend the fact that the C++ devex sucks. There are really a lot of reasons for it, some of which can't sensibly be blamed on the language and some of which absolutely can be. (Most of it probably just comes down to the language and tooling being really old and not having changed in some specific fundamental ways.)
However, it's definitely wrong to say that the typical tools are "non-portable". The UNIX-style C++ toolchains work basically anywhere, including Windows, although I admit some of the tools require MSys/Cygwin. You can definitely use GNU Makefiles with pkg-config using MSys2 and have a fine experience. Needless to say, this also works on Linux, macOS, FreeBSD, Solaris, etc. More modern tooling like CMake and Ninja work perfectly fine on Windows and don't need any special environment like Cygwin or MSys, can use your MSVC installation just fine.
I don't really think applying the mantra of Rust package management and build processes to C++ is a good idea. C++'s toolchain is amenable to many things that Rust and Cargo aren't. Instead, it'd be better to talk about why C++ sucks to use, and then try to figure out what steps could be taken to make it suck less. Like:
- Building C++ software is hard. There's no canonical build system, and many build systems are arcane.
This one really might be a tough nut to crack. The issue is that creating yet another system is bound to just cause xkcd 927. As it is, there are many popular ways to build, including GNU Make, GNU Autotools + Make, Meson, CMake, Visual Studio Solutions, etc.
CMake is the most obvious winner right now. It has achieved defacto standard support. It works on basically any operating system, and IDEs like CLion and Visual Studio 2022 have robust support for CMake projects.
Most importantly, building with CMake couldn't be much simpler. It looks like this:
$ cmake -B .build -S .
...
$ cmake --build .build
...
And you have a build in .build. I think this is acceptable. (A one-step build would be simpler, but this is definitely more flexible, I think it is very passable.)This does require learning CMake, and CMake lists files are definitely a bit ugly and sometimes confusing. Still, they are pretty practical, and rather easy to get started with, so I think it's a clear win. CMake is the "defacto" way to go here.
- Managing dependencies in C++ is hard. Sometimes you want external dependencies, sometimes you want vendored dependencies.
This problem's even worse. CMake helps a little here, because it has really robust mechanisms for finding external dependencies. However, while robust, the mechanism is definitely a bit arcane; it has two modes, the legacy Find scripts mode, and the newer Config mode, and some things like version constraints can have strange and surprising behavior (it differs on a lot of factors!)
But sometimes you don't want to use external dependencies, like on Windows, where it just doesn't make sense. What can do you really do here?
I think the most obvious thing to do is use vcpkg. As the name implies, it's Microsoft's solution to source-level dependencies. Using vcpkg with Visual Studio and CMake is relatively easy, and it can be configured with a couple of JSON files (and there is a simple CLI that you can use to add/remove dependencies, etc.) When you configure your CMake build, your dependencies will be fetched and built appropriately for your targets, and then CMake's find package mechanism can be used just as it is used for external dependencies.
CMake itself is also capable of vendoring projects within itself, and it's absolutely possible to support all three modalities of manual vendoring, vcpkg, and external dependencies. However, for obvious reasons this is generally not advisable. It's really complicated to write CMake scripts that actually work properly in every possible case, and many cases need to be prevented because they won't actually work.
All of that considered, I think the best existing solution here is CMake + vcpkg. When using external dependencies is desired, simply not using vcpkg is sufficient and the external dependencies will be picked up as long as they are installed. This gives an experience much closer to what you'd expect from a modern toolchain, but without limiting you from using external dependencies which is often unavoidable in C++ (especially on Linux.)
- Cross-compiling with C++ is hard.
In my opinion this is mostly not solved by the "defacto" toolchains. :)
It absolutely is possible to solve this. Clang is already better off than most of the other C++ toolchains in that it can handle cross-compiling with selecting cross-compile targets at runtime rather than build time. This avoids the issue in GCC where you need a toolchain built for each target triplet you wish to target, but you still run into the issue of needing libc/etc. for each target.
Both CMake and vcpkg technically do support cross-compilation to some extent, but I think it rarely works without some hacking around in practice, in contrast to something like Go.
If cross-compiling is a priority, the Zig toolchain offers a solution for C/C++ projects that includes both effortless cross-compiling as well as an easy to use build command. It is probably the closest to solving every (toolchain) problem C++ has, at least in theory. However, I think it doesn't really offer much for C/C++ dependencies yet. There were plans to integrate vcpkg for this I think, but I don't know where they went.
If Zig integrates vcpkg deeply, I think it would become the obvious choice for modern C++ projects.
I get that by not having a "standard" solution, C++ remains somewhat of a nightmare for people to get started in, and I've generally been doing very little C++ lately because of this. However I've found that there is actually a reasonable happy path in modern C++ development, and I'd definitely recommend beginners to go down that path if they want to use C++.
palata
a day ago
> Using vcpkg [...] When you configure your CMake build, your dependencies will be fetched and built appropriately for your targets, and then CMake's find package mechanism can be used just as it is used for external dependencies.
Yes! I believe this is powerful: if CMake is used properly, it does not have to know where the dependencies come from, it will just "find" them. So they could be installed on the system, or fetched by a package manager like vcpkg or conan, or just built and installed manually somewhere.
> Cross-compiling with C++ is hard.
Just wanted to mention the dockcross project here. I find it very useful (you just build in a docker container that has the toolchain setup for cross-compilation) and it "just works".
jandrese
a day ago
You also don't do "rustc build". Cargo is a build system too.
The whole point of pkg-config is to tell the compiler where those packages are.
I mean yeah, that's the point of having a tool like that. It's fine that the compiler doesn't know that, because its job is turning source into executables, not being the OS glue.
I'm not sure "having a linker" is a weakness? What are talking about?
It is true that you need to use the package manager to install the dependencies. This is more effort than having a package manager download them for you automatically, but on the other hand you don't end up in a situation where you need virtual environments for every application because they've all downloaded slightly different versions of the same packages. It's a bit of a philosophical argument as to what is the better solution.
The argument that it is too hard for students seems a bit overblown. The instructions for getting this up and running are:
1. apt install build-essential
2. extract the example files (Makefile and c file), cd into the directory
3. type "make"
4. run your program with ./programname
I'd argue that is fewer steps than setting up almost any IDE. The Makefile is 6 lines and is easy to adapt to any similar size project. The only major weakness is headers, in which case you can do something like: HEADERS=headerA.h headerB.h headerC.h
file1.o: $(HEADERS)
file2.o: $(HEADERS)
file3.o: $(HEADERS)
If you change any header it will trigger a full system rebuild, but on C projects this is fine for a long time. It's just annoying that you have to create a new entry for every c file you add to the project instead of being able to tell make to add that to every object automatically. I suspect there is a very arcane way to do this, but I try to keep it as simple as possible.steveklabnik
a day ago
I'm not your parent, but the overall point of this kind of thing is that all of these individual steps are more annoying and error-prone than one command that just takes care of it. `cargo build` is all you need to build the vast majority of Rust projects. No need to edit the Makefile for those headers, or remember which commands you need to install the various dependencies, and name them individually, figuring out which name maps to your distro's naming scheme, etc. It's not just "one command vs five" it's "one command for every project vs five commands that differ slightly per project and per platform". `make` can come close to this, and it's why people love `./configure; make`, and there's no inherent reason why this couldn't paper over some more differences to make it near universal, but that still only gets you Unix platforms.
> but on the other hand you don't end up in a situation where you need virtual environments for every application because they've all downloaded slightly different versions of the same packages.
The real downside here is that if you need two different programs with two different versions of packages, you're stuck. This is often mitigated by things like foo vs foo2, but I have been in a situation where two projects both rely on different versions of foo2, and cannot be unified. The per-project dependency strategy handles this with ease, the global strategy cannot.
benreesman
2 days ago
clang++ $(pkg-config --cflags --libs libtorch) qwen-3-nvfp4.cpp -o ./qwen-3-infer
Your move.
user
2 days ago
worik
2 days ago
None of that is a problem
There are a lot of problems, but having to carefully construct the build environment is a minor one time hassle.
Then repeated foot guns going off, no toes left, company bankrupt and banking system crashed, again
vlovich123
2 days ago
> There are a lot of problems, but having to carefully construct the build environment is a minor one time hassle.
I've observed the existence in larger projects of "build engineers" whose sole job is to keep the project building on a regular cadence. These jobs predominantly seem to exist in C++ land.
TuxSH
2 days ago
> These jobs predominantly seem to exist in C++ land.
You wish.
These jobs exist for companies with large monorepos in other languages too and/or when you have many projects.
Plenty of stuff to handle in big companies (directory ownership, Jenkins setup, in-company dependency management and release versioning, developer experience in genernal, etc.)
josefx
2 days ago
Most of what I have seen came from technical debt aquired over decades. With some of the build engineers hired to "manage" that themselves not being treated as programmers and just adding on top of the mess with "fixes" that are never reviewed or even checked in. Had a fun time once after we reinstalled the build server and found out that the last build engineer created a local folder to store various dependencies instead of of using vcpkg to fetch everything as we had mandated for several years by then.
pjmlp
2 days ago
I have been "build engineer" across many projects, regardless of the set of programming languages being used, this is not specific to C++.
jandrewrogers
2 days ago
I’ve only ever seen this on extraordinarily complex codebases that mixed several languages. Pure C++ scales really well these days.
wojciii
2 days ago
How is that even possible?
Wasn't CI invented to solve just this problem?
vlovich123
2 days ago
You have a team of 20 engineers on a project you want to maintain velocity on. With that many coooks, you have patches on top of patches of your build system where everyone does the bare minimum to meet the near term task only and it devolves into a mess no one wants to touch over enough time.
Your choice: do you have the most senior engineers spend time sporadically maintaining the build system, perhaps declaring fires to try to pay off tech debt, or hire someone full time, perhaps cheaper and with better expertise, dedicated to the task instead?
CI is an orthogonal problem but that too requires maintenance - do you maintain it ad-hoc or make it the official responsibility for someone to keep maintained and flexible for the team’s needs?
I think you think I’m saying the task is keeping the build green whereas I’m saying someone has to keep the system that’s keeping the build green going and functional.
AdieuToLogic
2 days ago
> You have a team of 20 engineers on a project you want to maintain velocity on. With that many coooks, you have patches on top of patches of your build system ...
The scenario you are describing does not make sense for the commonly accepted industry definition of "build system." It would make sense if, instead, the description was "application", "product", or "system."
Many software engineers use and interpret the phrase "build system" to be something akin to make[0] or similar solution used to produce executable artifacts from source code assets.
0 - https://man.freebsd.org/cgi/man.cgi?query=make&apropos=0&sek...
vlovich123
2 days ago
I can only relate to you what I’ve observed. Engineers were hired to rewrite the Make-based system into Bazel and maintain it for single executable distributed to the edge. I’ve also observed this for embedded applications and other stuff.
I’m not sure why you’re dismissing it as something else without knowing any of the details or presuming I don’t know what I’m talking about.
juliangmp
a day ago
> minor one time hassle
I dont know if you're jokingor just naïve, but cmake and the like are massive time sinks if you want anything beyond "here's a few source files, make me an application"
rendaw
a day ago
I don't plan on ever using C++ again, but FWIW in Rust there are lots of cases where you specify `move` and stuff doesn't get moved, or don't specify it and it does, and it's also a specific feeling.
einpoklum
a day ago
> why std::move doesn't move
Whenever someone asks you about std::move, or -values I have a ready-made answer for that: https://stackoverflow.com/a/27026280/1593077
but it's true that when a user first sees `std::move(x)` with no argument saying _where_ to move it to, they either get frustrated or understand they have to get philosophical :-)
jjaksic
2 days ago
Wow, I don't understand what anything means in those memes. And I'm so glad I don't!
It seems to me that the people/committees who built C++ just spent decades inventing new and creative ways for developers to shoot themselves in the foot. Like, why does the language need to offer a hundred different ways to accomplish each trivial task (and 98 of them are bad)?
fouronnes3
2 days ago
The road to hell is paved with 40 years of backwards compatibility requirements.
account42
a day ago
Ignorance is not something you should be proud of.
TheOtherHobbes
a day ago
C++ is the only language which invests in archaeology over futurism.
You get to choose between 25 flint-bladed axes, some of which are coated in modern plastic, when you really want a chainsaw.
saghm
a day ago
Being wise enough to know what's worth spending time to learn is, though. Knowledge isn't free.