One thing I like about Rust is that it prevents you from doing stupid things on the compiler level. I write a little bit of C/C++ and Rust. If you don't do C++ on a daily basis, you will silently introduce problems in the code that are very hard to spot. You just need to have a very good mental model of how to write good C++. It requires constant exercising.
For Rust, you just have to fight the compiler. This is especially useful when you have people on your team with some experience who also want to contribute, but you don't want to constantly point them in the right direction.
I actually have no idea how big teams work on large C++ codebases. Usually, you need to have a good idea of how the whole thing works. You can change one part of the code, and it will introduce bugs in the whole project because of how the memory is handled. Isolated changes are hard. And historically, a lot of C++ codebases lack good test coverage.
> If you don't do C++ on a daily basis, you will silently introduce problems
Even if you do, you still will. Just less often.
> I actually have no idea how big teams work on large C++ codebases... You can change one part of the code, and it will introduce bugs in the whole project because of how the memory is handled
Part of it is lots of tests, sanitizers, assertions, etc.
Part of it is keeping things modular and avoiding spooky action at a distance to the extent possible.
Part of it is unavoidable, and that's why people are moving to safer languages.
Your post reminds me of the old runtime vs compile time language debates of old (for me). Some would argue duck typing is all that we need, and that lots of tests can cover the missing types/etc. Eventually i realized that i'm just manually implementing compile time typing by way of robust tests to cover interface requirements.
Note that duck-typing can still be a compile-time thing: it's basically what you'd get in C++ if you use auto and templates for everything.
The trade-off between compile-time and run-time checking depends in large part on the time needed to address the issue. It's not really black and white. People just don't want to wait forever for static verification - I think that's kind of why clang-static-analyzer isn't used as much as clang-tidy.
> I actually have no idea how big teams work on large C++ codebases.
They choose a memory management strategy and stick to it. Of course, the problem, relative to something like rust, is the compiler doesn't enforce it. You can use linting tools and/or reviews.
> Usually, you need to have a good idea of how the whole thing works. You can change one part of the code, and it will introduce bugs in the whole project
That's not a problem with C++ specifically. That's a problem with organization. It's probably best know as the "Big Ball of Mud" architecture[1]. Rust has no particular defense against it, nor other languages that I am familiar with. If you don't see it as much with rust it's only because it takes time to develop.
(counter-intuitively, it's an impressively successful architecture -- so many long-lived projects use it).
> I actually have no idea how big teams work on large C++ codebases
Well yeah, you don't. Most people who comment on these sorts of threads don't, which obviously colors their bias in favor of the solution they do understand.
This lists all the reasons to use Rust, then handwaves about some nonsense, then declares victory for C++. The maturity argument made a lot more sense when Rust was 3 years old, rather than 10; the libraries argument is plain silly because library management is horrible in C++ and all the listed libraries are by comparison essentially one-click to use in Rust; the entire article also seems AI-generated.
Good for them. I like C++. It is a language that supports both being close to the computer and abstraction. I studied Rust a bit but it seems that their rules exclude some perfectly good software designs. If two classes need to work together as equals so class A has a reference to B and class B has a reference to A this is not very well possible. Especially if both A and B have multiple instances that are stored in containers. This is common with the bridge design pattern.
The language is improving and improving. Some years ago it was way too difficult for the speed gain to be worth it.
But it's become more and more easy to write. Many of the safety arguments Rust has are still technically true, but 90% less true than 6 years ago.
The C++ community is also really friendly and open minded.
It's hard to explain but C++ also has this nice relaxing feel when writing it. Like doing a puzzle. Maybe the cognitive load is very evenly spread? Or the header hpp model forces you to think first in data models and interfaces? I have no idea it's mainly a feeling.
I might be biased but I found it that the people who came to interview for the Rust roles of my company were noticeably better (or at least better at interviewing) than the applicants for the Java roles. More knowledgeable on the theory, struggled less on the hard things, more up to date on their tech watch
Isn't that pretty common in all languages that are not big?
They attract people truly interested in programming, not those going through the motions at a job.
I have heard this same thing with Haskell, Lisp, and many other languages out of the mainstream Java/C#/Javascript/Ruby.
Well, now you name something. Java is the most bureaucratic language only a matching personality or a programmer who is not very good would tolerate it.
This kind of generalization is so old. Imagine someone told you that "I've noticed that people of race A are noticeably better than people of race B". If you think it's different, just consider that people don't really have much of a choice in either case: there's a lot of Java developers because there's a lot of Java jobs, universities teach it, and it's been around a fairly long time when compared to Rust, at least... i.e. there's a lot of forces pushing people to using Java, and once you learn a language and get a job in such language, there's a lot of inertia that will keep most people on that same language for a long time. Yes, they can eventually choose something else, but only if that's a thing in your region, which may not be the case at all... and in any case, why would you go out of your way to change? Java used to be pretty terrible, but these days, it's a decent language. I can do Rust, but I still would pick Java for most projects given the choice, unless it was something where performance mattered more than the easiness of hiring Java developers and finding Java libs (though I admit Rust is catching up, there's a lot of libs now! However, one missing lib and you may be stuck for months having to port or implement something yourself - while in Java, or C++ for that matter, chances are slim you'll be in that situation).
I feel you're objecting to the wrong comment. sebstefan appropriately specified "the people who came to interview for the Rust roles of my company [...]", and was giving it as a counterpoint to the previous comment's broad generalization.
> there's a lot of Java developers because there's a lot of Java jobs, universities teach it, and it's been around a fairly long time when compared to Rust, at least... i.e. there's a lot of forces pushing people to using Java, and once you learn a language and get a job in such language, there's a lot of inertia that will keep most people on that same language for a long time.
The fact that one language is the mainstream default taught in many schools, whereas the other requires going out of your way to pick up, could well be a factor in the latter having more knowledgeable average applicants - in the same way I'd expect Gentoo users to be more technologically competent than Windows users on average.
> Yes, they can eventually choose something else, but only if that's a thing in your region, which may not be the case at all...
I presume the majority of Rust programmers learn it online.
I think, that complexity cannot be eliminated, but it can be hidden and distributed, using the right abstraction
that being said C++ being a big language adds complexity (stemming from the language itself, i.e. stemming from the tool)
So you can use a complex tool, to make a complex task simple, or a simple tool and keep the task more complex, requiring more steps etc..
But with C++ its a complex tools, that while it takes some complexity from the task, I think it adds enough complexity, that could outweigh the complexity it reduces
Rust has become fairly big now, no? Is there some objective metric that can show Rust is a "smaller" language (I bet it is, but I don't think it's by a lot)?
There’s not really any metric of a language being objectively “big” or “small” but I could point at several features where the Rust feature is significantly less complex than the C++ one. Additionally, C++ has more features that Rust doesn’t than the reverse.
This is our code. That is their code. Depend only on the interface of their code and not the implementation. You can look at their code for curiosity but don't depend on the implementation of their code in our code.
Then you don't care what subset of the language their code is written in.
This fails to address the question you replied to. When "our code" is memory-safe and "their code" isn't, it's still "Our Product™" that ends up with a CVE that costs us money and reputation, because users don't care about whose code it was.
From my experience no language bitrots and degrades worse than a C++ codebase.
The language standard has changed so much, the tooling, trendy libraries and the established conventions... It takes a herculean effort to keep a given source tree up to date.
Dive into a C++ repo started even 10-15 years ago and it can be a revolting experience, let alone one from back in the 90s.
And then from company to company conventions and expectations just vary dramatically.
When I was at Google we had a large committee of very smart people who applied monorepository wide modernizations across the whole repository, introduced amazing tooling and analysis tools, and imposed a very strict style guide that kept people fairly disciplined. But that was a herculean effort which most other organizations can't afford.
Rust has all sorts of problems (including specific ones for DB internals or OS development). But what's amazing when I read these articles is they don't actually seem to mention those specific problems that I've encountered in my last 3 years of professional Rust work. Instead they read like rationalizations by people who have a certain hammer they've gotten really skilled at using, and don't want to give it up.
That's fine if it keeps your organization productive, but I see no reason to publish about it?
If I were to make a list of gripes about Rust for this kind of work it would primarily emphasize the continued lack of acceptance/conclusion of the allocator-api (or competing) proposals, and the rather chaotic and unprofessional (and potentially insecure) nature of the way Cargo project dependencies explode into a hard-to-reason-about mess.
But the list they make? io_uring, mimalloc, and performance oriented networking are... not problems to use in Rust, not complicated at all. I assume the same (or better) for Zig.
> the rather chaotic and unprofessional (and potentially insecure) nature of the way Cargo project dependencies explode into a hard-to-reason-about mess.
this is one of my biggest gripes, too. that alone has been enough to cause me to avoid Rust for projects wherefore it would otherwise be a good fit. you can pull in "one" dependency and find yourself downloading hundreds of gigabytes of zillions of tiny dependencies, sometimes the same one at multiple versions. it's by no means a problem exclusive to Rust, but that's no excuse.
it's been a while, but my other major gripe was the way so many crates would require the nightly. the rust devs have done a good job maintaining backward compatibility between stable releases, but afaik there isn't any guarantee regarding the nightly. keeping up with the nightly is infeasible when each compiler release and all your dependencies needs to be vetted by your security team.
i also long found myself disappointed by the lack of a real specification, but that one is relatively minor. less of a frustration.
It's worth pointing out that you can absolutely use Rust without Cargo and crates.io and the culture that comes with.
You'll be swimming up against the stream. But arguably it makes sense for certain kinds of projects. I'd classify OS kernel and DB internals development as being those kinds of projects, TBH. Keep your dependency set extremely minimal, vendor it, and avoid crates.io entirely.
I don't actually run into nightly requirements... ever? These days.
I don't know if the stuff about the JVM is even true. I grant that Redpanda is written in C++, but it isn't clear that its performance advantages over Kafka are due to that rather than to the fact that Kafka was implemented in a performance-oblivious way by people who did not know anything about software efficiency. This doesn't reflect on the JVM. You can write a high performing system in Java and the modern JDK is a state-of-the-art toolchain that provides features that many C++ projects struggle with.
The by far biggest issue for Java is that they still haven't gotten their act together on value-types (project Valhalla), going back in time, the one thing someone should have told the designers was to not to release the erasing generics in Java 1.5 and go for something like C# did with value type structs.
Not really for "purity" issues, but rather due to the fact that memory speeds and main memory latency patterns that started to emerge as problems in the early 00s only got worse over time and having the erasing generics kind of cemented the memory access patterns.
The Java teams has done some truly amazing things in terms of GC research, but much of it is needed simply because the Java and JVM memory model (while "simple") is very allocation-heavy compared to C# that went for value types very early.
Take a peek at the QuestDB source code(Java) for heavy data-manipulation tasks, it's not really written in an idiomatic Java style to avoid GC costs (strongly reminicent of the way some people coded for JavaME back in the early 00s), a C# port would not be entirely idiomatic either but far more so than the existing code.
Full agree. Value types are underrated. That's one of the best things about C# and Go. They are increasing performance (with contiguous memory) and reducing GC pressure. I also believe that dynamically typed languages like Python would gain a lot by introducing a form of value types/unboxed values. For example, Cinder, Meta's internal version of CPython, supports what they call "static classes": https://github.com/facebookincubator/cinder?tab=readme-ov-fi....
The reasons to use C++ over rust in this article are not good. The reason I would have picked is C++ does a better job matching mental models (partly because of it's flexibility) and it's easier to say what you mean. Value semantics by default also make it easier to write functional style code.
It sounds like the rationale is that existing database technology is already written in it, and they want to re-use some of it. That's reasonable, but I do think that it only makes sense with the assumption that the flaws in C++ aren't large enough to be worth using something else, at which point there isn't really much need to justify using C++ at all. If someone is concerned about the flaws in C++, the benefit of relying on existing C++ libraries isn't going to seem as compelling to them for the exact same reasons they don't want to use C++ for their own code.
At the end of the day, the choice seems to be a bit circular; if you don't have concerns about C++, you'll find plenty of reasons to use it, and the arguments against it aren't going to be compelling. If you have concerns about it, the reasons to use it won't be compelling, and you'll likely agree with the arguments against it. I have to imagine that whether someone agrees with this choice will be entirely consistent with their existing opinions of C++; it doesn't seem like there are any new arguments left to make on this topic, so debates on the topic will inevitably rehash existing arguments (regardless of which side they come from) and only appeal to the people who already have formed their opinions based on finding those arguments compelling to begin with.
I don't think it's quite as circular as you're making it sound. If someone has a prior contraint of needing to move quickly (which is common in startups) it can make sense to choose any arbitrary technology, if it allows them to do so. I don't think someone developing a new game in C++ necessarily has no concerns about C++, that's just the language that all the console SDKs use. I don't think someone doing data science in Python necessarily likes Python, that's just the language that most models and libraries use (and that person probably has a deadline to publish a paper!)
Another factor to consider is that, if one is indeed trying to reuse code from existing databases (regardless of the reason for doing so), code from projects like Sqlite and FoundationDB is simply far less likely to contain serious bugs than any newer Rust-based option. There are way more mistakes one can make when writing a database than just memory safety mistakes, and the mistakes tend to be extremely subtle. Code having been run in production for long periods of time under significant amounts of load is basically a fundamental prerequisite for it to make any sense to trust the data of your users to it.
As Richard Restak postulates in his book “The Naked Brain”[0]: the limbic system provides a gut feeling (usually from comfort) and we rationalise our way backwards from that without being able to really pinpoint “why”; usually the “why” is secondary and only added as justification for the feeling post-decision.
One, C/C++ interop is a priority since they will interoperate with a large variety of C/C++ APIs (sounds like one of the main points of their project is to integrate things that are largely implemented in C/C++).
Two, they say their aim is "building a lasting system that will support decades of continued improvements." You want confidence that 99.9% of the code you write today remains just as good 20, 30, 50 years from now. I don't think rust is quite there yet (or maybe it is but hasn't yet proven it).
The lasting system bit is very weak. Those old languages are still getting regular releases. Also Rust barely changes from one version to the next, entirely unlike C++!
Rust interop to the libraries they list is not an issue.
If you are good at c++ and want to continue building software in c++ it is fine. Just be honest about it.
I think you can over examine metaphors like this but yeah, driving manual is more fun when you’re going on a fun drive. When you’re commuting in start stop traffic… less fun.
(and in this scenario you’re also usually sharing the car with other people so driving automatic would make everyone’s lives easier)
Fun isn't the point, and it's not (necessarily) a dig. I drive a manual transmission car as my daily driver. I enjoy it. I would never try to say it's better than a modern automatic, though - because that would be wholly incorrect at this point by any objective measure.
I'm noting that C++ vs Rust is basically this: every article that someone writes which goes over "we're still choosing C++" has the same vibe as people who choose manual transmissions in 2025. There's no real reason to do so at this point, other than if you want to.
These will be slowly eaten by Rust, but sure, we can agree that there might be a small carve-out for that sector of software. It's not a hill I'm willing to die on.
This seems like a bit of selection bias; people who learn it are more likely to be the type who anticipate enjoying it, and people who don't find learning it appealing probably wouldn't enjoy it if they did. I haven't driven in over a decade partially due to disliking it, and I have to assume that I'd dislike driving manual even more.
Think of it as, I'm a good driver and like fun so i chose a manual Ferrari as one of two family cars, then I borrow the car to one of my kids (junior dev) that doesn't have experience with high powered manual cars and crashes.
I suspect the H pattern they mean is where the gear shift is on the steering column, not on the floor.
I long ago owned a 1945 Dodge truck with that shifting setup.
Nope, but I appreciate the effort. I threw the H-pattern reference in there without thinking about it too much mostly to differentiate the reference from situations where you're shifting but it's not with a lever.
(I've been stuck on planes for 20 hours with little sleep, so ignore it if it doesn't make too much sense lol)
I mean it's an analogy, so it's not directly the original comparison. I hate C++ but I concede it's possible to get the most performance from C++ if you are diligent. My argument there would be: Ok, use C++ and C (or even machine code) for the most performance sensitive parts, but then use a safer easier language everywhere else.
I almost had a panic attack driving an _automatic_ up Lombard. (Sadly an old minivan with bad-lish tires).
In stop and go traffic (is there any other traffic in SFO?) it moved so much backwards and spun so hard every time I tried to move forward/upward.. I swore never to return. I haven't been back on that road since.
20 years ago, sure. Now if you want fun torque you drive electric and blink with confusion at manual-snobs.
I learned on stick and I still feel a nostalgic appeal, sure. I test drove a used hothatch Volvo C30 T5 Polestar edition last fall before ultimately settling on an electric performance car (Polestar 2) to feed my midlife indulgences. And I have to admit a certain ... thrill... from the turbo lag and the process of shifting.
But it all seems a bit silly when compared to instant torque at any RPM.
Manual snobs are a tiring bunch, it's not an inherently better transmission or way to drive. That said, instant torque isn't really that special and feels like a novelty after the first few times you've floored it.
When an EV actually has a suspension setup and overall weight that doesn't feel like I'm piloting a boat at sea, then I'll probably care. Porsche & co seem like they're still aiming for it so I've got some hope we get there.
> Over the past several years, the EloqData team has worked tirelessly to develop this software, ensuring it meets the highest standards of performance and scalability. One key detail we’d like to share is that the majority of EloqKV’s codebase was written in C++.
This is a very interesting approach to marketing a new database
If Rust was like Communism, there would be dozens of examples of massive codebases that ported to Rust and ended up with massive infighting, culminating in 20% of the developers being assassinated by the moderation team.
> In particular, the most harsh arguments against using C++, i.e. memory unsafeness, can be significantly mitigated when developing with a certain modern subset of the C++ language.
So the quest for the one true “modern subset” of C++ continues.
How do developers continue believing in this after a decade of the standards committee proving over and over again that they’re not interested in this and won’t contribute toward it?
The article isn't claiming there is a "one true modern subset" of C++ that they use. It's merely pointing out that you can significantly mitigate the main criticisms of C++ by making certain sacrifices, which is pretty much true.
There are good reasons the standards committee doesn't make those sacrifices on your behalf, because ultimately there are tradeoffs there that the programmer is supposed to understand and have control over. However, there is an argument to be had about what the default "safety setting" should be and whether C++ makes a good choice. IMO that's actually the main difference between safety in Rust and C++, since you can make Rust just as unsafe as C++ if you want, only you need to explicitly mark your code as unsafe.
Also, I believe the C++ standards committee does care about this, which is why Profiles [1] are being considered.
> The article isn't claiming there is a "one true modern subset" of C++ that they use.
They’re arguing that they’ve found a sufficiently safe subset, which if true would be the first.
It would be a waste of time to relitigate the many programmer-hostile, unsafe decisions the committee has made over the years. I think your programmers who are supposed to “understand and have control over” C++’s arsenal of footguns are more or less fictional and a language designer shouldn’t take them as intended audience.
Their reasoning is reasonable. And besides, when your existing team is already proficient in Cpp, arguing that "they should learn X instead" seems very risky and unwise.
Then just say so and it is fine. We are a team of c++ experts and we are able to build a high quality program in C++ is a completely valid unassailable reason. Except for the most ardent rust fanatic.
Rewriting an existing working codebase rather than committing to its incremental improvement is usually a sign of bad decision making. So I'd be concerned if a team that had an existing mature codebase in C++ and a team of engineers hired on that path were taking on such a venture.
But that's kinda... not a bloggable topic, frankly.
Rust was designed to facilitate incremental rewrites of an existing C++ library, making cross-language testing and validation very ergonomic; why C++ developers rarely take advantage of this (or even account for it as an option in their analysis of alternatives) is an exercise left to the reader.
> Rust was designed to facilitate incremental rewrites of an existing C++ library
Do you have a source for this claim? Rust is a fine language (though its advocates can be a bit belligerent sometimes). But, as a matter of fact, Rust was not designed for a easy interoperability with C++ or to make gradual rewrites easy.
One design constraint of Rust was to be able to be incrementally included in a large C++ codebase: Firefox.
It turns out that this kind of interop with C++ directly is extremely difficult, and so it isn’t super smooth right now. Interop with C was prioritized and you get zero overhead there. And various projects ease Rust <-> C++ via the C ABI.
As far as I can tell, modern C++20/23 is as safe (if not safer) than rust. So much of rust compares itself to C++99, where modern C++ doesn't use exceptions, has smart pointers (RAII), improved casting and array management, and has an extensive suite of checking tools and flags. The conversations I've seen at my company for using rust tend to be "well it would be tun to do something different", which just aren't very compelling to me. I worry Rust is going to end up like Haskell in 5 or so years
C++ is getting safer, but it has a long way to go to match Rust's safety guarantees. Google is doing a lot with spatial safety with hardened libc++, bounds checks for C-style arrays, and safe buffers; but temporal safety is a lot harder without more information in the source code.
Running sanitizers and such is quite expensive too. It burns a lot of cycles to run msan, asan, tsan, valgrind, etc.
Whereas catching these bugs at compile time saves everyone a lot of time and money.
TBH I don't find the reasons in the article particularly compelling. Rust has a lot of industry backing now and is pretty clearly the way forward to systems programming. Writing Rust wrappers over the various libraries they use is largely a one-and-done issue, and they can publish them to Cargo and share the load of keeping them updated. If ISO or various governments get their act together with a real software liability regime or cyber security requirements, companies with big legacy C++ code bases will be in a tough spot. Second best time to start writing safe code in your project is now.
One thing I like about Rust is that it prevents you from doing stupid things on the compiler level. I write a little bit of C/C++ and Rust. If you don't do C++ on a daily basis, you will silently introduce problems in the code that are very hard to spot. You just need to have a very good mental model of how to write good C++. It requires constant exercising.
For Rust, you just have to fight the compiler. This is especially useful when you have people on your team with some experience who also want to contribute, but you don't want to constantly point them in the right direction.
I actually have no idea how big teams work on large C++ codebases. Usually, you need to have a good idea of how the whole thing works. You can change one part of the code, and it will introduce bugs in the whole project because of how the memory is handled. Isolated changes are hard. And historically, a lot of C++ codebases lack good test coverage.
> If you don't do C++ on a daily basis, you will silently introduce problems
Even if you do, you still will. Just less often.
> I actually have no idea how big teams work on large C++ codebases... You can change one part of the code, and it will introduce bugs in the whole project because of how the memory is handled
Part of it is lots of tests, sanitizers, assertions, etc.
Part of it is keeping things modular and avoiding spooky action at a distance to the extent possible.
Part of it is unavoidable, and that's why people are moving to safer languages.
Your post reminds me of the old runtime vs compile time language debates of old (for me). Some would argue duck typing is all that we need, and that lots of tests can cover the missing types/etc. Eventually i realized that i'm just manually implementing compile time typing by way of robust tests to cover interface requirements.
> Eventually i realized that i'm just manually implementing compile time typing by way of robust tests to cover interface requirements
This, so much this!
Note that duck-typing can still be a compile-time thing: it's basically what you'd get in C++ if you use auto and templates for everything.
The trade-off between compile-time and run-time checking depends in large part on the time needed to address the issue. It's not really black and white. People just don't want to wait forever for static verification - I think that's kind of why clang-static-analyzer isn't used as much as clang-tidy.
> I actually have no idea how big teams work on large C++ codebases.
They choose a memory management strategy and stick to it. Of course, the problem, relative to something like rust, is the compiler doesn't enforce it. You can use linting tools and/or reviews.
> Usually, you need to have a good idea of how the whole thing works. You can change one part of the code, and it will introduce bugs in the whole project
That's not a problem with C++ specifically. That's a problem with organization. It's probably best know as the "Big Ball of Mud" architecture[1]. Rust has no particular defense against it, nor other languages that I am familiar with. If you don't see it as much with rust it's only because it takes time to develop. (counter-intuitively, it's an impressively successful architecture -- so many long-lived projects use it).
[1] http://www.laputan.org/mud/
> I actually have no idea how big teams work on large C++ codebases
Well yeah, you don't. Most people who comment on these sorts of threads don't, which obviously colors their bias in favor of the solution they do understand.
> You can change one part of the code, and it will introduce bugs in the whole project because of how the memory is handled
Why would it do that?
I suppose because it will start writing in memory that is handled by other parts of the code and corrupting it.
Honestly after 6 months to a year of constant Rust development you don't even fight with the compiler anymore. Instead it's mostly just your friend.
There are still logical holes in the borrow checker, but they're mostly irrelevant.
This lists all the reasons to use Rust, then handwaves about some nonsense, then declares victory for C++. The maturity argument made a lot more sense when Rust was 3 years old, rather than 10; the libraries argument is plain silly because library management is horrible in C++ and all the listed libraries are by comparison essentially one-click to use in Rust; the entire article also seems AI-generated.
Good for them. I like C++. It is a language that supports both being close to the computer and abstraction. I studied Rust a bit but it seems that their rules exclude some perfectly good software designs. If two classes need to work together as equals so class A has a reference to B and class B has a reference to A this is not very well possible. Especially if both A and B have multiple instances that are stored in containers. This is common with the bridge design pattern.
I would also choose C++
The language is improving and improving. Some years ago it was way too difficult for the speed gain to be worth it.
But it's become more and more easy to write. Many of the safety arguments Rust has are still technically true, but 90% less true than 6 years ago.
The C++ community is also really friendly and open minded.
It's hard to explain but C++ also has this nice relaxing feel when writing it. Like doing a puzzle. Maybe the cognitive load is very evenly spread? Or the header hpp model forces you to think first in data models and interfaces? I have no idea it's mainly a feeling.
I guess the Rust workforce is tiny, opinionated and mentally demanding.
I might be biased but I found it that the people who came to interview for the Rust roles of my company were noticeably better (or at least better at interviewing) than the applicants for the Java roles. More knowledgeable on the theory, struggled less on the hard things, more up to date on their tech watch
Isn't that pretty common in all languages that are not big?
They attract people truly interested in programming, not those going through the motions at a job. I have heard this same thing with Haskell, Lisp, and many other languages out of the mainstream Java/C#/Javascript/Ruby.
Well, now you name something. Java is the most bureaucratic language only a matching personality or a programmer who is not very good would tolerate it.
Maybe 10 years ago. This too has changed.
This kind of generalization is so old. Imagine someone told you that "I've noticed that people of race A are noticeably better than people of race B". If you think it's different, just consider that people don't really have much of a choice in either case: there's a lot of Java developers because there's a lot of Java jobs, universities teach it, and it's been around a fairly long time when compared to Rust, at least... i.e. there's a lot of forces pushing people to using Java, and once you learn a language and get a job in such language, there's a lot of inertia that will keep most people on that same language for a long time. Yes, they can eventually choose something else, but only if that's a thing in your region, which may not be the case at all... and in any case, why would you go out of your way to change? Java used to be pretty terrible, but these days, it's a decent language. I can do Rust, but I still would pick Java for most projects given the choice, unless it was something where performance mattered more than the easiness of hiring Java developers and finding Java libs (though I admit Rust is catching up, there's a lot of libs now! However, one missing lib and you may be stuck for months having to port or implement something yourself - while in Java, or C++ for that matter, chances are slim you'll be in that situation).
> This kind of generalization is so old.
I feel you're objecting to the wrong comment. sebstefan appropriately specified "the people who came to interview for the Rust roles of my company [...]", and was giving it as a counterpoint to the previous comment's broad generalization.
> there's a lot of Java developers because there's a lot of Java jobs, universities teach it, and it's been around a fairly long time when compared to Rust, at least... i.e. there's a lot of forces pushing people to using Java, and once you learn a language and get a job in such language, there's a lot of inertia that will keep most people on that same language for a long time.
The fact that one language is the mainstream default taught in many schools, whereas the other requires going out of your way to pick up, could well be a factor in the latter having more knowledgeable average applicants - in the same way I'd expect Gentoo users to be more technologically competent than Windows users on average.
> Yes, they can eventually choose something else, but only if that's a thing in your region, which may not be the case at all...
I presume the majority of Rust programmers learn it online.
I think, that complexity cannot be eliminated, but it can be hidden and distributed, using the right abstraction
that being said C++ being a big language adds complexity (stemming from the language itself, i.e. stemming from the tool)
So you can use a complex tool, to make a complex task simple, or a simple tool and keep the task more complex, requiring more steps etc..
But with C++ its a complex tools, that while it takes some complexity from the task, I think it adds enough complexity, that could outweigh the complexity it reduces
We need better languages, C++ is not it
> that being said C++ being a big language
Rust has become fairly big now, no? Is there some objective metric that can show Rust is a "smaller" language (I bet it is, but I don't think it's by a lot)?
There’s not really any metric of a language being objectively “big” or “small” but I could point at several features where the Rust feature is significantly less complex than the C++ one. Additionally, C++ has more features that Rust doesn’t than the reverse.
> memory unsafeness, can be significantly mitigated when developing with a certain modern subset of the C++ language.
Right.
> Most existing and popular databases are developed in C/C++, providing a wealth of resources and innovations we could leverage.
Right.
But two rights can make one wrong. How are you enforcing 'good part of C++' when you're interoperating with others' code?
Encapsulation.
This is our code. That is their code. Depend only on the interface of their code and not the implementation. You can look at their code for curiosity but don't depend on the implementation of their code in our code.
Then you don't care what subset of the language their code is written in.
This fails to address the question you replied to. When "our code" is memory-safe and "their code" isn't, it's still "Our Product™" that ends up with a CVE that costs us money and reputation, because users don't care about whose code it was.
From my experience no language bitrots and degrades worse than a C++ codebase.
The language standard has changed so much, the tooling, trendy libraries and the established conventions... It takes a herculean effort to keep a given source tree up to date.
Dive into a C++ repo started even 10-15 years ago and it can be a revolting experience, let alone one from back in the 90s.
And then from company to company conventions and expectations just vary dramatically.
When I was at Google we had a large committee of very smart people who applied monorepository wide modernizations across the whole repository, introduced amazing tooling and analysis tools, and imposed a very strict style guide that kept people fairly disciplined. But that was a herculean effort which most other organizations can't afford.
Rust has all sorts of problems (including specific ones for DB internals or OS development). But what's amazing when I read these articles is they don't actually seem to mention those specific problems that I've encountered in my last 3 years of professional Rust work. Instead they read like rationalizations by people who have a certain hammer they've gotten really skilled at using, and don't want to give it up.
That's fine if it keeps your organization productive, but I see no reason to publish about it?
If I were to make a list of gripes about Rust for this kind of work it would primarily emphasize the continued lack of acceptance/conclusion of the allocator-api (or competing) proposals, and the rather chaotic and unprofessional (and potentially insecure) nature of the way Cargo project dependencies explode into a hard-to-reason-about mess.
But the list they make? io_uring, mimalloc, and performance oriented networking are... not problems to use in Rust, not complicated at all. I assume the same (or better) for Zig.
> the rather chaotic and unprofessional (and potentially insecure) nature of the way Cargo project dependencies explode into a hard-to-reason-about mess.
this is one of my biggest gripes, too. that alone has been enough to cause me to avoid Rust for projects wherefore it would otherwise be a good fit. you can pull in "one" dependency and find yourself downloading hundreds of gigabytes of zillions of tiny dependencies, sometimes the same one at multiple versions. it's by no means a problem exclusive to Rust, but that's no excuse.
it's been a while, but my other major gripe was the way so many crates would require the nightly. the rust devs have done a good job maintaining backward compatibility between stable releases, but afaik there isn't any guarantee regarding the nightly. keeping up with the nightly is infeasible when each compiler release and all your dependencies needs to be vetted by your security team.
i also long found myself disappointed by the lack of a real specification, but that one is relatively minor. less of a frustration.
It's worth pointing out that you can absolutely use Rust without Cargo and crates.io and the culture that comes with.
You'll be swimming up against the stream. But arguably it makes sense for certain kinds of projects. I'd classify OS kernel and DB internals development as being those kinds of projects, TBH. Keep your dependency set extremely minimal, vendor it, and avoid crates.io entirely.
I don't actually run into nightly requirements... ever? These days.
Progress on language specification is good https://github.com/rust-lang/fls
I don't know if the stuff about the JVM is even true. I grant that Redpanda is written in C++, but it isn't clear that its performance advantages over Kafka are due to that rather than to the fact that Kafka was implemented in a performance-oblivious way by people who did not know anything about software efficiency. This doesn't reflect on the JVM. You can write a high performing system in Java and the modern JDK is a state-of-the-art toolchain that provides features that many C++ projects struggle with.
The by far biggest issue for Java is that they still haven't gotten their act together on value-types (project Valhalla), going back in time, the one thing someone should have told the designers was to not to release the erasing generics in Java 1.5 and go for something like C# did with value type structs.
Not really for "purity" issues, but rather due to the fact that memory speeds and main memory latency patterns that started to emerge as problems in the early 00s only got worse over time and having the erasing generics kind of cemented the memory access patterns.
The Java teams has done some truly amazing things in terms of GC research, but much of it is needed simply because the Java and JVM memory model (while "simple") is very allocation-heavy compared to C# that went for value types very early.
Take a peek at the QuestDB source code(Java) for heavy data-manipulation tasks, it's not really written in an idiomatic Java style to avoid GC costs (strongly reminicent of the way some people coded for JavaME back in the early 00s), a C# port would not be entirely idiomatic either but far more so than the existing code.
Full agree. Value types are underrated. That's one of the best things about C# and Go. They are increasing performance (with contiguous memory) and reducing GC pressure. I also believe that dynamically typed languages like Python would gain a lot by introducing a form of value types/unboxed values. For example, Cinder, Meta's internal version of CPython, supports what they call "static classes": https://github.com/facebookincubator/cinder?tab=readme-ov-fi....
Unfortunate that this blog can't be read on mobile without considerable pain. Seems fitting, though, I guess.
The reasons to use C++ over rust in this article are not good. The reason I would have picked is C++ does a better job matching mental models (partly because of it's flexibility) and it's easier to say what you mean. Value semantics by default also make it easier to write functional style code.
Am I reading too much into this, or did Mr. "EloqData Core Team" choose C++ because that's what they're comfortable with, but don't want to say it?
It sounds like the rationale is that existing database technology is already written in it, and they want to re-use some of it. That's reasonable, but I do think that it only makes sense with the assumption that the flaws in C++ aren't large enough to be worth using something else, at which point there isn't really much need to justify using C++ at all. If someone is concerned about the flaws in C++, the benefit of relying on existing C++ libraries isn't going to seem as compelling to them for the exact same reasons they don't want to use C++ for their own code.
At the end of the day, the choice seems to be a bit circular; if you don't have concerns about C++, you'll find plenty of reasons to use it, and the arguments against it aren't going to be compelling. If you have concerns about it, the reasons to use it won't be compelling, and you'll likely agree with the arguments against it. I have to imagine that whether someone agrees with this choice will be entirely consistent with their existing opinions of C++; it doesn't seem like there are any new arguments left to make on this topic, so debates on the topic will inevitably rehash existing arguments (regardless of which side they come from) and only appeal to the people who already have formed their opinions based on finding those arguments compelling to begin with.
I don't think it's quite as circular as you're making it sound. If someone has a prior contraint of needing to move quickly (which is common in startups) it can make sense to choose any arbitrary technology, if it allows them to do so. I don't think someone developing a new game in C++ necessarily has no concerns about C++, that's just the language that all the console SDKs use. I don't think someone doing data science in Python necessarily likes Python, that's just the language that most models and libraries use (and that person probably has a deadline to publish a paper!)
Another factor to consider is that, if one is indeed trying to reuse code from existing databases (regardless of the reason for doing so), code from projects like Sqlite and FoundationDB is simply far less likely to contain serious bugs than any newer Rust-based option. There are way more mistakes one can make when writing a database than just memory safety mistakes, and the mistakes tend to be extremely subtle. Code having been run in production for long periods of time under significant amounts of load is basically a fundamental prerequisite for it to make any sense to trust the data of your users to it.
These articles can always be surmised down to "because we wanted to". Rarely is there some real reason.
As Richard Restak postulates in his book “The Naked Brain”[0]: the limbic system provides a gut feeling (usually from comfort) and we rationalise our way backwards from that without being able to really pinpoint “why”; usually the “why” is secondary and only added as justification for the feeling post-decision.
[0]: https://www.amazon.com/Naked-Brain-Emerging-Neurosociety-Cha...
Humans are really just 3 LLMs in a trenchcoat after all
They do list specific reasons in the article...
One, C/C++ interop is a priority since they will interoperate with a large variety of C/C++ APIs (sounds like one of the main points of their project is to integrate things that are largely implemented in C/C++).
Two, they say their aim is "building a lasting system that will support decades of continued improvements." You want confidence that 99.9% of the code you write today remains just as good 20, 30, 50 years from now. I don't think rust is quite there yet (or maybe it is but hasn't yet proven it).
The list of reasons is very weak and no point is actually prohibitive against the usage of Rust.
It's really not clear on what basis you brush away their reasons. Just calling them "very weak" with nothing to support it doesn't mean much.
The lasting system bit is very weak. Those old languages are still getting regular releases. Also Rust barely changes from one version to the next, entirely unlike C++!
Rust interop to the libraries they list is not an issue.
If you are good at c++ and want to continue building software in c++ it is fine. Just be honest about it.
The "I drive an H-pattern manual transmission" of programming language discussions.
Is this like a dig? Does anyone disagree that manual is much more fun?
I think you can over examine metaphors like this but yeah, driving manual is more fun when you’re going on a fun drive. When you’re commuting in start stop traffic… less fun.
(and in this scenario you’re also usually sharing the car with other people so driving automatic would make everyone’s lives easier)
Fun isn't the point, and it's not (necessarily) a dig. I drive a manual transmission car as my daily driver. I enjoy it. I would never try to say it's better than a modern automatic, though - because that would be wholly incorrect at this point by any objective measure.
I'm noting that C++ vs Rust is basically this: every article that someone writes which goes over "we're still choosing C++" has the same vibe as people who choose manual transmissions in 2025. There's no real reason to do so at this point, other than if you want to.
Nah there are plenty of embedded contexts where you're only going to have an SDK for C/C++.
I worked on one recently.
These will be slowly eaten by Rust, but sure, we can agree that there might be a small carve-out for that sector of software. It's not a hill I'm willing to die on.
This seems like a bit of selection bias; people who learn it are more likely to be the type who anticipate enjoying it, and people who don't find learning it appealing probably wouldn't enjoy it if they did. I haven't driven in over a decade partially due to disliking it, and I have to assume that I'd dislike driving manual even more.
Think of it as, I'm a good driver and like fun so i chose a manual Ferrari as one of two family cars, then I borrow the car to one of my kids (junior dev) that doesn't have experience with high powered manual cars and crashes.
I suspect the H pattern they mean is where the gear shift is on the steering column, not on the floor. I long ago owned a 1945 Dodge truck with that shifting setup.
Nope, but I appreciate the effort. I threw the H-pattern reference in there without thinking about it too much mostly to differentiate the reference from situations where you're shifting but it's not with a lever.
(I've been stuck on planes for 20 hours with little sleep, so ignore it if it doesn't make too much sense lol)
I get your point and I like driving a manual car, but there’s a reason the whole world has switched to automatic
Perhaps not always for the better?
People eating, drinking and using mobile phones .. I think the luxury of having a hand free with automatic transmission is a contributing factor.
Stick shift requires a certain level of attention/engagement that might actually make for safer driving IMO.
I mean it's an analogy, so it's not directly the original comparison. I hate C++ but I concede it's possible to get the most performance from C++ if you are diligent. My argument there would be: Ok, use C++ and C (or even machine code) for the most performance sensitive parts, but then use a safer easier language everywhere else.
There's essentially no learning curve moving from manual to automatic. But moving from C/C++ to Rust, there's a big learning curve.
So not the perfect analogy--my point is the "manual" version isn't always better than the "automatic" version.
For one thing, manual transmissions require physical activity and coordination. That's not true with programming languages...
Not fun in San Francisco rush hour traffic on Russian Hill.
Lol yes.
I almost had a panic attack driving an _automatic_ up Lombard. (Sadly an old minivan with bad-lish tires).
In stop and go traffic (is there any other traffic in SFO?) it moved so much backwards and spun so hard every time I tried to move forward/upward.. I swore never to return. I haven't been back on that road since.
20 years ago, sure. Now if you want fun torque you drive electric and blink with confusion at manual-snobs.
I learned on stick and I still feel a nostalgic appeal, sure. I test drove a used hothatch Volvo C30 T5 Polestar edition last fall before ultimately settling on an electric performance car (Polestar 2) to feed my midlife indulgences. And I have to admit a certain ... thrill... from the turbo lag and the process of shifting.
But it all seems a bit silly when compared to instant torque at any RPM.
Manual snobs are a tiring bunch, it's not an inherently better transmission or way to drive. That said, instant torque isn't really that special and feels like a novelty after the first few times you've floored it.
When an EV actually has a suspension setup and overall weight that doesn't feel like I'm piloting a boat at sea, then I'll probably care. Porsche & co seem like they're still aiming for it so I've got some hope we get there.
Speaking of snobs, I have had multiple arguments with Automatic snobs, who think I am a bit thick for still driving a manual.
They refuse to take 'I do it because I like it' as an answer.
I mean that's weird - "I do it because I like it" is, IME, the answer that makes everyone drop the discussion.
Personal preference is the most valid reason, lol
> Over the past several years, the EloqData team has worked tirelessly to develop this software, ensuring it meets the highest standards of performance and scalability. One key detail we’d like to share is that the majority of EloqKV’s codebase was written in C++.
This is a very interesting approach to marketing a new database
[dead]
[flagged]
If Rust was like Communism, there would be dozens of examples of massive codebases that ported to Rust and ended up with massive infighting, culminating in 20% of the developers being assassinated by the moderation team.
and then someone would excuse that by saying "that wasn't real Rust"
> In particular, the most harsh arguments against using C++, i.e. memory unsafeness, can be significantly mitigated when developing with a certain modern subset of the C++ language.
So the quest for the one true “modern subset” of C++ continues.
How do developers continue believing in this after a decade of the standards committee proving over and over again that they’re not interested in this and won’t contribute toward it?
The article isn't claiming there is a "one true modern subset" of C++ that they use. It's merely pointing out that you can significantly mitigate the main criticisms of C++ by making certain sacrifices, which is pretty much true.
There are good reasons the standards committee doesn't make those sacrifices on your behalf, because ultimately there are tradeoffs there that the programmer is supposed to understand and have control over. However, there is an argument to be had about what the default "safety setting" should be and whether C++ makes a good choice. IMO that's actually the main difference between safety in Rust and C++, since you can make Rust just as unsafe as C++ if you want, only you need to explicitly mark your code as unsafe.
Also, I believe the C++ standards committee does care about this, which is why Profiles [1] are being considered.
[1] https://github.com/BjarneStroustrup/profiles
> The article isn't claiming there is a "one true modern subset" of C++ that they use.
They’re arguing that they’ve found a sufficiently safe subset, which if true would be the first.
It would be a waste of time to relitigate the many programmer-hostile, unsafe decisions the committee has made over the years. I think your programmers who are supposed to “understand and have control over” C++’s arsenal of footguns are more or less fictional and a language designer shouldn’t take them as intended audience.
https://robert.ocallahan.org/2017/07/confession-of-cc-progra... (2017)
> Also, I believe the C++ standards committee does care about this, which is why Profiles [1] are being considered.
It was discussed at length why this proposal is insufficient: https://news.ycombinator.com/item?id=45234460
Because they are grasping for reasons not to learn Rust.
Their reasoning is reasonable. And besides, when your existing team is already proficient in Cpp, arguing that "they should learn X instead" seems very risky and unwise.
Then just say so and it is fine. We are a team of c++ experts and we are able to build a high quality program in C++ is a completely valid unassailable reason. Except for the most ardent rust fanatic.
Rewriting an existing working codebase rather than committing to its incremental improvement is usually a sign of bad decision making. So I'd be concerned if a team that had an existing mature codebase in C++ and a team of engineers hired on that path were taking on such a venture.
But that's kinda... not a bloggable topic, frankly.
Rust was designed to facilitate incremental rewrites of an existing C++ library, making cross-language testing and validation very ergonomic; why C++ developers rarely take advantage of this (or even account for it as an option in their analysis of alternatives) is an exercise left to the reader.
> Rust was designed to facilitate incremental rewrites of an existing C++ library
Do you have a source for this claim? Rust is a fine language (though its advocates can be a bit belligerent sometimes). But, as a matter of fact, Rust was not designed for a easy interoperability with C++ or to make gradual rewrites easy.
One design constraint of Rust was to be able to be incrementally included in a large C++ codebase: Firefox.
It turns out that this kind of interop with C++ directly is extremely difficult, and so it isn’t super smooth right now. Interop with C was prioritized and you get zero overhead there. And various projects ease Rust <-> C++ via the C ABI.
Mitigate having bad legs by walking on your hands
As far as I can tell, modern C++20/23 is as safe (if not safer) than rust. So much of rust compares itself to C++99, where modern C++ doesn't use exceptions, has smart pointers (RAII), improved casting and array management, and has an extensive suite of checking tools and flags. The conversations I've seen at my company for using rust tend to be "well it would be tun to do something different", which just aren't very compelling to me. I worry Rust is going to end up like Haskell in 5 or so years
> As far as I can tell, modern C++20/23 is as safe (if not safer) than rust.
It is not. Rust will, for example, prevent the following memory-safety issue from compiling:
(This sort of pattern is responsible for nearly 100% of the C++ memory safety issues I know I've committed in the past several years.)C++ is getting safer, but it has a long way to go to match Rust's safety guarantees. Google is doing a lot with spatial safety with hardened libc++, bounds checks for C-style arrays, and safe buffers; but temporal safety is a lot harder without more information in the source code.
Running sanitizers and such is quite expensive too. It burns a lot of cycles to run msan, asan, tsan, valgrind, etc.
Whereas catching these bugs at compile time saves everyone a lot of time and money.
TBH I don't find the reasons in the article particularly compelling. Rust has a lot of industry backing now and is pretty clearly the way forward to systems programming. Writing Rust wrappers over the various libraries they use is largely a one-and-done issue, and they can publish them to Cargo and share the load of keeping them updated. If ISO or various governments get their act together with a real software liability regime or cyber security requirements, companies with big legacy C++ code bases will be in a tough spot. Second best time to start writing safe code in your project is now.
The sanitizers and static analysis tools are not as good as the borrow checker for preventing data races.
You can easily introduce memory-related issues in the "modern C++" and the compiler won't say a word even with pedantic checks.