I often feel that when C++ posts come up, the majority of commenters are people who haven't deeply worked with C++, and there's several things people always miss when talking about it:
- If you're working in a large C++ code base, you are stuck working with C++. There is no migrating to something like Rust. The overhead of training your engineers in a new language, getting familiar with a new tool chain, working with a new library ecosystem, somehow finding a way to transition your code so it works with existing C++ code and isn't buggy and adapts to the new paradigms is all extremely expensive. It will grind your product's development to a buggy halt. It's a bad idea.
- Every time a new set of features (e.g. reflection, concepts, modules, etc.) is released, people bemoan how complicated C++ continues getting. But the committee isn't adding features for the sake of adding features, they're adding features because people are asking for them, they're spending years of their lives writing papers for the committee trying to improve the language so everyone can write better code. What you find horrifying new syntax, I find a great way of fixing a problem I've been dealing with for years.
- Yes, it's a gross homunculus of a language. If I could switch our team to Rust without issues, I would in a heartbeat. But this is the beast we married. It has many warts, but it's still an incredible tool, with an amazingly hard working community, and I'm proud of that.
It's basically like clockwork - you can assume that any post about c++ language evolution is going to have a number of people saying one or all of the following :
1) "CPP keeps getting complex in useless ways. Just use C (maybe C with classes style)". This viewpoint is correct in the sense that modern CPP is essentially a different language than C++98. But I disagree with the rest incredibly strongly - modern C++ is more expressive, safer, and often more performant than the old-school style. Things like unique_ptr, string_view/spans, RAII, etc are very useful and reduce boilerplate code as well as manage complexity.
2) "CPP is garbage, use Rust instead". I have not personally written in Rust, but I do find it to be a very interesting language. I would consider very strongly writing a new project in Rust. But most C++ projects are not new, and although us nerds always love rewriting perfectly working code, it's a good way to shoot your business in the foot.
3) "The template system is obscene". I mean, this is true. :) I do occasionally sprinkle metaprogramming into my code, because it solves some problems incredibly well. But it is essentially a different language grafted on at compile time. if constexpr, concepts, etc help enormously with this problem. And yes, those have all been introduced very recently...
Another point is ecosystem. Imagine you trained your whole team on Rust and got the entire codebase scrapped. Now you have to actually go and rewrite all your upstream dependencies. Each library, infra integration, algorithm, data structure, etc.
Then, you gotta hire new people when those leave, and the rust hiring ecosystem is also just not there yet.
I worked mostly in robotics. Literally everything is cpp. All the grad students know it. Ros is there setting the mental models for better or worse. The list goes on.
You have really strong points on hiring, but the notion that you need to rewrite all your dependencies pretty weak. Nearly every modern language includes facilities for writing findings and I do believe that there are Rust to C++ binding tools.
This is true in theory, but in practice is a nightmare.
YMMV, but even small bits of python, Go, or (yes) Rust that have crept into the robotics stacks at the various places I've worked have created problems for incoming new-hires, or for maintenance even for senior folks. Python less so than others, but python is challenging to deploy on vehicle, b/c of various hard and soft problems.
In particular, Rust interop with CPP is poor. Should I recompile all ros packages to allow me to call a few things in Rust? Not at this time.
Case in point, until Rust is fully bootstrapped, even in an ideal world C++ would still be around.
Then there are all those industry standards whose definitions are only available in C, and eventually C++.
Likewise when I need to plug into JVM, CLR, V8, ART runtimes, I am reaching for C++, no need to introduce another layer into the sandwich, in terms of build tools, IDE tooling and stuff to debug.
Not familiar with "boosting", but I'm definitely a fan of concepts and reflection.
Reflection is absolutely gonna feel completely alien to people for a while, but there's a lot of areas in our codebase where I wish I could simply describe a data layout and have the efficient code generated for me instead of writing tons of boilerplate. Take JSON serialization for example. Currently, you have to write your (de)serialization by hand, but with the new reflection stuff one could do it based on a struct's members, and with less error. It'll be wonderful for writing new libraries that will make our lives easier.
Thanks for your feedback! It would be great to have reflection in C++,I did not know it was on the radar! Is there a compiler that supports it currently?
The 2026 proposal has some neat ideas (I like the ability for the developer to give a reasons for modifying behavior that may create otherwise cryptic error messages, for instance); but the more things one packs in there, the uglier, bloated the specs, and the more complicated and buggy compilers will be.
Once C with Classes was an experimental pre-processor to try out bringing in some Simula ideas into the C world. Today, C++ has become a language that changes dramatically every half a decade, where the main question is "will it
compile" if you receive someone else's code, and where even experienced
developers cannot tell from compiler error messages what's wrong (g++). The undoubtedly clever people who have been working on it have nevertheless committed war crimes in anti-orthogonality.
Tip: introduce a versioning mechanism like Rust has it, so that you are freed from the burden of having to be backwards-compatible.
The dev community (and software profession) is crying out for legible, parsable notation and greater safety. All the modern languages are drawing us towards some of these crucial goals. Python above all for its legible/usable notation, Rust for its compile-time and run-time characteristics; Go somewhere in the middle.
As a recovering C++ coder who discovered that there are better languages, I think that within the Tower of Babel that is the coding-language community, C++ has leased an entire floor for its ravings to the congregation.
Been programming in C++ for 25+ years and I could say the complexity exploded after C++11 with the introduction of rvalue references. The template syntax could have been simplified, but you get used to it.
People should use more typedef to make C/C++ look sane, but there is some pushback that it hinders readability, but I feel that it is the opposite.
Rvalue references et al are a valuable addition to the language if you're trying to avoid gratuitous copying. The performant alternative is a messy variety of functions that have the same effect as move semantics, except with ad hoc (or no!) compile-time checking. Using them correctly does requires a lot more background knowledge and conscious decision-making from the user than the rest of the language.
Templates are not that bad as a user, but as a template author, they're a completely different programming language that makes it much harder to express even simple ideas. (Concepts may have changed that, but I haven't had a chance to use them.)
My take would be they are bad for both users and authors.
In 2024 the expected compiler output for a syntax error in a statically typed language is a specific-as-possible report where in the written source the syntax fails - not 40 lines of illegible template error messages.
There are some cases where templates are the best design option. But they should be used only as the last resort when it’s obvious that’s the best way.
The problem with typedefs is that you have to find the definition. A compiler-aware IDE can help with this, but I keep cycling through intellisense, clangd, etc. and they only work if you configure things just right -- not good for reading through some unfamiliar code.
Consistent naming conventions help a lot, but then that just introduces assumptions that could be wrong.
As with most points of code style, it comes down to taste.
I haven’t fuzzed a C++ compiler myself, but our team recently tried fuzzing a relatively simple S-expression-based compiler and discovered several issues in a few weeks [1]. I can only imagine what could be uncovered in C++ compilers. If this hypothesis holds, it suggests a significant attack vector that might elude even the smartest security researchers who are only analyzing repository codes and dependencies.
Compiler engineers like fuzz testing. You'll find a bunch of infra for it in llvm. That should mean the easy targets have already been hit, though I wouldn't be too confident of that stance.
Plus there are hordes of academics using Clang/GCC as targets for bug-finding papers. The Csmith [1] paper alone has over a thousand citations at this point. I'd assume most of the low-hanging fruits are picked.
In my humble experience, both in academia and the cybersecurity industry, there are relatively few individuals and teams with the drive necessary to discover the most challenging bugs, especially compared to the sheer scale of the challenges. Fuzzing is just one example of this. Additionally, with billions of lines of code, it takes significant time for research to translate into real-world engineering practices.
One example of a higher order reasoning about this is [1] (includes metrics).
cppfront simplifies C++ a lot by introducing unifying syntax (that compiles to ordinary C++ -- same semantics in the end)
https://github.com/hsutter/cppfront
Yes, knowing underlying levels of abstraction may be useful e.g., godbolt.org is excellent, knowing CPU caches, pipelines, branch prediction, out-of-order execution may be essential for getting orders of magnitude improvement in performance.
It doesn't mean the abstraction itself is useless.
That categorisation is exactly the point, Objective-C and C++ didn't stop at being Typescript for C, they became their own thing, after using C compatibility and C code generation as their bootstraping mechanism.
Also plenty of languages compile to native code via C or C++ generation, Eiffel, Nim and some Scheme compilers being the most notorious ones.
However the WG21 chairman can't in a political correct way, assert that like Objective-C and C++ did to C, cppfront if successful will trail its own path, effectively being yet another C++ replacement language.
cppfront is an experimental sandbox. Its goal is not to replace C++ nor offer an alternative. Its goal is to explore features and semantics in order to improve C++ itself.
Exciting as it may be, to be fully available for portable codebases maybe around 2030, given the current velocity of compilers adoption of ongoing standards, even among the big three.
As of today, C++17 is the latest one can aspire to use for portable code, and better not making use of parallel STL features.
I would argue C++20 is totally fine. MSVC does not yet has a C++23 flag and it will be internally replaced with 'latest', aka some C++26 features, when you use it. This took us by surprise, because they deprecated some enum conversions and thus our clang-cl CI failed for openCV, with the latest llvm. I still fail to understand why enabling a specific C++ version, automatically means that it is considered stable. At least give use C++23-experimental or something /rant.
std::span is a trap, gsl::span should be used instead, unless one is willing to enable checked collections on the respective compiler.
As for the rest, the point stands regarding portable code across various C++ compilers.
I agree on the co-routines, while I know C# co-routines relatively well, and the C++/CX stuff that was used as inspiration for Microsoft's initial proposal, they are a bit of a mess, when key WG21 members also don't fully grasp how they have to be implemented, and we need two hour sessions on C++ conferences to go through "hello world" kind of implementations.
> std::span is a trap, gsl::span should be used instead
It's not a trap if you weren't expecting bounds checking.
> we need two hour sessions on C++ conferences to go through "hello world" kind of implementations.
After one hour it became clear to me I am going to stay away from that stuff until either it becomes much more usable (unlikely TBH) or someone forces me to use it...
std::span is no more a trap than the rest of C++'s standard library in that regard. Are you also eschewing std::vector and std::string? std::string_view, std::array?
I suppose C++26 brings std::span::at, although exceptions are a different can of worms.
> And yes, if I am calling the shots, bounds checking are enabled in release builds.
No disagreement there. But I'd prefer to turn it on across the board via compiler flag rather than pull in a special library for it and remember to use that library consistently. And if that's the case, I don't see std::span as any more problematic on this front than the rest of the standard library.
(Yes, I know, GSL isn't really a special library on Windows. But anywhere else, it is.)
Concepts are well supported and have been for a while and they are so great. Those alone make C++20 worth it. But Coroutines also make it worth it, if you build software that can use them well. Designated initializers change how I write code (for the better - by a lot). And of course std::span.
> Writing C++20 code without modules, ranges, or concepts, is like, what is the point.
The point is, obviously, use other features introduced in C++20 and not have to deal with artificial restrictions when you opt to onboard onto whatever feature you'd like.
To me C++20 has more to do with designated initializers than modules, for the very obvious reasons. It's fine if you take a pass at an upgrade and prefer to take the hit of migrating through a bigger delta, but framing this thing as "what is the point" is indeed missing the whole point.
If portability isn't a concern, VC++ with MSBuild, or clang with CMake (without header units), are pretty much quite usable.
However I do agree with the feeling for large scale adoption.
The way modules and concepts went down, or the way GC API got added only to be removed, or ongoing contracts discussion, has pretty much settled my opinion that WG21 really needs to adopt the same approach as other programming language communities.
Papers without working implementations for community feedback shouldn't be accepted at all.
> However I do agree with the feeling for large scale adoption.
I think that everyone has misguided and naive expectations on how such a radical change would roll out to production software.
Changing dpendency management and updating build systems is a hard sell for professional projects delivering production software. It's the most radical change in how you're software is built with zero upside in terms of features. Best case scenario your software works as it always did. Worst case scenario you wasted tons of developement effort to retool and revamp your whole CICD pipeline just to break your project. Hard sell. I mean, why do people think so many projects are still stuck with C++11?
> Drastically reduced compilation time is a huge upside for modules.
You can achieve those already with the adoption of basic architecture principles and the incorporation of tools like compiler caches such as ccache. Those who have an interest in those already do it. Modules are no silver bullet and change nothing.
That is generally the case. Modules had two different proposals with thair own implementation and they choose the one microsoft iplementet after debate which is why msvc had modules from the start.
The missing part of that story is that they ended up compromising on a third approach, without build tools support, with the hope everything would be quickly settled after shipping the standard.
Apple and Google kept using clang header modules, switched focus to their C++ replacements, and clang transition to C++20 modules languished until a few heroes stepped in to do the work.
Meanwhile GCC is still work in progress.
And build tools are still a mess, even cmake doesn't have yet a story for header units.
As for Microsoft, except for Office, there isn't a single Microsoft product, especially C++ SDKs, that make use of modules in any form.
This is quite different from how other programming language ecosystems migrate features from preview into stable.
I propose a theory/rule that every new C++ version takes superlinearly longer than the previous to implement.
Currently the Standard is at C++23 and we are ~4 years behind (C++20 is not portable, as you say). At this rate, by the time we get to C++40 or 50, compilers could be behind to a comical degree, like 15 years.
Personally I am interested to see how many unimplemented features it takes before the Committee takes action. (I would find it superbly amusing if they simply did nothing and we got to a point where no new C++ features ever became available.)
This is why I have the very unpopular opinion that C++26 is going to be the very last standard that anyone cares about.
It will be good enough for the industry use cases where C++ matters, while other languages keep slowly eroding C++'s market share.
Example of such scenario, LLVM, GCC, JVM, V8, CLR are all currently settled on C++17, maybe eventually C++20, they don't need any additional features, for their C++ use cases, other than having GCC and clang keep up with ISO.
How many people care about COBOL or Fortran 2023 standards?
GCC is still expected to bootstrap from a C++11 compiler. (For self-hosting compilers, language version choice is not only about useful language features.) The built compiler defaults to C++17. I think the remaining obstacle before moving to C++20 by default is more experience with (and fewer bugs in) the support for modules.
I've been writing in a certain C++17-like subset of C++20. I like designated initializers, and there are probably some other syntax and library conveniences from C++20 that I'm taking advantage of without knowing (std::string_view::starts_with?), but the rest is just C++17.
One notable exception is that I did a project with C++20's coroutines recently.
I'm with you, but in my experience you have to go further back still to C++11 before it's actually compileable on most distros. And even there the atomics stuff is not really fully supported everywhere.
One major gripe I have with these C++ updates is that the proportion of codebases that actually use recent C++ features (cpp17, cpp20, cpp23) is very close to zero. The more and more esoteric this language becomes, the fewer people who can actually master it.
I’ve been writing C++ over 20 years. The language is a freak show, combining the solid industrial tooling and userbase, with some development efforts led by a clown-car full of pretentious academics and people who just want to bolt on new stuff for no good reason except to ”keep the language fresh”.
C++ is not supposed to be fresh. It’s supposed to be portable, and allow fine tuning of programs to bare metal while allowing a sort of high level implementation of API:s (but often fragile and badly designed).
Some new features are excellent, others are not, and the history is plagued with weird historical feature gaps obvious to anyone familiar at all with more consistent languages.
So if something feels weird, there is always a good chance it’s not you, it’s the language (committee).
C++ certainly suffers from somewhat of a kitchen-sink nature. However, if you consider two of its design goals being:
* Support for multiple, different, programming paradigms.
* Very strong backwards compatibility, all the way back to C.
... then some "freakness" is to be expected. And I do believe some of the additions (to the library and the languages) have been excessive. However, I disagree with your characterization of language development work.
1. Most people on the committee, AFAICT, are from industry rather than academia. And if you consider national bodies, I think the ratio is even higher.
2. "Keeping the language fresh" is not a goal and not what the committee does. Most of what's added to the language are things that people have been complaining about the lack of for _decades_.
3. Feature proponents are those who want to "bolt on new stuff". Committee members are tasked with preventing new stuff being just bolted on.
4. Some new additions are necessary, and others are not necessary but useful, for "tuning programs to bare metal".
Finally - I agree that committee-work has the drawback of less consistency; and there are definitely warts. But for an established language with huge existing codebases and many stake-holders, and with the design goals I mentioned above in mind - an international committee and consensus-building is better than appointing some benevolent dictator.
> Most of what's added to the language are things that people have been complaining about the lack of for _decades_.
And are useless now, because everybody who had that problem either already solved it (the solution could have been "use another language") or did realise that it is not worth the hassle.
I guess the best examples are `std::format` or `std::thread`.
> But for an established language with huge existing codebases and many stake-holders, and with the design goals I mentioned above in mind - an international committee and consensus-building is better than appointing some benevolent dictator.
That depends, but yes, everything is better than letting Stroustroup "decide".
I don’t think the committee / proposals process is necessarily bad. It is a good way to develop a formal specification for a portable and highly complex language with many pitfalls and serious, industrial-level legacy compatibility requirements.
It might be better if it had a true BDFL, instead of a spiritual guide, and I do worry about the committee getting too far ahead of the industry and leaving it behind, plus what will happen when Stroustrup finally retires in earnest.
But yeah, now and then it does produce a turd, and there’s only so much turd-polishing you can really do.
I guess I’m just saying it’s a development model with pros and cons. The pros are necessary. The associated cons are inevitable.
To be specific I was not critizing or promoting any particular governance or design model. Just that this particular authority has had it’s more dysfunctional moments in it’s output - one should not presume all features of C++ are splendid examples of software design.
You can find std::string_view (C++17) in Google's WebGPU implementation [1], static_assert (C++17) in Protobufs [2], <bit> (C++20) in React Native [3], and std::format (C++20) in Cuda Core Compute [4]. So the big names in tech aren't afraid to add -std=c++20 to their build scripts. On the other hand, C++23 features aren't as common yet, but it's still very fresh and MSVC support is WIP.
I'd venture a guess that string_view, static_assert and bit were already a part of respective codebases, just in-house versions. These are very commonly used. So seeing them getting adopted is completely unsurprising.
However the adoption rates of newer C++ features are in fact new are way lower. From what I see lots of projects still use the language as C with Classes, basically, and that ain't going to change any time soon. The GP nailed it - C++ is adding a lot of esoteric stuff that very few people actually need or want.
Imagine how widespread use of Java 8, .NET Framework, Python 2, C89 is still around the industry and now apply it to C++ versions.
There is a reason why C++17 is the best we can currently hope for in what concerns portable code, given the actual support across industry compilers, and company project guidelines.
Many embedded shops might still be discussing between adopting C++11 or C++14.
I agree, but there's a big difference between saying some industries or companies are still targeting old standards and saying there's "near zero" adoption of new standards. The latter just isn't accurate from what I see.
I've been writing C++ for well over 30 years. I'm currently employed full-time maintaining the C++ toolchain, runtime, and standard libraries for a major commercial embedded OS. I see a lot of C++17 being used by my customers every day. It's there, running everything around you.
C++20 is still too fresh for my industry, especially for embedded where runtimes require certification for functional safety. Maybe in two years.
What can I tell The Committee? Stop. No, we don't need a single central ex cathedra library for networking. Or graphics. Or SIMD. Even the existing filesystem library is so broken it's dangerous (the standard specifies if it's used on an actual filesystem it's undefined behaviour -- which means using <filesystem> means your program could provoke the legendary nasal daemons just by being run). Stick to generic basics and leave the specialized stuff that not everyone needs to third-party libaries. Nothing wrong with a marketplace of libraries to serve an entire economy of requirements.
Standard SIMD everyone can build on top of sounds like a great idea--no unnecessary fragmentation due to using different subtly (or not) incompatible libraries. SIMD instructions are in desktop CPU since 90s. It is long overdue.
> the proportion of codebases that actually use recent C++ features (cpp17, cpp20, cpp23) is very close to zero
~Nobody uses all the recent features, but some new C++20 stuff does get adopted very quickly, like 3-way comparisons, constinit, abbreviated function template, etc.
For C++23, support for it is severely lacking in MSVC at least, so that's going to severely impact users.
I'm puzzled by this statement. In all three places I worked in the last 7 years, we actively pushed for the newest language standards. We're very eager for the c++23 switch to arrive so we can finally derive from std:: variant. And we're using a good subset of c++20 currently.
In some ways, you’re not wrong. In other ways, there’s been extremely broad support for some major new features in the language in recent years, like coroutines and concepts.
>One major gripe I have with these C++ updates is that the proportion of codebases that actually use recent C++ features (cpp17, cpp20, cpp23) is very close to zero
It depends what industry you're working on. A lot of HFT shops keep up to date with the latest compiler and make extensive use of new features that improve the ergonomics and compile-time performance of template metaprogramming, which is important for achieving the lowest possible latency.
Same with cars, buildings made out of newly discovered building materials, and electronics. I would argue this is a good thing for the same roughly the same reasons - rewriting software to use latest and greatest language feature is usually not efficient.
- Can't update the compiler (eg, porting the code base to the new compiler is too complicated)
- No compiler support for the new standard that target a specific platform that one still want to support.
- Too much work to update the whole code base to work with the new standard.
- A 3rd party library is not supporting new standard yet.
- The team is reluctant to have to learn new technologies.
Some are somewhat valid reason, some are less, some are indication of deeper problems.
(P.S: My C++ code base is using C++20. Didn't move to C++23 yet because I think some customers might not be ready for it yet for one of these reasons, but I'm going to push for it at some point.)
Compiler support for the platform is the general limit. C++ is very good about not breaking old code so old codebases are easy enough to port and anyone who refuses to learn can keep using the old ways.
> One major gripe I have with these C++ updates is that the proportion of codebases that actually use recent C++ features (cpp17, cpp20, cpp23) is very close to zero.
Anecdotally, all the stuff I do has been minimum C++20 for a few years. If you're using e.g. Qt 6, released in 2020, you're using at least C++17 features without knowing it ; same for recent versions of boost which start depending on C++17 or C++20 depending on the features / libraries.
> One major gripe I have with these C++ updates...
"One major gripe I have with cars is the number of people that know how to drive one is very close to zero."
Where I work (big tech) everything is c++17. I don't know what the schedule is but in a couple of years every bazel and CMake will get bumped to c++20. And so on.
Because ISO languages are usually designed on paper, with some of the features being done on whatever compiler the respective paper author feels like, thus compilers only rush out to fully support a new standard when it gets officially supported.
With Python, there is no standard per se, it is whatever CPython does, and everyone else has to try to mimic CPython.
I don’t really get this argument. Large C++ codebases are generally divided to libraries. The internal libraries and vendor libraries should both be of high quality. I’m not familiar with industrial use cases where every C++ user would not be a library writer.
> The internal libraries and vendor libraries should both be of high quality.
From my limited experience - high-quality internal libraries are simply not the reality; less likely to be achieved than winning the lottery. Companies typically:
* are not able to identify candidates able of writing high-quality C++
* do not try to attract SW engineers by committing to high-quality code.
* don't believe they should invest developer time in making a library more robust, and bringing them to the level of polish of a popular publicly-available FOSS libraries.
* do not have a culture of acquiring, honing and sharing coding skills and expertise, with the help of actual experts. Again, time and effort is mostly not invested in this.
Either you’ve worked with rookie developers (which is fine, but not ’expected industry baseline’) or in an engineering core lacking years of C++ development. Doing stuff ’the right way’ does not generally need extra resourcing - you simply do it the right way.
Quality gaps like described above - I think this happens when you try to develop C++ without actual experience in C++. C++ is so weird anyone trying to ”do the right thing in the language they are most familiar with” generally get it wrong for the first few years. And then you end up with a quagmire nobody wants to volunteer to clean up.
This is not a skill issue as such or lack of talent. C++ simply is so weird and there is so much bad ”professional advice” that you are expected to loose a few limbs before being able to navigate the design landscape full of mines.
> And then you end up with a quagmire nobody wants to volunteer to clean up.
Not only that, but the rookie developers coming in get inculcated into that. That's what they're used to, and they have all the motivation to continue writing poor code, because they need to avoid their better code clashing with what's already written - clashing compilation-wise and style-wise.
Of course, it's not 100% all bad, there are gradual improvements in some aspects by some developers.
No, and they shouldn’t probably. Most internal libraries don’t have and don’t need to implement novel complex template based specialization - not in their API at least. And stuff that’s internal to library needs to only implement the things the API contract requires - which usually does not require the rigmarole of fully generic ’modern’ template based implementation.
But there are two variables being defined by the destructuring. I believe OP's question was whether there's a rule for which gets chosen for the condition, rather than about contextual conversion to bool in general (which happens even when there's no initialisation in the if statement at all).
Your comment seems to imply the condition is evaluated before initialising the variable(s) at all; is that what you meant? If so, this beast would work (even though it's undefined behaviour to construct a std::string from nullptr, and std::string is not convertible to bool):
const char* foo() // may return nullptr
if (std::string s = foo())
Yes exactly. My hunch is to remember that in `auto [to, ec] = std::to_chars(p, last, 42)` the two names `to` and `ec` are not "real" variables/objects, but names bound to parts of the object returned to by `std::to_chars`. So fundamentally, `std::to_chars` returns a `std::to_chars_result`, that _is_ the return value and what is then contextually converted to bool for evaluation of the condition. It's then some C++17 compiler thing that separately associates the two names `to` and `ec` with the two parts of that returned tuple object.
But I could be wrong, the paper for the feature is linked but I didn't read it (!).
What is assigned (std::to_chars_result) is considered by the if condition. The left hand side of the assignment is then split in two. Just like if it were if (auto res = std::to_chars(p, last, 42)). The split with the [to, ec] makes it convenient inside the if body.
> The result of the expression is the condition. Thus, in your example, the bool check would apply to "s", after the expression is evaluated.
This is a contradiction. There is no expression in my code that evaluates to s. foo() is an expression, and then std::string s = ... is assignment initialisation, which is not an expression.
Edit: I suppose that if I used another form of initialisation, the answer becomes a bit more obvious:
if (std::string s('x', 3))
(Not that this makes sense but just the point is to use a constructor with more than one argument.) In this case it's clear the test has to be the just-initialised variable. In fact there could be no arguments at all!
User-generated static_assert messages would make it easier to build games that can be played entirely using compiler error messages. Something like this old IOCCC entry but nicer:
This doesn't mention the most exciting thing coming: static reflection. Finally no need to manually implement printing or serialisation functions for every struct.
C++ is a funny chimeric creation that has absorbed some great modern ideas from Rust and other new language but needs to preserve compatibility with its antediluvian C heritage. You could write it in a very clean and somewhat safe modern style or in a familiar C-like style. We use modern C++ at work and, by embracing RAII, it really isn't so bad.
You could but why when we already know at build time that the function is deleted or deprecated and better yet know exactly where.
runtime errors when in a rare path are often never tested until a customer hits that rare case. this is on of the reasons I won't use python for a larga project, eventually you will have critical errors in production because not only didn't you test the code but you didn't even get the minimun proof that it works that a compiler provides.
I for one started working on a new project in C++ rather than Rust. I think it's unclear whether Rust is going to be the successor at this point. It's probably never going to pick up in the games industry, QT is C++ (and Rust bindings will always be second class or they could end up unmaintained), has better compile times and is said to be undisputed when it comes to high performance. Obviously the tool for the job factor is most critical.
Career wise, many people are picking up Rust and almost no one is picking up C++, while experienced C++ devs either retire or switch to a higher level language due to landscape change. I would trust supply and demand to be in favour of C++ 10 years from now.
There are also attempts to make C++ more memory safe like Carbon[1] or Circle compiler [2]. If they succeed, why would anyone want to switch to Rust? Also Rust is not ideal for security from a different perspective - I think the lack of a package manager is one C++ strongest points. After working for 9 years with npm, you really appreciate that the dependency you install is usually just a single dependency (even an option for a single header file is common), when there's a package manager, people will abuse it (like install a package with 3 dependencies of its own for something that could be 50 LOC copy-paste) and managing the security of the supply chain is nearly impossible (it will be a legal requirement soon in EU though).
Anyway, I wanted to ask. How is the contracting market looking in C++ world? I'm guessing it depends on the domain heavily? I'm mainly asking about QT and anything that would be desktop / mobile apps / systems programming except video games, but I'm curious in general.
Originally, it stands for "The Fucking Article", as in "RTFA" (= "Read the fucking article"). The expansion "The Fine Article" is a humorous reference to the former.
Note that over the years "TFA" has lost its profane meaning and is generally used in a neutral way.
Wow. Against my better judgment I will keep to the rules of this site and assume that was a good faith question.
Rust is already a good systems language and is getting adoption. D is a great c++-alike already and for 20 (?) years.
There is a mature C++ toolchain for any processor and OS you can imagine.
Simply adopting a different C++ compiler or a newer version of one you are already using can take many months for a large company. Migration to even Carbon would probably take 10x as much effort.
Carbonlang is meant to have a seamless migration path, that is, flip the compiler on the same codebase for starters, no changes. It's not like TS/JS as a superset langauge, but you can have both at the file level and compile side-by-side.
First there is no Carbonlang, it is called Carbon.
Second, it is mostly a Google thing for their C++ use, it is still mostly a frontend implementation at this point, with semantics yet to be fully defined.
They are also open that Carbon is basically an experiment.
Frontend in a compiler, is what converts the text code representation of the language into some intermediary format, usually a graph or intermediate language, that is than further processed for type checking and other semantic analysis, suffering other transformations in the process, until fed into the backend, which takes it from there for the further phases required to generate machine code.
I don't understand why you provided that link to goldbolt though? What was it supposed to demonstrate?
Also, you said that one need a backend for a frontend (to be useful I guess). Do you mean to say that the "Carbon" frontend does not have any backend to work with?
No worries, I see some interest there, hence pointing out some literature on the subject, and the LLVM tutorial, as means for getting a better understanding than through plain HN comments.
If you replace your whole comment with ".", I'm just going to automatically down vote it, so the effect is worse than whatever was originally there.
These kinds of "masking" edits prevent good communication. If you want us all to disregard a comment you now totally disown, then just write an edit that prepends/appends that.
If HN decides to downvote a technical comment into oblivion that contains a minor technical oversight, I will not be part of that. And I can do whatever the fuck I want with my comments. That's what you get by (probably illegally) disabling a delete button
Writing that in an edit is completely valid -- we're grown up enough to weigh whether to bother reading the comment then. Replacing the whole thing with just "." is not acceptable, IMO.
If they're fine with accepting the down votes either way, I still want to register my complaint, pointless as it may be in practice.
It's perfectly okay that you downvote, as that supports the goal of demoting the comment. (The effects of up/downvoting on comment ranking trumps the effect on the author's karma, IMO.) However I don't agree that pseudo-deleting is unacceptable, if for example it contained an incorrect argument and the author thinks there is no value anymore in someone reading it.
So they presumably think they were fully wrong. Wrong about what? It's often useful to know about mistaken assumptions, etc. Now all the "corrections" in the replies are harder to comprehend. Everybody's work is devalued and made harder as a result of the pseudo-delete.
I mean, take a step back and look at what you’re talking about here: a minor subthread about an incorrect argument. When I see a “.” post, I usually think “okay, nothing to see here, I can skip this”. Which in all likelihood is the best for my and everyone else's time. And I grant the OP the freedom to make that call.
In terms of UX, it probably would be better if HN allowed a commenter to “dead” their comment, including the attached subthread. People who cared could still view it with showdead, but everyone else would be saved time.
It's just mildly annoying and rare enough I want to register the breaking of an implicit norm. I don't need to take steps forward or back, or anywhere else, I just disagree with you still, for subjective reasons that should be fairly clear.
std::ignore doesn't work in the context of structured bindings. And even if it did, "we already have this" has never stopped the C++ committee from adding something before :)
Yes, the "skill" of the commitee to discuss some feature for a really looooooong time and then come up with a solution which is going to be (half) fixed in the next standard always astonishes mé :).
I guess with C++38 we'll get a `always_definetly_ignore_this_wothout_any_diagnostic_whatsoever`.
sure about that? It does, happen to, work on libstdc++/libc++/MS STL but it's not specified to work anywhere but std::tie. The existing practice is to cast to void.
it will work for assignment/ construction, but isn't part of std c++ is what i mean. at least not yet. structured binding wont, but in the future c++ 26, _ should work there and other places
Leading with "Specifying a reason for deleting a function" then following up with "Placeholder variables with no name" did make me check the date of the article. It wasn't April 1.
The standards committee are thorough in their mission to including everything and the kitchen sink in C++.
"Placeholder variable with no name" is the super-common feature from other language where you write, for example (not exact real syntax):
(foo, _, bar) = function_returning_a_triplet();
since you only want the first and third items; the underscore is the placeholder.
Useful feature, convenient feature, doesn't complicate your life as a programmer, no need to even remember it, it'll just come to you. Good thing to have in the language IMNSHO.
C++ has certainly had a lot added, but I don't get your point regarding these particular two features. They seem quite minor, useful, easily implemented and unlikely to interact problematically with other things.
I often feel that when C++ posts come up, the majority of commenters are people who haven't deeply worked with C++, and there's several things people always miss when talking about it:
- If you're working in a large C++ code base, you are stuck working with C++. There is no migrating to something like Rust. The overhead of training your engineers in a new language, getting familiar with a new tool chain, working with a new library ecosystem, somehow finding a way to transition your code so it works with existing C++ code and isn't buggy and adapts to the new paradigms is all extremely expensive. It will grind your product's development to a buggy halt. It's a bad idea.
- Every time a new set of features (e.g. reflection, concepts, modules, etc.) is released, people bemoan how complicated C++ continues getting. But the committee isn't adding features for the sake of adding features, they're adding features because people are asking for them, they're spending years of their lives writing papers for the committee trying to improve the language so everyone can write better code. What you find horrifying new syntax, I find a great way of fixing a problem I've been dealing with for years.
- Yes, it's a gross homunculus of a language. If I could switch our team to Rust without issues, I would in a heartbeat. But this is the beast we married. It has many warts, but it's still an incredible tool, with an amazingly hard working community, and I'm proud of that.
It's basically like clockwork - you can assume that any post about c++ language evolution is going to have a number of people saying one or all of the following :
1) "CPP keeps getting complex in useless ways. Just use C (maybe C with classes style)". This viewpoint is correct in the sense that modern CPP is essentially a different language than C++98. But I disagree with the rest incredibly strongly - modern C++ is more expressive, safer, and often more performant than the old-school style. Things like unique_ptr, string_view/spans, RAII, etc are very useful and reduce boilerplate code as well as manage complexity.
2) "CPP is garbage, use Rust instead". I have not personally written in Rust, but I do find it to be a very interesting language. I would consider very strongly writing a new project in Rust. But most C++ projects are not new, and although us nerds always love rewriting perfectly working code, it's a good way to shoot your business in the foot.
3) "The template system is obscene". I mean, this is true. :) I do occasionally sprinkle metaprogramming into my code, because it solves some problems incredibly well. But it is essentially a different language grafted on at compile time. if constexpr, concepts, etc help enormously with this problem. And yes, those have all been introduced very recently...
Another point is ecosystem. Imagine you trained your whole team on Rust and got the entire codebase scrapped. Now you have to actually go and rewrite all your upstream dependencies. Each library, infra integration, algorithm, data structure, etc.
Then, you gotta hire new people when those leave, and the rust hiring ecosystem is also just not there yet.
I worked mostly in robotics. Literally everything is cpp. All the grad students know it. Ros is there setting the mental models for better or worse. The list goes on.
You have really strong points on hiring, but the notion that you need to rewrite all your dependencies pretty weak. Nearly every modern language includes facilities for writing findings and I do believe that there are Rust to C++ binding tools.
This is true in theory, but in practice is a nightmare.
YMMV, but even small bits of python, Go, or (yes) Rust that have crept into the robotics stacks at the various places I've worked have created problems for incoming new-hires, or for maintenance even for senior folks. Python less so than others, but python is challenging to deploy on vehicle, b/c of various hard and soft problems.
In particular, Rust interop with CPP is poor. Should I recompile all ros packages to allow me to call a few things in Rust? Not at this time.
I'm hopeful for the "Rust is robotics" movement, however https://robotics.rs/
The state of the field is so entrenched in CPP though there's more hope in embedded land with the widespread use of C.
Case in point, until Rust is fully bootstrapped, even in an ideal world C++ would still be around.
Then there are all those industry standards whose definitions are only available in C, and eventually C++.
Likewise when I need to plug into JVM, CLR, V8, ART runtimes, I am reaching for C++, no need to introduce another layer into the sandwich, in terms of build tools, IDE tooling and stuff to debug.
> What you find horrifying new syntax, I find a great way of fixing a problem I've been dealing with for years.
To prove you are not 'boosting'... could you give a convincing example?
Not familiar with "boosting", but I'm definitely a fan of concepts and reflection.
Reflection is absolutely gonna feel completely alien to people for a while, but there's a lot of areas in our codebase where I wish I could simply describe a data layout and have the efficient code generated for me instead of writing tons of boilerplate. Take JSON serialization for example. Currently, you have to write your (de)serialization by hand, but with the new reflection stuff one could do it based on a struct's members, and with less error. It'll be wonderful for writing new libraries that will make our lives easier.
Check out boost.pfr, it gets you there for a lot of cases. Here's a library I built with it: https://github.com/celtera/avendish
It's a proper quantum leap compared to pre-reflection
Thanks for your feedback! It would be great to have reflection in C++,I did not know it was on the radar! Is there a compiler that supports it currently?
C++ is a monster.
The 2026 proposal has some neat ideas (I like the ability for the developer to give a reasons for modifying behavior that may create otherwise cryptic error messages, for instance); but the more things one packs in there, the uglier, bloated the specs, and the more complicated and buggy compilers will be.
Once C with Classes was an experimental pre-processor to try out bringing in some Simula ideas into the C world. Today, C++ has become a language that changes dramatically every half a decade, where the main question is "will it compile" if you receive someone else's code, and where even experienced developers cannot tell from compiler error messages what's wrong (g++). The undoubtedly clever people who have been working on it have nevertheless committed war crimes in anti-orthogonality.
Tip: introduce a versioning mechanism like Rust has it, so that you are freed from the burden of having to be backwards-compatible.
> C++ is a monster.
Perhaps you meant monstrous?
The dev community (and software profession) is crying out for legible, parsable notation and greater safety. All the modern languages are drawing us towards some of these crucial goals. Python above all for its legible/usable notation, Rust for its compile-time and run-time characteristics; Go somewhere in the middle.
As a recovering C++ coder who discovered that there are better languages, I think that within the Tower of Babel that is the coding-language community, C++ has leased an entire floor for its ravings to the congregation.
Been programming in C++ for 25+ years and I could say the complexity exploded after C++11 with the introduction of rvalue references. The template syntax could have been simplified, but you get used to it.
People should use more typedef to make C/C++ look sane, but there is some pushback that it hinders readability, but I feel that it is the opposite.
Rvalue references et al are a valuable addition to the language if you're trying to avoid gratuitous copying. The performant alternative is a messy variety of functions that have the same effect as move semantics, except with ad hoc (or no!) compile-time checking. Using them correctly does requires a lot more background knowledge and conscious decision-making from the user than the rest of the language.
Templates are not that bad as a user, but as a template author, they're a completely different programming language that makes it much harder to express even simple ideas. (Concepts may have changed that, but I haven't had a chance to use them.)
”Templates are not that bad as a user”
My take would be they are bad for both users and authors.
In 2024 the expected compiler output for a syntax error in a statically typed language is a specific-as-possible report where in the written source the syntax fails - not 40 lines of illegible template error messages.
There are some cases where templates are the best design option. But they should be used only as the last resort when it’s obvious that’s the best way.
In 2024, I expect people implementing templates making use of concepts, and if not possible, at very least static_assert alongside enable_if.
Naturally reality often times doesn't match expectations.
The solution for move semantics before was specializing the swap function for container types. This was a much more pragmatic approach
I absolutely agree. Move semantics exploded the language.
Gotcha covered: someone wrote a 264-page book about C++ move semantics.
https://www.amazon.com/Move-Semantics-Complete-Guide-First/d...
The problem with typedefs is that you have to find the definition. A compiler-aware IDE can help with this, but I keep cycling through intellisense, clangd, etc. and they only work if you configure things just right -- not good for reading through some unfamiliar code.
Consistent naming conventions help a lot, but then that just introduces assumptions that could be wrong.
As with most points of code style, it comes down to taste.
> C++ is a monster.
I haven’t fuzzed a C++ compiler myself, but our team recently tried fuzzing a relatively simple S-expression-based compiler and discovered several issues in a few weeks [1]. I can only imagine what could be uncovered in C++ compilers. If this hypothesis holds, it suggests a significant attack vector that might elude even the smartest security researchers who are only analyzing repository codes and dependencies.
[1] "Why the Fuzz About Fuzzing Compilers?": https://www.coinfabrik.com/blog/why-the-fuzz-about-fuzzing-c...
Compiler engineers like fuzz testing. You'll find a bunch of infra for it in llvm. That should mean the easy targets have already been hit, though I wouldn't be too confident of that stance.
Plus there are hordes of academics using Clang/GCC as targets for bug-finding papers. The Csmith [1] paper alone has over a thousand citations at this point. I'd assume most of the low-hanging fruits are picked.
[1] https://www.cs.tufts.edu/~nr/cs257/archive/john-regehr/findi...
In my humble experience, both in academia and the cybersecurity industry, there are relatively few individuals and teams with the drive necessary to discover the most challenging bugs, especially compared to the sheer scale of the challenges. Fuzzing is just one example of this. Additionally, with billions of lines of code, it takes significant time for research to translate into real-world engineering practices.
One example of a higher order reasoning about this is [1] (includes metrics).
[1] "As TVL rises, so does the probability of being hacked" https://www.bittrap.com/resources/defis-growing-pains:-as-tv...
cppfront simplifies C++ a lot by introducing unifying syntax (that compiles to ordinary C++ -- same semantics in the end) https://github.com/hsutter/cppfront
That is like pretending using Typescript or Objective-C, doesn't require understanding everything else they build upon.
Herb Sutter always tries to sell cppfront in a different way, due to his position at WG21.
It would be rather odd if the chairman of WG21 would also be proposing a C++ replacement.
cpp2 is "A Typescript for C++" is how the author himself categorizes it https://herbsutter.com/2023/08/13/my-c-now-2023-talk-is-onli...
Yes, knowing underlying levels of abstraction may be useful e.g., godbolt.org is excellent, knowing CPU caches, pipelines, branch prediction, out-of-order execution may be essential for getting orders of magnitude improvement in performance.
It doesn't mean the abstraction itself is useless.
That categorisation is exactly the point, Objective-C and C++ didn't stop at being Typescript for C, they became their own thing, after using C compatibility and C code generation as their bootstraping mechanism.
Also plenty of languages compile to native code via C or C++ generation, Eiffel, Nim and some Scheme compilers being the most notorious ones.
However the WG21 chairman can't in a political correct way, assert that like Objective-C and C++ did to C, cppfront if successful will trail its own path, effectively being yet another C++ replacement language.
cppfront is an experimental sandbox. Its goal is not to replace C++ nor offer an alternative. Its goal is to explore features and semantics in order to improve C++ itself.
Exciting as it may be, to be fully available for portable codebases maybe around 2030, given the current velocity of compilers adoption of ongoing standards, even among the big three.
As of today, C++17 is the latest one can aspire to use for portable code, and better not making use of parallel STL features.
I would argue C++20 is totally fine. MSVC does not yet has a C++23 flag and it will be internally replaced with 'latest', aka some C++26 features, when you use it. This took us by surprise, because they deprecated some enum conversions and thus our clang-cl CI failed for openCV, with the latest llvm. I still fail to understand why enabling a specific C++ version, automatically means that it is considered stable. At least give use C++23-experimental or something /rant.
Writing C++20 code without modules, ranges, or concepts, is like, what is the point.
Naturally when code portability doesn't matter, it is another thing.
All my C++ side projects are written against C++latest on Visual C++, and make full use of modules and concepts.
> Writing C++20 code without modules, ranges, or concepts, is like, what is the point.
* std::span is in! https://stackoverflow.com/q/45723819/1593077
* Designated initializers (like in C99)
* Spaceship operator and default comparison ops
* More language constructs can be constexpr'ed
* Better structured binding
* using on enums
* Don't need to say "typename" as much :-)
* Bunch of minor improvements to the standard library
Note I did not say coroutines. I still don't understand how that boondoggle made it into the language the way that it has.
std::span is a trap, gsl::span should be used instead, unless one is willing to enable checked collections on the respective compiler.
As for the rest, the point stands regarding portable code across various C++ compilers.
I agree on the co-routines, while I know C# co-routines relatively well, and the C++/CX stuff that was used as inspiration for Microsoft's initial proposal, they are a bit of a mess, when key WG21 members also don't fully grasp how they have to be implemented, and we need two hour sessions on C++ conferences to go through "hello world" kind of implementations.
> std::span is a trap, gsl::span should be used instead
It's not a trap if you weren't expecting bounds checking.
> we need two hour sessions on C++ conferences to go through "hello world" kind of implementations.
After one hour it became clear to me I am going to stay away from that stuff until either it becomes much more usable (unlikely TBH) or someone forces me to use it...
std::span is no more a trap than the rest of C++'s standard library in that regard. Are you also eschewing std::vector and std::string? std::string_view, std::array?
I suppose C++26 brings std::span::at, although exceptions are a different can of worms.
Some errors are too late to fix, span was originally bounds checked when proposed.
Pre-C++98 compiler frameworks used bounds checking by default.
And yes, if I am calling the shots, bounds checking are enabled in release builds.
Never has been a problem other than performance cargo cult folks.
Thankfully governments are making this less of discussion.
> And yes, if I am calling the shots, bounds checking are enabled in release builds.
No disagreement there. But I'd prefer to turn it on across the board via compiler flag rather than pull in a special library for it and remember to use that library consistently. And if that's the case, I don't see std::span as any more problematic on this front than the rest of the standard library.
(Yes, I know, GSL isn't really a special library on Windows. But anywhere else, it is.)
>various C++ compilers
Which compilers? I'd bet there are compilers that are still stuck at c++98.
Concepts are well supported and have been for a while and they are so great. Those alone make C++20 worth it. But Coroutines also make it worth it, if you build software that can use them well. Designated initializers change how I write code (for the better - by a lot). And of course std::span.
Portable code across various C++ compilers....
> Writing C++20 code without modules, ranges, or concepts, is like, what is the point.
The point is, obviously, use other features introduced in C++20 and not have to deal with artificial restrictions when you opt to onboard onto whatever feature you'd like.
To me C++20 has more to do with designated initializers than modules, for the very obvious reasons. It's fine if you take a pass at an upgrade and prefer to take the hit of migrating through a bigger delta, but framing this thing as "what is the point" is indeed missing the whole point.
Assuming said C++20 features are actually portable across various compilers.
Do you understand what C++20 means in terms of compiler support? I don't know how anyone can post messages in this thread without understanding that.
Modules are DOA and won’t happen even by C++32
If portability isn't a concern, VC++ with MSBuild, or clang with CMake (without header units), are pretty much quite usable.
However I do agree with the feeling for large scale adoption.
The way modules and concepts went down, or the way GC API got added only to be removed, or ongoing contracts discussion, has pretty much settled my opinion that WG21 really needs to adopt the same approach as other programming language communities.
Papers without working implementations for community feedback shouldn't be accepted at all.
> However I do agree with the feeling for large scale adoption.
I think that everyone has misguided and naive expectations on how such a radical change would roll out to production software.
Changing dpendency management and updating build systems is a hard sell for professional projects delivering production software. It's the most radical change in how you're software is built with zero upside in terms of features. Best case scenario your software works as it always did. Worst case scenario you wasted tons of developement effort to retool and revamp your whole CICD pipeline just to break your project. Hard sell. I mean, why do people think so many projects are still stuck with C++11?
>zero upside
Drastically reduced compilation time is a huge upside for modules.
> Drastically reduced compilation time is a huge upside for modules.
You can achieve those already with the adoption of basic architecture principles and the incorporation of tools like compiler caches such as ccache. Those who have an interest in those already do it. Modules are no silver bullet and change nothing.
Nope.
Following such an insightful response, I feel I should get back to my projects and ask them to build slower just to comply with your insight.
People with bigger projects have to use something like this: https://userver.tech/d8/d4b/classutils_1_1FastPimpl.html
That is generally the case. Modules had two different proposals with thair own implementation and they choose the one microsoft iplementet after debate which is why msvc had modules from the start.
The missing part of that story is that they ended up compromising on a third approach, without build tools support, with the hope everything would be quickly settled after shipping the standard.
Apple and Google kept using clang header modules, switched focus to their C++ replacements, and clang transition to C++20 modules languished until a few heroes stepped in to do the work.
Meanwhile GCC is still work in progress.
And build tools are still a mess, even cmake doesn't have yet a story for header units.
As for Microsoft, except for Office, there isn't a single Microsoft product, especially C++ SDKs, that make use of modules in any form.
This is quite different from how other programming language ecosystems migrate features from preview into stable.
consteval is worth it alone.
I propose a theory/rule that every new C++ version takes superlinearly longer than the previous to implement.
Currently the Standard is at C++23 and we are ~4 years behind (C++20 is not portable, as you say). At this rate, by the time we get to C++40 or 50, compilers could be behind to a comical degree, like 15 years.
Personally I am interested to see how many unimplemented features it takes before the Committee takes action. (I would find it superbly amusing if they simply did nothing and we got to a point where no new C++ features ever became available.)
You may have missed that C++97 took six years to fully implement, due to export, and only a single compiler actually ever implemented it.
This is why I have the very unpopular opinion that C++26 is going to be the very last standard that anyone cares about.
It will be good enough for the industry use cases where C++ matters, while other languages keep slowly eroding C++'s market share.
Example of such scenario, LLVM, GCC, JVM, V8, CLR are all currently settled on C++17, maybe eventually C++20, they don't need any additional features, for their C++ use cases, other than having GCC and clang keep up with ISO.
How many people care about COBOL or Fortran 2023 standards?
GCC is still expected to bootstrap from a C++11 compiler. (For self-hosting compilers, language version choice is not only about useful language features.) The built compiler defaults to C++17. I think the remaining obstacle before moving to C++20 by default is more experience with (and fewer bugs in) the support for modules.
I've been writing in a certain C++17-like subset of C++20. I like designated initializers, and there are probably some other syntax and library conveniences from C++20 that I'm taking advantage of without knowing (std::string_view::starts_with?), but the rest is just C++17.
One notable exception is that I did a project with C++20's coroutines recently.
I'm with you, but in my experience you have to go further back still to C++11 before it's actually compileable on most distros. And even there the atomics stuff is not really fully supported everywhere.
One major gripe I have with these C++ updates is that the proportion of codebases that actually use recent C++ features (cpp17, cpp20, cpp23) is very close to zero. The more and more esoteric this language becomes, the fewer people who can actually master it.
Source: I've been writing C++ for 8 years.
I’ve been writing C++ over 20 years. The language is a freak show, combining the solid industrial tooling and userbase, with some development efforts led by a clown-car full of pretentious academics and people who just want to bolt on new stuff for no good reason except to ”keep the language fresh”.
C++ is not supposed to be fresh. It’s supposed to be portable, and allow fine tuning of programs to bare metal while allowing a sort of high level implementation of API:s (but often fragile and badly designed).
Some new features are excellent, others are not, and the history is plagued with weird historical feature gaps obvious to anyone familiar at all with more consistent languages.
So if something feels weird, there is always a good chance it’s not you, it’s the language (committee).
C++ certainly suffers from somewhat of a kitchen-sink nature. However, if you consider two of its design goals being:
* Support for multiple, different, programming paradigms.
* Very strong backwards compatibility, all the way back to C.
... then some "freakness" is to be expected. And I do believe some of the additions (to the library and the languages) have been excessive. However, I disagree with your characterization of language development work.
1. Most people on the committee, AFAICT, are from industry rather than academia. And if you consider national bodies, I think the ratio is even higher.
2. "Keeping the language fresh" is not a goal and not what the committee does. Most of what's added to the language are things that people have been complaining about the lack of for _decades_.
3. Feature proponents are those who want to "bolt on new stuff". Committee members are tasked with preventing new stuff being just bolted on.
4. Some new additions are necessary, and others are not necessary but useful, for "tuning programs to bare metal".
Finally - I agree that committee-work has the drawback of less consistency; and there are definitely warts. But for an established language with huge existing codebases and many stake-holders, and with the design goals I mentioned above in mind - an international committee and consensus-building is better than appointing some benevolent dictator.
> Most of what's added to the language are things that people have been complaining about the lack of for _decades_.
And are useless now, because everybody who had that problem either already solved it (the solution could have been "use another language") or did realise that it is not worth the hassle. I guess the best examples are `std::format` or `std::thread`.
> But for an established language with huge existing codebases and many stake-holders, and with the design goals I mentioned above in mind - an international committee and consensus-building is better than appointing some benevolent dictator.
That depends, but yes, everything is better than letting Stroustroup "decide".
I was not critizing the governance model.
I don’t think the committee / proposals process is necessarily bad. It is a good way to develop a formal specification for a portable and highly complex language with many pitfalls and serious, industrial-level legacy compatibility requirements.
It might be better if it had a true BDFL, instead of a spiritual guide, and I do worry about the committee getting too far ahead of the industry and leaving it behind, plus what will happen when Stroustrup finally retires in earnest.
But yeah, now and then it does produce a turd, and there’s only so much turd-polishing you can really do.
I guess I’m just saying it’s a development model with pros and cons. The pros are necessary. The associated cons are inevitable.
To be specific I was not critizing or promoting any particular governance or design model. Just that this particular authority has had it’s more dysfunctional moments in it’s output - one should not presume all features of C++ are splendid examples of software design.
there are definitely some total stinkers.
Any language under ISO doesn't have any spiritual guide, that role is gone the moment ISO takes over.
Everyone has one vote, and everything turns around politics to win mini-elections per feature evolution stage.
You can find std::string_view (C++17) in Google's WebGPU implementation [1], static_assert (C++17) in Protobufs [2], <bit> (C++20) in React Native [3], and std::format (C++20) in Cuda Core Compute [4]. So the big names in tech aren't afraid to add -std=c++20 to their build scripts. On the other hand, C++23 features aren't as common yet, but it's still very fresh and MSVC support is WIP.
[1] https://github.com/google/dawn/blob/40cf7fd7bc06f871fc5e4823...
[2] https://github.com/protocolbuffers/protobuf/blob/c964e143d97...
[3] https://github.com/facebook/react-native/blob/77b3a8bdd6164b...
[4] https://github.com/NVIDIA/cccl/blob/07fef970a33ae120c8ff2a9e...
I'd venture a guess that string_view, static_assert and bit were already a part of respective codebases, just in-house versions. These are very commonly used. So seeing them getting adopted is completely unsurprising.
However the adoption rates of newer C++ features are in fact new are way lower. From what I see lots of projects still use the language as C with Classes, basically, and that ain't going to change any time soon. The GP nailed it - C++ is adding a lot of esoteric stuff that very few people actually need or want.
Imagine how widespread use of Java 8, .NET Framework, Python 2, C89 is still around the industry and now apply it to C++ versions.
There is a reason why C++17 is the best we can currently hope for in what concerns portable code, given the actual support across industry compilers, and company project guidelines.
Many embedded shops might still be discussing between adopting C++11 or C++14.
I agree, but there's a big difference between saying some industries or companies are still targeting old standards and saying there's "near zero" adoption of new standards. The latter just isn't accurate from what I see.
I've been writing C++ for well over 30 years. I'm currently employed full-time maintaining the C++ toolchain, runtime, and standard libraries for a major commercial embedded OS. I see a lot of C++17 being used by my customers every day. It's there, running everything around you.
C++20 is still too fresh for my industry, especially for embedded where runtimes require certification for functional safety. Maybe in two years.
What can I tell The Committee? Stop. No, we don't need a single central ex cathedra library for networking. Or graphics. Or SIMD. Even the existing filesystem library is so broken it's dangerous (the standard specifies if it's used on an actual filesystem it's undefined behaviour -- which means using <filesystem> means your program could provoke the legendary nasal daemons just by being run). Stick to generic basics and leave the specialized stuff that not everyone needs to third-party libaries. Nothing wrong with a marketplace of libraries to serve an entire economy of requirements.
Standard SIMD everyone can build on top of sounds like a great idea--no unnecessary fragmentation due to using different subtly (or not) incompatible libraries. SIMD instructions are in desktop CPU since 90s. It is long overdue.
You are saying that the standard specifies that the standard file system features themselves do not work?
If another program (or thread) is using the same filesystem, calling std::filesystem functions can be UB.
> Behavior is undefined if calls to functions provided by subclause [filesystems] introduce a file system race.
http://eel.is/c++draft/fs.race.behavior#1.sentence-2
> the proportion of codebases that actually use recent C++ features (cpp17, cpp20, cpp23) is very close to zero
~Nobody uses all the recent features, but some new C++20 stuff does get adopted very quickly, like 3-way comparisons, constinit, abbreviated function template, etc.
For C++23, support for it is severely lacking in MSVC at least, so that's going to severely impact users.
Other compilers are hardly any better.
There can't be full C++23 support when they are still busy adding C++17 and C++20 features.
I'm puzzled by this statement. In all three places I worked in the last 7 years, we actively pushed for the newest language standards. We're very eager for the c++23 switch to arrive so we can finally derive from std:: variant. And we're using a good subset of c++20 currently.
In some ways, you’re not wrong. In other ways, there’s been extremely broad support for some major new features in the language in recent years, like coroutines and concepts.
>One major gripe I have with these C++ updates is that the proportion of codebases that actually use recent C++ features (cpp17, cpp20, cpp23) is very close to zero
It depends what industry you're working on. A lot of HFT shops keep up to date with the latest compiler and make extensive use of new features that improve the ergonomics and compile-time performance of template metaprogramming, which is important for achieving the lowest possible latency.
Same with cars, buildings made out of newly discovered building materials, and electronics. I would argue this is a good thing for the same roughly the same reasons - rewriting software to use latest and greatest language feature is usually not efficient.
Why do you think this is?
Some reason I can think of:
- Can't update the compiler (eg, porting the code base to the new compiler is too complicated)
- No compiler support for the new standard that target a specific platform that one still want to support.
- Too much work to update the whole code base to work with the new standard.
- A 3rd party library is not supporting new standard yet.
- The team is reluctant to have to learn new technologies.
Some are somewhat valid reason, some are less, some are indication of deeper problems.
(P.S: My C++ code base is using C++20. Didn't move to C++23 yet because I think some customers might not be ready for it yet for one of these reasons, but I'm going to push for it at some point.)
Compiler support for the platform is the general limit. C++ is very good about not breaking old code so old codebases are easy enough to port and anyone who refuses to learn can keep using the old ways.
> One major gripe I have with these C++ updates is that the proportion of codebases that actually use recent C++ features (cpp17, cpp20, cpp23) is very close to zero.
That's verifiably not true.
https://www.jetbrains.com/lp/devecosystem-2023/cpp/
Anecdotally, all the stuff I do has been minimum C++20 for a few years. If you're using e.g. Qt 6, released in 2020, you're using at least C++17 features without knowing it ; same for recent versions of boost which start depending on C++17 or C++20 depending on the features / libraries.
> One major gripe I have with these C++ updates...
"One major gripe I have with cars is the number of people that know how to drive one is very close to zero."
Where I work (big tech) everything is c++17. I don't know what the schedule is but in a couple of years every bazel and CMake will get bumped to c++20. And so on.
Why is this? I use new Python features pretty much immediately.
Because ISO languages are usually designed on paper, with some of the features being done on whatever compiler the respective paper author feels like, thus compilers only rush out to fully support a new standard when it gets officially supported.
With Python, there is no standard per se, it is whatever CPython does, and everyone else has to try to mimic CPython.
This is intentional.
Most of the new features are for library writers.
I don’t really get this argument. Large C++ codebases are generally divided to libraries. The internal libraries and vendor libraries should both be of high quality. I’m not familiar with industrial use cases where every C++ user would not be a library writer.
> The internal libraries and vendor libraries should both be of high quality.
From my limited experience - high-quality internal libraries are simply not the reality; less likely to be achieved than winning the lottery. Companies typically:
* are not able to identify candidates able of writing high-quality C++
* do not try to attract SW engineers by committing to high-quality code.
* don't believe they should invest developer time in making a library more robust, and bringing them to the level of polish of a popular publicly-available FOSS libraries.
* do not have a culture of acquiring, honing and sharing coding skills and expertise, with the help of actual experts. Again, time and effort is mostly not invested in this.
Either you’ve worked with rookie developers (which is fine, but not ’expected industry baseline’) or in an engineering core lacking years of C++ development. Doing stuff ’the right way’ does not generally need extra resourcing - you simply do it the right way.
Quality gaps like described above - I think this happens when you try to develop C++ without actual experience in C++. C++ is so weird anyone trying to ”do the right thing in the language they are most familiar with” generally get it wrong for the first few years. And then you end up with a quagmire nobody wants to volunteer to clean up.
This is not a skill issue as such or lack of talent. C++ simply is so weird and there is so much bad ”professional advice” that you are expected to loose a few limbs before being able to navigate the design landscape full of mines.
> And then you end up with a quagmire nobody wants to volunteer to clean up.
Not only that, but the rookie developers coming in get inculcated into that. That's what they're used to, and they have all the motivation to continue writing poor code, because they need to avoid their better code clashing with what's already written - clashing compilation-wise and style-wise.
Of course, it's not 100% all bad, there are gradual improvements in some aspects by some developers.
The upshot is that generally relevant C++ codebases become decades old - there should be enough time to eventually become competent.
Take a look at the implementation of ..say.. std::tuple, and say whether most C++ users need to be able to write that kind of C++.
No, and they shouldn’t probably. Most internal libraries don’t have and don’t need to implement novel complex template based specialization - not in their API at least. And stuff that’s internal to library needs to only implement the things the API contract requires - which usually does not require the rigmarole of fully generic ’modern’ template based implementation.
> if (auto [to, ec] = std::to_chars(p, last, 42))
I'm not into plusplus, however i'm curious. How the tuple get evaluated to a condition ? is that lowered to if `to && ec` ?
The std::to_chars function returns an object of type std::to_chars_result, which defines an operator bool() checking if ec == std::errc[0].
The if statement determines which branch to take based on the value of the condition. This value is contextually converted to a bool and evaluated[1].
[0] https://en.cppreference.com/w/cpp/utility/to_chars_result
[1] https://en.cppreference.com/w/cpp/language/if#Condition
But there are two variables being defined by the destructuring. I believe OP's question was whether there's a rule for which gets chosen for the condition, rather than about contextual conversion to bool in general (which happens even when there's no initialisation in the if statement at all).
Your comment seems to imply the condition is evaluated before initialising the variable(s) at all; is that what you meant? If so, this beast would work (even though it's undefined behaviour to construct a std::string from nullptr, and std::string is not convertible to bool):
Yes exactly. My hunch is to remember that in `auto [to, ec] = std::to_chars(p, last, 42)` the two names `to` and `ec` are not "real" variables/objects, but names bound to parts of the object returned to by `std::to_chars`. So fundamentally, `std::to_chars` returns a `std::to_chars_result`, that _is_ the return value and what is then contextually converted to bool for evaluation of the condition. It's then some C++17 compiler thing that separately associates the two names `to` and `ec` with the two parts of that returned tuple object.
But I could be wrong, the paper for the feature is linked but I didn't read it (!).
> Yes exactly.
Yes exactly, my example would work?
> My hunch is to remember that in `auto [to, ec] = std::to_chars(p, last, 42)` the two names `to` and `ec` are not "real" variables/objects, but ...
Oh so my example wouldn't work after all (because std::string s is a "real" variable/object)?
Your example wouldn't work, yes.
In case of structured binding
but in your case its simply:What is assigned (std::to_chars_result) is considered by the if condition. The left hand side of the assignment is then split in two. Just like if it were if (auto res = std::to_chars(p, last, 42)). The split with the [to, ec] makes it convenient inside the if body.
Ok but you've avoided saying whether my example would work, and I don't think what you've said even hints one way or another.
The result of the expression is the condition.
Thus, in your example, the bool check would apply to "s", after the expression is evaluated.
The fact that foo() may return nullptr at runtime and your "s" is UB is your fault for running with scissors.
so "this beast would work" for some definition of "work". But not because of order of evaluation.
Most modern C++ compilers would warn you about not using a bool in a conditional anyway.
> The result of the expression is the condition. Thus, in your example, the bool check would apply to "s", after the expression is evaluated.
This is a contradiction. There is no expression in my code that evaluates to s. foo() is an expression, and then std::string s = ... is assignment initialisation, which is not an expression.
Edit: I suppose that if I used another form of initialisation, the answer becomes a bit more obvious:
(Not that this makes sense but just the point is to use a constructor with more than one argument.) In this case it's clear the test has to be the just-initialised variable. In fact there could be no arguments at all!You are using definitions that I am not familiar with. Maybe it's because we speak different programming languages :-)
x = y is an expression statement in C++, which can be evaluated in an "if" for its side-effects.
https://en.wikipedia.org/wiki/Expression_(computer_science)
But Type x = y isn’t.
`ec` is an error code. What happens is a conversion to bool, see `operátor bool()` https://en.cppreference.com/w/cpp/utility/to_chars_result
And no, don't ask mé why somebody might think that a bool is a suitable type to check for success or error.
What is wrong with your keyboard lol
Autocomplete.
User-generated static_assert messages would make it easier to build games that can be played entirely using compiler error messages. Something like this old IOCCC entry but nicer:
https://www.ioccc.org/years.html#1994_westley
This doesn't mention the most exciting thing coming: static reflection. Finally no need to manually implement printing or serialisation functions for every struct.
Probably, it isn't fully backed in, and can happen the same as contracts in C++20.
Some of those features look like features I've been seeing in all major languages I use. They're mostly ergonomic for the developer.
C++ is a funny chimeric creation that has absorbed some great modern ideas from Rust and other new language but needs to preserve compatibility with its antediluvian C heritage. You could write it in a very clean and somewhat safe modern style or in a familiar C-like style. We use modern C++ at work and, by embracing RAII, it really isn't so bad.
https://web.archive.org/web/20240907061007/https://mariusban...
It seems to be down.
how long before you reckon it would be widely adopted? i feel like the pace of revision (~3 years per standard) seems too fast.
Regarding the delete feature, can one not just raise in C++ for a deprecated/deleted function?
The concept of doing something at runtime that could be done at compile time is anathema for c++ programmers.
You could but why when we already know at build time that the function is deleted or deprecated and better yet know exactly where.
runtime errors when in a rare path are often never tested until a customer hits that rare case. this is on of the reasons I won't use python for a larga project, eventually you will have critical errors in production because not only didn't you test the code but you didn't even get the minimun proof that it works that a compiler provides.
> not only didn't you test the code
That's why I won't use C++ programmers in a large project
I for one started working on a new project in C++ rather than Rust. I think it's unclear whether Rust is going to be the successor at this point. It's probably never going to pick up in the games industry, QT is C++ (and Rust bindings will always be second class or they could end up unmaintained), has better compile times and is said to be undisputed when it comes to high performance. Obviously the tool for the job factor is most critical.
Career wise, many people are picking up Rust and almost no one is picking up C++, while experienced C++ devs either retire or switch to a higher level language due to landscape change. I would trust supply and demand to be in favour of C++ 10 years from now.
There are also attempts to make C++ more memory safe like Carbon[1] or Circle compiler [2]. If they succeed, why would anyone want to switch to Rust? Also Rust is not ideal for security from a different perspective - I think the lack of a package manager is one C++ strongest points. After working for 9 years with npm, you really appreciate that the dependency you install is usually just a single dependency (even an option for a single header file is common), when there's a package manager, people will abuse it (like install a package with 3 dependencies of its own for something that could be 50 LOC copy-paste) and managing the security of the supply chain is nearly impossible (it will be a legal requirement soon in EU though).
Anyway, I wanted to ask. How is the contracting market looking in C++ world? I'm guessing it depends on the domain heavily? I'm mainly asking about QT and anything that would be desktop / mobile apps / systems programming except video games, but I'm curious in general.
[1] https://github.com/carbon-language/carbon-lang
[2] https://github.com/seanbaxter/circle
Reflection is awesome, it reminds me a lot of Zig’s comptime functionality.
There is no reflection feature in TFA.
Excuse my ignorance, but what is TFA?
I believe Reflection is being taken very seriously and will be included in standard 26.
The Fine Article, meaning what the HN submission links to.
I've always been mystified by that expansion. "The featured article" seems to me to make much more sense.
Originally, it stands for "The Fucking Article", as in "RTFA" (= "Read the fucking article"). The expansion "The Fine Article" is a humorous reference to the former.
Note that over the years "TFA" has lost its profane meaning and is generally used in a neutral way.
Planned to be included, it remains open if it will indeed.
Why they don't just invest in carbonlang instead?
Wow. Against my better judgment I will keep to the rules of this site and assume that was a good faith question.
Rust is already a good systems language and is getting adoption. D is a great c++-alike already and for 20 (?) years.
There is a mature C++ toolchain for any processor and OS you can imagine.
Simply adopting a different C++ compiler or a newer version of one you are already using can take many months for a large company. Migration to even Carbon would probably take 10x as much effort.
Carbonlang is meant to have a seamless migration path, that is, flip the compiler on the same codebase for starters, no changes. It's not like TS/JS as a superset langauge, but you can have both at the file level and compile side-by-side.
First there is no Carbonlang, it is called Carbon.
Second, it is mostly a Google thing for their C++ use, it is still mostly a frontend implementation at this point, with semantics yet to be fully defined.
They are also open that Carbon is basically an experiment.
If it is just a frontend, does it mean that it could be used everywhere the backend is used?
First you need to create a backend that understands what the frontend generates.
https://godbolt.org/z/rzMcs6nvh
I see...
Does it mean that the compiler is more than just a frontend? Or maybe I don't understand what a frontend is?
It definitly is.
Frontend in a compiler, is what converts the text code representation of the language into some intermediary format, usually a graph or intermediate language, that is than further processed for type checking and other semantic analysis, suffering other transformations in the process, until fed into the backend, which takes it from there for the further phases required to generate machine code.
https://llvm.org/docs/tutorial/MyFirstLanguageFrontend/index...
Thank you for your explanations.
I don't understand why you provided that link to goldbolt though? What was it supposed to demonstrate?
Also, you said that one need a backend for a frontend (to be useful I guess). Do you mean to say that the "Carbon" frontend does not have any backend to work with?
I suggest learning about compilers in first place.
https://www3.nd.edu/~dthain/compilerbook/compilerbook.pdf
Oh, thank you...
I am sorry I bothered you with my annoying questions..
No worries, I see some interest there, hence pointing out some literature on the subject, and the LLVM tutorial, as means for getting a better understanding than through plain HN comments.
C++, the sharpest knife in the drawer.
Unfortunately it's the sox-and-T-shirts drawer...
.
If you replace your whole comment with ".", I'm just going to automatically down vote it, so the effect is worse than whatever was originally there.
These kinds of "masking" edits prevent good communication. If you want us all to disregard a comment you now totally disown, then just write an edit that prepends/appends that.
If HN decides to downvote a technical comment into oblivion that contains a minor technical oversight, I will not be part of that. And I can do whatever the fuck I want with my comments. That's what you get by (probably illegally) disabling a delete button
Maybe they are fine with being downvoted and just want to prevent readers wasting any further time with their comment.
Writing that in an edit is completely valid -- we're grown up enough to weigh whether to bother reading the comment then. Replacing the whole thing with just "." is not acceptable, IMO.
If they're fine with accepting the down votes either way, I still want to register my complaint, pointless as it may be in practice.
It's perfectly okay that you downvote, as that supports the goal of demoting the comment. (The effects of up/downvoting on comment ranking trumps the effect on the author's karma, IMO.) However I don't agree that pseudo-deleting is unacceptable, if for example it contained an incorrect argument and the author thinks there is no value anymore in someone reading it.
So they presumably think they were fully wrong. Wrong about what? It's often useful to know about mistaken assumptions, etc. Now all the "corrections" in the replies are harder to comprehend. Everybody's work is devalued and made harder as a result of the pseudo-delete.
I mean, take a step back and look at what you’re talking about here: a minor subthread about an incorrect argument. When I see a “.” post, I usually think “okay, nothing to see here, I can skip this”. Which in all likelihood is the best for my and everyone else's time. And I grant the OP the freedom to make that call.
In terms of UX, it probably would be better if HN allowed a commenter to “dead” their comment, including the attached subthread. People who cared could still view it with showdead, but everyone else would be saved time.
It's just mildly annoying and rare enough I want to register the breaking of an implicit norm. I don't need to take steps forward or back, or anywhere else, I just disagree with you still, for subjective reasons that should be fairly clear.
std::ignore doesn't work in the context of structured bindings. And even if it did, "we already have this" has never stopped the C++ committee from adding something before :)
Yes, the "skill" of the commitee to discuss some feature for a really looooooong time and then come up with a solution which is going to be (half) fixed in the next standard always astonishes mé :).
I guess with C++38 we'll get a `always_definetly_ignore_this_wothout_any_diagnostic_whatsoever`.
sure about that? It does, happen to, work on libstdc++/libc++/MS STL but it's not specified to work anywhere but std::tie. The existing practice is to cast to void.
can you show an example how? I can't find a case where std::ignore compiles inside a structured binding declaration.
https://godbolt.org/z/sjefeGvPf https://godbolt.org/z/8a7Ps4KdW
it will work for assignment/ construction, but isn't part of std c++ is what i mean. at least not yet. structured binding wont, but in the future c++ 26, _ should work there and other places
No, that's why `maybe_unused` has been needed.
Leading with "Specifying a reason for deleting a function" then following up with "Placeholder variables with no name" did make me check the date of the article. It wasn't April 1.
The standards committee are thorough in their mission to including everything and the kitchen sink in C++.
"Placeholder variable with no name" is the super-common feature from other language where you write, for example (not exact real syntax):
since you only want the first and third items; the underscore is the placeholder.Useful feature, convenient feature, doesn't complicate your life as a programmer, no need to even remember it, it'll just come to you. Good thing to have in the language IMNSHO.
C++ has certainly had a lot added, but I don't get your point regarding these particular two features. They seem quite minor, useful, easily implemented and unlikely to interact problematically with other things.