Hacker News new | past | comments | ask | show | jobs | submit login

I do not see a double standard. There is a fundamental difference between PHP and C++. PHP was simply a bad design, needlessly ugly.

In contrast, the design of C++ was constrained by the goals of

- Compatibility with C (and later previous C++ standards)

- Zero overhead / you do not pay for what you do not use

- No GC

Those constrains are responsible for most of its ugliness, not poor design.




1. From a user's perspective it doesn't matter if it's "constraints" or "poor design".

2. Blaming C does not explain C++'s undecidable grammar, the zillion ABI incompatibilities introduced on top of C, lack of an alternative for #include coupled with features like inlining and templates making it a disaster that it never quite was in C, or many other of C++'s faults. How is PHP's grammar a worse design than C++'s? I really want to know. I think PHP's is better.

3. Source compatibility with C does nothing for the user (who could have lived with ABI compatibility just fine) and everything for Stroustrup (who managed to make his language that much more popular by an Embrace, Extend and Exterminate approach).

4. Design goals inevitably resulting in a bad language are bad design goals. If I said that I designed an electric car that electrocutes you but that's because it's compatible with an electric chair for which there's a lot of existing demand, would you then publicly defend me?

5. PHP is a child of Perl which is a child of shell, awk, sed and C. A lot of heritage there to blame as well. So?

6. PHP doesn't byte one's behind nearly as badly as C++ and that's why the double standard exists; because only a fraction of the people who try manage to ever get anything done in C++ and a large subset of these people feel entitled to be treated as an "elite". PHP programmers cannot afford to delude themselves thusly because everyone can be productive in PHP... A funny outcome, I find, given that it "should" have actually led to PHP being praised.


> 1. From a user's perspective it doesn't matter if it's "constraints" or "poor design".

Of course it does. If it's poor design, one just don't use the language at all. If it's constraints, any language that solves the same kinds of problems will be just as bad, and - life just sucks - one must stop complaining.


"If it's poor design, one just don't use the language at all."

For greenfield development, sure. Many and maybe most of us don't work in that space.


From a user's perspective it doesn't matter if it's "constraints" or "poor design".

Of course it does. I'm a "user" of C++ and all three of those constraints are very very important to me. There's pretty much no other language (in common use with lots of vendor support etc.) with the power of C++ that follows the same constraints. The constraints are the very reason that C++ is still so popular despite all the complexity (and seemingly an extremely hostile group of critics trying to warn everybody against using it.)


1. Wrong for reasons others have mentioned

2. PHP was never designed to even support objects in the first place. The API is super inconsistent in terms of parameter ordering and naming. Contrast that with the vast majority of the STL. Sources for the zillion ABI incompatibilities comment?

3. How about, oh, IDK, OpenGL being a C interface. Lua? The Linux kernel? X11? Driver interfaces in general?

4. The big presupposition here is that it's a bad language. Honestly, I'd like to see you design a language that does what C++ allows you to do much better.

5. C was arguably way better put together than Perl was. Perl is an abomination to be honest.

6. What are the use cases of C++ compared to PHP? C++ is used to do things like author renderers, or physics engines, or mathematical simulations. Trust me when I say that attempting to accomplish any of those to the same volume with the same hardware on PHP is madness.


1. Except for self-inflicted constraints that don't match the problem domain. Which leads to…

3. OpenGL, Lua FFI, Linux… have nothing to do with source compatibility with C. And everything to do with ABI compatibility. C-with-classes was a preprocessor that generated C code remember? That pretty much guaranteed ABI compatibility, since it was then compiled by the same C compiler.

But what did it matter if the pre-processor where also capable of parsing plain C? It doesn't! We have a C compiler, why not just use that? If stroustrup wanted to, he could have replaced the braces and semicolons by mandatory indentation, Python style, and lose zero backward compatibility in the process. Or at least, no compatibility that actually matters.

Source compatibility was a useless constraint that lead to a painful language. From the user's point of view, the difference between a useless constraint and a bad design choice is nil.

Now I will concede one advantage of source compatibility: learning curve in the extremely short term. If you don't even have a few hours to learn a different syntax, it makes sense. But if your intend to use a language for more than 2 weeks, it's silly to suffer any long term cost just because you don't want to pay that learning cost up front.


Source compatibility provides an upgrade path. This was useful when C++ was introduced, because it allows you to take an existing C program and incrementally start using C++ features. You could claim this is less important today, but I have a counter example. Look at gcc. It used to be written in C, but switched to C++ recently. Try that with mandatory indentation.


Upgrade path? Easy as pie! Just parse the C source, then pretty-print a mandatory-indentation version. As long as the new language has all the semantics of C, converting from one syntax to another is trivial. And if you insist on "incremental", you can do this one module at a time.

Seriously, stop listening to your guts and look at the facts.

As for gcc setting up some kind of example for upgrading to C++… C and C++ are low-level systems languages. They are not suitable for static compilers. While I understand bootstrap as a validation technique, it is not worth the hassle of suffering such a mismatch between the language and the problem domain.

Static compilers should be written in an ML dialect, or Haskell, or maybe Lisp. C and C++ have no business here. And if you really want the bootstrap thing, then you should also implement a DSL that compiles to C or C++, and use that to make your compiler.


Okay, if you literally have all the semantics of C, your idea works. But if you have ALL the semantics of C, why bother changing the syntax?

As for ML and company being a better choice than C++ for writing a compiler... There may well be convincing arguments for this position, and if I asked you, no doubt you'd give me some. But I'll ask something else instead. Most compilers I heard of are written in C or C++. Name me a compiler written in ML/Haskell/Lisp. There's just one rule: no bootstrapping. Meaning that if you show me a compiler for ML written in Haskell, that's fine, but a Lisp compiler written in Lisp is not.


> But if you have ALL the semantics of C, why bother changing the syntax?

Two reasons: first, change what sucks (like the useless clutter with braces and semicolons, or the switches that fall through by default). Second, add more features on top of what we have.

As much as I hate C++, we do live in a C dominated world, and C is… limited. The lack of a usable generics system is crippling. C++'s destructors really help with semi-automatic clean up. And it would be good to have a module system, and maybe built-in support for algebraic data types. (I must not overdo it: if I end up turning C into ML, I should use ML. Or Rust.)

> if I asked you, no doubt you'd give me some.

I'll give you one: Algebraic data types, which are virtually unmatched at representing abstract syntax trees, and basically any intermediate representation a compiler can come up with. Other arguments here: http://flint.cs.yale.edu/cs421/case-for-ml.html

> There's just one rule: no bootstrapping.

Of the top of my head, I don't recall any compiler that doesn't bootstrap. In any case, I fully expect that most compilers are indeed written in C and C++, not in ML or Haskell: C and C++ are that popular.

I also happen to think our industry as a whole is mostly nuts, retarded, and short-sighted.


You may well be right about that last part. But still, perhaps one day someone will write a C compiler in ML in record time and get people's attention.


I don't believe that C++ would have had undecidable grammar if it hadn't aimed for source compatibility with C.


What rubs me the wrong way is that it's not even C compatible. Trying to compile a C codebase with a C++ compiler is more likely than not going to give me hundreds or thousands of conversion errors thanks to the terser casting requirements. It's a worst of both worlds situation: Neither the ability to use C code as-is nor a clean grammar.

I also believe the grammar could have been made much better while maintaining C compatibility. The language design seems more like throwing stuff at the wall and rolling with whatever kind of sticks, than fully exploring the problem space. See the export keyword: implemented once, and then removed as worthless, not even bothering with the step of deprecation, at the recommendation of said implementation.


The undecidability arises from the way that C's lexer hack for types interacts with the way C++ chose to implement templates. There are other approaches to implementing generics that could have avoided this unfortunate interaction while remaining backwards compatible with C (as C does not have generics).


There also isn't a real alternative to C++ for many of the things it does. If you're writing an OS, JIT compiler, or a garbage collector, you're writing it in C or C++ right now. And due to frustratingly solvable problems like symbol collision, C does a terrible job of scaling to to large code bases.

In contrast, there are many languages that can do the job of PHP.

To be sure, there are languages that aim to challenge C++ on its home turf, but it will take a lot of time and a lot of work before you will see a relational database written in Rust (or insert your favorite C++-killer here).

EDIT: spelling and grammar


"C does a terrible job of scaling to to large code bases"

Linux says otherwise.


That's partly an exception that proves the rule and partly not a typical case. Uniquely, OSs get to reserve names for themselves (like time, syscall, reboot, etc.). Third-party libraries and other code have to do funny prefixing to avoid name collisions. OSs also have relatively few source code dependencies and, when they do, they get to dictate them.

There are other reasons that C doesn't scale well to large software systems. For example, C's relatively limited compile-time error checking, which leads to using conventions (especially naming conventions) to enforce good practices where tools and good typing would do the job better.


Exceptions don't prove rules; they refute them.

The actual origin of the saying is quite different:

http://en.wikipedia.org/wiki/Exception_that_proves_the_rule


I understand what the expression means, and that's what I meant (though perhaps I could have been more clear). *nix (and libc) are exceptional in that they got first dibs on the symbols they wanted due to being there first. So it proves (while justifying) the rule that you have to prefix all of your symbols unless you're Thompson or Ritchie in the 70s. Even now, in newer C features, extra acrobatics are done to avoid name collisions (_Bool in C99 for example).

But this is really a side issue to my original point: C++ and PHP are categorically different since there aren't strong alternatives to C++ for many types of problems. Though it is possible that other languages are eating at the edges of what C++ is good at.


You don't need to write large software systems in C or C++. Use Java. Or decompose it into components and employ Python.

But GC, VM, OS - all those TLAs can and should be written in kosher C.


Single data points never tell you a tremendous amount. People can and do write good software in languages that are poorly suited to the task all the time. No one is saying you can't do it; just that it makes your life significantly harder than it otherwise needs to be.

The success of C++ provides a wealth of evidence that C isn't really up to the task -- everyone seems to hate it for its complexity, but very few people are writing web browsers in C89 these days.


I would label your third point as a subcategory of the second, as it is impossible to have a garbage collector without having some sort of overhead.


> I would label your third point as a subcategory of the second, as it is impossible to have a garbage collector without having some sort of overhead.

It's impossible to have a heap allocator without overhead. How much overhead you have depends on the design constraints and the trade offs. Particularly if you are dealing with more sophisticated constraints like memory fragmentation and other issues, a GC might actually be an easier and lower overhead way of solving the problem.

More importantly, you missed the most important aspect of the second point: "you do not pay _for what you do not use_". You and copx would be surprised to know that pretty much since the standard committee started working on it, garbage collection has been something on the table for C++, but they've never been able to get a proposal that everyone was happy with.

I also think that the "No GC" category is perhaps not the right name. What C++ was really about was making UDT's first class types and therefore supporting value semantics. A side effect of that is that GC isn't nearly the priority it might otherwise be.


How about we put it this way: all forms of dynamic memory management have overhead, including malloc(). The overhead for garbage collection is different from the overhead for malloc(); GC is worse some respects (latency, space usage) but better in other respects (throughput, development time).

GC can be much faster than malloc() when allocating objects, depending on the GC scheme used and the heap profile, allocation savings may outweigh the cost of collection.

So "No GC" is a completely separate point.


Garbage collection does not offer better throughput than manual memory management, in fact it tends to need ~6x the memory to equal it[1].

[1] http://www.cs.umass.edu/~emery/pubs/04-17.pdf


I'd be wary of that paper: of the 5 garbage collectors they have tested, only one appears to be generational. That makes me doubt they used sufficiently state of the art garbage collection.


"Compatibility with C"

This is not a feature for a programmer to use to make their work easier, rather this is a hook to infect the code, spread the disease and live where other nicer things wither.

C++ is like mold. Yes it is ugly, and you don't want it, precisely because this is the most efficient life form compatible with your stale food, whether you want it or not.

Mold is the most popular pet people have, because it is so hard to avoid. Lice (PHP) were too, but today it was mostly eradicad thanks to better hygiene.


>This is not a feature for a programmer to use to make their work easier

It makes it possible to use C definitions directly and inline with no conversion overhead or indirection. When your work is writing fast code, the ability to leverage existing fast code damned well is a feature.

>C++ is like mold >you don't want it, precisely because this is the most efficient life form compatible with your stale food

You don't want mold because it causes respiratory and neurological problems. C++ does not cause respiratory or neurological problems, unless you count the insanity it inherited from C like, "A character is the same thing as a byte," "A pointer is the same thing as an array" (which isn't even actually true within the language, unless you count parameter declarations), and my personal favourite, "The mathematical value of false, the cardinality of the empty set, and the identity of nothing are all the same thing. After all, they have the same representation!"


> [Compatibility with C] makes it possible to use C definitions directly and inline with no conversion overhead or indirection. When your work is writing fast code, the ability to leverage existing fast code damned well is a feature.

That is achieved with ABI compatibility. You don't need source compatibility however. At worst, you'd have to re-write the prototype of the C function in your own syntax-incompatible language.

Think of it as a zero-overhead FFI.


>re-write the prototype

>ABI compatibility

In other words, conversion overhead and indirection, respectively. There is no substitute for having an inline definition available at the point of use.


Oh. I thought you were talking about runtime overhead. "Fast code" and all.

Well if you're worried about writing overhead, there are many FFIs out there that parse the header you want so you don't have to repeat anything.

But if you believe for a nanosecond that we need source-level compatibility to get the benefits of inlining… Just remember how C++ first worked: as a pre-processor, that generated C code! So it's all C underneath, and GCC and LLVM and MSVC and ICC can all do their crazy optizations accros the whole project, should you need that.

And in any case, C++ is no longer source-compatible with C. And it no longer work as a pre-processor either. Yet I doubt it has the disadvantage you talk about. C++ has many faults, but overhead with the C FFI is not one of them. Which means we have an existence proof for the assersion that source-level compatibility is. not. needed.


>if you believe for a nanosecond that we need source-level compatibility to get the benefits of inlining

You do. Yes, there's LTO, but to do it effectively relies on storing source code or information derived from it in the intermediate object files. Without the source, the best you can do is try to inline what is hopefully a reasonably well-behaved hunk of object code without any of the high-level information associated with it that the compiler threw away when it emitted it.

>C++ has many faults, but overhead with the C FFI is not one of them [...]

I'm not sure exactly what you mean, but if you're talking about the runtime overhead of calling an extern C function, then that is very much a fault. Inlining can make all the difference in the world with respect to performance in an inner loop, the classic example being qsort and std::sort, and to take advantage of that, source-level compatibility is needed.

Now, if you said that source-level compatibility were better achieved using a fully-fledged FFI than by stuffing all but the worst (well, some of the worst) of C into your language, I would whole-heartedly agree with you. If C++ had done the former, it would be a much cleaner, much less popular language, and instead of using a warty language with templates, useful user-defined datatypes, and function overloading, we'd all be using a warty language without templates, useful user-defined datatypes, or function overloading. Hell, many of us still do anyway.


> C++ is like mold. Yes it is ugly, and you don't want it, precisely because this is the most efficient form for it to be able to colonize your food.

Why does any mention of C++ bring out these silly metaphors that contribute nothing to the conversation?




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: