Hacker News new | past | comments | ask | show | jobs | submit login

And yet..I think this critique gets less strong as time goes on.

The amount of productivity available to Mr. Kamp for free today is conservatively double or triple that available in 1999. Databases, web frameworks, scale know-how, IDEs, hosting platforms, the list goes on.

He harkens back, sadly, to an era in which codebases like Genuity Black Rocket cost $100k in licensing, and ran on $30k/month Sun servers. Seriously.

Languages are faster, development times are shorter, and chips are WAY faster. And, code can be pushed out for tinkering and innovation onto github for free. Combine that with his estimate that we have 100x more people in computing, and the combination is a riot of creativity, crap, fascinating tech and everything in between.

The bazaar is messy, but I'm not aware of any solid critiques which show cathedrals are more efficient at the multiples-of-efficiency kind of gains we get from legions of self-interested, self-motivated coders.




> The bazaar is messy, but I'm not aware of any solid critiques which show cathedrals are more efficient at the multiples-of-efficiency kind of gains we get from legions of self-interested, self-motivated coders.

The article isn't about efficiency, it's about quality. The assertion is "Quality happens only when someone is responsible for it."


> Languages are faster, development times are shorter, and chips are WAY faster.

This is due to Moore's law, not the software design choices that the article bemoans. Those $30k/month Sun servers were many times faster and cheaper than the earlier machines they replaced as well.


While Moore's law helps, languages are more expressive, safer, more performant and have more batteries included yielding a whole bunch of improvements.

We've had software and hardware gains, massive ones, and they compound.


> While Moore's law helps, languages are more expressive, safer, more performant and have more batteries included yielding a whole bunch of improvements.

I have to disagree, compilers may have gotten a bit better at making faster binaries. But languages, like new languages, are increasing in expressiveness and safety, sure, but very rarely efficiency. Go and Rust are not faster than C or C++, likely never will be (for one C has decades of lead time), Go and Rust may be faster than C was 20 years ago, but that doesn't matter.


If Rust is significantly slower than equivalent C or C++, it's a bug. Please file them.

(And yes, sometimes, it's faster. Today. Not always! Usually they're the same speed.)


My point is more like this chart [0]. C has so much lead time, Rust will probably never be able to catch up, be close? Sure. But C has decades of lead time.

[0] http://www.viva64.com/media/images/content/b/0324_Criticizin...


> My point is more like this chart <

As steveklabnik noted that is old data (which you would be normally be able to see from the date-stamp in the bottom-right corner, but that's been hidden).

This web page is updated several times a month, and presents the charts in context --

https://benchmarksgame.alioth.debian.org/u64q/which-programs...

(You might even think that you can tell which language implementations don't have programs written to use multi-core and which do.)


That chart is extremely old. We are sometimes faster than C in the benchmark games, with the exception of SIMD stuff due to it not being stable yet. (and, it can fluctuate, depending on the specific compiler version, of course.)

For example, here's a screenshot I took a few months ago: http://imgur.com/a/Of6XF

or today: http://imgur.com/a/U4Xsi

Here's the link for the actual programs: http://benchmarksgame.alioth.debian.org/u64q/rust.html

Today, we're faster in C than one program, very close in most, and behind where SIMD matters.

  > But C has decades of lead time.
Remember, Rust uses LLVM as a backend, which it shares with Clang. So all that work that's gone into codegen for making C programs fast also applies to Rust, and all of the work Apple and whomever else is working to improve it further, Rust gets for free.


I mean true, I'm playing devil's advocate here. I respect the Rust community (heck of all the nu-c languages I respect it the most, I even did a poster on 0.1 of it for my programming languages class), I will be quite impressed if they can pull off (and they are the most likely to be capable of it in my opinion) what so far has been an insurmountable task: beat an old guard language in general purpose performance (Fortran, C, etc.); languages that have every advantage but design foresight. If they do it, it will be a great historical case study on how to build a new programming language.

As an aside: As someone who has used LLVM to build a compiler, it doesn't quite work that way, yes rust has access to those gains, but it may not be able to effectively use them (due to differing assumptions and strategies).


Totally hear what you're saying on all counts :)


> languages are more expressive, safer

Not generally, no. Maybe the popular ones become so, but that's mostly by rediscovering the languages of old, which had better safety and more expressive power.


Moore's law is just an observation, and the only way chips can actually be made is through sustained, coordinated, and meticulous teamwork.


How many of those languages were developed in the bazaar style? All the ones I can think of came from a single person designing things from first principles and taking inspiration from other cathedrals around them. Lisp, Scala, Ruby, Smalltalk, Prolog, Typescript, etc. are cathedrals. The one bazaar IDE I can think of is Eclipse and it's terrible. Visual Studio and Visual Studio Code on the other hand are much more sensible and again it's because they're cathedrals.


Another bazaar IDE is Emacs, and I'd say it's not terrible, although YMMV.

I'm not sure how you can say Lisp is a cathedral; it's not even "a" anything. Common Lisp, Racket, Clojure, Emacs Lisp, etc., many of which are themselves bazaars. Ruby, for another example, may have started as one person's vision, but now the canonical implementation is a big multisourced effort, and there are other implementations with lots of uptake that aren't directed or blessed by the mainline Ruby.


Cathedral is never built by one person. It's not even entirely designed by one person. But you have people responsible for quality, instead of "anything goes" ad-hoc development.

You mentioned Common Lisp - it's a great example of a cathedral. A language carefully designed by a committee, which took into consideration all the previous Lisps that were in popular usage. You can tell there was a lot of thought behind the process.

As for Emacs and the bazaar, I think this is a good case study of good and bad aspects of bazaars. On the one hand, you have an incredibly flexible tool, which turns it into a perfect test environment optimizing workflow with text-based tasks. You have people writing Emacs modes for anything including kitchen sink, and it turns out many of those experiments offer superior workflow than standard, dedicated applications (especially when it comes to interoperability and maintaining focus/flow).

On the other hand, Emacs often requires you to hot-patch stuff here and there, and its language support is usually worse than that of a cathedral-like IDE dedicated to a particular programming ecosystem. And I say it as an Emacs lover. I still prefer Emacs to IDEs, but that's because of the flexibility benefits, which are unparalleled. But I'm not deluding myself that Emacs has better support for Java than IntelliJ, or better support for PHP than Eclipse, or whatever. For language ecosystems requiring complex tools to back them up, it's a PITA to set up your working environment in Emacs. Hence the negative side of bazaar - you don't get as much focused effort to make something of high quality.


> You mentioned Common Lisp - it's a great example of a cathedral. A language carefully designed by a committee, which took into consideration all the previous Lisps that were in popular usage. You can tell there was a lot of thought behind the process.

Common Lisp was designed as a unified successor to Maclisp, in response to an ARPA request.

Not to Scheme, Interlisp, Lisp 1.6, Standard Lisp, Lisp 2, LeLisp, ....

Scheme was further developed. Interlisp died, Standard Lisp had a Portable Standard Lisp variant and then mostly died. Lisp 2 was dead before, LeLisp died later.

The core of Common Lisp was designed in 1982/1983, decided mostly by a small team of Lisp implementors (those had their own Maclisp successors) with a larger group of people helping out.

1984 a book was published on the language and implementations followed.

Standardization then came as a more formal process later with goal of creating an ANSI CL standard - again it was mostly US-based, paid by ARPA. Areas were defined (language clean-up, objects and error handling), .... Proposals were made (like Common LOOPS by Xerox) and then subgroups implemented and specified those (CLOS, ...).

> You can tell there was a lot of thought behind the process.

There were a lot of people involved. Not just the X3J13 committee. It was also a community effort at that time.

https://www.cs.cmu.edu/Groups/AI/html/cltl/clm/node3.html#SE...


A fun thing to do when you find someone who really likes Common Lisp: ask them to explain the loop sub-language in detail.

JK. They might starve to death before they finish.

The greatest thing about CL is that it has so many features that you can use to make a CL-lover look like a deranged nutbar.


It's fully unclear, why I should explain all the features in detail to someone. I'd never do that and I know nobody who would be interested in it.

You don't need to know all of it in detail. It's good to have an overview and look up details as needed.

In a numerics library, I don't need to know every function in detail. I just look it up on demand.

Hyperbolic tangent for complex numbers? I don't know the details. Learning all is fruitless. When I need it, I look it up.

> They might starve to death before they finish.

Teach Yourself Programming in Ten Years. Why is everyone in such a rush? http://norvig.com/21-days.html

Java JEE in detail? Oops.

The Haskell type system in detail? Ooops.

> The greatest thing about CL is that it has so many features that you can use to make a CL-lover look like a deranged nutbar.

Wait until you get 'Scheme R7RS large'. Ever looked at recent languages specs for languages like Scala, Haskell, Java 8, Fortress, Ada, the coming Java 9, Racket, C++, ...

One thing you need to learn about Lisp languages: the language is not fixed. It can be an arbitrary amount of features, even user supplied.

What you need to learn is not all the features of one construct. What you need to learn is how to learn the details incrementally on demand, while having an overview of the concepts.

If you think LOOP is large and complicated, have a look at Common Lisp's ITERATE: even more features and more powerful. Even designed to be user extensible.

https://common-lisp.net/project/iterate/

And it's totally great.


Now look, I like Lisp. Common Lisp was the first language I fell in love with. I remember it fondly, even though at this point I prefer Scheme.

But you have to admit that there's something a little...off...about having an iteration sub-language with a 43 page (in PDF) manual. And I mean sub-language literally; one of the advertised features of ITERATE is that it has a more Lispy syntax, so your editor has a hope of indenting it correctly.


I find it very handy and use it all the time. Luckily there are LOOP versions for Scheme and Racket as well:

http://wiki.call-cc.org/eggref/4/loop

https://planet.racket-lang.org/display.ss?package=loop.plt&o...

LOOP is actually not a CL specific language construct and did not originate there. It was invented by Warren Teitelman for Interlisp. There it was called FOR. From there it was ported/reimplemented and extended to several Lisp dialects.


> You mentioned Common Lisp - it's a great example of a cathedral.

Sure, but I didn't say Common Lisp was a bazaar, either. I said it didn't make sense to say "Lisp" was a cathedral, because there are many Lisps, and some of them are bazaars.


You are confusing a concerted effort with a bazaar. Just because a committee or a group of volunteers work on a project as an open source project does not mean it is a bazaar. Others have already said this but the distinction is about the quality and vision, not how many people work on it.


Haskell is famously the work of a committee, with multiple implementations and multiple partially-compatible extensions.




Consider applying for YC's first-ever Fall batch! Applications are open till Aug 27.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: