Hacker News new | past | comments | ask | show | jobs | submit login
The Montreal problem: Why programming languages need a style czar (earthly.dev)
98 points by ingve 4 months ago | hide | past | favorite | 197 comments



You need not a benevolent dictator for each language but a design/code guru for each system, who communicates clearly and builds standards into process.

Bad style is just confusion: noise, but not necessarily error.

The real problem is history and architecture.

E.g., Swift has new concurrency, and the old grand central dispatch, and thread-local's, and you can reach for atomics and even C-style locks, and each of these can be used in different ways. But your application will die unknowable deaths if you don't pick exactly one and stick with it.

The same is true for data modeling and liveness.

And given the (testing) cost of migrating legacy code, the best thing is often to keep doing things the same way.

So to my mind, the most valuable people are not language mavens or even the design theorist who can make a greenfield work of algorithmic art, but the person who can migrate a system, knowing both the old and new ways and the incremental steps that can be taken while the system is running.

Recent examples are moving from Google moving to Go, or Apple moving from x86 to Apple silicon, or C/Objective-C to Swift, and even Java iterating much faster these last few years. In many cases it's as much a people problem: getting stakeholders to be patient, oldsters to be flexible, and everyone to be happy with limited improvements.


This reminds me of Paul Graham's Five Questions About Language Design (https://paulgraham.com/langdes.html):

"When languages are designed for other people, it's always a specific group of other people: people not as smart as the language designer. So you get a language that talks down to you."

The same applies to these style standards. I agree with his third point in that article:

"3. Give the Programmer as Much Control as Possible."


I think that points 3 and 4 of that article are somewhat refuted by the popularity of Go.

You could argue that Go ultimately does give you a much control as possible but it certainly doesn’t do it like Lisp in the way that Paul Graham admires, and Go definitely doesn’t focus on brevity, beyond avoiding Java style magic incantations and boilerplate.

Interestingly Go is designed for “yourself and your friends” and they ended up in a very different place from the hackers that Paul Graham admires. It’s a language spawned from the pain points of large corporate software projects that wins a lot of sympathy from smart engineers that are working on large corporate software projects. The fact that it translates well to small projects as well is definitely an engineering feat.


The single biggest factor in Go's popularity is that google is pushing it.

Then I'd say concurrency, native compilation, static linking...

The simplicity of its language constructs is IMHO one reason why Go is not more popular. The point seems moot anyway. Go is mostly used by experienced programmers, usually havong experience in other, richer languages.


Go is a language made for people doing systems programming who like static typing. Compared to popular alternatives like C# and Java, Go is a breath of fresh air.


Go was intended for systems programming, but is rarely used for that.

(Java and C# don't compete in that area at all)


Go has a stupid definition of “system programming”. It uses it to mean “low-level network stuff”, and absolutely not as something where low-level languages would fit.


Go is marketed as being designed for system programming. But all of its features seem to be designed for web services.


Don't bundle C# with Java in this area. It is way more low-level. Give it a try and it will surprise you :)


In what way is an objectively more verbose language than C# and Java a “breath of fresh air”? Especially one so badly designed with gotchas everywhere (defer being function scoped, they even got loop captures bad at first, when that was a well known problem 50(!) years ago, etc)


What exactly is "systems programming"? I thought that had to do with manipulating hardware? Usually Java and C# are more associated with application programming...


Not sure of any official definition, but it usually means that you're creating a system that applications made by other people will be built on top of. For example, writing Postgres or NodeJS is doing systems programming, writing a web backend atop those two isn't.


I was thinking of Go when I re-read that article. Unlike many here on HN, I am not a fan of Go because I find it far too opinionated and untrusting of its users. While technically it might have been written for Pike's and Thompson's friends, it does feel to me like a language designed to prevent poor programmers from making mistakes.


> it does feel to me like a language designed to prevent poor programmers from making mistakes.

That's Rust. I would not call _any_ language containing some kind of nil pointers to be " designed to prevent poor programmers from making mistakes". Go is mainly designed to be "easy to read" (and excels at that if you are used to C-style syntax).


Easy to read? That would be Python. I would say Go is designed to be fast to learn for experienced programmers.

Null pointers are a free runtime check for invalid assumptions. I don't understand the fuzz about them being bad at all.


I find python quite hard to read, honestly. The lack of typing in most python means that it isn’t immediately clear what a given piece of code does without diving into its implementation. You have to rely on docstrings (good luck) or descriptive naming.

The issue with null pointers is that they are, as you said, a stand-in for an invalid state. However, most null pointers don’t prevent you from trying to use the underlying data, which can, among other things, cause crashes (think not checking the return value of malloc). Additionally, it doesn’t make formal the litany of possible forms that a value can take — if you codify possible states, then you don’t always need to check everything. I’m currently working to port security firmware from C to Rust, and I’ve found I make fewer state checks, because the data I’m working with has bounded state — the data given cannot be in an invalid state.


Types really only help when they're primitives. You're going to have a million random classes that are only used in a few places. Seeing `ConfigManagerFactory config_manager_factory` doesn't help. What helps is avoiding excessive oop and making sure objects are printable for debug, imo something Javascript does a lot better.

Re null pointers, in Java or Python they raise an exception. Isn't that ok? It doesn't end up doing something invalid like in C.


Agreed. However, IMO Go's designers have good taste. That's hard to quantify but you know it when you see it and I see it in Go. Not so much in most other languages. Again, IMO.


Arguably, Go is the only language I couldn’t even infer the syntax for before learning it.

I can quite easily follow any ML, or C-like language, even if this is the first time I see it, and then go just writes some bullshit list of tokens, where you can’t even decipher which is a type or an identifier name. And then does some receiver syntax in function definitions that is absolutely unhinged.


I really dislike Golang, everything about it, even the name, and this is one reason. C C++ Rust can do systems stuff, Java JS Python and even C# can do applications. All with pretty intuitive syntax except for Rust kinda, but Rust has good reasons for that.

I don't get why Golang, ObjC, or Swift exist other than some company pushing them, and their guiding design principles are "be different to be different."


Go is popular, but so are Javascript and C++. I don't really know that you can draw any sort of conclusion from one language being popular.


> Go definitely doesn’t focus on brevity

It does, arguably to a fault.

* Single-letter variables

* Capitalization instead of private/public


Requiring explicit error handling instead of exceptions makes it not brief. Lots of things will fail, and every single time:

  f, err := Sqrt(-1)
  if err != nil {
      // handle error
  }
Probably async/await would make things briefer too, idk, haven't used Golang enough to tell.


It seems like you're conflating "options" with "control". The two Scala example shown in the articles are two ways (or options) of writing the same code, but having two ways of writing this doesn't really give you more control over what's happening.


Golang talked down to its users about generics for a long time, then they finally gave in. Give it some more years and they'll finally have exceptions and async/await.


I don't know why this goes against programmer culture so much, but for me rule number 1 should be Optimize for IDE Integration. And probably number 2 should be, readable stack traces.


In general, I care the most about it being easy to debug by a random person who's never seen it before and barely knows the language either. So #2 stack trace as you said. Add in simple syntax. Lots of languages, libs, frameworks, etc show off how nice they are when everything is happy and ok; that's easy.


PHP has:

1. https://www.php-fig.org/psr/psr-2/ - coding style guidelines that are then used in tools like phpcs for automated linting

2. phpcs allows for defining additional rules that can be used in automated linting - https://gist.github.com/andreyAndrienko/19941c4b621f3212976a...

3. phpstan used for static analysis of files, allowing for defining architecture rules like: "this method cannot be used" or "this class cannot be used in this folder"

4. phpat (https://www.phpat.dev/examples/) (and architecture plugin in pest - https://pestphp.com/docs/arch-testing) allowing for defining architecture rules


I work on a large PHP project and we've been using a combination of phpcs and a non-machine enforced style guide and the project is quite comprehensible. Some developers have different styles but it's fine. Even gofmt will never force people to adhere more to functional or imperative programming styles and that's a-okay - I love my functional code but I understand that other people comprehend things in different manners and as long as it works I'm not going to go in and edit someone else's approach.

Over enthusiastic machine driven style enforcement can get pretty unreadable - do you have a "one arg per line" rule - do you even enforce that in front-end validation where you have thirty parameters? Alternatively do you compromise the strength of your type safety and pack them all into a params[] just so the code is less messy? I remain skeptical that there is a reasonable one-size fits all solution to styling but I'm quite certain that python isn't it if it exists.


"front-end validation where you have thirty parameters"

When your hammer is a type system


Quality readable code does not depend on these sorts of styles. As a matter of fact, you can write totally opaque code, in perfect style. I have seen this.

If using getOrElse in one function and using an if in another is confusing, I am very sorry: things differ, things change, and its best to just learn to read code.


> If using getOrElse in one function and using an if in another is confusing, I am very sorry: things differ, things change, and its best to just learn to read code.

When engineers tell me these sorts of style changes confuse them, I think lesser of them. It's probably not socially acceptable to say, but there it is.


i don't think style-enforcers are 'confused' and unable to read code, it's that consistent style makes reading code a LOT easier.

if the same thing is being done in two different ways, you have to actually read and think about the both chunks of code to see that they're the same.

with consistent style you can essentially speed-read a codebase because things are predictable.

with inconsistent style, you're constantly reading slightly-different pieces of code and figuring out "is this different purely for style reasons, or was there a reason that one section if different?"


You speak of using a consistent, established pattern to improve comprehension, but you eschew the use of capital letters when writing English.

Interesting.


I also make sure to include typos.

Just to be consistent with the rest of the internet.


You're just misunderstanding. They're not saying they literally can't understand the code if it has inconsistent style and formatting; they're saying it's more work.

It's like reading a book with no paragraphs. Of course you can do it. Nobody is "confused" by the lack of paragraphs. But you're an idiot if you think paragraphs don't matter, or don't make reading easier.

I tend to think less of people that don't understand this, or don't care.


There's also the context to consider: if I'm scanning someone elses' code in an unfamiliar subsystem, there's a high chance there is a bug or something time critical. Not the time for extra annoyance and work.

Finding code that suddenly zigs where all the rest of the systems code zags takes extra mental overhead. It's annoying, and misleading. All too frequently bugs are coupled to such style violations, adding extra frustration around such misleading code.


> All too frequently bugs are coupled to such style violations

That's a really good point actually. Style violations can make the code look like it does something that it doesn't, `if (a=b)`, missing curly brackets in C/C++, missing semicolons in Javascript, mismatched brackets, etc. An autoformatter can make it more obvious when you've done something wrong.


You're not alone, I tend to think the same thing. I really don't notice a difference between reading my team's own code vs reading our dependencies code or other random open source projects. Despite the fact that none of these repos enforce the same style. My most charitable interpretation is that software development is filled with people who have a touch of OCD and all these inconsistencies affect them in ways I simply can't comprehend.


I tend to agree, although certainly, it would possibly be poor style to mix them together frequently in the same area in an egregious way


As a Montrealer, I'm not sure I'm understanding "the Montreal" problem. Is it simply that we are a very bilingual city? If so, it feels like a pretty crummy analogy, because our bilingualism is a boon.

Is it an architecture thing? Because that's another great aspect of Montreal.

Maybe something like "the peanut-butter-and-anchovy" problem or the "circus at a funeral" problem.


Author here. It wasn't meant to be a knock on Montreal. But on code bases that have very apparent different eras and styles.

The idea actually came from a podcast interview with a C++ expert. We talked about some parts of any large C++ code base looking like Old Montreal and some looking like Art Deco or whatever other style.

To me it was a very accurate and memorable metaphore but I get why it hits wrong for many.

Montréal is a beautiful City that has a mix of styles. Thats great for the city, just not great in a big code base.

Clearly I need a better term.


Not a knock on Montreal, just a flavorful comparison to a real-world phenomenon.

> But as code grows, it’s like Montreal. Every part of the city is different.

---

It's like "Cathedral vs Bazzar." No slight against cathedrals or bazzars; these are just colorful analogies for design/organization philosophies.


That's why I love the fact that Elixir has a built-in formatter and pretty much everyone in the community just uses it. It's also not very configurable, so there isn't even a point in having these kinds of discussions.


Elixir's formatting is one of the several things that annoys the hell out of me. I can't pull out an example at the moment, but it absolutely loves to move comments out of a block where they're relevant to the exact line, up several lines to where the block itself begins, losing a lot of the immediate context and forcing me to write something significantly more verbose.


Elixir mentioned!

Question, do you use elixir professionally? If so, how did you get in that position?

The company I currently work at is planning to build a large messaging system using C# and a ton of memory mapped files, but I think just leveraging elixir/erlang and BEAM would make life easier.

Not really sure how to convince bossman of that, though.


IMO it's better to think of programming languages in the same way as frameworks in cases like this: because they are. Frameworks present a tradeoff in many areas, but the big ones tend to be solving issues specific to some domain, and the "framework cost" which is needing to work within the constraints and expectations of the framework. Typically the hard part about learning a new programming language is not learning the syntax, it's learning the semantics and the standard library, something that you will pay some amount of when adopting a framework. Maybe to a lesser degree, but most programming languages tend to have consistent semantics around core things like control flow.

If you want my personal opinion, I think you'll spend more time getting cut by the razor sharp edges of `mmap(2)` (or whatever the Windows equivalent is) than picking up Elixir, must less the difficulty in handling edge cases such as extremely high load and network failures. Elixir (or rather, BEAM + OTP) is built specifically with the problems you will face in mind, and it would be naive to not at least take a serious look at it.

I'll also add, that while I don't know the specifics of your system, message broker/passing systems tend to have very hard boundaries with serialization in-between, making it easy to build polyglot systems.

Edit: also w.r.t. picking up a new language, I'm referring to the cost of a team picking up a new language, not you as an individual :)


You are risking to lose a lot of performance by switching away from .NET (assuming you're using anything that isn't EOL) to Elixir. mmap is...alright, but it's not a panacea and 1BRC challenge was a good demonstration that you reach identical performance with often less trouble by leveraging `FileStream`, `SafeFileHandle` and `RandomAccess` instead (the latter is as low on overhead as it gets, reading to a stack buffer with `RandomAccess` read is barely more expensive than calling `pread` from C directly).


It’s all greenfield right now, so we have no metrics.


I'm not the person you're asking, but we used Elixir professionally at work - still do, I guess, since it's still running - but we're moving off of it.

The biggest problem is that for the most part it's a pretty and a pretty cool language - but it doesn't have a huge community you can hire out of. So if all of your Elixir experts leave, and you're faced with a problem you cannot solve, it becomes significantly harder to figure out what the correct solution is. And for us, especially, using Elixir didn't give us any huge benefits that other languages couldn't also provide.

So you don't need to convince the bossman that Elixir is the correct language to use; you need to convince the bossman that it's a solution that will still be maintainable when you and several of your coworkers who wrote the thing leave for greener pastures.


Same with Dart. You may not like the style too much, but at least it's consistent and everyone uses it.

Dart also has a kind of "standard lints" (one for Flutter, one for just Dart): https://dart.dev/tools/linter-rules

Even though you can write your own lints to verify every little detail of the code, the fact that standards exist is great.

Check out the rules enabled by the default lints, it's pretty amazing: https://github.com/dart-lang/lints/blob/main/rules.md


Perhaps a controversial opinion, but to me this really just shows the major flaw with our current text-file-based model for programming languages; it mixes the actual content of our programming with its presentation. It's time we begin to move past text files for storing our programs and move to directly using a shared intermediate representation (something along the lines of an AST). Then we can move past all this nitpicking about code style, indenting, etc.


There were many attempts at that, JetBrains MPS (metaprogramming studio) comes to mind, but all them feel clunky and constrained.

Many proprietary programming systems do follow this model, like Mathematica and visual IDEs like PLC programmers. But to bring it to wide programming audience we need to agree on an open format for representing syntax trees and implement several editors for typical use cases, from ultra-lightweight slow-connection-SSH-capable to more or less fully featured IDEs. Though I don't know much details about them, tree-sitter and language server protocol could be centers of crystallization for such software. I hope we'll see interesting developments in this space.


But if you are using if/else vs .getOrElse vs map to do something the ASTs would be different.

Also the whole which package manager to use in Python, or which Async lib to use, or which part of std-lib is to be avoided in certain cases or so on.

I think those things need specific guidance and agreement. AST would help with code layout though, but lots more to do in a language with many ways to accomplish something.


I think that's an interesting solution. The thing I've always wondered about doing this though is how would it work for storing code that has syntactical errors? dartfmt has the option of just giving up as you're still left with text. What happens when you want to share code that has those kind of errors?

Not saying this isn't a solvable problem.


If we assume code is shared in AST form, but people still end up editing a text representation of the AST (rather than directly editing the AST through a visual interface, as in Scratch et al), then you just wouldn't be able to share your syntax errors - realistically you'd put them in a comment or something, or exchange out-of-band.


Maybe if it has syntactical errors it’s text and not code?


Sounds appealing, but I think in practice this would be very unpleasant to work with. ASTs are almost always significantly more verbose than the code that generated them which makes both reading and writing more painful.

I say this from my own experience designing a language. Writing the AST directly was at least 2x as long as the language code.


I agree with this if the AST is a YAML or JSON document, but maybe if it was visual then you wouldn't have that problem (but I imagine it would exacerbate the style problem while also eliminating many of the advantages of text formats).


> using a shared intermediate representation (something along the lines of an AST).

This is more of a past idea (e.g. sexpr) than a new one, fwiw. Whether or not it's due for revisiting is unclear.


I like this idea and I feel like it should exist for something like Racket. You would need to write an unparser that takes an AST (or some canonical dialect like lang/racket) and then maps it back to text (a function of target language + style config) for the purposes of editing and the saves as the AST again. While I don't think you can just invert a reader function you probably can ask an LLM to do the inversion for you and check it. It's pretty much decompilation after all.



Scratch was ahead of its time


thank you for this, I've always thought that I should be able to program in the way I like most and upon saving it would only produce an AST

that combined with a personal "stylesheet" would allow my editor to display it the way I like

only the AST would be version controlled and other users do the same

that's the dream


Strange to come up with a metaphor for this like ‘the Montreal problem’ but not have its proposed solution remotely resemble the ‘Montreal solution’. Does Montreal have a style czar? I don’t think so. Nor would most people say it needs one.

To the extent that these problems exist, they are per codebase problems and allowing per codebase solutions is fine.

Languages can support multiple idiolects where different idioms are prevalent. Competent developers fluent in a language might need to learn to code switch - the way data scientists speak Python behind closed doors calls for a wholly different level of formality from the way Django coders might speak it. C# devs who build enterprise apps on Azure don’t use the same inflections that their fellow C# speakers building games in Unity use.

The important thing is to read the room and choose how you express yourself in a way that’s appropriate for the people you’re with. Are these people going to appreciate your fluent continuation passing style free expression? Or are they more comfortable if you express yourself in classes and methods?


Beyond even that he takes a non-unique urban trait of Montreal and titles it a problem without ever explaining why it's a problem for Montreal, much less how Montreal did or did not solve for it in any way that is analogous to his Style Czar.


He talks about the Montreal effect, and says it's a problem when it shows up in code.

The title was changed, the one here on HN is wrong.

He doesn't say it's a problem for Montreal, so he doesn't need to explain that.

As a montrealer I think thee effect isn't really present here but whatever, if it helps him make a point.


The section that delves into the ‘effect’ is under the heading:

> The Montreal C++ Problem

And indicates that having old ‘stuff’ around when new ‘stuff’ arrives is the cause of ‘the Montreal Problem’:

> C++20 had a lot of good ideas, but lots of code predates that standard. And so drift occurs. You either don’t adopt the new way or you end up with a code base with more than one style. If you do the latter, you end up with the Montreal Problem.

(Emphasis mine)

The next section makes it seem like the ‘problem’ is about culture and language rather than, say, architecture:

> If you are doing work in the old-Montreal code section. It’s like a different dialect. You now need to know multiple dialects of the language and when and where to apply each one.

And then suggests that the new-Montreal old-Montreal problem is a divergence of community, which is ‘splitting’ people ‘apart’:

> So, how do you evolve a language without splitting it apart? This gets trickier with a whole community involved. ... And with that, the community fractures.

Overlaid with the idea that the solution is a ‘czar’ of some sort the metaphor feels clumsy and kind of inappropriate.


> He doesn't say it's a problem for Montreal, so he doesn't need to explain that.

He certainly did before he changed the title and article. It would be nice if the author at least had a note on why and when he updated the article...


Exactly. The Montreal "problem" doesn't sound like a problem at all, but rather a defining feature that gives Montreal a distinctive and worthwhile character...part of what makies it a destination city for people of a wide variety of backgrounds.

Isn't this what we call "dynamism" and frequently a highly desired trait for a group?


I don't know why but my reaction to this is a strong negative. It seems like exactly the opposite of what we should expect out of programmers.

The whole Go philosophy rubs me the wrong way.


> exactly the opposite of what we should expect out of programmers.

I mean this only jokingly: Who gets to decide what programmers are supposed to be like? You?

As someone who tries on a new language every year or two, I love using languages with formatters. Worrying about how many spaces to indent with or other arbitrary things is the opposite of what this particular programmer likes to worry about--I want to jump right in and solve problems with a language, and let the formatter make the code look like it came from a textbook.


I'm arguing for a higher bar. A programming language doesn't need a style czar proposing what is and is not idiomatic. The grammar of the programming language already does that, and we need to trust people to do engineering and decide for themselves how to use that grammar.


  > we need to trust people to do engineering and decide for themselves how to use that grammar.
i agree with this very much, but one snag point in a lot of projects is when people do pull-request reviews and there are disagreements about "what is readable" or "easy to understand" then a lot of time gets wasted going back and forth so having a style-czar approved format can make things smoother because there isn't much to argue about any more


I've enjoyed working on teams that use an auto-formatter, like black in python. Using a formatter that's not my favorite is much more enjoyable than not using a formatter. Ditto for as many style issues as you can automate. It sucks to spend code review making humans spend time (re)arguing things machines can say just as well.

So if someone proposes that their language should have stated idiomatic style, I'd go further and say that a language style guide should come with tooling to automatically fix and automatically detect issues in as much of the style guide's scope as reasonably possible.


> It sucks to spend code review making humans spend time

For some humans, they don't have to be "made" to do it. They love squabbling over style issues and rearranging deck chairs. I will never, ever understand caring about style issues, so auto-formatters are a gift from the heavens. Now the guy that used to waste time in code review can waste time tweaking the formatter rules, and everyone else can move on with their life.


> Now the guy that used to waste time in code review can waste time tweaking the formatter rules, and everyone else can move on with their life.

Had worked with one such person before, unfortunately after introducing formatters he found a new thing to be pedantic about and started annoying everyone with that instead (it was C++ and his second obsession was ensuring that all objects are always moved correctly and no CPU cycle is being wasted while our enterprise app is waiting on the network to resolve tons of API calls).


C++ for high-level code tends to be this way. Waste tons of focus on saving a couple of CPU cycles, but the whole thing is waiting on RPCs anyway.

Also you probably used less efficient algorithms because the smarter ones were too annoying to implement in C++, and there's possibly a solid Python lib that does it with hyper optimized C code anyway.


Yeah I agree. The language lives in a weird niche where both intricate low-level details are visible, but also supports abstractions only visible from outer space.

You always have some low-level detail which can be nitpicked about, even though it does not matter in the slightest.

Admittedly, choosing C++ for such a high-level project wasn't a good idea at all, but that is what the first few devs knew best and sticked with it.


Oh and those measures that save 2 extra CPU cycles do things in a way more error-prone way, which some day causes memory corruption.


... which you spend more CPU cycles than you ever saved trying to debug it


I care about style. If I’m reading the same code day in and day out I want it to look nice. Not nice code is a distraction, maybe it’s an OCD or a ‘visual misophonia’, but inconsistency is one of the first things I notice. I truly don’t understand why some people don’t care how their code looks.

I have noticed that some people will present tidy code during interviews and amazingly drop that habit once they’re employed.


It's not only about it looking 'nice' (although there's that component as well as an added bonus).

Once you've read tons of code formatted with the same formatter, passing the same linter rules and following the same general idiomatic rules, it becomes so much easier to read and review new code that adheres to all the same rules.


This is definitely true. Your eyes tell you that something's off long before your brain figures out what it is.


I passed the style test at work then instantly stopped following any of the rules. My teammates who also don't care approve the changes. We deal with lots of bugs each day, and none of them have ever been because things like using `auto` in C++ when we shouldn't have. There are far better things to spend time on.

And if someone does care about something in particular, often they will contribute to an auto-cleanup tool that makes it a certain way. So there was no reason for me to do it manually.


Which came first, the bugs or the nonchalance!


The bugs of course


Problem is even with auto-formatters, people will squabble over pointless stuff. I just do whatever comes naturally and assume someone is going to want it a certain other way, then I change it without arguing. That goes for designs too.


To reiterate OP's point about scala, in my idiomatic scala, I'd use a 3rd technique:

  def ABSOrSeven(maybeNumber: Option[Int]): Int = maybeNumber match {
    case Some(int) => Math.abs(int)
    case None => 7
  }
   
Somewhat more verbose but I think is the most clear and should have fewer function calls than the getOrElse version and has no unsafe operations like the (safely guarded) `maybeNumber.get` call.

The point though is that the author is correct that scala provides too many ways of doing the same thing such that disagreement is inevitable.


That's nice. The 3.1 version they used felt hard to immediately understand to me, a little too abstract.


IMO the solution is to have the formatter provided by the compiler/CLI/LanguageServer. rustfmt, go fmt, terraform fmt, etc. And then the language authors are in charge of how it's supposed to be formatted too, and they should try to make that formatter 'strict' (IMO) in the sense of not allowing (accepting as input and outputting unchanged) two semantically equivalent styles. Don't leave it to third parties to fight over becoming the de facto style du jour. Don't be python, and certainly not javascript.


Language-provided formatters immediately end so many conversations about "correct" formatting that I consider them a requirement for a mature (read: "Would use in production on a project people are paying money for the success of") language these days, along with things like "IDE support," "static typing and/or sanitization analysis," and "a usable debugger."


This generation seems to love rules and forcing others to bend the knee. I don't get why, but it's highly offputting.


Some of us who prefer things more buttoned-down by tooling are probably older than the generation you’re thinking of, and are mostly just sick of having to figure out which parts of Haskell some hipster has decided to try to replicate in a language that’s very bad at being Haskell (so, most languages) before we can read their code.

I’ve seen entire language ecosystems get overwhelmed by fads of this sort, and dealing with these waves of pointless trend-chasing is exhausting.


How old do you think the author is? He’s not some “new generation” dev.


For whatever reason, the current zeitgeist is all about enforcing as many things as possible via tooling versus actually learning core principles.

I think this is because all of tech seems to be culturally downstream from FAANGs who have weird problems like "we have 500 devs making multiple commits a day and we can't trust them to do anything more than close tickets." Those are wholly self-inflicted problems, but the rest of the industry laps them up as if they are the secret to being as "great" as FAANGs. Couple that with tons of people who have less than <7 YOE in tech and/or are switching into it and you see there are very real political advantages to "keeping up with best practices."


I don’t see how using tooling prevents you from learning core principles. As in, what _core_ principle am I missing on because I don’t manually indent my braces? Note that this conversation is about _style_.

The zeitgeist re style is very much “don’t fret too much about it, use an automated tool and move on”.

Compared to the flame wars of the past about tabs vs spaces and other inconsequential things we’ve come a long way, and for good.


It doesn’t. But my snark comes from the fact that we seem to talk even less than before about actual engineering and more about the tools of software.


It's your choice to talk about something else. Nobody put a gun to your head - I hope - and forced you to read this story and comment on it.


> talk even less than before about actual engineering and more about the tools

Isn't that just a consequence of how much--or little--context is shared with (or easily-communicable-to) Random Strangers On The Internet?

For example, I've got some architectural issues going on, but it's a lot of work just to summarize it in a way that doesn't require local/domain knowledge. In contrast, talking about the weather or tabs-vs-spaces doesn't need much set-up.


that's also engineering.


Enforcing mundane things with tooling is 100% orthogonal to core principles of programming.

I say this with 20+ years of programming experience: a project that doesn't do that is a project run by amateurs.


I suspect FAANGs got infected by these idea by hiring from more conservative institutions driven by their financial bottom-line. I've never worked at FAANGs but I've seen that arc at other companies as they scale. Practices that foster speed and innovation are supplanted by ones that favor predictability and safety.


> the current zeitgeist is all about enforcing as many things as possible via tooling versus actually learning core principles.

That argument doesn't make much sense IMHO. By enforcing as many things as possible using some tooling we now do have more time to actually learn "core principles" instead of figuring how to configure/use/... $IDE/$EDITOR/$TOOL/$COMPILER/$OS to get comparable results (or anything to build at all).


scale means making the problems faced by 20% of your community a problem for all of your community


I think there's been a deskilling over the past decade and a bit.

Last job we were having trouble communicating and collaborating with some acquired teams, and I figured something out that helped a little - a culinary analogy - my team knew how to cook, let's not aggrandize ourselves too much but say we were at least short order cooks at the diner, while these new teams were McDonalds' workers. McDonald's workers aren't all idiots or lacking work ethic, but they know how to follow the laid down McDonald's burger production processes, not how to cook.

So there's gonna be a desire for McDonald's processes.


Ah yes they old "my generation is the best generation" cliche.


Imagine you have a ‘style czar’ for your language, telling everyone what ‘idiomatic’ code looks like in their language. What do you do when, one release, they go completely crazy?

Or what if an apostate arises and promulgates a radically different vision of what ‘good style’ is in the language?

This isn’t a purely theoretical risk. The ‘Automatic Semicolon Insertion’ heresy is an active alternate reality in the JavaScript world - with some developers willing to argue to the death that semicolon-free style is the only professional way to write JavaScript code; while many developers carry on coding in the C-like syntax JavaScript was clearly intended to use blissfully unaware that they are actually on one side of a violent debate.


That's why its helpful to have an czar or authority to consider opinions and state a preference. To converge the community and stop wasting time on these debates.

Also, nobody has to follow the guidance of 'style czar'. Each team and community could deviate in whatever ways. "We follow the standards expect for these changes about semi-colons." I think just having a standard can helpful when having your own style as well, because you can contrast it against something concrete.


Rubocop in Ruby has the great trailing commas in hash literals debate.

That is what has solidly convinced me that programming languages themselves, like the compiler, absolutely need to fuck off with the rigid style proscriptions.

There does need to be a solid, configurable linter so that any given codebase can have its own strongly held and automatically corrected opinions. I don't trust any one person or a cabal to determine style for everyone using the language though.

When it comes to something like that Javascript semicolon debate, the way to solve it is linters with a flag for either set of people, so they can be their own dictators over their own codebase.


If that happens, you fire them?


Cyclomatic complexity (and the other new kind of complexity that I forget the name of) are a good way to measure code readability/reasonability. We have a few rules of thumb for JavaScript that our junior devs have absolutely railed against, but some of them are from code standards that have persisted in the industry for 20+ years, and to me at this point are somewhat standardized.


> and the other new kind of complexity that I forget the name of

If you happen to remember, please update your comment or respond to mine.

[I believe that this doesn't exist OR it isn't useful, but I really want to be proven incorrect.]

EDIT:

Were you referring to the architectural complexity mentioned here: https://swizec.com/blog/two-types-of-complexity-and-their-im... ?


Cognitive complexity? Cyclomatic and Cognitive are the two that Revive reports on.


> Don't write custom operators. Just don't. Really.

Then how exactly am I supposed to pretend that I'm writing Haskell at my day job??


Well i've been coding professionally for over 30 years, and i've seen languages and coding standards come and go, and in all that time, i've never once felt that what was holding back a project was the coding standard, or lack of one.

Maybe i've got lucky? Or maybe this is a solution looking for a problem? All I know is that whatever you adopt as a standard today will look weird and outdated in 20 years time, so frankly, don't sweat it.


You got lucky.

I’ve seen a codebase where one guy wrote all code in Obfuscated C Code Competition style with all optional whitespace removed. Huge solid blocks of spaghetti code with the line breaks inserted only to make the code line up visually into rectangles.

Another guy at the same company thought that a good naming convention is to name all identifiers starting with ‘a’ alphabetically and then extending that with ‘aa’, ‘ab’, etc… when he ran out of single letters. I mean all identifiers including namespaces, class names and methods names. Code read literally like this:

    z = g.h.p( ab, j, y );
At least with the IOCCC style it was possible to simply use an IDE reformat to fix it, but this basically required decompilation to make sense to any other human.


How did the authors of either example maintain their own code? Is it possible that they had a readable local dev version and passed it through a minifier prior to pushing to production?


Some people's brains just work this way. Here's an example of a somewhat popular and regularly maintained library written in a similar style: https://github.com/enkimute/ganja.js/blob/6e97cb45d780cd7c66...

Another example is Jarek Duda's paper on the ANS compression algorithm. A brilliant guy by all accounts, but his diagrams are... well... impenetrable. Check out Figure 4 on page 10: https://arxiv.org/pdf/1311.2540.pdf

There's lines and arrows going every which way. There's about five different concepts layered on top of each other into one diagram, with nothing to obviously disambiguate them. Like... what is the black dot between C(s,x) and D(x), and why is it pointing in random directions!?

The most extreme and stereotypical version of this style are the billboards written by some homeless people. You can probably picture it already in your mind's eye: A wall of very dense text with little whitespace or structure, and a mix of fonts and colours seemingly at random.

I had a brilliant mathematician friend who wrote like this. He would squeeze an entire semester's worth of study notes into a single sheet of paper, on one side. It was impenetrable gibberish to everyone else, but the colours and 2D positioning let him build a mental mind-map.

For people like this, if you reformat their code even a tiny bit, their mental map is invalidated, and they lose track of it completely and become upset. I discovered this (the hard way) when applying automatic code formatting tools to the codebases I mentioned previously.

Personally, I find this type of thing to be absolutely fascinating, because it's the intersection of many scientific fields but belongs in none, and hence is under-studied. There's elements of pedagogy, psychology, literacy, neuroscience, computer science, etc...

It remains an open question how we can get large groups of neurodiverse humans to collaborate on a codebase when they don't even "read" or "think" in compatible ways!

Enforcing a single style may work for most developers, but clearly not all.

PS: Even on an anecdotal level, when I was a beginner programmer I could only read a single brace and indentation style. After two decades, I can now read almost any style, including inconsistent and outright crazy indentation without even noticing that something is amiss.


I usually see this style from people with a mathematics or physics background instead of CS. Their coding style doesn't take maintainability or collaboration with others into account. And these programmers therefore produce code which no one else is able to change when they inevitably leave the company after some technical strategic direction change happens which they disagree with. Allowing them to dictate your coding standards is putting your company at risk. If getting them to confirm to (tooling automated linter) coding standards is impossible I'd personally isolate their code to specific packages with clearly defined interfaces so you have some hope of replacing it after their departure. Make a working agreement with them that coding standards apply to any PRs outside of their pet projects or they'll get rejected.


Nobody is really saying that this is going to make it break a project. That's a straw man.

I'm sure you've seen plenty of projects succeed without tests or CI or code review or comments or bug trackers or static types or linters or version control or ...

None of those are required for a project to succeed, but they're all best practices that we have learned make things better than not doing them.


There isn't much doubt in my mind that a consistent style helps reading code as you can spend less time parsing the structure of the code and focus more on the logic. That being said the purely syntactical parts (i.e.: the things taken care of by gofmt and dartfmt) seem like a limitation of storing code as text.

If code was stored as an AST, everyone could tailor the rendering (tabs vs space, where brackets go. Whether to put a return on the last line in Kotlin) to however they like and tools like gofmt would be obsolete. That's not to say this approach isn't without its downsides though.


> To highlight the problem, take Scala. A language I absolutely love, by the way. But it’s got this one big issue. There’s no idiomatic Scala. It’s way too flexible.

> I can write a single file, calculator class and start with a Java style:

  // Returns, braces and semi-colons
  def getResult(): Double = {
    return result;
  }
    
  def multiply(number: Double): Calculator = {
    if (number == 0) { 
      println("Multiplication skipped: number is 0"); 
    } else { 
      result = result * abs(number); 
    }
    return this;
  }
Scala has this problem, but this is a terrible example. Because literally no one writes like this.

More plausible is:

  def getResult: Double = result
    
  def multiply(number: Double): this = {
    if (number == 0) { 
      println("Multiplication skipped: number is 0"); 
    } else { 
      result *= abs(number); 
    }
    this
  }
And even that's a bad example.

Because still no one writes Scala like this, with mutable chaining. (Java yes, Scala no.) It's either mutable, or immutable chaining.


I do state that nobody writes code like that. The point was the language supports all these ways of coding and never states an opinion on what should be prefered.

Maybe there is a style implied by Odersky's Scala course, but I'm not sure even that style is considered the in fashion style these days.

Somebody needs to write these things down. People can ignore them, but at least deviation would be explicit. See my diagram with the arrows.


I get the point about different idioms or patterns and having discussions and organizational guides around those “don’t catch bare Exceptions” kinds of things.

But for actual “tabs vs spaces” or “opening bracket on same or next line” discussions? These days those are a total waste of time. Just run Black, prettier, godfmt, whatever your language has - that removes like 80% of coding style-related discussion.


I would like to see evidence that style consistency matter __at all__. There is so much dogma around consistency being the most important thing, but nobody ever backs that claim up with evidence. Is there any real research which shows that consistency actually leads to either 1) better developer productivity or 2) a lower defect rate?


> Don't use explicit returns

this is not just a style issue, its also a functional one https://tpolecat.github.io/2014/05/09/return.html


That is an interesting read. I had no idea!


> We need someone in charge to step up. “Okay, everyone, we’re standardizing on Hatch for Python packaging. If you’ve got issues with Hatch, speak up. We’ll look into them. But just so you know, we’re aiming to make Hatch the go-to for Python 3.16.”

AKA the systemd approach.


I’d like to argue one point made in the article about Go- there are (at least) two styles of writing Go. One style uses embedded structs for something like JavaScript prototype chains (specifically, the Go compiler uses this technique to blend the scanner and parser together), and the other style pretends that struct embedding doesn’t exist at all.

I’m willing to say that I am (was?) in the doesn’t exist camp, and was shocked when I found it in inside the compiler and in a few other places.

The shock was similar to that I’d experienced writing in languages that don’t have a “style czar”. When reading Go, you usually can predict exactly where every variable is declared, but with struct embedding that certainty is lost.


This reads like it was written by someone who lacks a sense of good and bad style. Code style is not a consensus or arbitrary imposition any more than it is in a human language. There IS a right and a wrong way.


I think one can extrapolate this to make an argument that this problem is the reason we rarely see LISP take off in a big way in a business context.

LISP is incredibly flexible. The ability to use macros to build DSLs without ever leaving the language, and have all the tooling designed to work with the language able to work within that DSL, is a game-changer.

... but what it changes the game into is "If you're working solo you can bend the whole language around to fit your mental model, but if you're working in a group you now have every engineer creating dozens of (probably unsanitary) bespoke macros to bridge the gap between their problem and their mental model today and the end result can become indecipherable to a new user, which jeopardizes a business project."

Paradoxically, flexibility can kill a large project. Large-codebase projects with 100 engineers don't need multitiools. They need what the army needs: an M4/M4A1 5.56mm Carbine with mass-produced ammunition. And they need it because when a fellow software engineer "falls in battle," it's in the company's best interest if you can pick up their weapon and start shooting immediately without having to learn the eighteen abstractions they hid the trigger behind, even if that makes your day-to-day work a little slower because there are upper bounds on how you can modify your weapon.

They're not planning for your needs; they're planning for your needs and the needs of everyone after you who will pick up your weapon and wield it when you're gone.


One can absolutely work in groups using Lisp. There are a lot of examples for that. Often the groups don't need to be that large when working in Lisp. Code in larger projects tends to be more compact. I heard for example of 1/10th of a large (1000 persons) comparable C++ project at Lucent many years ago.


I'm sure it can be made to work; I'm saying the extra flexibility means you need someone making sure everyone's either on the same page or knows how to get there.

I'd be interested to know what approach Lucent took to wrangling the complexity.


Let's assume they had 10 times the people and something between 5 to 20 times the code size. Going from 100 to 1000 people adds a lot complexity. I would think that adding good and experienced managers is important. Keeping all these people aligned, avoiding code duplication, reducing dependencies also needs to be addressed on the organization level. One can make it work, but some of the consequences one can't avoid. Though, it has some positive sides, too.

But then you'll need a lot of tools and architecture to deal with that. To reduce the code size to write, they probably would need to add frameworks, code generators, model-driven architecture, domain specific language, ... lot's of tools. Let's say you'll need to persist C++ objects into a database, they probably used an OO-database (at that time) for C++ with a special query language. Otherwise they would have used a more traditional relational database, either map things to an object-oriented persistence framework or write SQL. Tool building/usage is key, I would think.

Greenspun's tenth rule also applies.


Form is function in programming, we can dismiss using stdlib feature A vs stdlib feature B as "style choice", but the moment someone electively adds that second style, the mental complexity of the code increases. The new contributor needs to understand more abstractions for no particular reason.

No particular style is important, but design centralism and simplicity is important. Tools like Gofmt and black/isort I make sure are baked into dev environments, and I aggressively question the addition of new dependencies. Why are you importing superl33thttp when we already have requests?


At the risk of being downvoted: I love Rust exactly for that. Or rather, for the combo of rustfmt and clippy. Both of which one could actually call "opinionated" not least on style.

I may not always agree but I go with clippy's suggestions anyway (which btw. just means passing --fix, most of the time, and they're done for me).

Because I care about other people reading my code.

One of the realizations the article touches on: if everyone uses the same idioms in a language, code becomes way easier to read for everyone.

This wasn't so obvious to me until recently. I was regularly still writing/touching a lot of C/C++ (25 years -- until ca. early 2021).

So everyone's lib/header/whatever uses their own style and when you also need to modify or at least audit/understand 3rd party code, you're kinda use to dealing with that. I.e. I would have said: what is all the fuzz about?

Or I guess I could also be saying: when you get kicked in the shin every day, it won't hurt so much any more after some time.

I never though of this as making such a difference until I started using Rust professionally: my shins hurt way less; mostly not at all as everyone is using rustfmt & clippy. In fact, many Rust repos now have these two coming out clean as a condition for a PR to even make it through automatic checks.


One of the things that I wanted to try doing in emacs is the following: Considering a javascript project using prettier, when reading a file from disk, apply my personal prettier config before showing the contents in the buffer. Whenever I run the save action, apply the project’s prettier config before writing to disk, but leave the buffer contents the same.

I think this could be an interesting experiment. Maybe choosing a style to save that is better for reviewing/diff-ing, then being free to have your own personal preferences when editing.


What's good style in production code isn't necessarily good style in test code (and vice versa).

Good style in an SDK or a library is not necessarily the same as good style in a self-contained business app.

Just like suburbs and downtown can't be expected to look similar (whether in Montreal or elsewhere), style, too, is context-dependent, and there's more aspects to this context than just the language of choice.


Author here.

This is inspired by my love / hate relationship with Scala. I love the language, but it's so flexible that in a large enough code base, it becomes a mess.

It's not a Scala-specific problem, though. I think that languages that have a lot of powerful constructs need to be explicit about style, what's idiomatic, what approaches are no longer considered ideal, and when it's appropriate to reach for various features.

I see here that many disagree with me, which is ok I guess. But I'm pretty pro code formatters and style guides and I wish their were someone in charge to break the impasse of package managers in python? How do you overcome the XKCD 927 issue?

I think someone needs to be empowered to force convergence in languages when its needed.


it's so flexible that in a large enough code base, it becomes a mess. It's not a Scala-specific problem

it's not scala specific, but it is a function of the flexibility of a language.

scala and perl are the only two languages that i am aware of where this issue is a common complaint. other languages are either not popular enough for this issue to surface or they are not as flexible for it to be as big of an issue as it seems to be in scala and perl.


Agreed, reminds me very much of perl. I just consider these two, especially perl (partly because I'm such a novice), require a lot of discipline for how to choose a style that will scale well.


Great piece! I think of this as a feature. The idea of scala was to be a 'scalable' language you can do lots of things with. OOP, FP, etc. It does introduce the need for more discipline which i think your article properly addresses


>I wish their were someone in charge to break the impasse of package managers in python? How do you overcome the XKCD 927 issue?

At some point, a problem is unsolveable.

After setup.cfg and setup.py and pyproject.toml and pip and poetry and pyenv and pipenv, it's actually over for Python. There is no way to mulligan after 15 years of mediocre tools followed by mediocre tools.

The other day somebody tried to convince me that poetry's handling of index urls is safer and saner than pip. It's probably true, but I only used pip because it was available, not because I like to be unsafe. So when another group of experts comes along and says "lmao of course pip (the only tool you could use until now) is worse than poetry", I just can't care anymore.

tl;dr once the XKCD 927 issue lands, it's here to stay. A language has to come opinionated out the gate (like Go or Rust).


Why pick on MTL?


And is it a "problem" that Montreal has a mixture of architecture both traditional and modern? That is part of what makes it an interesting city! I understand that varying styles in the source code of programs may not be desirable, but the analogy to city architecture makes no sense.


It's an even stranger example to use when you consider Montreal's (and Quebec's) history. During the "Quiet Revolution" era of the 1960s and 1970s, government in Quebec basically became a "Style Czar", to use the article's terminology.

Québécois language and culture was forced upon Montreal's sizable and established populations of English-speaking and non-Québécois residents via Loi 101 and other policies.

Regardless of one's stance on the matter, the reality is that it was extremely disruptive to Montreal, socially and economically.

Throughout the 1970s and 1980s, many residents and businesses ended up leaving Montreal, and moving to cities like Toronto, Calgary, and Vancouver.

Toronto, Calgary, and Vancouver subsequently saw significant economic and social growth, while Montreal stagnated throughout the 1980s and 1990s, and even beyond then.

Today, Montreal still has a rather fractured society in many ways.

Maybe a "Style Czar" figure or mentality can bring a sense of consistency, but forcing it on existing and established communities will likely cause a whole new set of problems.


As a montrealer, I was super curious about what the author would do of that analogy but turns out it was very, very bad.


Those AI generated staircases make me sad. That's not at all what the city looks like!


Author here.

It wasn't meant to be a knock on Montreal. But on code bases that have very apparent different eras and styles.

The idea actually came from Kate Gregory, in an interview I did with her some time ago, where she talked about some parts of any large C++ code base looking like Old Montreal and some looking like Art Deco or whatever other style.

Edit: Yes, Montréal Effect seems better... changing.


Lifelong Montréaler here. I think having "problem" in the title is what is ruffling feathers, although I'm not particularly bothered by it. That's perhaps where the analogy is strained the most: stylistic divergence in code can be problematic, but in architecture and urbanism it's typically seen as an asset.

Perhaps calling it the "Montréal effect" would be more apropos?


It's just a low-value analogy. Wikipedia says there's 10,000 cities in the world. Montreal cannot possibly be the exemplar of whatever this post is trying to say.


I suspect part of the problem (and hence friction) is drawing an analogy to something that's actually a strength/net positive in Montréal as a city, but using it to claim a weakness/net negative in programming languages.

Widely homogeneous cities are not necessarily failures, but it's hardly a indicator of somewhere that functions well (as a city).


mf said get rid of explicit return statements


This seems like an awful lot of worrying about a problem that doesn't actually exist.


It's not that "style" doesn't matter, it's that you hit quickly diminishing returns after the bare basics of a formatter and maybe naming conventions. And most people who are enthusiastic about "style" push it way, way past this point of diminishing returns.

At the end of the day, I don't really care how you write your code as long as I can read and understand it, which comes 90% down to 1) did you choose clear names, 2) do you have clean abstractions / interfaces. If you did that, write your code however you want. And in fact, it's antisocial to demand that other devs (junior or otherwise) adapt to your own personal idiosyncrasies. Just as in writing prose, different authors have different, clearly intelligible "voices" that distinguish them. Stamping out someone else's voice just to match your own preferences kills the joy of programming for them, you should only do so in the direst of circumstances.


> And in fact, it's antisocial to demand that other devs (junior or otherwise) adapt to your own personal idiosyncrasies.

Most people see it as the exact opposite of this: it's you who are trying to impose your own personal idiosyncrasies on the project if you don't just adopt the project's style (which is hopefully automated so all you need to do is press a shortcut once you're done, or run a command before comitting) and insist on doing your own thing.


Style which can reliably be achieved by running a script is not really what I'm talking about. I like Go's auto formatting. People go way, way beyond that.


exactly this.

last week in a codereview i got asked to clean up indentation of some html structures that i had added.

my response was: i duplicated an existing structure and made mine look exactly the same. i agree that proper indentation would be nice. but if i did that, my structures would look different from the original. i could clean up the original too, but then the diff would be harder to read because it would contain unrelated changes.

copying the original unclean structure in this case is best for readability, because consistency is more important than any particular style.

if anyone cares about the style, they can submit a style cleanup separately.


> my response was: i duplicated an existing structure and made mine look exactly the same. i agree that proper indentation would be nice. but if i did that, my structures would look different from the original. i could clean up the original too, but then the diff would be harder to read because it would contain unrelated changes.

The idealist part of me wants to say: Clean up the original code, submit that as the first diff, then rebase your additions on top as a separate diff.

The pragmatist part of me recognized long ago that a lot of software engineering now is like what you describe. The days of elegant, carefully-maintained codebases where someone sweated over every semicolon are long gone.


> carefully-maintained codebases where someone sweated over every semicolon

this is not a thing that has ever existed. what on earth are you talking about.


I’ve been writing software for 20 years and I’ve never seen anything like that either.


in a pull request all commits get merged into one when you review the PR, so it would have to be two separate pull requests to be reviews separately, which is what i suggested.

when working by myself without PRs then i would do almost exactly as you describe. i would probably put the cleanup after the code change though mainly because i want to focus on solving the problem first, and worry about cleanup later. but i can see the benefit of doing the cleanup first.


My feelings about that stuff have shifted over time - these days I figure fuck it let the linger worry about the formatting. As long as everyone on the team is using the same config, things like indentation prettt much take care of themselves.


A guy I know at $currentjob would have agreed to the criticism, then created at least 2x Jira tickets to clean up the original and then the structure you mention.


I must be the only person who honestly doesn't care a whole lot. At the end of the day, good programming comes from good design. If you think about what's going on and the interfaces it ought to have, then the style of the implementation is not super important. In fact, insofar as different teams / groups may be quicker at one style than another, it would be better to just go along with it.


I'm thinking that it is largely psychological/OCD tendencies that affect programmers at different magnitudes. It can sometimes be difficult to not be distracted by the minor style inconsistencies before being able to focus on the larger system design.


Code style is easy to discuss, thus it sucks up the oxygen in the room that should be used for design discussions.


It's another example of bikeshedding.


In my experience a large project needs, at the very least:

* Indenting consistent with the curly braces, and well enough defined that when you remove an outer loop and want to unindent the code from that block, you don't have to do it by hand.

* A one-line code change doesn't bring with it a 500-line style change because the file passed between people with different style tastes.

Unfortunately, it's often difficult to achieve these modest desires without someone also wanting to limit line lengths, ban the ternary operator, ban nulls, and so on...


Just use a style formatter and the debate is done. Everyone is unhappy now. Or very happy due to not arguing.


In my career the only people I saw arguing for auto style formatters are people that were simply unable to systematically follow simple rules, and that you're better off without.


Why in god’s name would programmers of all people resist letting computers handle “systematically [applying] simple rules” for us automatically?


Naming and layout are part of how experts communicate intent with each other. Auto formatters are literally blind so they mostly make poorly motivated naming and layout decisions.


I would love some examples of this. I’ve yet to see a case in 20+ years of programming that was even close to being more-valuable than not having to think about formatting, but maybe it’s because none of the ten or so languages in which I’ve been paid to write code have been the sort where that’s a big deal.

[edit] naming, yes. That’s important. Static types are enormously important—machine-checkable documentation is the best sort. Comments are excellent if names and types aren’t enough. Separate documentation if all of those fail. Where my braces go or tabs-v-spaces or what have you? Not so much, but again, maybe it’s because we write different languages.


Spark offers a fluent DSL with infix operators for data transforms, but after a company-imposed auto formatter got done mangling it I was only allowed to check in a wall of method chaining that none of my teammates would write.

    things
      .where($"key" === lit(key))
      .select("value")
      .as[String]
      .repartition(N1, $"value")
      .flatMap(new ThingIterator(_))
      .repartition(N2)
      .write
      .parquet(
        s"s3a://bucket/things/key=$key")
As for naming, I see stuff like “every method must take one arg whose name ends with Request” even when envelope or notification would make more sense.


programming is about keeping control of the way the infrastructure operates


Are you also against makefiles, on the same grounds? “If they can’t be bothered to run each build step by hand, you probably don’t want them around anyway”

That’s much closer to being “the way infrastructure operates” than formatting is.

I suppose 100% of IAC is right out, then.


Source code is something that you author; binary artifacts aren't.


I'm afraid anyone who uses Tailwind is almost certainly using a style formatter to auto-arrange the CSS.


> * A one-line code change doesn't bring with it a 500-line style change because the file passed between people with different style tastes.

In general, one should refrain from drive-by refactoring.


If style is not important, it seems like all the more reason to standardize it.


I am pro-standardize, but this isn't a great argument. Enforcing a standard style takes resources from both the language and the people using it. You don't spend resources on something that isn't important.


Coming up with and agreeing on a standard takes work, but in most cases unofficial standards already exist - one merely needs to be blessed. (And if some project wants to not use it, that's also free)


Yes, as you say because the style is not important. Consider the wasted time on arguments between tabs and spaces and how many spaces; the noise in PRs from style conversions that hide the actual additions and alterations; the increased efficiency that comes from learning and using the shapes of a codebase to more quickly seek.

It doesn't matter what the style convention is so long as you have one, you have automated tooling that creates and validates it, and that enforcement is done by machines as part of the delivery process.

Not because style is valuable but because time is a valuable resource and spending it on preference differences and dealing with variances is wasteful.


Well... no. Engineers are more efficient in their own styles. Realistically, I've wasted hours setting up my editor at each new job / each project to get on board with the style. This really doesn't matter. Maybe it's just me, but I feel that a good engineer ought to be able to read code regardless of the indentation or particular language dialect (within reason).

When I've led teams, I've never enforced things like indentation or bracketing styles. Things I do enforce are (1) explain complicated code with comments, (2) make it readable (for whatever definition of readable), and (3) reasonable naming conventions (i.e., don't name every variable i, j, k). These guidelines are useless for automated tooling. Perhaps in the age of LLMs this will change (who knows).

I've found the strict requirements and automated tooling absolutely terrible. I often align things to keep parallel structure to show how different lines are the same or to create 'tables', etc. Formatting tools destroy this. For me, using my emacs rectangular editing, the alignment greatly increases productivity. This is one example. I'm much faster at editing personal projects / projects I've led than ones with strict style guidelines.

For me, I always tell my guys: edit in the style in which you are fastest and most efficient. I believe this greatly increases productivity rather than having a team 'nosy neighbor' that cares whether you use two or four spaces for indentation.

My two cents.


Your approach seems like the worst of both worlds: a single codebase with mixed styles.

I work at a large tech company, and while I don't really participate in style wars, I do care that there is a single enforced standard across any particular codebase.


That is my take, after having written in many languages over many years.

I also think this is more of a Zeitgeist thing, to be honest.


> Well... no. Engineers are more efficient in their own styles. Realistically, I've wasted hours setting up my editor at each new job / each project to get on board with the style

I disagree for two reasons:

1. Optimizing for what each dev is used to is optimizing for a local maximum. If they’re THAT good it won’t take them long to adjust to new defaults. Good engineers should be able to pick up a new lang quickly so some changes in the context of a language should not bog them down significantly in the long term.

2. The overly long time it took to set up tooling in a particular company or language do not outweigh the bigger advantage of being able to quickly get a grasp about what a piece of completely unfamiliar code does.


So you don't somehow end up with mismatched curly braces?

I might paste something, but mess up and forget to select one of the curly braces or select too many. I am seriously considering closing all of them on the same line in this style

    if blah {
        thing; }


This is why you have a CTO: someone is in charge of making all those design decisions.


If your CTO gets involved in your code formatting discussions you need a new CTO


The CTO is the one in charge of choosing the programming language, putting in place the initial framework, processes and rules.


I am quite sick and tired of all these style guides, best practices and so on. People who enjoy telling other people what to do are just disgusting. And the 'standards' are quite indistinguishable from just a random idiot with a random idiot opinion. Following a standard in no way guarantees that the code is readable. A lot of standards come close to guaranteeing that it is not. E.g., whoever came up with the idea that 2 spaces of indentation is enough. One thing is that auto-formatting rules are just not smart enough to create good looking code because one formatting style may be more readable in one circumstance and another in another circumstance.


I tend to agree. My formatting guide is the default formatter in intellij, then i choose my whitespace, brackets, newlines etc based on each situation. There is no real reason to get any more complicated than that


I once had a team discussion about something style related. I told them "I don't care either way. To me, it is like Tabs vs. Spaces: Both are fine, let's just pick any and all is well." A more senior developer added: "Yes, as long it is Spaces."

He lost a lot of credibility in my eyes that day. Really, you are so very much more experienced than me, but you cannot cope with tabs?

Since that time, I am even more tired of style discussions. Choose whatever you want, as long as you don't make it annoying to me (e.g. if you want a specific formatting, give me a tool; don't force me to do it).

I would rather have an Abstraction Czar that ensures that all code is properly abstracted (not too much, not too less) instead of the nightmares I have to deal with every day.


Okay, downvotes but no replies. In my eyes, what I am saying here is mostly in line with what some others said. My unique points were mostly my anecdote and my note about abstraction.

Well, I was new here, but it seems that contributing is not for me. I will go back to just reading then. You people do you ...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: