Hacker News new | past | comments | ask | show | jobs | submit login
Rome: A Linter for JavaScript and TypeScript (romefrontend.dev)
303 points by acemarke on Aug 8, 2020 | hide | past | favorite | 136 comments



I feel so torn on this project. On the one hand, I want to root for sebmarkbage, who has done so much for the field.

On the other, as someone who used to do some "JS platform" work at a tech company, I really don't want Rome to catch on. Yet Another Standard is really painful for the ecosystem, and while it's obnoxious that you need so many tools, at least we've finally settled (mostly) on good answers for each vertical. TS, ESlint, Prettier, Webpack, Babel.

Now, would it be nice if there were an underlying engine/server that all tools could use to share an AST without reparsing everything? Sure, I guess. Would it also be nice to have one tool that wraps the others, at least when you're getting started? Yes, though I think create-react-app has blazed that trail pretty well.

The thing is, all those tiny little details from Prettier and ESLint and Webpack really matter – Rome will take years to achieve the number of lints that ESLint has, and even more years for the community to agree on which ones matter. Similarly, every corner-case of prettifying will have to be considered anew, and every edge case of bundling rebuilt or reimagined.

I love to see a talented person take on something insanely ambitious, so I wish him luck and hope he proves me wrong. But if Rome succeeds, it'll be a big pain for a lot of people as they try to port everything over, and for the community as they grapple with dueling standards for how things should be done.


The ecosystem is already fragmented and dueling. When it says it replaces those tools it means it aims to replace the functionality of those tools, not make them obsolete.

Rome being successful doesn't mean eliminating those tools, it's providing something valuable and giving people an option. If it's not for you then that's okay. Rome is early and is still evolving, including possible areas for extensibility.

I think a lot of people don't realize the sort of capabilities that they're missing out on by not having their tools work together, or sticking with old tooling that cannot innovate for legacy reasons (like Babel). I don't think it's harmful to the ecosystem to advocate for more consolidation, especially around tooling that not a lot of people either like to deal with, or have few maintainers in the first place.

I also think you have me (Sebastian McKenzie) confused with Sebastian Markbage from the React team.


Hey there – I do mix the two of you up, but Babel is nothing to scoff at. I've used its internals in the past, and read a good chunk of its source code, so I think I've seen your blames all over. I'm quite appreciative of your past contributions! (and his)

Reading your comment, it sounds like you expect some people might use Rome for some things (say, linting and compilation) and preexisting tools for others (say, formatting and bundling). Is that your intent?


Thanks! Yeah, it is, at least for a while since it will take a very long time to reach expected maturity. You can adopt as many or as little pieces as you want. The idea is that once you adopt one of the "tools", you can use the others and reuse the exact same configuration, and ideally you shouldn't even need to do anything else.

ie. For linting we also do dependency verification. So you might need to configure Rome if you put your dependencies on some weird non-standard place. Once we open bundling up, we'll already know how to resolve everything.

Each part should stand on it's own. It's not as if the linter we've released is worse, and the only selling point is that it's going to be part of a larger suite. It legitimately has features and separates itself from the alternatives. eg. extreme focus on useful error messages, powerful autofixes (that operate on an AST rather than insert strings like ESLint), proper caching for even better performance (ESLint doesn't offer a good solution here) etc.

Many are making the mistake in thinking that building Rome is just as time consuming and resource intensive as rewriting each of the tools it's meant to replace separately, it's not. Once we've validated the linter we've also validated the compiler (it's the same thing), our ability to watch files, analyze dependencies, integrate with your editor etc. Sharing so much fundamentally decreases the actual complexity of everything, not increases.

I think it's also important to note that I personally have written an extremely small amount of the implemented lint rules. Our API is just easier to use, and we've focused on setting up the tooling necessary to make writing them easy (although internally in our repo).

Check out https://github.com/romejs/rome/issues/341, https://github.com/romefrontend/rome/issues/20, and https://github.com/romefrontend/rome/issues/159, if you're interested in seeing how the progress was actually made in the implementation of those rules. The work was spread out over a really long time, and if given complete focus and proper coordination (I was not good at this and kind of let it be organized ad-hoc, which is actually how we got some amazing contributors), then it could have been completed in a fraction of the time.

There's been some overfocus on particular things like minimal configuration and the lack of extensibility but those are not hard requirements and will evolve over time, particularly as we get feedback and people demonstrate the requirements and restrictions they're under. The project understandably involves a lot of hubris, but I do believe that Rome isn't only valuable in aggregate and will have significant advantages even if you only decide to use one piece.


Thanks for this clarification. My first reaction to reading the project page was "damn a tool that takes doing everything to a whole new level". If it's adoptable piecemeal, there's a good chance that if it becomes popular I may actually get to work with it someday - but if it were an all or nothing proposal, in my current job the answer is likely to be nothing due to challenges integrating with other technologies we use and get value from.


Very interesting! Thanks for clarifying.

I've also done some work on Prettier in the past, so I'm curious about some details there too.

1. Do you plan to build a wadler-based formatter, similar to prettier? Will it be as opinionated as prettier, or based on a pluggable architecture? (I'll admit that I have yet to conceive of a viable way to make the latter work well).

2. Have you considered (or already built) an incremental parsing server for tools like the formatter? For example, when I add a statement at the bottom of the document, will it reparse the whole document, or be smart enough to notice that a change has been made that can use cached results for most of the AST? (I'm not sure if this is feasible for JS, especially for parsers that require line/col info).

In any case, I'll be following your progress with great interest!


Eventually one of you needs to bite the bullet and change your name!


Welcome to web development, where change has been the only constant since—checks notes—the early 90s!

My way of coping with the appearance of yet another new-and-hyped thing, is to not complain and resist, nor to try to immediately learn the thing, but rather the much easier task of learning the why of the thing. Equipped with this knowledge, I can grok the JS ecosystem as a whole, and make better choices when it's time to choose a tool.

I do agree it's super exciting to see Sebastian McKenzie launch out with something new. I think I understand the why of this, and wish him every success!


Your comment says to me anxiety largely just at violating conventions of popularity. Your only hesitation is an appeal to authority. There is no consideration of merit or anything technical.


I read the anxiety as far less being about popularity than past trauma from the Javascript ecosystem and the whiplash it went through before it settled on the React/Vue/Typescript world we have today. When you go through that much change in the past, you really need a good argument to be convinced to try something new.

As a developer, I haven't had any qualms with ESLint, so I wouldn't really have any reason to pick up Rome. Now if the full tool set offers lots of things, then I might consider replacing my current Vue template / create-react-app, but that's a big step.

All of this to say, no, it's not an appeal to authority. It's sounds like a "I'm not sure I can imagine this being worth switching to because we have something that works and is stable now, and we haven't had a great track record with tooling churn in JS world". It's a value add question.


> from the Javascript ecosystem

If that's really the problem you have to perform an honest self-assessment: are you spending more time and energy on tooling and framework drama than building/maintaing original products?

Its an important question to ask yourself as it directly indicates whether you are delivering or consuming business value. Developers will twist themselves into knots to falsely correlate those two things, but customers and end users absolutely don't care.


As much as this sentiment rings true, it's also a fairly useless one

Most people don't have jobs building/maintaining original products or delivering or consuming business value. Most people have jobs doing what someone else tells them. Business value just happens to be the byproduct of doing what is told. When this "someone else" finds this brand new tool, either they ignore it in favor of existing tooling which means you're never going to learn it which means you're left behind in a constantly changing ecosystem and then changing jobs becomes a massive pain because you're not familiar with said tool, or "someone else" loves it and you're forced to adopt it and your life is pain as you migrate ancient projects to try and please their whims

JS being in such constant flux means new companies making new products adopt the standards set by the current trends which means they will not hire you if your knowledge has gaps and if you're looking for new careers, you have to continually keep up. This is exhausting and a lot of people are, IMO rightfully so, frustrated


Shifting ground from value add to required wasted time as ordered by leadership is an odd position.

It’s that kind of nonsense that incompetent people use to qualify prior bad decisions formed from insecurity at expense to everything else. It’s less clever than a person might think.


You just repeated what I said, yeah? My whole point is not trying Rome is focusing on exactly that. Feels like we're going in circles here. The top level post you commented on is performing that exact assessment you just asked for and criticized as an appeal to popularity.


Yep, that's exactly where I was coming from. Thanks.


Definitely share this sentiment.

Even if it somehow manages to succeed at its mission and become the defacto toolchain for all of frontend, I feel its improvements over the status quo will be short lived, and over the long run having a monolithic toolchain dominate will prove to be a net-negative for the frontend tooling ecosystem.

Having a single defacto monolithic toolchain greatly raises the barrier to entry for new competing tools, as they'd no longer be able to just compete on the merits of doing any individual task better than its direct competition to gain adoption.

Having different single purpose tools that can people can pick and choose from independently of each other is a crucial part of what has enabled rapid innovation in the JS tooling ecosystem up till now, since it keeps implementation/switching costs for any individual tool as low as it can reasonably be for that specific use-case.

This comes at the cost of making it harder for any individual to keep up with all the latest innovations (as critics of the JS tooling ecosystem are often quick to point out), but projects like create-react-app have been fairly successful at addressing this through sourcing collective wisdom from the community to create a collection of best-in-class tools configured to work effectively together out of the box.

I can also totally emphasize that there are many low-hanging fruits in terms of efficiencies to be gained through unifying tooling to avoid wasted work and maintenance overhead, but I'm of the opinion that the gains in efficiency isn't going to be worth the cost in potential for stifled innovation, and that those efficiencies are better achieved through shared specs and shared low-level libraries that individual tools can choose to adopt if they become compelling enough.

I'm personally still going to be rooting for single purpose tools + curated collections over any monolithic toolchain.


With all that said, I for one would totally welcome something new in the formatting space to challenge prettier that decouples the line-length based autoformatting from the opinionated default styling presets.

Specifically motivated by this popular issue: https://github.com/prettier/prettier/issues/5814

:)


JavaScript kind of always had this problem, and on the whole, it’s a good problem. Nobody forces new tools to become widely adopted. It happens because they improve enough stuff to convince us that switching is worth it.

When the web moved from jQuery to React, the details developers cared about changed. Opening modals got harder, but thats okay because React made so many other things better.

React is good enough to force us to adopt compilers for a language that doesn’t need them. When people complain about JavaScript on Hacker News, it’s usually about the build tools – and Rome has a shot at being an answer to that.

The web is more mature now, but that doesn’t mean the time for new tools is over.


sebmarkbage and sebmck are different people.

sebmarkbage is one of the architects of React and a TC39 member. He's known for the Fiber architecture, among other things.

sebmck/kittens wrote 6to5 as a teenager in Australia, eventually landed a job doing JS infra at Facebook, and now works at Discord. He's a founder Babel, Lerna, Yarn, and now Rome.


I removed Babel from my toolchains a while ago, so it's even shorter: TS, ESlint, Prettier, Webpack.


Same, but I wouldn't say I'm happy with it. Everything's interdependant and that makes it such a pain to configure and maintain:

  - Prettier has to be integrated with ESLint to avoid conflicting formatting and for good devX
  - TS config interferes with ESLint config ("this file is not part of any project" or conflicting unused var warnings)
  - Webpack has to be integrated with TS to compile down to JS
  - ESLint, TS and Webpack all need configuring to know about absolute imports
If you throw babel into the mix, complexity just explodes. Thankfully it's feasible to have a stable config that "just works", but it can be quite the time sink. I wouldn't be against something that takes this pain away


Exactly. Let's say Eslint, TS, Prettier, Jest, and Webpack, are just standard for most projects I start.

The problem is, each little issue a developer has, often requires not just installing these, but then adding a whole bunch of plugins to make them work together. So whenever I start a project, my package.json is just already packed with a gazillion of devDependencies from @types/ to eslint and webpakc plugins.


This is the reason why I wrapped my Webpack config in its own package (same with Eslint), that way I pretty much only have a single dependency to add when starting or maintaining a project (and if a project has a very unique need, it can still modify the config generated by the package).

The package has a bunch of automated tests, so Dependabot checks for me if dependency updates are still compatible with the way I expect them to behave (and if a dependency needs to be swapped by another, I only need to do it in one place).


> standards for how things should be done

I cannot deal with these kind of statements and that's probably why I never use linters for my own projects. And yes, my code is good enough, a linter won't make it any better. It just blocks me time and time again and litters the codebase with: // eslint-disable-next-line

I never really understood why these tools are so popular and who really needs them. In the end it's all about the result of the code you've written, how easy it can be maintained and how good it works. No linter can help me with that.


I think the benefits of linting are similar to writing tests.

Initially the practice is detestable, especially if you've already developed some hard-fought coding chops without them. But after some time the practice itself influences your ability to write better code to start with, and successive tests become terser and more useful.

I think the same is true for linting. I found being involved in projects that enforce linting rules on commit to be insufferable, so I installed Prettier and configured it to format on save. Over time the auto-formatter did less because of the habits I was picking up.

I did have to confront the idea of being less precious and attached to my own coding-style, but frankly I still have pretty strong opinions despite having an automaton do my thinking for me in some of the projects I work on.

Of course, neither tests nor linting are a panacea, but so long as you give a hoot to start with you won't be able to avoid writing better code after using them.


> Over time the auto-formatter did less because of the habits I was picking up.

Indeed, making you more robot for "how things should be done".

Just one example: I like my code to be readable, and one of the things I apply for that is the use of white space. Between my functions/methods I have at least 3 lines of white space. It's a lie that this is not "how things should be done", and there is no way to tell prettier to accept that because it is meant to be opinionated. These tools have a place for some teams and junior devs, but I try to stay away from the ugly code it spits out.


Good point on teams, but I'm not convinced that linting is a great tool for juniors in the same way. Frustration is an instrument for learning, but adhering to linting rules forces that energy in the wrong direction for those still going through the basics.

On the point of being a robot, in what way are we not already "robots" by stubbornly sticking to what we think of as best? Are you not already a robot by limiting yourself to a narrow-range of what you consider "readable"?

I also find it annoying to occasionally lose white-space to Prettier. But I've learned my way around it: By either adding short comments or by decomposing single-lines into multiple ones, which ends up improving readability anyway.


> my code is good enough

Then you're perfect or your environment is perfect. Or you work alone?


Agreed. Creating a one-tool-to-rule-them-all is unlikely to help matters much.

To me, (and with the conversation restricted to applications, not websites or anything that is primarily a document for consumption), we should be going in exactly the opposite direction; reducing the number and scope of the extra tools we need to use.

We should be making bundlers unnecessary (javascript modules plus smart http 2 servers are one way, service workers are another), we should write code that targets the js that modern browsers understand so that babel is unnecessary as part of our main workflow and is only run if you want old-browser compatibility, we should look at folding the key parts of react into the DOM (firefox used to have something really not all that different from JSX, and the core functionality of react is pretty small - the DOM is a virtual DOM). Linting and source code formatting should primarily be part of your IDE/Editor workflow. Testing needs vary, and I want to be able to use the right framework for testing for the project without having to bring along all the other cruft.

What I loved about the web when I first started building apps for it was that I could start with a single file in an editor and grow from there. Making a change, pressing F5 and seeing the change immediately was so liberating compared to long compile times and complex build systems. With your code carefully written so as to fail fast, you could often see failures in the live application faster than my Java IDE would pick up a type error in Java code (anyone remember the white menu bar of doom?).

I think it's a great pity that we've taken one of the web platforms major strengths - fast iteration times, and added so much tooling. We used to scoff at the fake stackoverflow question where someone asked 'how to do addition in jquery', but the sheer quantity of tooling that is normally used for projects has put us in a similar position.


Honestly, I think your thinking is totally backwards.

Wanting a project to be successful because of the person who started it is a terrible reason and has had (unsurprisingly) terrible results wherever I've seen it in the OSS community.

I feel the exact opposite: things like Rome (and Deno) are exactly the kind of projects that are needed in order to sort out a horribly fractured eco-system. I don't care who built them as long as they work well to solve the problems that so clearly exist.


> Wanting a project to be successful because of the person who started it is a terrible reason and has had (unsurprisingly) terrible results wherever I've seen it in the OSS community.

I somehow misinterpreted pipenv being an official python project. My usecase was pretty simple, I wanted a requirements.txt that only included the things I explicitly installed with perhaps a requirements-lock.txt with the actual list of things installed. :/

I didn’t even realize poetry existed until I talked to people who do python full time.


Yeah, that is an example that sprang to mind as soon as I read the GC.


> Yet Another Standard is really painful for the ecosystem, and while it's obnoxious that you need so many tools, at least we've finally settled (mostly) on good answers for each vertical. TS, ESlint, Prettier, Webpack, Babel.

Have we? I have 2019 projects I have worked on that dont use TypeScript and while I dont mind TypeScript I have a YAGNA attitude about it still. My coworkers seem to hate it despite never having used it. I think of TS as almost irrelevant if you can do JSHint but thats just me coming from Python 3 where type hinting is valid and doesnt require additional tooling to make it work.

Sadly I understand why JS syntax doesnt expand too much over the years and you need something like Babel.

But on the other hand if we stopped allowing legacy web clients we could probably push modern JS to more mature stages including type hinting (optional of course) as was done in Python 3.


You don't need to use any of these verticals - you can just write a .js file and serve it without any modifications, linting, etc.

But if you want to? Each vertical has a de-facto community standard at this point. If you want typechecking, it's TS. If you want formatting, it's Prettier. Etc.

And of course, nobody's captured 100% of any of these verticals - there are plenty of people who use Flow o Reason for typechecking, or "StandardJS" for formatting.

But a new person setting up a JS project can just install the community standards at this point and be fine (albeit bothered at configuring things, if they don't use c-r-a).


I don't see it this way; I see Rome as the Apple of front-end tooling. You use an opinionated service to do most of the work. No brainer!

If you want more configuration, build your own, and most people will do. But for the novice, people who do multiple new projects a year, or people who don't care about details, Rome will be great.

For homemade projects, I use CodeKit, and I think Rome will be the same without the friendly and refined GUI.


it's weird that you think of TS as being a settled-on part of the stack.


Developer surveys show it's one of the most agreed-upon tools in the JS stack from users:

https://2019.stateofjs.com/javascript-flavors/#javascript_fl...


the prevailing consensus esp in the JS world is that 1 tool should do 1 thing. This is fine under the Unix philosophy, but challenges arise due to the combinatorial explosion of config needs, bad errors, and overhead from crossing module boundaries. There are a number of attempts at challenging this status quo:

- ESbuild (100x faster than webpack in part due to focus on doing a single parse (but also shipping a Go binary))

- Deno

- Rome

As we consolidate on the jobs to be done we expect out of modern tooling, it makes sense to do all these in a single pass with coherent tooling. It will not make sense for a large swathe of legacy setups, but once these tools are battle tested, they would be my clear choice for greenfield projects.

recommended related reads:

- https://medium.com/@Rich_Harris/small-modules-it-s-not-quite...

- (mine) https://www.swyx.io/writing/js-third-age/


After spending way too much time debugging issues with frontend tooling, I am all on board for this glorious monotool future. We’ve definitely gotten some progress out of the massively micropackage approach that Webpack/Babel/etc use, but these days it really feels like we’re passing a complexity limit and we need a new approach. The fact that CRA will refuse to startup if you have another version of Babel installed anywhere in the tree, is a pretty good clue that the current approach isn’t scaling.


Yes, time spent on tooling is time not spent solving real problems. The ability to make unpopular decisions necessary to move a product forward is what separates an expert from an expert beginner.


I wrote about something related as well: Disintegrated Development Environments — How Did We Get Here?

https://amasad.me/disintegrated


Wow! I've been trying to put my thoughts about this into a blog post, but it would have ended up (or will end up) almost exactly like yours (except I didn't know about "Worse is Better"). Great to see someone with such similar ideas!



i confess i dont know where this fits in the pantheon. is it basically a direct competitor to Rome written in Rust?


Last time I checked, it's a direct competitor / reimplementation of Babel in Rust.


Man, the performance benchmarks that esbuild publishes are pretty remarkable. I know it's common to cherry-pick favorable benchmarks, but even if these are misleading by two orders of magnitude esbuild would be best in class.


I also found them unbelievable so I did some quick tests with a couple of large projects that I'm working on, currently using Rollup. From these, I'd say their benchmarks are completely accurate.


Agree. I went as far as regressing to `watchexec` to re-run esbuild and cargo build on any changes. Esbuild is near instant compared to anything else.


I think maybe what's missing is one tool which the "one thing" it does well is put all the other tools together seamlessly and make sure they work right. My guess is that's what an IDE is supposed to do (I say "my guess" because I've always done my development in Vim).

I just know that every time I revisit any JS front-end stuff, I end up punting and just manually adding script/stylesheet includes for Vue and Bootstrap or whatever so I don't need to deal with half that crap. As an outsider it's always so daunting, and I feel like whatever I learned 6-12 months ago last time I needed to do something doesn't necessarily apply anymore.


There are some claims to this thrown recently:

* create-react-app serves as a very basic frontend template for a React project. It eliminates worrying about Babel, webpack, polyfills, etc

* next.js / nuxt.js solve both front-end and backend for React / Vue. Likewise they package webpack, Babel, typescript, etc, and also help with bundle splitting, routing, and server-side rendering that used to take 30 different packages and 6 think-piece blog posts. It’s good stuff (I can speak for Next, dunno about Nuxt)


> create-react-app serves as a very basic frontend template for a React project. It eliminates worrying about [...]

Until you need to do something that CRA doesn't do. Then you have to learn about babel/webpack/whatever AND how to do get CRA to play ball with it. I'm still on the fence on whether CRA and its leaky abstractions is better than a boilerplate with working config files exposed


But, CRA comes with a button that turns it into a boilerplate with working config files.


Huge ambitious project and I hope it delivers. If anyone can lead this one to fruition is Sebastian. So this project is in good hands.

Undeniably, it's technically ambitious to build all these pieces under a single umbrella. And then, convincing people who are honestly scared to touch their Webpack config to switch to a new tool might not be easier.

But if Rome booms, it'll truly benefit the community.


> convincing people who are honestly scared to touch their Webpack config to switch to a new tool might not be easier.

If someone can offer me a convincing alternative to webpack I will switch in a heartbeat.


Same, but the whole point is to switch you away from Webpack, Eslint, Prettier, Jest... I truly hate Webpack, but I do love Jest.

Basically, Rome will most likely need somewhat of a Typescript moment. Where it's just transitioning to the new normal for (many) JS projects.


I'm almost convinced by snowpack, especially because it has an escape hatch to webpack. (But that escape hatch is of course yet another opaquely managed webpack config, so hence some of the remaining reservations.)


Parcel is pretty amazing


agree 100%. love how it requires no configuration.


There's something that scares me about Rome: it's lack of plugins.

I really like Babel because of its plugins. My project typecheck.macro, could not exist without Babel plugins.

How will Rome support compile time transformations like graphql.macro or typecheck.macro?

Compile time plugins allow JavaScript to evolve super rapidly & make the creation of frameworks that act as compilers instead of traditional frameworks possible.

[1] https://github.com/vedantroy/typecheck.macro


I spoke in this post about how rushing into a plugin system hurt the longevity of Babel and it's ability to innovate. We aren't going to make the same mistake again. Rome will likely eventually have a plugin system, but what that would look like isn't clear.

This is the first release and until there's some actual usage, there's no real way to realistically predict what sort of things people will feel is missing. You can always supplement your project with multiple linters if you feel like it is currently a blocker.


Without plugins at first, how open are you to contributions that would normally go into a plugin and not into core?

As someone who works on libraries that don't use JSX, and therefore have to rely on compiler and linter plugins to get support (tagged template literals in my case) I worry that React's dominance will mean that it gets a place in Rome, while other systems are locked out, which only increases React's dominance.

Would you consider an internal plugin system as a first step, where plugins have to be part of the codebase, but intentionally get a restricted API surface? This would allow you to try out and refactor the plugin API over time before committing to it publicly.


Depends on what it is I think. We discussed the idea of "expansion packs" which would enable certain funtionality as a sort of "limited config" hack. https://github.com/romefrontend/rome/issues/173

I specifically call out not allowing smaller communities to grow that don't have the advantage of their community size being forcing functions for support. It's the most compelling reason to me to even have any sort of plugin or custom rules system in the first place.

Although we didn't go through with that idea because they can just be enabled by default since the patterns they're linting for are unambiguous, I guess that applies to any additional ones, although so far the way we've approved these sort of these has been adhoc. There's a strong desire though to formalize some "approval process" for lint rules and more typical project decisions.


Isn't this the reason why it is called Rome? In JavaScript history, we are past Babel (and the Greek city states) and a centralized tooling will allow for an unprecedented language economy?

Constantinople and Moscow are waiting, as well as Venice and London. A plugin system that wants to incorporate everything will risk having to maintain Byzantine diplomatic relations. On the other hand, restricting the plugin system to its core will create a local powerhouse that will utterly fail to adapt once new ventures become available.

I am waiting for the TypeScript / JavaScript split.


My guess is "all roads lead to Rome".


It's a good thing. It should get the defaults right first before rushing into plugins. Webpack had everything delegated to plugins without a single sane default for years which only contributed to its complexity. As thebiggest example: it had a several-thousand word article on consistent caching that used three different plugins for three or four years before that became default behaviour in version 5.

I, for one, welcome a future of build tools without plugins.


They are not the same thing. Despite not supporting plugins, I think you can definitely expect Rome to support some form of macros, thought it might look different from the existing API in babel.


Sebastian keeps mentioning how all these different kind of tools could re-use the same infrastructure for the things they all do, but... It's still not quite clear to me what benefit that brings to me as a non-contributor? I can see how it could be beneficial if the entire ecosystem would rally around the same tools and then be able to move faster, but given that that has not yet happened... Why would I use this to lint my code? The nicer error messages?


I don’t work on Rome and haven’t looked at their code much, but if I were to guess: performance and better error messages

If you can do the same work in fewer passes, it will often be faster

So instead of every file change, webpack runs its own parser, then Babel runs its own parser, then ESLint, etc — Rome probably just runs the parser once and sends the AST to other plugins

And then if you invest in making the parsing stuff really good, that makes the linter better, the Babel-equivalent better, etc


Two immediate potential benefits:

- Only one tool to configure, instead of many

- Many tools revolve around parsing your code and generating an AST, then manipulating / processing that AST (Prettier, ESLint, Babel, TS, Webpack, ....). That's a lot of extra processing that has to be done. In theory, doing all that processing _once_, and reusing the AST, would potentially run a lot faster.


Also, fixes in the shared libs would be directly available to the tools build on top.


I'm not an expert in the internals of these things, but presumably Prettier, Webpack, TypeScript, Babel, Istanbul, and ESLint each build their own separate, private ASTs from scratch. Maybe a project like this could pipeline a sequence of ASTs one to the next, and skip some of that overhead?


Thanks for the answers everyone. Just wanted to point out this comment by Sebastian that also perfectly explains it: https://news.ycombinator.com/item?id=24096744

(And it would be good to put something to that extent prominently on the website, if it isn't yet.)


Why a focus on frontend only? I am using Typescript, Babel, Jest and a bundler for any app/library on the frontend/backend. If you got to do it, do it all the way! I can already see the GH issues, “sorry, this is a backend use case”. So now my tooling would end up being more complex with the addition of another tool since it will never support all my use cases. Seemed promising but this alone makes me question its utility.


TypeScript is a frontend language. "Frontend" is the category of languages we plan on supporting. It doesn't say anything about your usage of those languages. Where you run the code does not matter.


Could you expand on why you think TypeScript is a "front-end" language? I was under the impression that it just compiles to Javascript and while Javascript used to be a front-end language, it's nowadays bravely used for backend projects as well, for some reason.


I want to somehow make the point, which seems to be somewhat out of fashion this season (it was a focus a few years ago), that one of the features of Javascript is that it's universal/isomorphic. And that the average computer (and even smartphone) is very powerful, so if storage is abstracted, the distinction blurs. To me, this means features like browser based tools, which are gradually becoming more user friendly, don't really need a "server," except perhaps as a way to coordinate and for long term storage. Eventually, I think a non technical user should be able to create content that contains open-ended dynamic formatting and distributed data based views, without a round trip to a server. If they want to dip into arbitrary Javascript, maybe just re-using a module, fine. Ultimately, the same code might be used in the browser as that in a development process. Current markdown tools and notebooks are headed in this direction. However, trying to use something like mdx-js shows we are far from reasonable bundle sizes when taking this approach. One-tool-per-application might yield the small bundle sizes and dynamism for this goal. Wondering if Rome has this goal in mind. Thanks!


I just last week found out about https://rushstack.io/ you might like it.


I think this has a brighter future than lerna if you are managing a monorepo They’re betting big on PNPM being a strong long term bet.

They also just recently released Heft which is the rushstack repo build/lint/dev-server stack.

It’s a promising project


totally agree, on all counts. I'm getting back into typescript dev and tried lerna... I couldnt get it working after a week of trying. It probably does work, just lack of tutorials makes it a really steep learning curve. Then I found an out-of-passing mention to Rush and !!! Heft is great. PNPM is great (hadn't known about that either)


Why start this from scratch ? Why not build on top of TypeScript ?

I get the single pass rationale but isn't it possible to integrate a bundler into TypeScript compiler ?

I'm just wondering because Microsoft has a decently sized team of paid engineers working on TS for years now - what are the odds an OSS project outperforms them at implementing their own language (which they also evolve with every release)


The TypeScript compiler API[1] is beta, underdocumented, and not really designed to support use-cases other than working with an abstract AST. It’s OK for transpiling, but would be a bit awkward for writing a linter and not at all suitable for a formatter.

As an example of the sort of problems you might run into with the API in its current state, trying to parse and then pretty-print code can cause comments to vanish[2] under certain circumstances. This is fine for transpiling, but is a major issue when you’re trying to prettify and overwrite the original source file.

[1] https://github.com/Microsoft/TypeScript/wiki/Using-the-Compi... [2] https://github.com/microsoft/TypeScript/issues/39620


I get that - but since TS is also OSS you can fork and upstream, still sounds like less work than writing everything from scratch.


Why?

TS and Rome architectural goals are completely incompatible.

Writing a transpiler is not complicated at all, especially for someone who has written Babel in the past.

On the other hand, if you familiarise yourself with both TS's gigantic codebase and with Rome's goals, you'll quickly see that modifying TS would be much vaster undertaking than writing it from scratch.

Also if Rome were just an also-ran to replace ESLint/Webpack/Babel/etc, it would be ok to have a less-than-ideal starting point, but it's actually trying to do things differently: better error messages, recoverable parser errors, more fixable linter messages, being faster.

This is the exact case of something that should be written from scratch (or at least use something that makes it possible) rather than building on some another technology with completely incompatible goals.

Another example? It took several years for Webpack to achieve tree-shaking that was as good as Rollup first versions, even with more maintainers, more sponsorship and a lot of time. Why? Because Webpack's architecture was completely different.


But you're basically trying to superset the TS compiler. If this doesn't cover TS typechecking then it fails on it's promise of a single pass compiler stack.

Are you suggesting that TS team is so incompetent or legacy burdened that their codebase is impossible to extend. What use case of TS would you exclude to arrive at simpler code base ?

Writing a transpiler that ignores type checks is probably simple but also quite useless for dev workflow - if I need to run a separate type checker might as well run a separate linter and compiler, the tech is established.


> Are you suggesting that TS team is so incompetent or legacy burdened that their codebase is impossible to extend.

I'm not suggesting at all that they're incompetent, but you seem to be...

Typescript's goal was not extensibility in the way it would be needed for Rome to use it as its base. There is nothing wrong with that.

Also, it's not "incompetence" to write a codebase that is hard to extend or modify to do something in a completely different way. It's just a difference of priorities.

Of course it would be better to only have easily-modifiable codebases in the world, but this is not possible. Teams might have other conflicting goals.

Don't fool yourself into thinking that a good codebase has every single positive attribute in the world just because it's a good codebase, or because its authors are good coders.

Webpack was harder to modify than creating Rollup from scratch, GCC was harder to modify than it was to create LLVM from scratch, Linux is immeasurably harder to modify than Minix if you want it to do things it was not . But it doesn't mean that those codebases are worse or better.

--

> What use case of TS would you exclude to arrive at simpler code base ?

Why would I want to exclude any use case from Typescript? I actually mentioned Rome's raison d'être is to have more things than other transpilers.

It's a matter of architecture and early choices, of how the parser/ast/pretty-printer were implemented. Not really a matter of features we can remove.


The TypeScript plugin for Babel already works this way: it strips out the type annotations to make the code understandable to plain JS interpreters, nothing more. Most people do their type checking through their editor or some other tooling. The coupling of building and checking in a single step, in the case of a dynamic language, is really just a matter of convenience for the implementers (as well as historical tradition). Unlike C, in TypeScript type information is not actually needed to output runnable code.


I know how it works and I work with TS on a daily basis - the problem is you can't rely on editor type checking because all editors (Idea and TS language server) do type checking on the currently open file - so as soon as your module is referenced from somewhere else you won't get editor notifications if your changes broke that module unless you open that file - using this alone is useless on large projects.

I would argue editors already do a better job at linting and lint rules are mostly file local anyway so the "single pass lint on each build" is not that useful.

Having type checking as a part of the compile process is the best way to leverage typescript (you can fork the checking process so it doesn't block output for faster reload experience)


It's entirely possible then to use tsc as a "global" typechecker for the purposes of your dev workflow, without making it a part of your build process (unless you're doing CI, in which case it could still be a separate, preliminary pass that doesn't have to be bound up in the actual build stage). All I'm saying is, especially for a project like Rome that has so many other irons in the fire, I think it's a fully legitimate decision to focus on builds alone and parse-but-not-typecheck TypeScript code, leaving the latter to projects like tsc that have much bigger teams and already do a great job of it.


But what's the value proposition of this solution then ? If I need to run separate processes anyway what's the benefit over running webpack with TS and a fork checker ?


There are lots of use-cases for producing a working build that doesn't necessarily pass type-checks. Active development, for one. I often find myself hacking on something and actively ignoring certain TypeScript errors that I know won't cause breakage, until I'm at a point where I want to start polishing things up. During this time I may need to test out the functioning-if-unsafe iterations, and I don't want to be blocked by errors that I plan to clean up later. Another case is if you're building a clean checkout; if you already know there are no type errors, there's no need to check again. The two are simply separate, if related, concerns.


Better error messages, an error-recoverable parser, faster speed, consolidation into a single tool, additional ability to lint/pretty-print/minify/test instead of just transpiling and bundling, having zero dependencies, having a smaller codebase.

Maybe those things are not important for you, but they are for the Rome author and for many people in the community.

Also with brundolf's solution you can run the TSC typechecker (but not the transpiler/emitter) and Rome in parallel, making good use of modern multi-core processors. With Webpack+TSC you'll have to typecheck, transpile and bundle all in series. This might be very significant.


Also, the ball has been in Typescript's court for a while to provide a fast, easy to use typecheck-only API that receives an AST and can be used by other tools that already have parsers.

If that existed, it would allow Typescript it to be embeddable, and Rome could use that.

People have been asking for it for half a decade.


Sure, but they may well have looked at the state of it and concluded that it would need to be rewritten to get it to do what they needed—in which case doing that in a new project is actually less effort, since you don’t need to support the existing uses. (I agree it’d be interesting to hear from the creators, since this is a somewhat unusual result.)


I think the odds are pretty good. The TypeScript compiler's performance is not optimal: https://twitter.com/evanwallace/status/1275945317045657602. Microsoft's TypeScript team has a lot of priorities and other things likely take precedence over performance work.

Also it's pretty trivial to build a faster compiler if you don't have to worry about type checking. You can still have type checking with a bundler that ignores types by running the official TypeScript type checker in parallel with your bundler.


But without typechecking I still have to run a separate tool during development (which is where single pass performance matters the most me) - so suddenly it's not a single parse all-in-one solution.


>what are the odds an OSS project outperforms them at implementing their own language?

A lot more than you imagine. Esbuild can already transpile 100x faster than typescript.(Note: esbuild does not typecheck).

If they use similar technique as esbuild, they can easily surpass typescript compiler in all functional areas


esbuild does not type check. TypeScript type checks and then emits code. Better performance can be achieved by just stripping types and emitting code.

It would be great to improve the performance of type checking, but that’s more complicated.

Some project are trying, the Deno maintainers have asserted that TypeScript should be rewritten in Rust for better performance. swc is trying to do that - https://github.com/swc-project/swc/issues/571


I just don't see the use case for RomeJS and I'm not sure people will want to wait several years for this project to mature. esbuild seems to hit the sweet spot for ESNext/TS/JSX/bundling - it's 100 times faster than current tooling and works today with with next to no configuration.


Is there any chance that Rome can be based on Deno in the future?

It seems to me Deno has built with browser-like environments in mind, thus the environment would be more unified.

Node.js is pretty much a different beast. If we want to have an ambitious project like Rome, I would love to have it based on an ambitious foundation.


I asked this question, as well, and the answer I got is that Deno already does these things.


I’m not great at predicting how tech will evolve but I have a feeling that this will be built in a day.


>Regular double quoted strings can have newlines.

This doesn't feel right. There are already ` for multi-line comments. Why not use them? Changing the meaning of " breaks the ability to copy paste an rjson file into regular sourcecode.


Can I ask how long is it probably going to take for Rome to provide fuctionality to replace the tools it aims to replace, in a stable release? Rough estimate


A few questions on top of my head when I consider changing the toolchain:

- does it handle images imports

- does it handle scss imports

- if not, does it require css-in-js (and if so which one)

- does it handle shaders imports (and other arbitrary dependencies that need custom processing to get embedded in the bundle)

- does it generate SRI hashes for stylesheets and scripts

- can it generate web workers

- can it generate node bundles or only browser bundles

- can it split chunks

- do you still get TS intellisense in VSCode

- can it be used with Rust and/or Webassembly


It's a linter so you don't need to ask any of those questions. Future usage as a bundler isn't dependent on any decision to use it as a linter.


It does matter if the only value of the linter at the moment is that it will be part of a set of tools to replace the entire toolchain.

It already took forever for the ecosystem to get JSHint, Tslint and others to converge into Eslint.


Please integrate the features of dependency cruiser [1] while you are at it. One of the most useful tool of its kind.

[1] https://www.npmjs.com/package/dependency-cruiser



That helmet is Greek. It was a placeholder. There's a pretty good GH issue[0] for creating the new one.

[0]: https://github.com/romefrontend/rome/issues/1


I was sure I’d heard of this before, but it looked different. I guess that is why.


I feel like Google could have had this with their Closure Compiler/Sheets/Templates/Library but they never dedicated any resources to making the APIs more ergonomic and they’ve been historically poorly documented as well, not to mention for one reason or another the Compiler doesn’t do automatic dependency traversal and because the APIs are not not intuitive and well documented they lost mindshare.

Like many great Google projects it just never got fully realized, which is a shame :(


Closure's Achilles' heel is that it's written in Java. It's well-integrated into the stack at Google, but independent JS developers are often skeptical of tools that require more than `npm install`. It's been recently cross-compiler to JavaScript, so it is now available via npm.


Very happy to see that this is no longer a Facebook project, which I believe it was just a few months ago. I’d bet there’s a good story behind that.

Let’s liberate React next?


Sebastian started it, and it followed him when he left:

https://romefrontend.dev/blog/2020/08/08/introducing-rome.ht...


Can someone ELI5 what kind of linting is required for a transcribed code generated by TS compiler?


Have been watching Rome for a while and can not wait to use it, especially since it's beta time.

It took forever to get eslint/prettier/webpack etc to work in sync and if there is one tool can do it all, I'm all for it.


I feel like new tools for JS should ideally be written in a compiled language like Rust, just the performance benefits alone are making me use SWC rather than some other tools for part of my TypeScript toolchain.


That was part of the impetus behind ReasonML IIRC, JS tooling in OCaml-ish. Not sure if the performance benefits translated though.


They absolutely did. Compilation happens in a fraction of a second, it is fun to use. Refrain if you have to use Typescript at work, it will spare you a lot of frustration.


I use ReasonML a bit, but I don't think the comparison is exactly the same as I'm saying. ReasonML has JS like tools, such as esy in place of npm. I'm saying JS tools should be make in languages like ReasonML or Rust: fast, efficient, compiled languages.


I find this interesting. One of the things I like about languages like Go, Dart, and Rust is their robust tooling.


Strangely there's no reference to XKCD yet. Here you go: https://xkcd.com/927/


I like this idea, I’m tired of configuring tools that I use together to know about one another. I don’t need most of the notional power that arrangement supposedly can offer, and configuring these tools has only gotten more bewildering. Tired of screwing around with an endless array of plugins. Many of us build react apps, I think it’s worth having a toolchain that caters to that.

I care a lot more about dev experience and speed than “one tool to do one job”. The “job” is to handle the code I’m working with in a way that isn’t as frustrating as our current tools.


[flagged]


That’s the same question I had when I read your comment!


Ugh. I cannot stand names based on ancient things or myths. They remind me of the early days of programming when systems were named Neptune, Iapetos, or Cronus in order to make themselves sound bigger or larger than life.


Rome is the current capital of Italy... not a mythical or ancient thing.


However, it is a little bit weird to name a programming tool after a country's capital city. Are we gonna get the London linter next? Or the Washington DC web framework?


There is also Istanbul a test coverage tool and their CLI named nyc.


There is a web framework called Chicago boss already.


Roma is the current capital of Italy. Rome was an ancient city.


The city was Roma in Italian and Latin. In English it is Rome in both the ancient and modern sense - if such a distinction makes sense.

https://www.merriam-webster.com/dictionary/Rome

https://www.online-latin-dictionary.com/latin-english-dictio...


And what’s wrong with the early days? Great things were accomplished back then.


I like the name. It evokes "all roads lead to Rome"

Although "Borg" might be more accurate.


I thought it was a reference to "Rome wasn't built in one day". As in, the creators know that this is one hell of an ambitious project.


"All roads lead to Rome" is the reference


It's both.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: