
Rome: A Linter for JavaScript and TypeScript - acemarke
https://romefrontend.dev/blog/2020/08/08/introducing-rome.html
======
rattray
I feel so torn on this project. On the one hand, I want to root for
sebmarkbage, who has done so much for the field.

On the other, as someone who used to do some "JS platform" work at a tech
company, I really don't want Rome to catch on. Yet Another Standard is really
painful for the ecosystem, and while it's obnoxious that you need so many
tools, at least we've finally settled (mostly) on good answers for each
vertical. TS, ESlint, Prettier, Webpack, Babel.

Now, would it be nice if there were an underlying engine/server that all tools
could use to share an AST without reparsing everything? Sure, I guess. Would
it also be nice to have one tool that wraps the others, at least when you're
getting started? Yes, though I _think_ create-react-app has blazed that trail
pretty well.

The thing is, all those tiny little details from Prettier and ESLint and
Webpack really matter – Rome will take years to achieve the number of lints
that ESLint has, and even more years for the community to agree on which ones
matter. Similarly, every corner-case of prettifying will have to be considered
anew, and every edge case of bundling rebuilt or reimagined.

I love to see a talented person take on something insanely ambitious, so I
wish him luck and hope he proves me wrong. But if Rome succeeds, it'll be a
big pain for a lot of people as they try to port everything over, and for the
community as they grapple with dueling standards for how things should be
done.

~~~
sebastianmck
The ecosystem is already fragmented and dueling. When it says it replaces
those tools it means it aims to replace the functionality of those tools, not
make them obsolete.

Rome being successful doesn't mean eliminating those tools, it's providing
something valuable and giving people an option. If it's not for you then
that's okay. Rome is early and is still evolving, including possible areas for
extensibility.

I think a lot of people don't realize the sort of capabilities that they're
missing out on by not having their tools work together, or sticking with old
tooling that cannot innovate for legacy reasons (like Babel). I don't think
it's harmful to the ecosystem to advocate for more consolidation, especially
around tooling that not a lot of people either like to deal with, or have few
maintainers in the first place.

I also think you have me (Sebastian McKenzie) confused with Sebastian Markbage
from the React team.

~~~
rattray
Hey there – I do mix the two of you up, but Babel is nothing to scoff at. I've
used its internals in the past, and read a good chunk of its source code, so I
think I've seen your blames all over. I'm quite appreciative of your past
contributions! (and his)

Reading your comment, it sounds like you expect some people might use Rome for
some things (say, linting and compilation) and preexisting tools for others
(say, formatting and bundling). Is that your intent?

~~~
sebastianmck
Thanks! Yeah, it is, at least for a while since it will take a very long time
to reach expected maturity. You can adopt as many or as little pieces as you
want. The idea is that once you adopt one of the "tools", you can use the
others and reuse the exact same configuration, and ideally you shouldn't even
need to do anything else.

ie. For linting we also do dependency verification. So you might need to
configure Rome if you put your dependencies on some weird non-standard place.
Once we open bundling up, we'll already know how to resolve everything.

Each part should stand on it's own. It's not as if the linter we've released
is worse, and the only selling point is that it's going to be part of a larger
suite. It legitimately has features and separates itself from the
alternatives. eg. extreme focus on useful error messages, powerful autofixes
(that operate on an AST rather than insert strings like ESLint), proper
caching for even better performance (ESLint doesn't offer a good solution
here) etc.

Many are making the mistake in thinking that building Rome is just as time
consuming and resource intensive as rewriting each of the tools it's meant to
replace separately, it's not. Once we've validated the linter we've also
validated the compiler (it's the same thing), our ability to watch files,
analyze dependencies, integrate with your editor etc. Sharing so much
fundamentally decreases the actual complexity of everything, not increases.

I think it's also important to note that I personally have written an
extremely small amount of the implemented lint rules. Our API is just easier
to use, and we've focused on setting up the tooling necessary to make writing
them easy (although internally in our repo).

Check out
[https://github.com/romejs/rome/issues/341](https://github.com/romejs/rome/issues/341),
[https://github.com/romefrontend/rome/issues/20](https://github.com/romefrontend/rome/issues/20),
and
[https://github.com/romefrontend/rome/issues/159](https://github.com/romefrontend/rome/issues/159),
if you're interested in seeing how the progress was actually made in the
implementation of those rules. The work was spread out over a really long
time, and if given complete focus and proper coordination (I was not good at
this and kind of let it be organized ad-hoc, which is actually how we got some
amazing contributors), then it could have been completed in a fraction of the
time.

There's been some overfocus on particular things like minimal configuration
and the lack of extensibility but those are not hard requirements and will
evolve over time, particularly as we get feedback and people demonstrate the
requirements and restrictions they're under. The project understandably
involves a lot of hubris, but I do believe that Rome isn't only valuable in
aggregate and will have significant advantages even if you only decide to use
one piece.

~~~
dgoldstein0
Thanks for this clarification. My first reaction to reading the project page
was "damn a tool that takes doing _everything_ to a whole new level". If it's
adoptable piecemeal, there's a good chance that if it becomes popular I may
actually get to work with it someday - but if it were an all or nothing
proposal, in my current job the answer is likely to be nothing due to
challenges integrating with other technologies we use and get value from.

------
swyx
the prevailing consensus esp in the JS world is that 1 tool should do 1 thing.
This is fine under the Unix philosophy, but challenges arise due to the
combinatorial explosion of config needs, bad errors, and overhead from
crossing module boundaries. There are a number of attempts at challenging this
status quo:

\- ESbuild (100x faster than webpack in part due to focus on doing a single
parse (but also shipping a Go binary))

\- Deno

\- Rome

As we consolidate on the jobs to be done we expect out of modern tooling, it
makes sense to do all these in a single pass with coherent tooling. It will
not make sense for a large swathe of legacy setups, but once these tools are
battle tested, they would be my clear choice for greenfield projects.

recommended related reads:

\- [https://medium.com/@Rich_Harris/small-modules-it-s-not-
quite...](https://medium.com/@Rich_Harris/small-modules-it-s-not-quite-that-
simple-3ca532d65de4)

\- (mine) [https://www.swyx.io/writing/js-third-
age/](https://www.swyx.io/writing/js-third-age/)

~~~
paulhodge
After spending way too much time debugging issues with frontend tooling, I am
all on board for this glorious monotool future. We’ve definitely gotten some
progress out of the massively micropackage approach that Webpack/Babel/etc
use, but these days it really feels like we’re passing a complexity limit and
we need a new approach. The fact that CRA will refuse to startup if you have
another version of Babel installed anywhere in the tree, is a pretty good clue
that the current approach isn’t scaling.

~~~
austincheney
Yes, time spent on tooling is time not spent solving real problems. The
ability to make unpopular decisions necessary to move a product forward is
what separates an expert from an expert beginner.

------
topicseed
Huge ambitious project and I hope it delivers. If anyone can lead this one to
fruition is Sebastian. So this project is in good hands.

Undeniably, it's technically ambitious to build all these pieces under a
single umbrella. And then, convincing people who are honestly scared to touch
their Webpack config to switch to a new tool might not be easier.

But if Rome booms, it'll truly benefit the community.

~~~
Aeolun
> convincing people who are honestly scared to touch their Webpack config to
> switch to a new tool might not be easier.

If someone can offer me a convincing alternative to webpack I will switch in a
heartbeat.

~~~
schwartzworld
Parcel is pretty amazing

~~~
wecryopen
agree 100%. love how it requires no configuration.

------
marmada
There's something that scares me about Rome: it's lack of plugins.

I really like Babel because of its plugins. My project typecheck.macro, could
not exist without Babel plugins.

How will Rome support compile time transformations like graphql.macro or
typecheck.macro?

Compile time plugins allow JavaScript to evolve super rapidly & make the
creation of frameworks that act as compilers instead of traditional frameworks
possible.

[1]
[https://github.com/vedantroy/typecheck.macro](https://github.com/vedantroy/typecheck.macro)

~~~
sebastianmck
I spoke in this post about how rushing into a plugin system hurt the longevity
of Babel and it's ability to innovate. We aren't going to make the same
mistake again. Rome will likely eventually have a plugin system, but what that
would look like isn't clear.

This is the first release and until there's some actual usage, there's no real
way to realistically predict what sort of things people will feel is missing.
You can always supplement your project with multiple linters if you feel like
it is currently a blocker.

~~~
spankalee
Without plugins at first, how open are you to contributions that would
normally go into a plugin and not into core?

As someone who works on libraries that don't use JSX, and therefore have to
rely on compiler and linter plugins to get support (tagged template literals
in my case) I worry that React's dominance will mean that it gets a place in
Rome, while other systems are locked out, which only increases React's
dominance.

Would you consider an internal plugin system as a first step, where plugins
have to be part of the codebase, but intentionally get a restricted API
surface? This would allow you to try out and refactor the plugin API over time
before committing to it publicly.

~~~
sebastianmck
Depends on what it is I think. We discussed the idea of "expansion packs"
which would enable certain funtionality as a sort of "limited config" hack.
[https://github.com/romefrontend/rome/issues/173](https://github.com/romefrontend/rome/issues/173)

I specifically call out not allowing smaller communities to grow that don't
have the advantage of their community size being forcing functions for
support. It's the most compelling reason to me to even have any sort of plugin
or custom rules system in the first place.

Although we didn't go through with that idea because they can just be enabled
by default since the patterns they're linting for are unambiguous, I guess
that applies to any additional ones, although so far the way we've approved
these sort of these has been adhoc. There's a strong desire though to
formalize some "approval process" for lint rules and more typical project
decisions.

------
Vinnl
Sebastian keeps mentioning how all these different kind of tools could re-use
the same infrastructure for the things they all do, but... It's still not
quite clear to me what benefit that brings to me as a non-contributor? I can
see how it could be beneficial if the entire ecosystem would rally around the
same tools and then be able to move faster, but given that that has not yet
happened... Why would I use this to lint my code? The nicer error messages?

~~~
acemarke
Two immediate potential benefits:

\- Only one tool to configure, instead of many

\- Many tools revolve around parsing your code and generating an AST, then
manipulating / processing that AST (Prettier, ESLint, Babel, TS, Webpack,
....). That's a lot of extra processing that has to be done. In theory, doing
all that processing _once_, and reusing the AST, would potentially run a lot
faster.

~~~
k__
Also, fixes in the shared libs would be directly available to the tools build
on top.

------
iRomain
Why a focus on frontend only? I am using Typescript, Babel, Jest and a bundler
for any app/library on the frontend/backend. If you got to do it, do it all
the way! I can already see the GH issues, “sorry, this is a backend use case”.
So now my tooling would end up being more complex with the addition of another
tool since it will never support all my use cases. Seemed promising but this
alone makes me question its utility.

~~~
sebastianmck
TypeScript is a frontend language. "Frontend" is the category of languages we
plan on supporting. It doesn't say anything about your usage of those
languages. Where you run the code does not matter.

~~~
capableweb
Could you expand on why you think TypeScript is a "front-end" language? I was
under the impression that it just compiles to Javascript and while Javascript
used to be a front-end language, it's nowadays bravely used for backend
projects as well, for some reason.

------
rubber_duck
Why start this from scratch ? Why not build on top of TypeScript ?

I get the single pass rationale but isn't it possible to integrate a bundler
into TypeScript compiler ?

I'm just wondering because Microsoft has a decently sized team of paid
engineers working on TS for years now - what are the odds an OSS project
outperforms them at implementing their own language (which they also evolve
with every release)

~~~
wolfgang42
The TypeScript compiler API[1] is beta, underdocumented, and not really
designed to support use-cases other than working with an abstract AST. It’s OK
for transpiling, but would be a bit awkward for writing a linter and not at
all suitable for a formatter.

As an example of the sort of problems you might run into with the API in its
current state, trying to parse and then pretty-print code can cause comments
to vanish[2] under certain circumstances. This is fine for transpiling, but is
a major issue when you’re trying to prettify and overwrite the original source
file.

[1] [https://github.com/Microsoft/TypeScript/wiki/Using-the-
Compi...](https://github.com/Microsoft/TypeScript/wiki/Using-the-Compiler-API)
[2]
[https://github.com/microsoft/TypeScript/issues/39620](https://github.com/microsoft/TypeScript/issues/39620)

~~~
rubber_duck
I get that - but since TS is also OSS you can fork and upstream, still sounds
like less work than writing everything from scratch.

~~~
rmthw
Why?

TS and Rome architectural goals are completely incompatible.

Writing a transpiler is not complicated at all, especially for someone who has
written Babel in the past.

On the other hand, if you familiarise yourself with both TS's gigantic
codebase and with Rome's goals, you'll quickly see that modifying TS would be
much vaster undertaking than writing it from scratch.

Also if Rome were just an also-ran to replace ESLint/Webpack/Babel/etc, it
would be ok to have a less-than-ideal starting point, but it's actually trying
to do things differently: better error messages, recoverable parser errors,
more fixable linter messages, being faster.

This is the exact case of something that should be written from scratch (or at
least use something that makes it possible) rather than building on some
another technology with completely incompatible goals.

Another example? It took several years for Webpack to achieve tree-shaking
that was as good as Rollup first versions, even with more maintainers, more
sponsorship and a lot of time. Why? Because Webpack's architecture was
completely different.

~~~
rubber_duck
But you're basically trying to superset the TS compiler. If this doesn't cover
TS typechecking then it fails on it's promise of a single pass compiler stack.

Are you suggesting that TS team is so incompetent or legacy burdened that
their codebase is impossible to extend. What use case of TS would you exclude
to arrive at simpler code base ?

Writing a transpiler that ignores type checks is probably simple but also
quite useless for dev workflow - if I need to run a separate type checker
might as well run a separate linter and compiler, the tech is established.

~~~
brundolf
The TypeScript plugin for Babel already works this way: it strips out the type
annotations to make the code understandable to plain JS interpreters, nothing
more. Most people do their type checking through their editor or some other
tooling. The coupling of building and checking in a single step, in the case
of a dynamic language, is really just a matter of convenience for the
implementers (as well as historical tradition). Unlike C, in TypeScript type
information is not actually needed to output runnable code.

~~~
rubber_duck
I know how it works and I work with TS on a daily basis - the problem is you
can't rely on editor type checking because all editors (Idea and TS language
server) do type checking on the currently open file - so as soon as your
module is referenced from somewhere else you won't get editor notifications if
your changes broke that module unless you open that file - using this alone is
useless on large projects.

I would argue editors already do a better job at linting and lint rules are
mostly file local anyway so the "single pass lint on each build" is not that
useful.

Having type checking as a part of the compile process is the best way to
leverage typescript (you can fork the checking process so it doesn't block
output for faster reload experience)

~~~
brundolf
It's entirely possible then to use tsc as a "global" typechecker for the
purposes of your dev workflow, without making it a part of your build process
(unless you're doing CI, in which case it could still be a separate,
preliminary pass that doesn't have to be bound up in the actual build stage).
All I'm saying is, especially for a project like Rome that has so many other
irons in the fire, I think it's a fully legitimate decision to focus on builds
alone and parse-but-not-typecheck TypeScript code, leaving the latter to
projects like tsc that have much bigger teams and already do a great job of
it.

~~~
rubber_duck
But what's the value proposition of this solution then ? If I need to run
separate processes anyway what's the benefit over running webpack with TS and
a fork checker ?

~~~
brundolf
There are lots of use-cases for producing a working build that doesn't
necessarily pass type-checks. Active development, for one. I often find myself
hacking on something and actively ignoring certain TypeScript errors that I
know won't cause breakage, until I'm at a point where I want to start
polishing things up. During this time I may need to test out the functioning-
if-unsafe iterations, and I don't want to be blocked by errors that I plan to
clean up later. Another case is if you're building a clean checkout; if you
already know there are no type errors, there's no need to check again. The two
are simply separate, if related, concerns.

------
swimmingly
I just don't see the use case for RomeJS and I'm not sure people will want to
wait several years for this project to mature. esbuild seems to hit the sweet
spot for ESNext/TS/JSX/bundling - it's 100 times faster than current tooling
and works today with with next to no configuration.

------
namelosw
Is there any chance that Rome can be based on Deno in the future?

It seems to me Deno has built with browser-like environments in mind, thus the
environment would be more unified.

Node.js is pretty much a different beast. If we want to have an ambitious
project like Rome, I would love to have it based on an ambitious foundation.

~~~
ARussell
I asked this question, as well, and the answer I got is that Deno already does
these things.

------
yoav
I’m not great at predicting how tech will evolve but I have a feeling that
this will be built in a day.

------
toohotatopic
>Regular double quoted strings can have newlines.

This doesn't feel right. There are already ` for multi-line comments. Why not
use them? Changing the meaning of " breaks the ability to copy paste an rjson
file into regular sourcecode.

------
userSumo
Can I ask how long is it probably going to take for Rome to provide
fuctionality to replace the tools it aims to replace, in a stable release?
Rough estimate

------
wildpeaks
A few questions on top of my head when I consider changing the toolchain:

\- does it handle images imports

\- does it handle scss imports

\- if not, does it require css-in-js (and if so which one)

\- does it handle shaders imports (and other arbitrary dependencies that need
custom processing to get embedded in the bundle)

\- does it generate SRI hashes for stylesheets and scripts

\- can it generate web workers

\- can it generate node bundles or only browser bundles

\- can it split chunks

\- do you still get TS intellisense in VSCode

\- can it be used with Rust and/or Webassembly

~~~
sebastianmck
It's a linter so you don't need to ask any of those questions. Future usage as
a bundler isn't dependent on any decision to use it as a linter.

~~~
wildpeaks
It does matter if the only value of the linter at the moment is that it will
be part of a set of tools to replace the entire toolchain.

It already took forever for the ecosystem to get JSHint, Tslint and others to
converge into Eslint.

------
forty
Please integrate the features of dependency cruiser [1] while you are at it.
One of the most useful tool of its kind.

[1] [https://www.npmjs.com/package/dependency-
cruiser](https://www.npmjs.com/package/dependency-cruiser)

------
smhmd
I liked the old logo[0] better.

[0]:
[https://raw.githubusercontent.com/romefrontend/rome/1d583050...](https://raw.githubusercontent.com/romefrontend/rome/1d58305097f768b320150bb9bd293c7e74f87b00/assets/logo_with_text.png)

[1]:
[https://raw.githubusercontent.com/romefrontend/rome/main/ass...](https://raw.githubusercontent.com/romefrontend/rome/main/assets/JPG/logo.jpg)

~~~
tbeseda
That helmet is Greek. It was a placeholder. There's a pretty good GH issue[0]
for creating the new one.

[0]:
[https://github.com/romefrontend/rome/issues/1](https://github.com/romefrontend/rome/issues/1)

------
no_wizard
I feel like Google could have had this with their Closure
Compiler/Sheets/Templates/Library but they never dedicated any resources to
making the APIs more ergonomic and they’ve been historically poorly documented
as well, not to mention for one reason or another the Compiler doesn’t do
automatic dependency traversal and because the APIs are not not intuitive and
well documented they lost mindshare.

Like many great Google projects it just never got fully realized, which is a
shame :(

~~~
bsimpson
Closure's Achilles' heel is that it's written in Java. It's well-integrated
into the stack at Google, but independent JS developers are often skeptical of
tools that require more than `npm install`. It's been recently cross-compiler
to JavaScript, so it is now available via npm.

------
tobr
Very happy to see that this is no longer a Facebook project, which I believe
it was just a few months ago. I’d bet there’s a good story behind that.

Let’s liberate React next?

~~~
bsimpson
Sebastian started it, and it followed him when he left:

[https://romefrontend.dev/blog/2020/08/08/introducing-
rome.ht...](https://romefrontend.dev/blog/2020/08/08/introducing-
rome.html#history)

------
dandare
Can someone ELI5 what kind of linting is required for a transcribed code
generated by TS compiler?

------
ausjke
Have been watching Rome for a while and can not wait to use it, especially
since it's beta time.

It took forever to get eslint/prettier/webpack etc to work in sync and if
there is one tool can do it all, I'm all for it.

------
satvikpendem
I feel like new tools for JS should ideally be written in a compiled language
like Rust, just the performance benefits alone are making me use SWC rather
than some other tools for part of my TypeScript toolchain.

~~~
readarticle
That was part of the impetus behind ReasonML IIRC, JS tooling in OCaml-ish.
Not sure if the performance benefits translated though.

~~~
scns
They absolutely did. Compilation happens in a fraction of a second, it is fun
to use. Refrain if you have to use Typescript at work, it will spare you a lot
of frustration.

------
LibertyBeta
I find this interesting. One of the things I like about languages like Go,
Dart, and Rust is their robust tooling.

------
SergeAx
Strangely there's no reference to XKCD yet. Here you go:
[https://xkcd.com/927/](https://xkcd.com/927/)

------
draw_down
I like this idea, I’m tired of configuring tools that I use together to know
about one another. I don’t need most of the notional power that arrangement
supposedly can offer, and configuring these tools has only gotten more
bewildering. Tired of screwing around with an endless array of plugins. Many
of us build react apps, I think it’s worth having a toolchain that caters to
that.

I care a lot more about dev experience and speed than “one tool to do one
job”. The “job” is to handle the code I’m working with in a way that isn’t as
frustrating as our current tools.

------
Konohamaru
Ugh. I cannot stand names based on ancient things or myths. They remind me of
the early days of programming when systems were named Neptune, Iapetos, or
Cronus in order to make themselves sound bigger or larger than life.

~~~
elondaits
Rome is the current capital of Italy... not a mythical or ancient thing.

~~~
esperent
However, it is a little bit weird to name a programming tool after a country's
capital city. Are we gonna get the London linter next? Or the Washington DC
web framework?

~~~
CURLY3Y
There is also Istanbul a test coverage tool and their CLI named nyc.

