Unlike many other languages, R has a native/built-in tabular data structure. So when your data have tabular structure R is by far the best glue for building pipes between external libreries. If the data fits in RAM it literally doesn't have to leave the data.table object throughout the whole pipeline (including all the cleaning and transformations).
The only meaningful alternative I see is Python with maybe Polars or DuckDB.
DuckDB is great for medium data: Too big for memory, but small enough to fit on local storage. It's also extremely performant for loading data and supports a wide range of storage backends. It's also really well integrated with R and can really speed up certain queries, as long as the DuckDB engine can translate them to valid SQL.
We use embedded R in production in a way some other companies would use Python and I can say having a better compiler would definitely help.
Even if most people use R interactively, having contributers working on compiler has many positive spillovers for the language.
Also note that the R code running behind the scenes of your scripts (powering the functions of your favourite packages) is quite a different language, using less dynamic features. This is where a better compiler would always be appreciated.
My first computer too. Got it from my uncle without any manuals, we spend a whole afternoon writing down a code of a game from a magazine letter by letter. We thought we will just press the "Start" functional key on the right side. To our surprise, sadly, absolutely nothing happened! After few more hours, my dad came home and it took him few guesses to finally type "RUN" in the console...
There are still several areas where R beats Python: tabular data crunching, data analysis (plotting, stats), finance (econometrics etc...) but it's less and less obvious.
The only meaningful alternative I see is Python with maybe Polars or DuckDB.