Rubocop is an amazingly powerful tool for refactoring ruby codebases. The ability to add custom linting rules and associated autocorrectors basically makes it an AST-level find-and-replace tool.
You do need comprehensive testing of course, but that should be table stakes anyway for any large Rails codebase.
Rubocop is awesome. The amount of configuration allows it to be tailored to a company’s preferred rules. (Rubocop’s defaults do not match the common stylings I see in the Ruby in Rails community.)
I really like Rubocop, but it's quite slow and resource intensive. I was trying to set up tooling for someone doing The Odin Project recently and couldn't get Rubocop or Solargraph working in VSCode in their VM, on a pretty high end XPS.
The trick with speeding up rubocop (and most other linters, for that matter) is to run it as a daemon. There's a rubocop daemon gem available. Or with solargraph, rubocop is kept in memory so you only pay the startup cost once. With that setup you can get a lint/fix on save in sub 300ms.
Good for them. I have found these warnings and others in Ruby so obnoxious that I started to look for alternatives, on the command line and as a language. I no longer support putting warnings into a code base in this way anymore as a result. I'm sure those who've experienced unwanted output from colors.js might feel similar.
On a brighter note, my ability with awk and sed is much better now, so it's not all bad!
Personally, I'm really happy that 2.7/3.0 is the biggest backwards change ruby has had in years, and still wasn't really that bad, was pretty easy to update codebases for. I'm happy that ruby committers are taking avoiding breaking changes pretty seriously.
(the deprecation warnings are part of a standard of making what breaking changes do occur easier to handle, I don't think I'd agree with removing them. In this case, gitlab would have had to change their code for 3.0 with or without the warnings in 2.7; the warnings helped them do so reliably. There may be ways to make them less annoying without being less useful. I do think the breaking changes themselves should be kept as infrequent as feasible. )
Also in Ruby 2.7.2, these warnings that were introduced in 2.7.0 were suppressed by default after feedback from the community.
One thing to note is that because these warnings are runtime, you really need a very comprehensive test suite to find them all. Even then logging such warnings when running in production is probably a good idea.
Agree the Ruby 2.7 => 3.0 change has been handled quite well by the core Ruby team. The kwargs warning is one part of the larger effort to get GitLab onto Ruby 3.0. You can follow along at https://gitlab.com/groups/gitlab-org/-/epics/5149.
Agreed. One of the things I like about both Ruby and Rails is how well deprecations and changes are managed. Significant changes occur over multiple versions with really good prewarning.
> the deprecation warnings are part of a standard of making what breaking changes do occur easier to handle, I don't think I'd agree with removing them
What's wrong with an error that has the deprecation warning in, for the version that it breaks? I'm all for being informed, I just don't agree that it has to be baked into an interpreter and shown to me every time I run it. Write a blog post, put it in the change log or the README, I read those as I do the logs where it's incredibly irritating.
> What's wrong with an error that has the deprecation warning in, for the version that it breaks?
I don't understand what you mean? Isn't that what happened?
> Write a blog post, put it in the change log or the README
So, a) not everyone will see that, but b) that doesn't help you find anywhere in your code (possibly including in dependencies) that will break when you upgrade ruby, which you need to do.
But I agree it can be annoying when there isn't anything you can do about it or you aren't ready to do anything about it. One way that might help is allowing you to toggle such warnings on or off. That third party gem that does allow you to do that albeit a bit hackily.
Anyway, if you move to another language, come back after a couple years and let us know if you were more or less happy with their breaking change scenarios! :)
> > What's wrong with an error that has the deprecation warning in, for the version that it breaks?
> I don't understand what you mean? Isn't that what happened?
There are no errors in a version preceding the change, only warnings. There will be errors and possibly warnings in a version after the change. I don't want to see this stuff prior to upgrade.
> a) not everyone will see that
Why is that the problem of everyone else? They should take their job more seriously and read the change log.
> b) that doesn't help you find anywhere in your code (possibly including in dependencies) that will break when you upgrade ruby, which you need to do.
No, running it under the new version will. If there's a need to be ready ahead of time then, for example, the Ruby team could add these warnings as a compile time flag. They had a compile time flag to remove the warnings in a previous release, so it can be done.
> Anyway, if you move to another language, come back after a couple years and let us know if you were more or less happy with their breaking change scenarios! :)
I'm enjoying Crystal, for many reasons, and I'll be looking at Elixir too. Not sure I can entirely replace Ruby with them but I don't use it for any new projects now.
Deprecation is an essential part of language and SDK maintenance. If you find a programming language that hasn’t done it, you’ve found a trivial or dead language. (Or one that’s just gone ahead and removed the features without any warning.)
Where did I say anything to the contrary? By all means, deprecate, by all means talk about it. Do not, however, put those messages in the output when I run a program. I don't even want it there on install but at least that's defensible in some way.
Neither of those can tell you where you’re using the now deprecated feature.
I guess you would be happy with a runtime warning that must be ENabled with a flag and documentation of that flag in the release notes.
I fear many users wouldn’t learn about the flag because they don’t read release notes (and yes, that would be a sorry state of affairs), and thus only would learn about it when the feature is removed (in this case because their program would error out, but for other breaking changes, it could ‘happily’ run, but with changed behavior)
> Neither of those can tell you where you’re using the now deprecated feature.
Do you not run tests? Know how to grep?
> I guess you would be happy with a runtime warning that must be ENabled with a flag and documentation of that flag in the release notes.
Yes, I would, as someone who reads changelogs, READMEs and blog posts before upgrading the interpreter, and who wants the only output of running the interpreter to be a direct consequence of the code I run, not hand-holding for people who are sloppy.
If individual deprecation warnings have to be specifically enabled, people who don’t read release notes describing them won’t search their source code for possible future problems, and their tests won’t signal any issues before a feature is removed, or (possibly, for other deprecations) behavior is silently changed. Deprecation would be worth zero for such users.
How can you legitimately compare this to the colors.js thing? This is a language evolving, and warning you of something that in Ruby 3 will be a _breaking change_.
This is a good change, too, and if you're just writing small one-off Ruby scripts, why not start writing compliant Ruby?
> How can you legitimately compare this to the colors.js thing?
Are they both annoying? Do plenty of gem authors want to spam me with their thoughts every time their gem is run?
That's how.
> This is a good change, too, and if you're just writing small one-off Ruby scripts, why not start writing compliant Ruby?
The nonsense output from the interpreter isn't a one off - certainly isn't from gems - and if my Ruby isn't compliant then why is the interpreter running it? The warnings are about future changes, I simply read the change log when something is released, no need to spam me endlessly.
I made the mistake on the command line of wrapping it in a alias and dumping the output to dev null, missed something actually important and wasted time! Won't make that mistake again, your monkey patch sounds a much better idea.
STDOUT/ERR is just the wrong place for information other than what my code is outputting.