I have no strong opinion on serverless but I’ve used Rails for 13 years now (in massive $multi-million SaaS products as well as hobby projects) and it still makes me happy. I keep thinking about learning Node and React but I just can’t be bothered because Rails lets me get things done so quickly while also being a joy to write Ruby and Rspec.
Regarding hosting, I think this is actually a great time to host Rails apps either via Heroku, via the new Digital Ocean k8s app cluster or – and this is my new favourite tool – using Hatchbox.io to deploy apps to Digital Ocean or AWS. It’s a dream!
I’m also a big fan of rails, but I’ve experienced a lot of problems scaling it. A lot of the problems ultimately came down to the simple fact that Ruby is really, really slow. At a certain scale you end up forced to develop infrastructure on a different stack to keep up with the CPU load. I never ran into that so quickly when building similar systems with java, go, and C#.
I wanna say something nice about rails too so I’ll say I have never seen a team so quickly deliver high quality web app features than a well oiled rails team. It’s something to behold.
This very much depends on the use case. If you are truly CPU-bound than yes, ruby is a bad choice. Similarly, if you are IO-bound or otherwise need a lot of concurrency ruby is not great. I also think it's problematic for large code bases with hundreds of developers. However, there is a huge swath of web apps and APIs that just need to pull a bit of data from a DB or two, munge it together and send it to the client. Rails is perfectly horizontally scalable and plenty fast for this even at scale given Amdahl's law.
That said, there are definitely some self-inflicted wounds people run into here. Of course there are the oft-cited expert beginner problems like n+1 queries and missing indices, but there's a more subtle issue as well: ActiveRecord itself is so convenient that it discourages writing baseline efficient code even when you need it. Basically any time you need to process a medium to large amount of data, ActiveRecord acts like a sort of super boxed type (to use Java terminology) adding a huge amount of overhead. Yes it's true that ruby doesn't even have true primitive types that can achieve the performance of Go or Java, but often times ruby primitives are plenty fast, you just have to be willing to write a bit of extra code, and ActiveRecord as an ORM has plenty of escape hatches at various levels of granularity to facilitate this (eg. pluck, find_by_sql, update_all, execute, etc).
I definitely agree that when writing Rails at scale, it's extremely important to view ActiveRecord as just another tool in the toolbox, and not always effective for every use case. I've worked in places that had a dogmatic aversion to ever writing raw SQL, and to a one they always suffered for it in performance sensitive code.
At the places who did handle this well, the "high-performance code" that needs to be hand-tuned SQL is usually much smaller than you think (a few queries here and there), and ActiveRecord is still great for your simple queries or for smallish tables.
Could you go into more detail about what you see as issues with IO bound tasks?
My understanding is that MRI Ruby provides non-block IO operations if you wrap them in a thread and that it is only CPU bound tasks that are blocked by the GVL.
Is there some other issue related to that?
(JRuby provides fully multi-threading for cpu bound tasks without GVL).
"My understanding is that MRI Ruby provides non-block IO operations if you wrap them in a thread and that it is only CPU bound tasks that are blocked by the GVL."
All IO operations in ruby are subject to the GIL (global interpreter lock).
GVL is an implementation detail rather than a feature of the language (I believe the term Global VM Lock replaced GIL in the standard library, sometime around ruby 2.0ish I think).
JRuby, for example, has no GVL including for CPU based code; everything there can run in parallel.
Even in MRI Ruby though, wrapping IO operations in a thread allows you to release the GVL when the IO operation blocks.
e.g.
5.times.map do
Thread.new do
Net::HTTP.get('example.com', '/index.html')
end
end.each(&:join)
Will perform those network requests in parallel rather than sequencially. This is how ActiveRecord can perform asynchronous database calls in parallel on MRI Ruby.
I got that HTTP example from[1], which has a good write up but it's also covered in Working with Ruby Threads by Jesse Storimer[2].
I asked the original question because in the Concurrent-Ruby Readme they discuss Ruby's mutable references and the possibility of thread safety issues because of that.[3]
Rails makes it really easy to do something 10 different ways to get the same result. Unfortunately, most of which aren't the most performant way. In my 10 years of building Rails apps of all different sizes, and seeing some very mature apps in production, this is the most common culprit I've seen.
I currently work on a rather large Rails app for a site that most of us here use. A common pattern for our performance pitfalls are things like this:
Tag.all.map(&:name)
versus
Tag.all.pluck(:name)
Using `#map` will instantiate a `Tag` object and do all the slow(er) metaprogramming to get you the syntactic sugar that makes interacting with an ActiveRecord object a treat. It does this, and then you only hit `#name`, completely wasting all that effort.
`#pluck` will change the SQL from `select * from tags` to `select tags.name from tags` and never instantiate a `Tag` object, instead short-circuiting and directly fetching the resulting SQL rows — which comes back as an array. It's something along the lines of:
The first example loops over the loaded `@user.tags`, loads them if they're not already `#loaded?`, selects ones that are `type == 'LanguageTag'`, only to grab the `#id`.
The second example joins the two resulting SQL statements and calls `#to_sql` on the second statement, building one query from two queries.
Are these times when the first example would be preferred? Yeah, plenty! If your result is already `#loaded?`, then you probably don't need to hit the database again. But for this example and the ones I'm stumbling across while getting our company up-to-speed on "good ways", these are the the commonalities.
Save for only very recently, the company I work for hasn't put emphasis on real-world Ruby/Rails skills, instead "if you can code at a high level for any language, we think you can make reasonable contributions to our codebase." This has lead to hiring skilled developers that just don't know that there's a more preferred way of doing things in Rails for different contexts.
I would refactor your second example into an exists using Arel, because at best the IN will result in the same performance. At worst it will be significant my slower. There are also particular issues with NOT IN and NULL. This is at least true in PG.
I also deal with a lot of scale, the issues people have here doesn’t match my reality. I think people have issues and rather than looking at what is fundamentally happening with their call patterns, they jump to calling out rails itself.
Rails does have some specific issues, but you’d have to go pretty deep to see them and boot times are terrible.
The general problem you describe - n + 1 queries caused by needless iterating over / instantiating Ruby objects when a simple SQL query would do - is certainly a very common newbie mistake, but it's just that: a newbie mistake.
Confusion here simply shouldn't be a problem for even a moderately seasoned developer, and if they do make such a mistake (because hey, we all make mistakes...) in performance-sensitive code they could quickly recognize it for what it is - a bug - and fix it.
If you're hiring junior developers, on the other hand? Sure! But you should know what you're getting, and your code review / mentoring process should get them straight.
I'm not sure I really understand how this is Ruby's or Rails' fault, unless your premise is "ORMs considered harmful" - in which case, ActiveRecord is far from alone here, and that's a different sort of conversation.
Your examples aren't really doing the same thing in two different ways. Map (and select, collect, each, and some others) iterate over an enumerable. Pluck and select are active record methods that generate SQL.
It's a good example of choosing magic/brevity over expressiveness. You don't know that Tag.all.map calls SQL because it's not something you explicitly tell it to do. That's the real issue with Ruby & Rails. The magic lets you do some powerful stuff but sometimes it's hard to tell what exactly is happening.
That's why I said they produce the same result instead of saying they do the same thing. If you read further in my example, I mention that they perform very differently under the hood.
That's a consequence of using ORMs. ORMs are a horrible performance mess. Just straight up write SQL. I'm sure Ruby has enough metaprogramming magic to not have to deal with cursors manually.
Would love to hear what you're doing that's got Ruby pegging the CPU. I've ran a few rather large Rails deployments over the years and it's rare to crush the app tier first, so I wonder if there's something unique here that we can make Rails better at.
Have you used fragment caching (in "Russian Doll" style)? In my experience, that's the key to making Rails applications fast. Ruby is slow, so there are definitely situations where it's not a good choice, but in many "basic applications" caching makes that irrelevant. Language speed is irrelevant when you don't run much code :-).
Rails is great if what you’ve got is read-mostly cachable content being served up. In an environment with high rates of incoming data, most of which can’t be reliably cached, things get more interesting and you start needing to scale horizontally a lot more and leaning on the database.
Rails handled Black Friday and Cyber Monday traffic for an ecommerce company I used to work for just fine. If you are making money, it's worth the cost of 50 lowend VPSes autoscaled (we could have done with a lot less, too).
If we were using Java it would have taken us three times the people and four times as long to build the site, and we all would have been laid off.
> Rails handled Black Friday and Cyber Monday traffic for an ecommerce company I used to work for just fine.
That says nothing about the added cost of running inefficient services, which require additional nodes to serve the same requests and thus increase operational cost and also risk to perform the same service.
Really? Our frontend servers handle 50 rps and cost $20 each and are nowhere near peak utilization. If anything ever needs scaling its the database. What level of traffic are you talking about?
I don’t want to second guess technical decisions I know nothing about but: no, Rails shouldn’t be streaming video. But 2k requests per second with mostly text content being shuffled around sounds absolutely doable with Rails. The cost benefit of easier development should definitely not be understated as well.
But that being said, the primary driver for tools should be what the developers know and ease of access finding developers who know this technology. If the city you work in mostly has PHP developers, PHP is a great choice. Similar for Java, Haskell, Lisp, etc. My point is that the tool ”Rails” definitely is adequate for this problem (minus streaming video...). Look at Shopify, Github or any other massive Rails app
If a Rails back-end streamed video, there wouldn't literally be a loop written in Ruby shoving bytes back and forth stored in Ruby arrays or buffers. It would be farmed off to something appropriate. You wouldn't necessarily want that machine to be doing it, using any middleware.
Yes, this matches my experience. I've been doing Rails full-time for about 6 years and any "slowness" has been the result of some problem, not Rails itself. 99% of the time this is an uncached n+1 query situation, or some beastly slow database operation that needs to be moved to background processing, things that would be problems in any framework or even some sort of bare-metal asm solution that nonetheless relies upon an external storage layer. =)
In a CRUD app the Rails layer should be extremely thin and the storage layer(s) should be doing nearly all of the heavy lifting.
There is a level of traffic at which even a "properly" thin Rails layer becomes the bottleneck, relative to many other frameworks.
TechEmpower benchmarks suggest it is around 2,500 requests per second in their "multiple queries" benchmark. In a more real-world scenario that might be 1,000 req/sec or less.
If one is attempting to serve more requests than this per minute then yes, perhaps Rails is the bottleneck. Admittedly, Rails' large memory consumption relative to other frameworks means it can be tough (well, technically easy, but expensive) to scale horizontally at this point.
it seems like a lot of my career has been optimizing SQL queries in Rails apps... which is often just adding the correct indexes. Kind of a lot of Rails devs just don’t know to do that
Slowness seems to be an overblown issue when it comes to the backend. Hardware is cheap compared to developers and app servers scale horizontally by default; if a language is actually slow but otherwise provides good productivity it's almost always cheaper to throw more hardware at it than move to a "faster" language (there's a reason we don't typically write web apps in C).
Isn't normal Rails the fast, scalable and easy to pick up framework for a Rails developer? Elixir has hype factor, but it still lacks the library ecosystem of Ruby. It's entirely possible to get to github scale with just Ruby and not have to learn all the gotchas of BEAM and OTP.
I've been using Elixir in production for four years now, in a fairly complex SaaS app, and I can comfortably say I've never had to "learn all the gotchas of BEAM and OTP". IMO Elixir/Phoenix do a really good job of abstracting all of that away from you, while still making it easy enough to tap into their intricacies if needed (e.g. if you want to write a GenServer for something).
Lack of library support was a bit annoying at first. But then I realized that, back when I used to take advantage of existing libraries in Ruby/Rails, in the vast majority of situations I was just utilizing a small portion of those libraries anyway. It ended up easy enough to write my own code for those features.
I have written Rails for 8 years and Elixir for 3.
Phoenix is essentially what Rails should have been.
It has the upsides of Rails without the connected antipatterns, native technical debt.
I can also confirm that there isn't a need to learn BEAM gotchas (which i did learn). One of my ex colleagues is a junior dev who has been using elixir for 3 years now and didn't need to use those special aspects of BEAM.
As a Ruby expert, Elixir is substantially easier, learning BEAM is a joke (and very instructive) compared to learning the entirety of the Ruby object model.
I can't think of reasons to use Rails nowdays. If you are in the CRUD apps market, use Postgrest or similar products. As for the rest sure, choose any language you want, the framework is way less relevant (arguably: detrimental) at that point.
The industry has proven extensively that for non-trivial projects, Rails is damaging due to ActiveRecord.
Second this. I did rails for years and liked it a lot, but there were pain points, particularly as a project scaled.
Elixir/Phoenix solves a lot of these pain points and is a joy to work with. If you like Ruby/Rails then you already know quite a bit of Phoenix. Schedule yourself 4 hours to follow a tutorial or book.
For the language and the framework the official tutorial/documentation are pretty good.
Also Pragmatic Programmers has very good books on both, written by the creators.
I love Phoenix but I will disagree with the OP in 2 points.
Yes probably Elixir/Phoenix will manage better thousands of concurrent connections, but if you are doing personal projects or are a start-up that condition is irrelevant in the meantime. So that only lefts us with the big established applications. That does not mean Phoenix is not good, it only means you will not notice a difference with Rails for most of your projects.
The second thing, that OP failed to mention is that the Rails ecosystem is at least an order of magnitude larger. From the gems available, job opportunities,developers available, instructional material, SOP for many tasks and so on.That is boring, but valuable.
Programming Phoenix is good. You might want to look at something Elixir specific too, I liked Programming Elixir although I've heard good things about Elixir in Action as well.
The official online documentation is pretty good too.
Boring tech is great if your primary concern is writing business logic. Ideally, this should always be true, but of course the reality is that people's jobs often have them working on products they don't really care for. And of course, it's easy to get distracted by tech anyway. Experience also comes into play when it comes to loving "boring" tech—it took my around a decade before I was sufficiently jaded that almost all I want to do is write actual business logic. I still love refactoring, though :)
> it took my around a decade before I was sufficiently jaded that almost all I want to do is write actual business logic
It's comforting to see I'm not alone. I just wanna ship clean code and contribute value to the business, while all the young bucks in my team are more interested in rewriting our REST apis in Graphql (with all kinds of rationalizations). Looks like the younger you are the more eager you are to explore new tech and the older you are the more keen you are in just delivering value in a boring old stack.
Ha, ya, my work recently mandated moving everything to graphql. My team started a greenfield project so there was no need to convert anything, though the learning curve was high with apollo. Graphql is nice, but as the only consumers of our API, I don't see it as a win over REST. At this point, though, we're proficient enough that it doesn't get in our way.
Having learned web dev around five years to go, my main stack has been Node/Express or Firebase (serverless) with React front-ends.
However, I got tired of the lack in conventions in Express, and Firebase has quite a few gotchas and limitations, so I'm moving to good old Django. I'm delighted thus far, I love not having to reinvent the wheel every time I start a project or add a feature.
I'm probably keeping React/Next.js as my primary working tools for the front-end nonetheless, but that's just me because I feel highly productive with them and I've become accustomed to the decoupled client/API architecture.
I can see the merits of SSR with templating as seen in RoR or Python, and I'm happy to have that tool in my belt. But conversely I also think it's worth for pretty much everyone to learn React (or Vue, Svelte, or what have you) because they do open different possibilities in UX engineering and app architecture, and the market is quite hot for them too.
Is there a good way to turn your rails app into a mobile app these days? I remember seeing something from turbolinks showing a way to make hybrid mobile apps using turbolinks.
It’s been a while. I tried to hack something like this together - without having done any android dev before - and I understand why it’s taking a while. It’s not as simple as I’d imagined it would be.
What's the benefit of moving from Rails to Node and React productivity-wise? On the backend side, Rails comes with an ORM, authentication, templating system and many more out of the box, while on Node you'll need to mix and match a bunch of 3rd party libs or roll your own, which might be a good thing depending on your priorities but certainly kills productivity. On the frontend side, you can always use react with rails backend, but developing traditional template-based ui is usually faster than developing a react app and you can take advantage of rails templating/rendering feature to speed up development.
Even in mid-sized projects there's usually one or two pages with sufficiently complicated front-ends that it's worth having something like React in your toolbox, though it might be Vue or Svelte.
You don't have to make the whole app a SPA to use these tools. You can just use it in those few pages that really need it.
Meh, put React into your build, maintain the thing, make the website download and parse React - all that just for one page? Better be a damn dynamic page. I wouldn't do it even if it's a big form with some dynamic fields.
It's very easy to load React only on the page that needs it. Use dynamic ES module import, or link the standalone builds only on the pages that use it.
Alternative like Preact is very small and can be easily (pre)cached on the client side.
Why bother? I work on a very UI-heavy interactive app (it's a gantt chart) for work and therefore have been doing react daily for 1.5 years. React is probably making life a little easier here, and I don't hate it, but I don't love it either. If I didn't love my team so much I would be looking for a job working on a more classic web app in Rails. And when I profess my appreciation for Rails, I could just as easily be talking about Django, Phoenix, or any other _opinionated_ backend framework. [edited for spelling]
I know some Rails shops that are switching to Stimulus Reflex for some UI-heavy things, it works like Phoenix LiveViews, you can briefly see how CodeFund uses it for their interfaces here (40 seconds in): https://www.youtube.com/watch?v=F5hA79vKE_E&feature=youtu.be...
I'm aware of Stimulus Reflex though never tried it out! :) I've been playing with LiveView a lot recently which is fantastic. I would love to switch to stimulus but we're pretty heavily bought into React at this point.
Regarding hosting, I think this is actually a great time to host Rails apps either via Heroku, via the new Digital Ocean k8s app cluster or – and this is my new favourite tool – using Hatchbox.io to deploy apps to Digital Ocean or AWS. It’s a dream!