Hacker News new | past | comments | ask | show | jobs | submit login
A Replacement for Strong Parameters (ryanbigg.com)
37 points by todsacerdoti on Nov 13, 2022 | hide | past | favorite | 29 comments



One of my devs recently pitched an idea to me that I can't stop thinking about.

Strong Params requires you to enumerate all params on the server. But why couldn't you know what are valid params while building your form? You could serialize the params shape in the form builder and include it in your post payload, signed by the app secret to avoid tampering.

It should have done this from the beginning!


There is a Ruby form library named Forme that works this way for Roda and Sequel (https://forme.jeremyevans.net/files/README_rdoc.html#label-R...). As you expect, this makes handling normal HTML form submissions much easier.


just want to say Roda and Sequel deserve a better marketing, you need a dhh, as they are great!


This wouldn’t work for APIs or controllers that you make requests to via JS, but for the default use case of forms it’s an _incredible_ idea.

It would probably be not too hard to do as a gem - you inject a hidden field when form_with is called and you define a new method on params so the controller can indicate that this approach is being used.


You're talking about pure validation, though. dry-schema and the like, though, are more like in-memory ETL libraries for form data: they don't just describe a valid parameters shape, but also what — and how — to extract from those parameters, and an (implied) data-structure shape + typings to put the results into.

I can see how you could generate the Extract step of an ETL process (i.e. a validation specification) purely from the form HTML itself. But I'm not sure where you'd stuff the information required to do the Transform or Load steps. It isn't cleanly separable into per-component data attributes.


I'm not sure that I understood your suggestion. What if the the client is curl -F name=Ryan -F age=34 ? How would I sign that request and why?


It would only work for routes that only accept rails form_for input, and would reject requests without the signature.


Because a well crafted response could write whatever they wanted to your DB as you’ve given the client too much control.

If I know, for instance, an admin is identified by an `is_admin` flag, I could add this to the allowed params and send my flag. Bam. I’m now an admin.

I know the example is a bit simplistic, but it wouldn’t take long to comb your app for a real world example where this would be problematic.


That’s the point of the signature the above poster suggested - if you use the app secret to generate a signature of the valid parameters, then the client can’t edit this list without invalidating the signature and there is no risk of them modifying other fields. If your secret is compromised such that this becomes possible, you have way bigger problems.

I’m not super keen myself as I prefer the implementation to be much more explicit than magic, but there doesn’t seem to be an obvious security hole here.


Precisely.

Strong Params is a canonical violation of DRY. When you build your form, you fill it with fields that are allowed to be filled out and submitted. Then you have to duplicate that work in the controller for no obvious benefit.

As for "more explicit than magic" I prefer to think of this as "automatic" or "conventional" rather than magic. Much of Rails' "magic" is this exact kind of thing, allowing the right thing to automatically happen with the ability to step in or override manually when the convention isn't what you want.

I get why you'd want it to be more explicit. I think that this feature was invented to solve a security problem and the solution was to give devs more work to do. There's no obvious reason why it couldn't be automatically taken care of for the dev and it would definitely make maintenance and legacy app upgrades smoother.


These attacks work by modifying values, not adding parameters. Sounds like they would still work.

Besides, you still have to validate the signature server-side, so it's not like it's saving any work. Validation just gets split up between generating the signature, a network round-trip, and validating the signature.


> so it's not like it's saving any work

The "work" the GP is trying to save here isn't CPU cycles, but rather developer labor — the redundant labor of writing both a form view that describes form inputs, and a form-value schema validator to be called from the controller that receives the form's submitted input values.

Generating the signature, and validating the signature, would both be done transparently by middleware components of the framework, with no marginal developer labor required per form.


Django forms already do that, and because of product requirements around design everyone I know who has used it has found it more of a pain than just doing the forms directly. You're just going to be in the HTML/CSS anyway to make it look right. Only for internal pages, where we didn't have bother with UI design, was it worthwhile.


No idea why you're getting downvoted. It seems obvious to me too that you still have to validate server-side, so signing the form validation schema on the frontend is a waste of time.


I agree the syntax for strong_params is confusing and error prone, but I wish the author had drawn more of a distinction between the schema of the request payload versus the schema of the model. Rails assumes the model should handle its own type validations, and strong_params is sort of a bandaid to ensure the params object can’t affect internal implementation details. dry_schema looks way more powerful but it makes it easier for one to forget that the primary goal is not a generalized schema, but a security boundary for ensuring dangerous attributes aren’t set on the model.

I think the pattern is helpful but it’s still pretty easy to mess up.


We already do something similar. Strong parameters is a terrible api since it splits the schema and tge "authorized parameters", which is really just the same.

Aside from that, I don't love dry libs either, they are often very complex for their usecases and they all have DSLs that could have been better served by using ruby, datastructures and objects


To me, some of the 'dry' libraries are useful, but some feel a bit like Haskell-envy. They provide some value, but also some overhead and cognitive load, and in some cases just feel a bit odd.


recently had a new hire try to add dry-monad to our monolith and I had to explain to him that while I personally don’t mind it, having to explain monads and have other people that don’t necessarily understand monads maintain the code is not a good idea and that there are perfectly idiomatic Ruby constructs that any developper could maintain that can accomplish the same thing

He doesn’t work for us anymore (not because of this)


I think strong_params is a pretty weak way to do parameter parsing and share the author's frustration. I'm out of rails-land now, and using openapi docs and generated middleware to handle all that for me. We're able to give the docs to customers and able to generate code for API clients.

dry_schema looks nice and a certain improvement over strong params, but I'd go a step further.


If you disagree with just about every Rails convention, just don’t use Rails. There are other options in the world.

Ryan is a smart guy, but it seems like he’d be a lot happier with something that isn’t Rails.


Rails is what pays the bills :) Just because I don't like _parts_ of Rails doesn't mean I will completely abandon it. This occurs in other parts of my life too. I find it really inefficient to use allen keys to tighten hex screws, so I recently bought a set of hex heads. Now I can use them with my screwdriver. Doesn't mean that I've sworn off allen keys altogether though, because sometimes they come in handy in particular circumstances.

I just use the right tool for the job, and in this place I do not think strong parameters is the right tool for the job.


To each their own. It was dickish of me to say, "it seems like he’d be a lot happier with something that isn’t Rails." For that, I'm sorry.

The dry-rb approach isn't for me. That doesn't mean it's not going to resonate with a lot of others. (Clearly, it does)

To me, the Dry Schema approach feels unnecessarily heavy-handed and duplicates work that I would expect to be done in the model. Strong Parameters helps a bit with form data safety, but I'm not looking for anything else out of it. I'm expecting my models to validate and coerce those strings into something useful, and raise errors if the data is invalid. Dry Schema feels like an unnecessary layer in the middle. ¯\_(ツ)_/¯


Glad to see you here! You helped me a lot 10 (more? Lost track) years ago on IRC, when getting started with Rails, as well as by reading "multitenancy with Rails".

Indeed, Rails pays the bills.

And pulling out companies from the trouble they are in due to common Rails misuse is still very rewarding.

Haven't yet figured out how to help companies deep into callback hell though. Might be non-solvable.


My personal opinion is that malformed data shouldn't be reaching domain code.

A unique schema for each route that is strongly typed which can be turned into a specialized streaming JSON parser.

Given Rails is heavily dynamic, it reaches lawlessness and with no regards to data integrity, eventually leading me to decide to exit ruby and rails altogether.

With Ruby (& Rails) every single parameter is suspect and cannot be trusted. Stripe's type checker does alleviate some of this stress and anxiety.


> My personal opinion is that malformed data shouldn't be reaching domain code.

The mistake a lot of people make: Rails itself isn't domain code.

In hexagonal-architecture terms, Rails is a framework for building your "web gateway", not for building your domain logic. Your domain logic should live on its own, outside Rails, as plain-old code with a well-defined API (consisting of functions and domain types.)

Rails' job — like any other hexagonal-architecture gateway's job — is to translate in incoming requests into command or query objects to pass to the domain-logic API; and to translate the domain-logic API's generated events or query-result objects back out. The former is exactly an MVC controller; the latter is exactly an MVC view.

Think of it like building a (portable, multiplatform) game: your business-logic is like the game engine, assets, scripting, etc. Rails, meanwhile, is like the OS-specific glue code that makes your game run in a window and respond to mouse/keyboard input on Windows/macOS/Linux etc. This glue code takes events from each OS and brings them into your game; takes output from your game and feeds it back to OS drivers; and binds OS mechanisms like window controls to game features. You don't want any "game" logic to live inside your OS-specific glue code. You want your glue code to just be a "wrapper" that brings in your game as, essentially, a library, and then "wires it up" to the given OS.

In this view, both Strong Parameters and dry-schema are tools used "at parse time" — i.e. controller-execution time within the web gateway — to create the "strongly typed" "unique schema" for the route, that allows it to discard/reject data before putting it into the real strongly-typed objects: the business-domain objects used by your business-logic DDD context.

(And the reason that this gluing-together is done through an arbitrarily-programmable framework, rather than by just giving routes static strong types, is because the translation process itself — especially when compatibility with multiple legacy systems is involved — can be "Turing-hard", requiring a full programming language to specify what should be glued to what, and when it's valid to translate X to Y vs X to Z. Maybe some users are talking to v1.1 of an endpoint while some users are talking to v1.0, based on a per-user feature-flag or holdback-flag or whether the user is a paying customer or not.)


This is the main reason why I use graphql on my Ruby web projects. The client flexibility is nice. Performance issues from wonky queries sent from clients can be a pain to debug - thankfully it’s an internal API so I can work directly with the client devs to create optimal queries. But the strong type guaranties and schema validation is the main reason why I prefer graphql over rails controllers with strong parameters.


Do you find the type definition work to be better/easier than using model validations? I’ve found that I prefer that logic in the model instead of type def files, but hey, to each their own, of course.


I use a mix of both. GraphQL is used to validate the schema and types of the input, model validations are used to ensure that the actual values being saved to the database make sense from a domain perspective.


As bad as strong parameters is, it was still, at the time, a decent improvement over its predecessor, attr_accessible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: