Hacker News new | past | comments | ask | show | jobs | submit login

React's default behavior is that when a component re-renders, _everything_ underneath it in the component tree re-renders as well. If you want a component (and its descendants) to skip re-rendering, you can implement the `shouldComponentUpdate` method. You can put any logic you want into `sCU`, but the most common implementation is to compare the contents of `this.props` and `nextProps` to see if anything actually meaningfully changed.

You _can_ do a "deep equality" comparison that recurses through every nested field in both `this.props` and `nextProps`, but that's relatively expensive. The alternative is "shallow equality", which uses pointer/reference equality checks for each field in both objects. However, in order for that to be useful, you need to manage your data in an immutable fashion so that each update results in a new object/array reference, rather than directly modifying the existing objects.

So, you don't _have_ to manage data immutably in React, but doing so enables performance optimizations, and also goes along with React's functional programming influences.




For anyone interested, there's a great conference talk on this topic by Lee Byron from Facebook: https://www.youtube.com/watch?v=I7IdS-PbEgI


Gotcha - so you see it mostly as just a perf optimization based around SCU?

Funny timing. I was actually implementing SCU using deep obj equality for the first time earlier today. I did understand the desire for immutability, but it didn't seem to be a game changer.


Perf optimization is a big benefit, yes, but it also fits with functional programming principles in general.

React's `setState()` definitely doesn't care if you mutate or not - you can `.push()` right into an existing array in state, and re-set it into state to queue the re-render.

On the Redux side of things, immutability is important for several reasons. First, pure reducer functions are more easily testable. Second, they enable time-travel debugging - without immutability, jumping back and forth in state would cause the state contents to behave unpredictably, breaking time-travel. Third, the React-Redux `connect` function relies on immutability checks against the root of the state tree to see if it _thinks_ anything has changed, and against the return values of your `mapState` functions as well. If you mutate Redux state, your connected components usually won't re-render properly, because they think nothing has changed.


Just an fyi, if you are just doing a shallow comparison you can use React.PureComponent which will implement a shallow comparison sCU for you.


You're comparing reference to infer whether underlying data changed? Sounds really really fragile... Does the JavaScript language guarantee this or are you depending on the specific engine implementation? For instance will this still hold if the OS preempts the browser out and restores it later?


That's an absolutely standard practice in Javascript, and especially in the React world.

The assumption is that if two objects are different references, then their contents are also different. It's possible that you could have two different objects whose fields point to the exact same contents, but given a typical application setup that's unlikely. So, a simple `a === b` (or if you're looking at all the contents, `first.a === second.a && first.b === second.b && .....`) is just some quick pointer comparisons in the JS engine. So, at the JS level you don't care what the pointer address actually _is_, just whether the two variables point to different things in memory.


> their contents are also different

More precisely: may be different.


As I said, it's an "assumption" :)




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: