JavaScript Functional Immutable Update Performance

What are the performance implications of “immutable functional updates”?

Assumed audience: Programmers, particularly front-end JavaScript programmers who’ve encountered Redux, who are curious about functional programming and performance. Note that I’m not trying to persuade anyone about functional programming in this post—just answering a question about it!

In a chat group I’m a part of online, someone asked basically this question (I’ve tightened it up a bit for the purposes of this post):

I’m a complete newbie to Javascript and React, and something got my attention while I’m reading a React book: they advocate a functional programming style, which means things should be immutable. In a Redux reducer, an add” action updates the state by generating a new state like this:

return [ ...state, newElement ]

That’s okay in Haskell, Lisp, Erlang, because those languages use linked lists, but in Javascript I would guess this will be O(n),1 right? That seems like overkill; why not just do this instead?

state.push(newElement); return state

You’re correct; however, for many cases it doesn’t matter. If you have 30 elements in your array (or honestly even 300) that’s fine performance-wise. Creating a new array by iterating over the old one isn’t as fast as just inserting a new element, but it also is rarely the bottleneck, especially since using the spread operator or using the .concat() method do shallow copies of the data. When your arrays get large, it does matter, of course.

Also worth note: it’s not specifically the use of linked lists that makes it safe in other contexts; it’s the use of linked lists as one means of implementing persistent data structures. Elm and others also have arrays! They just have slightly different implementations and performance characteristics.

As for why you wouldn’t want to just use push: because doing so turns into spooky action at a distance” pretty easily unless your whole team is exceedingly disciplined. I’ve spent non-trivial parts of the last two weeks looking at bugs (in the Ember app I work on) caused by people using push instead of functional-style updates, so this particular pain is very fresh for me.

You can also do return state.concat(newElement), which has the same semantics as using the spread operator does.

It’s basically just a workaround for the fact that this stuff isn’t how JS natively behaves – JS kind of assumes mutation is the default at a language level.


Notes

  1. If that O(n) notation is unfamiliar to you, don’t worry: it’s not as complicated as it might seem. This is an example of Big-O Notation, which is just a convenient shorthand for the basic performance characteristics of an operation: O” for the order,” or growth rate, of the function. The thing inside the parentheses describes the relationship between the number of items n and that growth rate. O(n) means the growth rate is linear”: it grows with the number of items involved. If it were O(n2) it would grow like the number of items involved squared; if it were O(1) it would be constant no matter how many items were involved. ↩︎