For Tabulator the virtual DOM is essential. if you try and load 10,000 rows into a table either the browser will slow down and become almost unusable or crash entirely.
For large data sets it makes the table work regardless of the number of rows
What if you just chunk your DOM writes into small batches, and then put them into a queue, which is then consumed within a requestAnimationFrame() handler? A bit of indirection, but not nearly as much as a complete virtual DOM. (I assume this would result in the table "slowly" populating, but the browser and the page should both remain responsive.)
You can can do this but for large data sets it would take ages and doesnt over come the main issue.
Lets say we are working with 10,000 rows
Yes loading that many elements into the DOM is slow and your approach would stop it blocking, however what it dosnt do is stop the sheer number of elements from overloading the DOM.
Most browsers simply cant handle that many elements in the DOM, if you try and scroll a div containing that may elements at best it will be sluggish and at worst it will simply crash the browser. on devices without much memory it is very likely that the whole DOM will freeze up and make the site unusable.
By adding and deleting the elements as they become visible/hidden it keeps the number of elements in the DOM to a minimum, while improving load time and adding only a bit of extra processing to the scroll event, which modern browsers can handle with ease. giving all round the best solution
Ah, I didn't realize that "culling" of non-viewport-visible divs happened within your vDOM implementation. I agree that that would need to happen either way. If the "culling" process was a separate layer, though, it could also be implemented on top of the DOM-queuing process I mentioned.
I did this myself when writing a chat client's infinite history scrollback. Chat messages that aren't on the screen still have a pure-data "event" representation of them loaded, but their DOM representation (virtual or otherwise) is entirely discarded. Rather than the physical DOM nodes for chat-message divs having fixed positions such that they're able to disappear without affecting their peers' layout, I instead just have the chat messages stacked as regular block elements under reflow, with two fixed-size "scroll pad" divs just above and below the viewport "consuming" and "releasing" the vertical height that had been taken up by message divs as they're added and removed. You can have millions of rows "loaded" (in the sense of parsed data sitting in JS memory) for instant access, with not much browser memory usage at all.
Though, even that memory usage was too much (this chat client is supposed to run beside memory-intensive apps like games), so as well, in my setup, nodes that are far enough off the screen can have their parsed "event" representation discarded, and replaced with a single URL representing the serialized "history chunk" that those events came from. When the scroll distance comes close to them, the chunks are reloaded by re-requesting and re-parsing the chunk (which, helpfully, is usually still in the browser cache.)
For large data sets it makes the table work regardless of the number of rows