Casey hasn't a clue. He's only written games and he hasn't actually had to solve the issues that are being solved by in other domains. If he actually did have that experience he'd see his ideas don't scale. Games are in generally, vastly different than other apps (word processors, video editors, browsers, etc) in for one, for the most part, they get to choose all of their data upfront. If you're making "The Last of Us 2" there is no user data. There's no "some people will use this to write a letter to grandma and yet some other people will write an 800 page book on physics with mathematical diagrams" and yet another will write a report on the market with linked live data.
Consider "games" like Minecraft, Roblox or Dreams, those are entirely user-data-driven. Different types of games are at least as different to each other as to other types of applications (or rather, they are not less diverse than other applications, they usually just have a higher focus on performance).
How are they different? You have some primitives like a block, and meaning is given by users to a group of them. The program has to care only about the primitives.
(Of course making it performant, not rendering/“simulating” everything and the like is exceedingly hard, but it is true that it is more of a “closed world” as opposed to some other areas of software development.)
This type of "creative games" lets users combine basic building blocks in the same way a word processor application allows to write entire books by combining a a limited set of characters. The limitations of strictly linear games like Last of Us are not because of technological restrictions but because it's hard to tell a cinematic story while still giving the user complete freedom.
Minecraft was created in Java with ducktape and hacks and a lot of ugliness and perf characteristics which wouldn't fly past a pedantic programmer like Casey
Is your argument that arenas don't scale because user-provided data is variable in size?
Although arena memory is casually described as "allocate one huge chunk of memory up front," you are not literally only allocating one block ever and praying it never runs out. If you run out, you allocate another block. The point is that you don't call malloc for every string, object, list, etc. Adhering to this largely eliminates the need for RAII. What about this doesn't scale?
Personal anecdote: I'm building an IDE, where literally all of my data is provided by the user, and arenas have worked perfectly. I don't think I have a single destructor except for dealing with things like file descriptors, etc.