Sandboxes have significant performance costs of their own: the cost of spawning and communicating with a separate process, in addition to the extra memory use of a multiprocess architecture. They also come with high engineering overhead: creating a Chromium-like sandbox is a lot of work (which is why so few apps go to the trouble), and it's easy to mess up and accidentally introduce sandbox escapes. There is the attack surface of the kernel and IPC layer. Finally, what seems at first to be a simple IPC interface between the image decoder and the host application becomes a lot more complex when hardware acceleration (GPU or otherwise) becomes involved. Securely providing a decoder access to hardware (say, to perform iDCT or colorspace conversion) is a nontrivial task.
This comment didn't deserve to be downvoted, as it's a legitimate argument. But when you add up all of these costs compared to just having a system that prevents memory safety issues in the first place, the Rust approach is rather appealing.
(Oh, by the way: Preventing image DoS isn't as hard as it sounds. Refusing to decode images larger in area than some limit gets you most of the way there.)