Yeah, big time. It's faster, so crawlers give you better scores for page speed, which is important. Secondly, it automatically renders all of your content, vs if you dynamically load content, the crawler may just see a page with a "Loading" element and never actually view the content itself.
Google argues that it is able to handle javascript heavy client side code in it's crawlers, but the data seems to show otherwise.
Perhaps the best method is a mix of static or SSR content for the content-heavy stuff that you want indexed and SPAs for the truly dynamic experiences. This is easier said than done but there’s a good chance your marketing team is separated from “product” anyway. Marketing can continue to use WordPress or some other CMS with a static export or SSR and product gets the full app experience stuff.
It’s mentioned in other threads that SSR is more expensive as your scale - so you might as well make the “outside” layer of your site light weight and static/SSR for fast client loading and then give them the full SPA once they’ve clicked through your landing pages.
Yes. There's a separate queue for sites that need js rendering and it eats much more into your crawl budget. Best way to avoid it imo is to use something like Rendertron, which is made and recommended by Google.