SSR is the traditional method. Here all of the page’s resources are on the server and it will do all the work to render the page when it’s requested.
CSR is the newer method. Here a JS framework transfers the rendering load to the client side, i.e., the browser. It has more requests to the server as opposed to the SSR where everything is handled in one go.
As a result, SSR may be faster, but that depends on the connection speed, server load, page optimization and so on. CSR, on the other hand, might be a touch slower initially, but after the resources are downloaded, the rest of the rendering happens faster and is not impacted by server loads and so on.
Search engines, mainly Google and Bing, can crawl and render JS. So, it’s tempting to start SEO-ing everything via JS. Making your JS content well-structured, organized and rendering properly shouldn’t be done only because of the search engine bots. It should be the very basis of a JS site.
It depends on your goals. If you have a JS-heavy site, then yes, it’s a no-brainer to make sure it’s SEO-ed well. But if JS doesn’t really have a big or any role on your site, then there’s no real reason to start using it more just for the hope that it will increase your page rank.
A few JavaScrtipt SEO tips:
- Make sure the content can be indexed within the load event
- Don’t index content which depends on user events or actions
- The best HTML SEO practices apply to JS, too
- Every site, including JS sites, requires unique URLs which are indexable
- Don’t forget the meta data, title, ALT tags for images, they still count
- Make sure there are no pushState errors as they can result in duplicate content
- Href- and scr- attributes still matter a lot
- Don’t forget robots.txt and make sure bots are allowed and JS is not excluded