Render templates on the server
Of course there is one major drawback. You basically have to code your entire website twice. Depending on your frameworks, you may be able to reuse your templates and some code. At first you think it won’t be too bad. However, reusing templates and model classes is a little trickier than it seems.
For every difficulty there is a fix, but soon you are dying from the stings of a thousand little differences. Differences between the rendered versions from the server and client can confuse users and make every bug twice as hard to track down.
Render noscript tags on the server
The nice thing about this approach is that you don’t have to worry about making your server-generated html identical to your client-generated html. It is important that you keep the content the same –or else you have taken a wrong turn into the black hat art of cloaking and you might find yourself banned from Google. However, you are safe to ignore the ui chrome, auxiliary links, stats, and sidebars that are nice for humans but of no use to search engines.
The drawbacks are pretty much the same as they were for option one, just tamer since you are duplicating some code but not all of it. It will start out easy as your html will be simple and you will only have a couple different pages that need to be crawlable. Over time, the scope is sure to expand. As the site changes, you’ll still have to maintain both the server-side code and client-side code.
Use PhantomJS in realtime
Use PhantomJS and store snapshots
You can easily improve on the previous method using the oldest trick in the book: caching. Instead of waiting for Google to come calling before you fire off PhatomJS, you can do all the work in advance. You’ll need to open literally every page on your website one by one with PhantomJS. Then you’ll follow the process described above. However, instead of sending the result straight to Google, you’ll store it. When Google asks for the page, send them the page you have already pre-generated.
This takes care of the speed problem, and you can generate pages at a constant rate instead of having to respond to request surges. Of course, there’s added complexity to reach these gains, and you’ll have to store all those static pages. If you make site wide changes, you’ll need to regenerate all your pages. With a large site, that process could take a few days, even with a few dozen servers grinding away.
Use a third party service
If you want to use stored snapshots but don’t want to go through the trouble of setting PhantomJS and a cache, check out BromBone. It takes care of the messy part for you. They process your pages and store the rendered snapshots. When Google crawls your site, you just have to fetch the snapshot from BromBone and pass the page on to Google.