We use AngularJS for every project here at Madness Labs, but single page applications (SPAs) and search engines aren’t known to get along. SEO4AJAX and Laravel quickly helped us solve this problem and here is how, but first, why isn’t my single page application (SPA) SEO friendly?
How Search Engine Crawlers Work
To understand why your nice and shiny ajax app is terrible for search engines we first must understand how the crawler bots work. When a regular user visits your site they have a nice JavaScript enabled browser and as they click around the page’s content is dynamically loaded into the hull of the web app. When a crawler bot performs the same routine, it does so, without JavaScript enabled, which means no content is loaded into the page.
No content = No SEO
Possible Solutions
You have 2 options when it comes to dealing with SEO in SPAs and each one of them has drawbacks.
- Render Flats – This involves rendering the page on the server side and will end up with code duplication, not very DRY.
- Headless Browser – Using a tool like PhantomJS, serve the page with rendered JavaScript, time consuming…
Enter SEO4AJAX
Luckily, there is a service that will give us what we need without investing a bunch of time or code duplication, SEO4AJAX. It works by using PhantomJS to render all of the pages in your sitemap and then saves the “snapshots” so you can serve them to the search engines. Setting up a site takes minutes and they have a growing, simple API. Enough chatter, let’s fix our SEO problems!
Madness Solution
As stated, we used Laravel to route to our snapshots, but this could very well be replicated in ExpressJS or any other server side language. Basically, we made a filter (GIST BELOW) that you can simply paste your auth key from SEO4AJAX in and it will render the snapshot if the user detected is a crawler bot. Pretty neat eh!
Until Next Time
I hope this helps encourage more people to pick up new age design techniques and not rule out a great SPA framework like AngularJS, just because of a couple SEO workarounds. As you have seen, it can be made simple if you know where to look. See you next time and thanks for reading.
UPDATE 01/14/15
After speaking with the lovely people at SEO4AJAX they have made me aware that my filter was not as friendly to all of the search engines as it could have been. They have helped me tweak the GIST (Already Updated) to add support for escaped fragments, which most search engines use for SPAs. Be aware that you will need the following Meta Tag to enable this functionality.
<meta name="fragment" content="!">
A big thanks to SEO4AJAX for their wonderful product and their contribution to the Madness.