A relatively new trend of incorporating JS into websites is single-page applications. While traditional websites load all their resources (HTML, CSS, JS) by requesting each from the server each time it’s needed, SPAs require just one initial loading and don’t bother the server after that, leaving all the processing to the browser. This results in faster websites but might be a disaster for SEO.
In this post, we’ll discuss how SPAs are made, why they are so hard to optimize, and how to make sure search engines can understand them and rank them well.
What is an SPA
The technology behind SPAs is favorable to end users: they can easily navigate through web pages without any discomfort of extra page loads and layout shifts. Given that single-page application sites cache all the resources in a local storage (after they are loaded at the initial request), users can continue browsing them even under an unstable connection. Due to these benefits, the technology is here for good even though it demands extra SEO effort.
Why is it hard to optimize SPAs
Before JS started dominating web development, search engines were crawling only the text-based content from HTML. As JS was becoming more and more popular, Google started thinking about adding the functionality for interpreting JS resources and understanding pages with them. They have made significant improvements over the years but there still are a lot of problems with how search crawlers see and access content on single-page applications.
Problems with error 404
With an SPA, you also lose the traditional logic behind the 404 error page and many other non-200 server status codes. Since everything is rendered by the browser, the web server returns a 200 HTTP status code to every request, and search engines can’t tell if some pages are not valid for indexing.
Another issue that comes with an SPA is Google Analytics tracking. For traditional websites, the analytics code is run every time a user loads or reloads a page, counting each view. But when users navigate through different pages on a single-page application, the code is run once, not triggering each individual pageview. The nature of dynamic content loading prevents GA from getting a server response for each pageview. Luckily, there are methods to track user activity on a single-page application website (we’ll cover them later) but they require additional efforts.
How to optimize SPAs
For now, let’s see what actions you can take to make your single-page application website fully visible to crawlers. Among the helpful things, you’ll find SEO basics such as a clean, polished sitemap and some specific techniques related to rendering.
Before dealing with crawling and indexing problems specific to SPAs, make sure you do the basics: create a properly formatted sitemap file and submit it to Google. It won’t help you with JS resources but at least, search engines will know that your pages are there and what structure your website has.
SE Ranking’s Website Audit check a website’s sitemap for a number of issues:
Another go-to solution for single-page applications is pre-rendering. It means loading all HTML elements and caching them on the server to then serve to search crawlers. There are services like Prerender and BromBone that intercept requests made to a website and show different page versions to search bots and real users: the cached HTML to the former and “normal” JS-rich content to the latter. Websites with fewer than 250 pages can use Prerender for free, while bigger ones have to pay a monthly fee that starts from $200. It’s a straightforward solution: you upload the sitemap file and it does the rest. BromBone doesn’t even require a manual sitemap upload and costs $129 per month.
There are other, more time-consuming methods for serving a static HTML to crawlers. For example, you can use Headless Chrome and the Puppeteer library which will convert routes to pages into the hierarchical tree of HTML files. Then, you’ll need to remove bootstrap code and then edit your server configuration file to locate the static HTML meant for search bots.
Progressive enhancement with feature detection
Feature detection is among major Google’s recommendations for SPAs. This technique involves progressively enhancing the experience with different code resources. How does it work: a simple HTML page serves as a basis that is accessible to crawlers and users, while features on top of it (CSS and JS resources) and enables and disables according to browser support.
To implement feature detection, you’ll need to write separate chunks of code to check if each of the feature APIs you need is compatible with each browser. Fortunately, there are specific libraries like Modernizr that help save time.
Views as URLs
When users scroll through an SPA, they pass separate website sections. Technically, an SPA contains only one page (a single index.html file) but visitors feel like they’re browsing multiple pages. When users move through different parts of a single-page application website, the URL changes only in its hash part (for example, http://website.com/#/about, http://website.com/#/contact). The JS file instructs browsers to load certain content based on fragment identifiers (hash changes).
To help search engines perceive different sections of a website as different pages, you need to use distinct URLs with the help of the History API. The latter is an HTML5 standardized method of manipulating the browser history. Google Codelabs suggest using this API instead of hash-based routing so that search engines can see different fragments of content triggered by hash changes as separate pages. The History API allows you to change navigation links and use paths instead of hashes.
Views for error pages
With single-page websites, the server has nothing to do with error handling and will always return the 200 status code saying that everything is okay. But users can use the wrong URL to access an SPA and there should be some way to deal with error responses. Google recommends creating separate views for each error code (404, 500, etc.) and tweaking the JS file so that it directs browsers to the respective view.
Social shares and structured data
Social sharing optimization is often overlooked by websites: we’ve learned that missing Twitter Cards top the list of the most common issues identified by SE Ranking’s Website Audit tool. No matter how insignificant it may look, implementing Twitter Cards and Facebook’s Open Graph will allow for rich sharing across popular social media channels, which is good for website’s search visibility. If you don’t use these protocols, sharing your link will trigger a random—not necessarily relevant—visual object to be displayed as a preview.
Using structured data is also important to make different types of website content readable to crawlers. Schema.org provides options to label data types like videos, recipes, products, and so on. You can conduct a Google’s Rich Results Test to learn if data types are assigned currently and enable rich search results for your web pages.
Testing an SPA
Previously, Google Webmaster Tools included the Fetch as Google functionality which allowed seeing the downloaded HTTP response and page HTML as it was fetched and rendered by the search engine. But in 2019, Google removed the tool, and now you can only access some crawling and indexing information in the URL Inspection section of Search Console. It doesn’t give an informative preview of what Google sees but provides you with basic information about crawling and indexing issues.
Google’s Mobile-Friendly Test is also helpful as it shows you the rendered HTML and identifies page resources that can’t be loaded and processed by search engines. Plus, Headless Chrome is a great way to test your SPA to see how JS will be executed. A headless browser doesn’t have a full UI but provides the same environment as real users will have. Finally, test your SPA across various browsers using tools like BrowserStack.
Solving tracking problems
Since traditional tracking code in Google Analytics doesn’t work with single-page websites, you’ll have to use additional tools. The trick here is to record and monitor real-user interaction instead of pageviews.
GA itself suggests tracking virtual pageviews by setting the set command and new page value. You can also implement plugins like Angulartics that tracks pageviews based on user navigation across the website. Or, you can set History Change triggers in Google Tag Manager that also tracks user interactions. There are other tools that can help you with SPA tracking by gathering RUM (real user monitoring) data.
There’s no way around basic SEO
The specific nature of SPAs aside, most general optimization advice suits this type of website. Basic SEO efforts entail:
- Security. If you haven’t already, protect your website with HTTPS. Otherwise, you might get cast aside by search engines and compromise user data if the site is using any. Website security isn’t something you cross out of your to-do list but a valuable aspect that needs continuous monitoring. Regularly check your SSL/TLS certificate for critical errors to make sure your website can be safely accessed:
- Content optimization. We’ve talked about the SPA-specific measures like writing unique title tags and description meta tags for each view (like you would for each page on a multi-page website). Before you do that, you need to have optimized content: tailored to the right user intents, well-organized, visually appealing, and rich in helpful information. If you haven’t collected a keyword list for the site, it will be hard to provide visitors with the content they need. You can check out our guide on keyword research to find some new insights.
- Link building. Backlinks signal Google about the level of trust other resources place in your website so building a backlink profile is a vital part of SEO. There are no backlinks alike: each link pointing to your website has a different value, and while some can significantly boost your rankings, spammy ones can damage your search presence. Learn more about backlink quality and build up your link profile according to the best practices.
- Competitor monitoring. You’ve probably researched your competitors in the early stages of website development. Like with most SEO and marketing tasks, you need to keep track of your niche all the time. Thanks to data-rich tools, you can easily monitor rivals’ strategies in organic and paid search, evaluating the situation on the market, spotting fluctuations among major competitors, and getting inspiration from particular keywords or campaigns that already work for similar sites.
Single-page application websites done right
To make your SPA shine in the eyes of search engines, you need to make its content easily accessible to crawlers. Providing visitors with dynamic content load, blasting speed, and seamless navigation, don’t forget to serve a static version to search engines. On top of that, make sure to have a correct sitemap, use distinct URLs instead of fragment identifiers, and label different content types with structured data.
Do you have any experience with single-page application websites? Have you struggled with pushing your SPA to the top of the SERPs? Share your thoughts and life lessons in the comments section.