Anastasia Osypenko
May 21, 2021 | 16 min read

Historically, web developers have been using HTML for mere content, CSS for styling, and JavaScript for interactivity elements. It’s JS that makes it possible to add pop-up dialog boxes and expandable content on web pages. Now, over 97% of all sites use JavaScript because it gives opportunities to modify web content in response to user actions. 

A relatively new trend of incorporating JS into websites is single-page applications. While traditional websites load all their resources (HTML, CSS, JS) by requesting each from the server each time it’s needed, SPAs require just one initial loading and don’t bother the server after that, leaving all the processing to the browser. This results in faster websites but might be a disaster for SEO. 

In this post, we’ll discuss how SPAs are made, why they are so hard to optimize, and how to make sure search engines can understand them and rank them well.

What is an SPA

Single-page application, or SPA, is a specific JavaScript-based technology for website development that doesn’t require any more page loads after the first page view load. React, Angular, and Vue are the most popular JavaScript frameworks used for building SPA. They mostly differ in supported libraries and APIs but use the same logic of serving fast client-side rendering. Many high-profile websites (Twitter, Pinterest, Airbnb) are built with a single-page application architecture.

An SPA eliminates the requests between the server and browser, making the site much faster. But search engines are not so thrilled about this JavaScript trick. What happens is that search engines don’t get enough content: they don’t click around like real users and don’t understand that the content is being added dynamically. What they’re left with is a blank page yet to be filled.

The mechanics behind SPAs

The technology behind SPAs is favorable to end users: they can easily navigate through web pages without any discomfort of extra page loads and layout shifts. Given that single-page application sites cache all the resources in a local storage (after they are loaded at the initial request), users can continue browsing them even under an unstable connection. Due to these benefits, the technology is here for good even though it demands extra SEO effort.

Why is it hard to optimize SPAs

Before JS started dominating web development, search engines were crawling only the text-based content from HTML. As JS was becoming more and more popular, Google started thinking about adding the functionality for interpreting JS resources and understanding pages with them. They have made significant improvements over the years but there still are a lot of problems with how search crawlers see and access content on single-page applications.

There’s little information on how other search engines perceive single-page applications but it’s safe to say that all of them are not crazy about websites reliant on JavaScript. If you’re targeting search platforms beyond Google, you’re in quite a pickle. The 2017 Moz experiment showed that only Google and surprisingly, Ask, were able to crawl JavaScript content, while all other search engines remained totally blind to JS. As of today, no breakthrough announcements were made by any search engine except Google about putting efforts to understanding JS and single-page application websites. At least, there are some official recommendations: for example, Bing makes the same suggestions as Google—it encourages server-side pre-rendering, a technology that allows bingbot (and other crawlers) to receive static HTML as the most complete and comprehensible version.

Search bots failing to understand JavaScript

Crawling issues

HTML, which is easily crawlable by search engines, doesn’t contain much information on an SPA. It includes an external JavaScript file with the help of the <script> src attribute. The browser runs the script from this file and the content is dynamically loaded, but the crawler might fail to perform the same operation and in that case, what it sees appears to be an empty page. 

Back in 2014, Google announced that they were improving the functionality to understand JS pages, at the same time admitting that there were a lot of blockers preventing them from indexing JS-rich websites. In Google I/O ‘18 series, Google analysts talk about two waves of indexing for JavaScript-based sites, meaning that Googlebot re-renders the content when it has the resources. Since a lot of JavaScript takes much more processing power and memory from Googlebot, the cycle of crawling, rendering, and indexing isn’t instant. Fortunately, in 2019, Google said they needed a median time of 5 seconds for JS-based web pages to go from crawler to renderer. Just as we were getting used to the two waves of indexing scheme, in 2020, Google’s Martin Splitt told that there was no more such thing. Or rather, that it’s “more complicated.”

How Googlebot processes JavaScript

The major thing to understand here is that there’s a delay in how Google processes JavaScript on web pages, and all JS content that is loaded on the client side might not be seen as complete and properly indexed. Search engines can discover the page but won’t be able to understand if the copy on that page is high-quality and corresponds to the search intent.

Problems with error 404

With an SPA, you also lose the traditional logic behind the 404 error page and many other non-200 server status codes. Since everything is rendered by the browser, the web server returns a 200 HTTP status code to every request, and search engines can’t tell if some pages are not valid for indexing. 

Tracking issues

Another issue that comes with an SPA is Google Analytics tracking. For traditional websites, the analytics code is run every time a user loads or reloads a page, counting each view. But when users navigate through different pages on a single-page application, the code is run once, not triggering each individual pageview. The nature of dynamic content loading prevents GA from getting a server response for each pageview. Luckily, there are methods to track user activity on a single-page application website (we’ll cover them later) but they require additional efforts.

How to optimize SPAs

For now, let’s see what actions you can take to make your single-page application website fully visible to crawlers. Among the helpful things, you’ll find SEO basics such as a clean, polished sitemap and some specific techniques related to rendering. 

Sitemap

Before dealing with crawling and indexing problems specific to SPAs, make sure you do the basics: create a properly formatted sitemap file and submit it to Google. It won’t help you with JS resources but at least, search engines will know that your pages are there and what structure your website has.

SE Ranking’s Website Audit check a website’s sitemap for a number of issues:

Sitemap check in Website Audit

Server-side rendering

Server-side rendering (SSR) renders a website on the server and then sends it to the browser. This technique allows search bots to crawl all content based on JavaScript. While this is a life saver in terms of crawling and indexing, it might slow down the load. The thing with SSR is that SPAs use the opposite approach by nature: it’s the client-side rendering that makes them so fast and interactive for users and also simpler in deployment.

Isomorphic JS

A possible rendering solution for a single-page application is isomorphic, or “universal” JavaScript. Isomorphic JS makes web pages generated on the server and relieves a search crawler from having to execute and render JS files. 

The “magic” of isomorphic JavaScript applications lies in their ability to run on both the server and client side. How does it work: users interact with such a website as if its content was rendered by the browser, when in fact, they use the HTML file generated on the server side. There are frameworks that facilitate isomorphic app development for each popular SPA frameworks: for example, Next.js and Gatsby for React. The former generates HTML for each request, while the latter generates a static website and stores HTML in the cloud. Similarly, Nuxt.js for Vue renders JS into HTML on the server and sends the data to the browser.

Pre-rendering

Another go-to solution for single-page applications is pre-rendering. It means loading all HTML elements and caching them on the server to then serve to search crawlers. There are services like Prerender and BromBone that intercept requests made to a website and show different page versions to search bots and real users: the cached HTML to the former and “normal” JS-rich content to the latter. Websites with fewer than 250 pages can use Prerender for free, while bigger ones have to pay a monthly fee that starts from $200. It’s a straightforward solution: you upload the sitemap file and it does the rest. BromBone doesn’t even require a manual sitemap upload and costs $129 per month. 

There are other, more time-consuming methods for serving a static HTML to crawlers. For example, you can use Headless Chrome and the Puppeteer library which will convert routes to pages into the hierarchical tree of HTML files. Then, you’ll need to remove bootstrap code and then edit your server configuration file to locate the static HTML meant for search bots.

Pre-rendering for SPAs

Progressive enhancement with feature detection

Feature detection is among major Google’s recommendations for SPAs. This technique involves progressively enhancing the experience with different code resources. How does it work: a simple HTML page serves as a basis that is accessible to crawlers and users, while features on top of it (CSS and JS resources) and enables and disables according to browser support. 

To implement feature detection, you’ll need to write separate chunks of code to check if each of the feature APIs you need is compatible with each browser. Fortunately, there are specific libraries like Modernizr that help save time. 

Views as URLs

When users scroll through an SPA, they pass separate website sections. Technically, an SPA contains only one page (a single index.html file) but visitors feel like they’re browsing multiple pages. When users move through different parts of a single-page application website, the URL changes only in its hash part (for example, http://website.com/#/about, http://website.com/#/contact). The JS file instructs browsers to load certain content based on fragment identifiers (hash changes). 

To help search engines perceive different sections of a website as different pages, you need to use distinct URLs with the help of the History API. The latter is an HTML5 standardized method of manipulating the browser history. Google Codelabs suggest using this API instead of hash-based routing so that search engines can see different fragments of content triggered by hash changes as separate pages. The History API allows you to change navigation links and use paths instead of hashes. 

Google analyst Martin Splitt gives the same advice—to treat views as URLs by using the history API. He also suggests adding link markup with href attributes, creating unique title and description tags for each view (with “a little extra JavaScript”). Note that markup is valid to any links on your website: output them with an <a> tag with an href attribute instead of using the onclick action. JavaScript onclick can’t be crawled and is pretty much invisible to Google. 

Views for error pages

With single-page websites, the server has nothing to do with error handling and will always return the 200 status code saying that everything is okay. But users can use the wrong URL to access an SPA and there should be some way to deal with error responses. Google recommends creating separate views for each error code (404, 500, etc.) and tweaking the JS file so that it directs browsers to the respective view.

Social shares and structured data

Social sharing optimization is often overlooked by websites: we’ve learned that missing Twitter Cards top the list of the most common issues identified by SE Ranking’s Website Audit tool. No matter how insignificant it may look, implementing Twitter Cards and Facebook’s Open Graph will allow for rich sharing across popular social media channels, which is good for website’s search visibility. If you don’t use these protocols, sharing your link will trigger a random—not necessarily relevant—visual object to be displayed as a preview.

Using structured data is also important to make different types of website content readable to crawlers. Schema.org provides options to label data types like videos, recipes, products, and so on. You can conduct a Google’s Rich Results Test to learn if data types are assigned currently and enable rich search results for your web pages.

Testing an SPA

Previously, Google Webmaster Tools included the Fetch as Google functionality which allowed seeing the downloaded HTTP response and page HTML as it was fetched and rendered by the search engine. But in 2019, Google removed the tool, and now you can only access some crawling and indexing information in the URL Inspection section of Search Console. It doesn’t give an informative preview of what Google sees but provides you with basic information about crawling and indexing issues. 

Crawling and indexing check in GSC

Google’s Mobile-Friendly Test is also helpful as it shows you the rendered HTML and identifies page resources that can’t be loaded and processed by search engines. Plus, Headless Chrome is a great way to test your SPA to see how JS will be executed. A headless browser doesn’t have a full UI but provides the same environment as real users will have. Finally, test your SPA across various browsers using tools like BrowserStack.

Solving tracking problems

Since traditional tracking code in Google Analytics doesn’t work with single-page websites, you’ll have to use additional tools. The trick here is to record and monitor real-user interaction instead of pageviews. 

GA itself suggests tracking virtual pageviews by setting the set command and new page value. You can also implement plugins like Angulartics that tracks pageviews based on user navigation across the website. Or, you can set History Change triggers in Google Tag Manager that also tracks user interactions. There are other tools that can help you with SPA tracking by gathering RUM (real user monitoring) data.

There’s no way around basic SEO

The specific nature of SPAs aside, most general optimization advice suits this type of website. Basic SEO efforts entail:

  • Security. If you haven’t already, protect your website with HTTPS. Otherwise, you might get cast aside by search engines and compromise user data if the site is using any. Website security isn’t something you cross out of your to-do list but a valuable aspect that needs continuous monitoring. Regularly check your SSL/TLS certificate for critical errors to make sure your website can be safely accessed:
Website security check
  • Content optimization. We’ve talked about the SPA-specific measures like writing unique title tags and description meta tags for each view (like you would for each page on a multi-page website). Before you do that, you need to have optimized content: tailored to the right user intents, well-organized, visually appealing, and rich in helpful information. If you haven’t collected a keyword list for the site, it will be hard to provide visitors with the content they need. You can check out our guide on keyword research to find some new insights.
  • Link building. Backlinks signal Google about the level of trust other resources place in your website so building a backlink profile is a vital part of SEO. There are no backlinks alike: each link pointing to your website has a different value, and while some can significantly boost your rankings, spammy ones can damage your search presence. Learn more about backlink quality and build up your link profile according to the best practices. 
  • Competitor monitoring. You’ve probably researched your competitors in the early stages of website development. Like with most SEO and marketing tasks, you need to keep track of your niche all the time. Thanks to data-rich tools, you can easily monitor rivals’ strategies in organic and paid search, evaluating the situation on the market, spotting fluctuations among major competitors, and getting inspiration from particular keywords or campaigns that already work for similar sites. 

Single-page application websites done right

To make your SPA shine in the eyes of search engines, you need to make its content easily accessible to crawlers. Providing visitors with dynamic content load, blasting speed, and seamless navigation, don’t forget to serve a static version to search engines. On top of that, make sure to have a correct sitemap, use distinct URLs instead of fragment identifiers, and label different content types with structured data. 

The single-page experience brought to users thanks to JavaScript responds to the demands of modern users who want to interact with web content as soon as possible. To keep UX-centered benefits of SPA and also rank well in search, developers are switching to what Airbnb’s engineer Spike Brehm calls “the hard way”—balancing between the client and the server. 

Do you have any experience with single-page application websites? Have you struggled with pushing your SPA to the top of the SERPs? Share your thoughts and life lessons in the comments section.

Share article
Post Views: 3,340
6 comments
  1. Hi Anastasia,
    That’s a very informative article that you have written here and describes exactly the SEO issues that I am currently experiencing with the SPA on my website. My next question is, can you recommend any good and affordable development teams who could resolve these issues for me?
    Much appreciated,
    Guy

    1. Hi Guy, thanks for your comment. Unfortunately, I don’t have any experience with hiring development teams so I can’t give you direct recommendations. There are lots of freelancer and tech talent platforms (like TopTal) where you can search for developers with relevant expertise, or you can explore development companies profiles on B2B review platforms (like Clutch): if you spot SPA development in the provider’s portfolio, reach out to them to learn if they can help you.

    1. You’re right, Liz, with the development team that has the expertise in the field, it’s not complicated.

  2. SPAs are indeed very beneficial from the user’s perspective. But I think most of them don’t even require optimization as they are created for networks, dashboards, trackers, etc.

    1. Thanks for your comment, Matt. It’s true that a lot of SPAs might not need SEO (like the Trello task tracker, which is a classic example of SPA technology. Note though that they have a traditionally built website for promoting their service). But as the technology gets more popular, SPAs are used across many different industries and do require additional optimization efforts.

Leave a Reply

Your email address will not be published. Required fields are marked *

More Articles
SEO Insights
The complete mobile optimization guide for SEO
Nov 25, 2021 11 min read

Since 2019 Google has been paying extra attention to mobile optimization and page usability when ranking websites. If your website is not mobile-friendly, you risk losing organic traffic. Learn more about the best mobile optimization practices and tools that can help improve the usability and SEO of your website.

Dmitry Davydov
SEO Insights
Keeping up with PPC trends—Interview with ​​Joel Bondorowsky
Nov 23, 2021 27 min read

Joel Bondorowsky is an Israel-based PPC and marketing growth expert who has contributed to dozens of successful projects including Wix and SimilarWeb. In the interview, he shares his thoughts on the new privacy regulations, extensive automation, and offers some tips on remaining competitive in the rapidly changing PPC landscape.

Svetlana Shchehel
SEO Insights
20 pro tips on using SE Ranking
Nov 22, 2021 28 min read

There are a vast number of ways how SE Ranking can help take your SEO and marketing efforts to the next level, and we've decided to give you some useful tips. Here, we want to point out 20 pro ways of leveraging SE Ranking that you may have overlooked.

Andrew Zarudnyi