#SEOWeek 2019: Allowing Search Engines to Read JavaScript

In our previous #SEOWeek presentation, we emphasized the need to create a strong user experience on your website. If you’ve spent time building a sleek, high-performance site to dazzle your audience, there’s a solid chance you’ve used some JavaScript elements. JavaScript is responsible for many of those dynamic, interactive features you frequently encounter while browsing the web, such as dropdown menus, carousels, and more.

While JavaScript can make your site look cool, it may also cause problems for search engine crawlers as they attempt to read and index your site. This could mean that significant drop in ranking every site owner dreads. If you have JavaScript implemented on your own site, it’s important to determine whether it’s hurting more than helping, and how to troubleshoot accordingly.

What Are JavaScript & Web Crawlers?

Before we dive into the details, here’s a quick refresher on the roles played by JavaScript and web crawlers.

JavaScript is a computer programming language commonly used to create interactive effects within web browsers. It can be inserted anywhere within the HTML of a webpage. These effects include dynamically generated content and site functions like hover effects, slide-out menus, and other ecommerce-enabled functions.

An example of source code written in JavaScript. (Source: Wikimedia Commons)

A web crawler, also known as a “spider” or “bot,” is a program that visits a website and reads its pages in order to develop entries for a search engine’s index. If a page isn’t indexed correctly, the search engine won’t be able to provide it for relevant queries and its organic search ranking will suffer.

JavaScript has become one of the most popular programming languages among web developers because it’s easy to use. However, if your JS isn’t properly executing, the crawler won’t be able to index that content. This could lead to potential customers not finding you, an increasing bounce rate, and a negative user experience.

Evaluating JavaScript on Your Site

At a time when UX is so crucial to SEO success, Google’s inability to read a website can be crippling. As websites become more JavaScript-rich, it’s important to code your site correctly and ensure bots can follow the customer journey through its pages. If your site doesn’t render properly, its ranking, performance, and visibility will be adversely affected in all digital channels.

Ask yourself these questions about your own site:

  • Do you know what resources are using JavaScript and why?
  • At what point in rendering the website are these resources being executed?
  • Are all JavaScript functions currently necessary?

Google recommends no more than 10 JavaScript requests rendered for any website. If more are rendered, the website speed will decline. Those JS elements may look pretty on your site, but they can also turn it into a sluggish mess, particularly on mobile.

Most developers use JavaScript to achieve a specific visual and/or functional goal without realizing the overall impact it can have on organic visibility and speed. So make sure your web developers can answer the above questions!

Maximize Efficiency

The reason we keep asking developers to reduce rendering requests is to maximize the efficiency of the resource loading process. This directly impacts organic performance because if the JavaScript is slow, the website is slow as well. Bots won’t crawl the site effectively, resulting in lower rankings.

John Mueller, Google’s SEO guru, stated that if a website isn’t rendered within 2-3 seconds, the crawler will likely bounce off and not index the website in that wave. This puts you at a disadvantage since you can’t be sure when the next wave of indexing will occur.

Tom Greenaway shared the below chart depicting the process at Google’s I/O 2018 conference. The key takeaway: If you’re loading your website in JavaScript, make it fast.

The JavaScript on your site should make indexing as fast and easy as possible. (Source)

The JavaScript on your site should make indexing as fast and easy as possible. (Source)

Prerender as a JavaScript Solution

A prerender service is a popular solution for executing JavaScript while overcoming the aforementioned crawling issues.

Prerender is a middleware you install on your server to check whether each request is from a crawler. If so, the middleware will send a request to serve the static HTML version of that page. If not, the request will continue on to normal server routers. The crawler never knows you’re using a prerender server since the response always goes through your server. This allows all content to render correctly within the browser and all on-page content to be properly crawled and indexed accordingly.

Benefits of Prerender

  • Pages are rendered about twice as fast.
  • TTFB (time to first byte) is reduced.
  • Memory usage becomes more efficient.

Prerender is its own entity: It’s open-source software used to help render JavaScript-heavy sites. It makes these websites more crawl-friendly, thus letting site content be properly indexed.

Your Next Steps

What should you do to ensure the JavaScript on your site isn’t impeding search engine crawlers? Here’s a rundown of the most important steps:

  1. Determine how heavy your website’s JavaScript resources are and what purpose they serve on the site. (Effects, content, functionality, etc.)
  2. Review your Search Console for any page rendering issues. Crawling tools such as Screaming Frog can aid in troubleshooting this, as well.
    • Search Console can show you how your website’s pages are rendering to Google bots. Screaming Frog can be configured to crawl and render JavaScript specifically.
  3. Disable JavaScript within your browser to troubleshoot what is and isn’t being properly rendered.
  4. Minimize your JavaScript requests. Combine them into a single file and prioritize execution.
  5. Implement a prerender service abiding by Google’s Dynamic Rendering Guidelines.

We Can Help!

Want more guidance? Request a free Google Visibility Crawl Audit by contacting us at info@netelixir.com. Our SEO team would be happy to review your site and discuss ways we can help you improve your organic performance.

We also invite you to download our brand-new Search Without Screens research report with data and insights about how today’s consumers are using voice search.

Stay tuned for even more updates from #SEOWeek 2019! If you missed our webinar on The Changing Landscape of SEO, you can read our recap or watch the complete session.

Request Consultation

Search The Blog

Categories

Why NetElixir?

18+

years experience

8+

years as a UPS Digital Connections Partners

1 of 27

agencies (out of 7,000+ U.S. agencies) in Google's Agency Leadership Circle

We love cookies!

Want results?

Let us show you how we can help.