A web crawler, also known as a “spider” or “bot,” is a program that visits a website and reads its pages in order to develop entries for a search engine’s index. If a page isn’t indexed correctly, the search engine won’t be able to provide it for relevant queries and its organic search ranking will suffer.
Ask yourself these questions about your own site:
- At what point in rendering the website are these resources being executed?
John Mueller, Google’s SEO guru, stated that if a website isn’t rendered within 2-3 seconds, the crawler will likely bounce off and not index the website in that wave. This puts you at a disadvantage since you can’t be sure when the next wave of indexing will occur.
Prerender is a middleware you install on your server to check whether each request is from a crawler. If so, the middleware will send a request to serve the static HTML version of that page. If not, the request will continue on to normal server routers. The crawler never knows you’re using a prerender server since the response always goes through your server. This allows all content to render correctly within the browser and all on-page content to be properly crawled and indexed accordingly.
Benefits of Prerender
- Pages are rendered about twice as fast.
- TTFB (time to first byte) is reduced.
- Memory usage becomes more efficient.
Your Next Steps
- Review your Search Console for any page rendering issues. Crawling tools such as Screaming Frog can aid in troubleshooting this, as well.
- Implement a prerender service abiding by Google’s Dynamic Rendering Guidelines.
We Can Help!
Want more guidance? Request a free Google Visibility Crawl Audit by contacting us at firstname.lastname@example.org. Our SEO team would be happy to review your site and discuss ways we can help you improve your organic performance.
We also invite you to download our brand-new Search Without Screens research report with data and insights about how today’s consumers are using voice search.