SEOprofiler newsletter
Welcome to the latest issue of our newsletter!

Here are the latest website promotion and Internet marketing tips for you.

We hope that you enjoy this newsletter and that it helps you to get more out of your website. Please forward this newsletter to your friends.

Best regards,
Andre Voget, Johannes Selbach, Axandra CEO

1. How does Googlebot render your pages?

Google has published a new page on which Google explains how Google renders web pages for indexing.

Googlebot uses a web rendering service (WRS) that is based on Chrome 41 (M41). Generally, the WRS supports the same web platform features and capabilities that the Chrome version it uses. However, there are a few exceptions and differences that you should keep in mind.

Features and APIs that are explicitly disabled or limited

1. The web rendering service and Googlebot don't support WebSocket protocol

Googlebot and WRS only understand HTTP/1.x and FTP, with and without TLS. Use the website audit tool in SEOprofiler to check the HTTP version of your website:

HTTP version

2. The web rendering service disables some interfaces and capabilities

  • IndexedDB and WebSQL interfaces are disabled.
  • Interfaces defined by the Service Worker specification are disabled.
  • WebGL interface is disabled; 3D and VR content is not currently indexed.

If you want to find out if Googlebot's web rendering service supports a certain block of code, use feature detection.

3. Googlebot and the web rendering service are stateless across page loads

Googlebot's web rendering service loads each URL of your page, following server and client redirects, just like a regular browser. However, the web rendering service does not retain state across page loads:

  • Local Storage and Session Storage data are cleared across page loads.
  • HTTP Cookies are cleared across page loads.

If a page on your website requires a cookie that was set by another page on your website, this won't work.

4. The web rendering service declines permission requests

Any features that requires user consent are auto-declined by the Googlebot. For example, Camera API, Geolocation API, and Notifications API. A full list can be found here.

There is an easy way to check your pages

Making sure that Googlebot can index your web pages can be a complicated and time-consuming task. Fortunately, there is an easy way to check your pages. The website audit tool in SEOprofiler checks all pages of your website and it also shows you the things that you have to change so that Google can index your web pages correctly.

When you get an okay from the website audit tool in SEOprofiler, you can be sure that Google and other search engines can index your pages. If you haven't done it yet, check your pages now:

Check your website now

2. Internet marketing news of the week

John MuellerGoogle: serving content by user-agent language is a bad idea

"On Twitter, Google’s John Mueller said that it was a bad idea to server content based on the user-agent language setting.

Most desktop browsers have settings that allow you to check or change the language preference settings. These tell the server what language you prefer for pages and resources that it sends you (separately from the language of the browser user interface). This setting will be used in the user-agent information.

Unfortunately, John Mueller did not explain why it was not a good idea to server content based on the user-agent language."


Google’s John Mueller: it does not matter if your website is with or without www

"On Twitter, Google’s John Mueller said that you can use your website domain name with or without www. For Google, both versions are the same.

Of course, you should focus on one version of the domain. Use redirects and/or the canonical tag to show Google the preferred version of the domain name."


John MuellerGoogle’s John Mueller: the size of your CSS files doesn’t matter

"On Twitter, Google’s John Mueller confirmed that the size of a CSS file has no influence on the rankings of a website. [...] Google can handle very large files and the size of a CSS file is usually not a problem for Google."


NY Times: Google doesn’t want what’s best for us

"[Google] has altered our notions of privacy, tracking what we buy, what we search for online — and even our physical location at every moment of the day. Every business trying to reach mass-market consumer demand online knows that Google is the gatekeeper. [...]

Because tools like Google and Facebook have become so essential and because we have almost no choice in whether to use them, we need to consider the role they play in our lives. By giving networks like Google and Facebook control of the present, we cede our freedom to choose our future."

+++ SEARCH +++ ENGINE +++ NEWS +++ TICKER +++

  • Google does not use the geo meta tags on your web pages.
  • Bing testing Google-like search header at top of search results.
  • Google testing ad tag placement in Google Local Packs.
  • Google tests "topics" search refinements.
  • Google has updated the list of reasons for reporting inappropriate business photos.
  • Facebook: addressing cloaking so people see more authentic posts.
  • Google still ignores last-modified meta tag.
  • Google won't share the number of search quality algorithms.

3. Previous articles


SEOprofiler.com, Axandra GmbH, Nordring 21, D-56424 Staudt. Managing directors: Andre Voget, Johannes Selbach, Amtsgericht Montabaur, 6 HRB 6339.

SUBSCRIBER INFORMATION: You're receiving this newsletter because you (1) subscribed to it on our website or (2) in our software programs or (3) because you submitted your website URL to AxxaSearch.com. If you don't want to receive this newsletter click here.