Weekly SEO news: 15 March 2016
Welcome to the latest issue of the Search Engine Facts newsletter.

Table of contents:

We hope that you enjoy this newsletter and that it helps you to get more out of your website. Please pass this newsletter on to your friends.

Best regards,
Andre Voget, Johannes Selbach, Axandra CEO

1. John Mueller: how Google indexes JavaScript sites

On his Google+ page, Google's John Mueller posted an updated on how Google indexes JavaScript sites and progressive web apps. Here are his recommendations:

Google and JavaScript sites

1. Don't cloak to Googlebot

  • Use "feature detection" and "progressive enhancement"  techniques to make your content available to all users.
  • Avoid redirecting to an "unsupported browser" page. Consider using a polyfill or other safe fallback where needed. 
  • The features Googlebot currently doesn't support include Service Workers, the Fetch API, Promises, and requestAnimationFrame.

2. Use the rel=canonical tag

Use rel=canonical when serving content from multiple URLs is required. Further information about the canonical attribute can be found here.

3. Avoid the AJAX-Crawling scheme on new sites.

Consider migrating old sites that use this scheme soon. Remember to remove "meta fragment" tags when migrating. Don't use a "meta fragment" tag if the "escaped fragment" URL doesn't serve fully rendered content.

4. Avoid using "#" in URLs (outside of "#!").

Googlebot rarely indexes URLs with "#" in them. Use "normal" URLs with path/filename/query-parameters instead, consider using the History API for navigation.

5. Check your web pages

Use Search Console's Fetch and Render tool to test how Googlebot sees your pages. Note that this tool doesn't support "#!" or "#" URLs. (You can also use the website audit tool in SEOprofiler to check your pages more thoroughly.)

6. Check your robots.txt file

Ensure that all required resources (including JavaScript files / frameworks, server responses, 3rd-party APIs, etc) aren't blocked by robots.txt. The Fetch and Render tool will list blocked resources discovered. If resources are uncontrollably blocked by robots.txt (e.g., 3rd-party APIs) or otherwise temporarily unavailable, ensure that your client-side code fails gracefully.

7. Do not use too many embedded resources

Limit the number of embedded resources, in particular the number of JavaScript files and server responses required to render your page. A high number of required URLs can result in timeouts & rendering without these resources being available (e.g., some JavaScript files might not be loaded). Use reasonable HTTP caching directives.

8. Google supports JavaScript to some extent

Google supports the use of JavaScript to provide titles, description & robots meta tags, structured data, and other meta-data. When using AMP, the AMP HTML page must be static as required by the spec, but the associated web page can be built using JS/PWA techniques. Remember to use a sitemap file with correct "lastmod" dates for signaling changes on your website.

9. Other search engines might not support JavaScript at all

Finally, keep in mind that other search engines and web services accessing your content might not support JavaScript at all, or might support a different subset.

In general, critical web page content should not be hidden in JavaScript. Google might be able to index the JavaScript content to some extent, but you will still have difficulty with other search engines.

The tools in SEOprofiler help you to get high rankings on Google and other search engines. If you haven’t done it yet, try SEOprofiler now:

Try SEOprofiler now

Back to table of contents - Visit SEOprofiler.com

2. Search engine news and articles of the week
A new open source search engine: Common Search

"The Web is now a critical resource for humanity, of which search engines are the arbiters. They decide which websites get traffic, which companies survive, which ideas spread. [...]

To be clear, there is nothing wrong with profit-seeking. It has been a tremendous driver for innovation, and will continue to be. What is wrong is not being able to choose an alternative.

This is why we are building a new kind of search engine: open, transparent and independent."

John MuellerGoogle’s John Mueller: a website with a Penguin penalty cannot be fixed manually by Google [video]

"We don’t do manual updates for a lot of our algorithms. We essentially wait for the algorithms to get updated, to run automatically. So there is no manual update that is possible in cases like this.

If you fixed all the issues, with the next update, then it will probably look a little bit better."

RankBrain: a study to measure its impact

"Predictably, one of the most common questions I get asked is how RankBrain will impact SEO. Truth be told, at the moment, there is not much impact at all. RankBrain will simply do a better job of matching user queries with your web pages, so you’d arguably be less dependent on having all the words from the user query on your page.

In addition, you still need to do keyword research so that you can understand how to target a page to a major topic area (and what that major topic area is). Understanding the preferred language of most users will always make sense, whether or not search engines exist."

GoogleSites with linking manual actions will lose rankings after action lifted

Google's  Juan Felipe Rincón said on SMX:

“With link penalties, removing links will reduce PageRank, which will reduce rankings after the manual action is lifted. [...] They just lose the inflated rankings from the artificial rankings, so they will rank where they should have been ranking without them."

The terrifying connection between malware, Google Search Console and AdWords

"Security warnings in Google Search Console (GSC) can be scary. Really scary. Whether your site was flagged for being hacked, serving malware, unwanted software or worse, security warnings in GSC can cause serious problems for your organization. [...]

Many advertisers have no idea this is even possible, but it is. If your site gets flagged for malware, your AdWords account can get suspended. So just when you need AdWords the most (since organic search will be taking a hit), you won’t have air cover from paid search."

Search engine newslets

  • Google updates mobile branded search with 'people also search for' option.
  • Google: Best practices for bloggers reviewing free products they receive from companies.
  • Google on why they use DMOZ: I have absolutely no clue.
  • Why research skills matter more than ever.
3. Recommended resources


SEOprofiler helps you to get high rankings on Google and more customers. It offers all the tools that you need:

  • automatic weekly website audits
  • tools for keyword research and analysis
  • competitive intelligence tools
  • automated ranking checks
  • website analytics
  • tools for spam-free link building
  • link disinfection tool
  • social media monitoring
  • white-label reports (web-based and PDF)
  • AdWords Profiler and Ranking Profiler
  • NEW: Domain Finder tool
  • and much more.

Back to table of contents - Visit SEOprofiler.com
4. Previous articles
Back to table of contents - Visit SEOprofiler.com