Weekly SEO news: 24 January 2017
Welcome to the latest issue of our newsletter!

Here are the latest website promotion and Internet marketing tips for you.

We hope that you enjoy this newsletter and that it helps you to get more out of your website. Please forward this newsletter to your friends.

Best regards,
Andre Voget, Johannes Selbach, Axandra CEO

1. Google: how to optimize the crawl budget of your website

In Google's official webmaster blog, Google's Gary Illyes wrote about crawl budgets and how they affect your website. Prioritizing the pages that should be indexed can help you to get high rankings for your more important pages. Two factors influence the crawl budget of a website:

crawl budget

1. The crawl rate limit

Crawling is the main priority of Google's web crawler. The crawl-rate limit represents the number of simultaneous parallel connections Googlebot may use to crawl the site, as well as the time it has to wait between the fetches.

The crawl rate is influenced by how quickly a website responds to requests. You can also limit indexing in Google's search console. Unfortunately, Google does not support the crawl-delay directive for robots.txt that is supported by many other bots.

2. Crawl demand

The crawl demand represents Google's interest in a website. URLs that are more popular on the Internet tend to be crawled more often to keep them fresher in Google's index. Google also attempts to prevent URLs from becoming stale in the index.

If a website moves to a new address, the crawl demand might increase in order to reindex the content under the new URLs.

The crawl rate limit and the crawl demand define the crawl budget as the number of URLs Googlebot can and wants to crawl.

How to optimize your crawl budget

Having many low-value-add URLs can negatively affect a site's crawling and indexing. Here are some low-value-add URLs that should be excluded from crawling:

1. Pages with session ID's: If the same page can be accessed with multiple session ID's, use the rel=canonical attribute on these pages to show Google the preferred version of the page. The same applies to all other duplicate content pages on your site, for example print versions of web pages. The duplicates will be ignored then.

2. Faceted navigation (filtering by color, size, etc.): Filtering pages by color, size and other criteria can also lead to a lot of duplicate content. Use the robots.txt file of your site to make sure that these duplicates aren't indexed.

3. Soft 404 pages: Soft 404 pages are error pages that show a "this page was not found" error message with the wrong HTTP status code "200 OK". These error pages should use the HTTP status code "404 not found".

4. Infinite spaces: For example, if your website has a calender with a "next month" link, Google could follow these "next month" links forever. If your website contains automatically created pages that do not really contain new content, add the rel=nofollow attribute to these links.

5. Low quality and spam content: Check if there are pages on your website that aren't that good. If your website has very many pages, removing these pages can result in better rankings.

If you do not block these page types, you will waste server resources on unimportant pages that do not have value. Excluding these pages will make sure that Google indexes the important pages of your site.

What does this mean for your web page rankings on Google?

It's likely that you do not have to worry about crawl budgets. If Google indexes your pages on the same day they are published (or a day later) then you do not have to do anything.

Google crawls websites with a few thousand websites efficiently. If you have a very big site with tens of thousands of websites it is more important to prioritize what to crawl, and how much resources the server hosting the site can allocate to crawling.

Crawling is not a ranking factor. There are many factors that are used by Google's ranking algorithms. The crawling rate is not one of them. The tools in SEOprofiler help you to optimize your website for these ranking algorithms. If you haven't done it yet, create your SEOprofiler account now:

Try SEOprofiler now

2. Internet marketing news of the week

DuckDuckGoDuckDuckGo: 10 billion private searches & counting

"[Privacy focused search engine DuckDuckGo] surpassed a cumulative count of 10 billion anonymous searches served, with over 4 billion in 2016! [...]

People are actively seeking out ways to reduce their digital footprint online. For example, a Pew Research study reported '40% think that their search engine provider shouldn’t retain information about their activity.'"

Google's Gary Ilyes: use nofollow when linking to bad sites

"Google's Gary Illyes said this morning on Twitter that it makes sense to use the nofollow link attribute when you link to bad sites. [...]

So when linking out to bad sites, and your gut makes you feel bad about it, use the nofollow [attribute]."

John MuellerGoogle' John Mueller: don't build links from Google Sheets

"Someone asked Google's John Mueller if they can use Google Sheets to build links. Basically, make a public Google Sheets page, post a ton of links on it and get Google to index the page. [...]

Of course, John Mueller was like - why are you wasting your time with that?"

+++ SEARCH +++ ENGINE +++ NEWS +++ TICKER +++

  • Google: International targeting in Search Console doesn't reduce visibility in other countries.
  • Bing testing “Send Directions to Phone” in local knowledge panel.
  • Google testing price trends for local hotel listings.
  • Former Google search chief Amit Singhal joins Uber.
  • Matt Cutts resigned from Google.
  • Google local shopping ads showing in pack style layout.
  • Google has search quality raters all over the world.

3. Previous articles