Weekly SEO news: 11 October 2016
Welcome to the latest issue of our newsletter!

Here are the latest website promotion and Internet marketing tips for you.

We hope that you enjoy this newsletter and that it helps you to get more out of your website. Please forward this newsletter to your friends.

Best regards,
Andre Voget, Johannes Selbach, Axandra CEO

1. How to avoid problems with duplicate content

Duplicate content can lead to ranking problems on Google and other search engines. Although there is no duplicate content penalty, your web pages might not rank as well as they could if their content can also be found on other pages.

duplicate content

What is duplicate content?

Google's definition of duplicate content:

"Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Mostly, this is not deceptive in origin. Examples of non-malicious duplicate content could include:

  • Discussion forums that can generate both regular and stripped-down pages targeted at mobile devices
  • Store items shown or linked via multiple distinct URLs
  • Printer-only versions of web pages

Google tries hard to index and show pages with distinct information. This filtering means, for instance, that if your site has a 'regular' and 'printer' version of each article, and neither of these is blocked with a noindex meta tag, we'll choose one of them to list."

If the content of a page can be found on multiple pages, Google only displays one of these pages in the search results. Unfortunately, this might not be the page that you want to see.

How to find duplicate content issues on your website

If your website contains duplicate content, the wrong pages might appear in Google's search results. For that reason, you should check your pages for duplicate content.

The easiest way to find duplicate content on your web pages is to use the Website Audit tool in SEOprofiler. The Website Audit tool checks all pages of your website and it informs you about errors that can lead to ranking problems.

Of course, duplicate content issues are among the things that the Website Audit tool finds. In addition to duplicate content, the Website Audit tool checks your web pages for many more things that can have a negative influence on your Google rankings.

How to fix duplicate content issues

There are several things that you can do to fix duplicate content issues. Of course, you can do nothing and hope that Google gets it right. Usually, this is not recommended. Better do the following:

  • Use the rel=canonical attribute to inform Google about the preferred version of a page. Details about the attribute can be found here. For most duplicate content issues, this is the best solution.

  • Redirect the duplicate URLs with a 301 redirect to the original page. In that case, the alternate pages won't be displayed at all. This doesn't work with print pages, etc. because you want to display these pages.

It's not necessary to block duplicate pages in your robots.txt file and it's also not necessary to use a noindex attribute on these pages. Just use the two tips above and you'll be fine.

In addition to the Website Audit tool, SEOprofiler offers many more tools that help you to get better rankings on Google and other search engines. If you haven't done it yet, create your SEOprofiler account now:

Test SEOprofiler now

2. Internet marketing news of the week

GoogleGoogle: reconsideration requests never worked for Penguin or other algorithms 

"I should point out that many many SEOs confuse link manual actions with Penguin and many cases there is an overlap of sites with Penguin issues that also have manual actions. But Google said before, Penguin penalties don't get manual actions notices.

Google also said time and time again that algorithms can never be reversed by submitting a reconsideration request. I asked Gary Illyes from Google this morning and he said, never ever did a reconsideration request reverse a Penguin or algorithmic penalty."


Google confirms some Google search algos look at history of page/site

"Now SEOs have long speculated that historical changes can have an impact.   While the most obvious one would be changes made to content, Google has also said that sites can see Panda recovery very quickly after improving the quality on individual pages. [...]

So the question remains of what other historical signals could be taken into account. Link (on site or off site) related? Schema related? Some specific content related signals? Or about 190 other possible signals it could be."


John MuellerExploring a newly-granted Google patent around social signals

"Every now and then, a patent comes across my radar that gets me excited, and one granted recently to Google fits that bill perfectly. [...]

In short, according to this patent, what people you’re connected to recommend, like and engage with could be used to impact your rankings.

While I haven’t yet seen any direct evidence of this at this time it makes sense and Google has toyed with similar systems in the past. The idea of boosting a restaurant that a friend of mine likes who lives in a city I’m visiting would have easily perceived advantages."

+++ SEARCH +++ ENGINE +++ NEWS +++ TICKER +++

  • Online ads need to be viewable for 14 seconds to be seen.
  • Google: Penguin recoveries are taking a few more days.
  • Google: 301 redirecting all pages to home page are seen as soft 404s.
  • AMP is one year old and growing fast, will it ultimately trump responsive design?
  • How often Google updates the various search algo ranking signals.
  • eBay acquires visual search engine Corrigon for less than $30M.
  • Only add structured data markup to individual products not categories or lists.

3. Previous articles