Weekly SEO news: 26 January 2016 |
Welcome to
the latest issue of the Search Engine Facts newsletter.
Table of contents:
We hope that you enjoy this newsletter and that it helps you to get more out of your website. Please pass this newsletter on to your friends. Best regards, |
1. How to help
Googlebot to index your web pages in three steps |
Every website owner
wants to get as many high rankings on Google as possible. You can lead
Google and other search engine bots to the most important pages of your
websites. Make sure that irrelevant pages of your site do not clutter
the search results and that your important pages get the attention they
deserve.
Why should
you guide search engine robots? Important and valuable pages will be crawled every time Google visits your website if you guide the bot. Other pages such as the 'terms and conditions' page on your website are not important for search engine rankings. Search engine robots can ignore these pages that do not need to be ranked. Indexing time that is spend on these pages should be spent on the more important pages of your site. There are several different ways to guide Googlebot to the right pages: Robots.txt is the name of a text file that is available at www.your-domain.com/robots.txt. It tells search engines which parts of your website should be accessible. You should set disallow rules for all file types, folders and pages that you do not want to be indexed. Search engine robots check the robots.txt file before indexing the pages on a website. A good robots.txt file makes sure that unimportant pages aren't indexed. Examples can be found here. The meta noindex tag is another way to exclude individual pages from the index. In general, you do not need this tag if you have a good robots.txt file. 2. Make sure that all of your links work Search engine bots follow the links they find on your web pages. Broken links are a sign of low quality sites. You don't want Googlebot to waste time on crawling links to non-existing pages on your site. You can check the links on your site with the website audit tool in SEOprofiler. 3. Improve your website structure A
good website structure with clearly categorized page content increases
the likelihood that Googlebot and other search engine bots will find
the important pages of your site.
If you do the
things above, Google will index the right pages of your site. The
number of indexed pages is not as important as the quality of the
indexed pages. If the right web pages can be found for the right
keywords, you will get many more sales.The most important pages of your site should be available with a few mouse clicks. The website audit tool in SEOprofiler shows you how many clicks are needed to reach the pages on your site. The tools in our web-based website promotion tool SEOprofiler help you to achieve this goal. If you haven't done it yet, try SEOprofiler now: |
2. Search engine news and articles of the week |
Google: real time Penguin
algorithm should launch in weeks "Gary Illyes
from Google did it again,
he said on Twitter that he thinks the Penguin update that was expected
in 2015, that was then delayed due to the holidays and now was expected
this month should launch within weeks.[...]
It has been 15 months since the last official Penguin update, which was Penguin 3.0, we did see some shuffles to that update through December 2014 but nothing since."
"In a recent
Google+ hangout, Google’s John Mueller said that a website can lose its
trust from Google so that it does not pass PageRank value anymore:
'With regards to PageRank, I think the main issues that we see there are really if we recognize that this is a site that it doesn’t make any sense to pass any PageRank from. Then, on a site level, we might say, okay, we’re not going to pass any PageRank from here.'" "Google
Patent Search API, Google News Search API, Google Blog Search API,
Google Video Search API, Google Image Search API. We supported these
APIs for a three year period (and beyond), but as all things come to an
end, so has the deprecation window for these APIs.
We are now announcing the turndown of the above APIs. These APIs will cease operations on February 15, 2016." Google files patent to replace social media marketers "The idea is
quite simple: a link is
shared from one user to another. When the link is clicked, Google will
utilize a pop-up system that will suggest how to share the link on
social media.
To this end, Google understands that email is just one way of communicating, but wider distribution is required for 'sharing links and for engaging in a conversation,' the patent says." "The hack
resulted in code being injected on the site that Google identified as
an attempt at cloaking. Google did not recognise it as a hack, instead
flagging it as attempted webspam, and acted accordingly. Suffice to say
that, obviously, Google is not omniscient – it cannot determine whether
a site has been hacked or is intentionally trying to deceive."
Search engine newslets
|
3. Recommended resources |
SEOprofiler helps
you to get
high rankings on Google and more customers. It offers all the tools
that you need:
|
4. Previous articles |