|Weekly SEO news: 1 September 2015|
the latest issue of the Search Engine Facts newsletter.
Although you do not know it, some of your web pages might block Google. If Google cannot access all of your web pages, you're losing visitors and sales. This week's article explains five reasons why Google cannot access your pages.
In the news: rich snippets don't impact rankings,
Google only discovers links on pages that have a 200 status, Google's
new local pack shows in the number one spot 93% of the time, and more.
Table of contents:
We hope that you enjoy this newsletter and that it helps you to get more out of your website. Please pass this newsletter on to your friends.
1. Five reasons
why Google isn't indexing all of your web pages
Although you do not know it, some of your web pages might block Google. If Google cannot access all of your web pages, you're losing visitors and sales. Here are five reasons why Google cannot access your pages:
1. Errors in the robots.txt file of your website keep Google away
The disallow directive of the robots.txt file is an easy way to exclude single files or whole directories from indexing. To exclude individual files, add this to your robots.txt file:
To exclude whole directories, use this:
If your website has a robots.txt file, double check your robots.txt file to make sure that you do not exclude directories that you want to see in Google's search results.
Note that your website visitors can still see the pages that you exclude in the robots.txt file. Check your website with the website audit tool in SEOprofiler to find out if there are any issues with the robots.txt file.
2. Your pages use the meta robots noindex tag
The meta robots noindex tag enables you to tell search engine robots that a particular page should not be indexed. To exclude a web page from the search results, add the following code in the <head> section of a web page:
<meta name="robots" content="noindex, nofollow">
In this case, search engines won't index the page and they also won't follow the links on the page. If you want search engines to follow the links on the page, use this tag:
<meta name="robots" content="noindex, follow">
The page won't appear on Google's result page then but the links will be followed. If you want to make sure that Google indexes all pages, remove this tag.
The meta robots noindex tag only influences search engine robots. Regular visitors of your website still can see the pages. The website audit tool in SEOprofiler will also inform you about issues with the meta robots noindex tag.
Your pages send the wrong HTTP
The server header status code enables you to send real website visitors and search engine robots to different places on your website. A web page usually has a "200 OK" status code. For example, you can use these server status codes:
The website audit tool in SEOprofiler shows the different status codes that are used by your website and it also highlights pages with problematic status codes.
4. Your pages
are password protected
If you password protect your pages, only visitors who know the password will be able to view the content.
Search engine robots won't be able to access the pages. Password protected pages can have a negative influence on the user experience so you should thoroughly test this.
5. Your pages
It might be that
your web pages use
pages. Google can parse these pages to some extend but you're making it
unnecessarily difficult then.
How to find these problems on your website
In general, you want Google to index page pages. For that reason, it is important to find potential problems on your site. The website audit tool in SEOprofiler locates all issues on your site and it also shows you how to fix these problems. If you haven't done it yet, try SEOprofiler now:
|2. Search engine news and articles of the week|
| Displaying rich snippets in
Google search results doesn’t impact rankings
"So while you may want to keep trying to get those rich snippet results, having any of them displayed on your site’s search won’t give you any kind of ranking boost (nor will implementing structured data). But again, having rich snippets can often lead to an increase in CTR, especially if your competitors aren’t displaying it too."
"[Google] crawls a page, it returns a 200 status. But then Google has to analyze the page to determine if the page should be really a 404. So maybe at some point before the soft 404 label is associated with the page, Google may crawl the links on the page and pass link juice? But after the soft 404 label is associated with it, all of that goes away?
John Mueller said that Google will only crawl links on pages that return a 200 status code."
"Although the Partners on Google and BingAds are not necessarily killing your account, they could be having a strong enough effect on your numbers to take a second look and assess how you manage this challenging aspect of SEM.
Don’t rush into anything, thoroughly review the data and take granular decisions rather than blanket ones. If you’re campaigns have a capped budget and you’re hitting it, the simple act of turning off the Partners network could notably improve your KPIs but only if the data supports this."
"The local pack, before it changed to the 3-pack, used to show up in the number one slot in the Web search results only 25 percent of the time. Now, with the new 3-pack, the study says it shows up in the number one position 93 percent of the time."
Google My Business asking inactive accounts to verify local GMB information
"Google is planning to email business owners who haven’t logged into their Google My Business account for at least a year, and asking them to sign in and confirm their business information is still correct, or make any changes that are needed.
What about those businesses that don’t login? According to Google, if the accounts remain inactive, they 'run the risk of being de-verified, or in rare cases, removed from Google Maps'."
Search engine newslets
|3. Recommended resources|
you to get
high rankings on Google and more customers. It offers all the tools
that you need:
|4. Previous articles|