Weekly SEO news: 20 March 2018 |
Here are the latest website promotion and Internet marketing tips for you.
We hope that you enjoy this newsletter and that it helps you to get more out of your website. Please forward this newsletter to your friends.
Best regards,
Over the past few years, Google has dramatically improved the ability to index web pages that require JavaScript to display their content. Can you get high rankings with JavaScript-only web pages? If Google can index your JavaScript pages, will they also rank your pages?
How do web pages get into Google's index?
Step 1: Google crawls web pages with Googlebot
Before a web page can be listed on Google's search result pages, it has to be discovered by Google. Google uses a web-crawler with the name 'Googlebot' to discover new pages.
Googlebot fetches web pages, follows the links on these web pages, fetches these pages, follows the links, etc. Web crawlers such as Googlebot are simply programs that can analyze links and HTML code. Web crawlers do not render web pages. JavaScript is not executed on this stage.
Step 2: Google indexes the pages
Googlebot has the main role when it comes to find new web pages. Google's Caffeine algorithm is used to index the pages. In that phase, Google tries to render the pages, JavaScript on the crawled pages is executed with a web rendering service (WRS).
Unfortunately, it is not clear to what extent Google executes the JavaScript on the found pages. If rendering the pages shows that there are new links on the page that are only available through JavaScript, these URLs will be sent to the crawler.
Step 3: Google ranks the pages
After rendering the page, Google tries to understand the content of the found web pages. Depending on the content, the quality of the content and other factors (such as the external links that point to the web pages), Google will rank the pages.
JavaScript and SEO
Googlebot does not render the content of web pages. It just discovers new pages and parses the content of these pages. Caffeine renders the content and it processes the JavaScript on the pages.
Google can index and rank JavaScript to some extent. However, experience shows that there are still a lot of problems with JavaScript pages. For that reason, you should make it as easy as possible for Google to index your pages. Making your web pages and your JavaScript search engine friendly is important if you want to get high rankings.
Google recommends 'progressive enhancement'. That means that you should use only HTML for your web page content and then use AJAX (Asynchronous JavaScript And XML) to improve the appearance of your web pages. That's the best way to optimize your pages because Google can see the full content of your web pages in the HTML code, and users will get a good looking website.
If you use a framework such as Angular JS, use tools that prerender the pages to make sure that Google can index the full content of your web pages. If you rely on Caffeine, you cannot be sure if the full content of your pages will be parsed. Some content of your pages might be invisible to Google if it relies on JavaScript that is not supported by Caffeine.
What you should do now
Google says that they can index and rank JavaScript. Unfortunately, many webmasters cannot confirm this. In addition, most search engines and social networks do not render JavaScript.
For that reason, it is important that you deliver your web page content in plain HTML to search engines. The easier search engines can parse your web pages, the more likely it is that your content can be indexed correctly.
To make sure that your web pages can be parsed by all crawlers, check your web pages with the website audit tool in SEOprofiler. You can create your SEOprofiler account here:
Think twice before using nofollow attributes on your website
If you have to use a nofollow tag, use it as an attribute on specific links but not at a page level. Using nofollow at a page level just hurts you more than anything. It’s not a good idea."
Official: Google confirms ranking algorithm update
Focused updates are minor updates that deal with a single issue (page speed, HTTPS, etc.). Core updates address multiple things (such as the Panda update or the Penguin update)."
Google,
Apple face EU law on business practices
Under the proposal, operators of search engines, app stores as well as e-commerce sites such as eBay will have to specify upfront the 'most important parameters determining ranking', such as “specific signals incorporated into algorithms” and adjustment or demotion mechanisms.
The proposal will not force companies to disclose their algorithms but just provide descriptions at a general level explaining 'how and to what extent the relevant ranking mechanism takes account of the quality of the products and services offered'."
+++ SEARCH +++ ENGINE +++ NEWS +++ TICKER +++
- Google's forced SSL search page, encrypted.google.com, is shutting down.
- Once a competitive advantage, Siri now seen as liability for Apple.
- Google claims to remove 100 bad ads per second.
- Google adding business descriptions in Google My Business & local panel.
- France to sue Google, Apple over developer contracts.
- Google adds captions to image search.
- Improve your pages with a professional SEO website audit
- How the Ranking Monitor in SEOprofiler can help your business
- Don't panic: how to react to ranking changes
- How fast can you see results from your SEO campaign?
- 6 easy optimization tips that will improve your rankings on search engines
- Google explains featured snippets
- How to get high rankings in Google's local results (4 steps)
- View all past issues...
Do you need better rankings for your website?
SEOprofiler is a web-based search marketing tool that helps you to get better Google rankings. You can create a free SEOprofiler account here. The step-by-step manual will help you to get started as quickly as possible.