Google Webmaster Tools (GWT) is the primary mechanism for Google to communicate with webmasters. Google Webmaster Tools helps you to identify issues with your site and can even let you know if it’s been infected with malware (not something you ever want to see, but if you haven’t spotted it yourself, or had one of your users tweet at you to let you know, it’s invaluable).
The “Google Webmaster Tools” is a set of tools for web publishers looking to optimize the positioning of their websites on the Google search engine. The Google Webmaster Tools covers the different aspects of SEO:
The Google Webmaster Tools is a dedicated tool for website a publisher that provides access to useful information for search engine optimization.
This service is linked to the Google services (e.g Gmail, Google Analytics), its use requires a Google Account (simply create a Gmail address). Authenticate on Google Webmaster Tools
Once on the home page of your profile, enter the URL of your website and click on “Add a Site”:
At this stage, you can determine the form under which you want your site to be indexed by Google: with or without the “www” prefix. Click on “Continue” and validate your website.
There are several methods to validate a site, that is to say, confirm its ownership. The method recommended by Google is the insertion of a HMTL validation file:
It provides an overview of the key status indicators of your website in terms of indexing and positioning:
Among the tools available:
The “Health” section of Google’s Webmaster Tools program is where things start to get interesting. This area provides tons of information on how well the Googlebot is able to interact with your website, so there are a few particular features you’ll want to pay attention to:
While the specific results shown here might not influence your day-to-day SEO strategies, it’s still a good idea to check these individual tools every so often to ensure that your site is still functioning normally.
Crawl errors shows you issues Googlebot had in crawling your site. This includes response codes (404s, 301s) as well as a graph of the errors over time. This is a fantastic resource for spotting broken links, as the URL shows up as a 404 error. You can see when Google first detected the error codes and download the table of errors into a spreadsheet.
Pages crawled per day is a good SEO metric to track over time. You can get some insight from the chart, but this is a metric to check in on and record every week. Ideally you want that number continuing to climb, especially if you are adding new content.
Fetch as Googlebot will return exactly what Google’s spider “sees” on the URL you submit. This is handy for spotting hacked sites as well as seeing your site the way Google does. It’s a good place to start an SEO audit.
The really neat feature that’s new this year is “Submit to Index”. Ever made a title tag change and wished Google would update its index faster to get those changes live? ‘Submit to Index’ does just that. 50 times a month you can submit a page to update in near real-time in Google’s index. Very handy for testing on-page changes.
The robots.txt Tester tool shows you whether your robots.txt file blocks Google web crawlers from specific URLs on your site. For example, you can use this tool to test whether the Googlebot-Image crawler can crawls the URL of an image you wish to block from Google Image Search.
You can submit a URL to the robots.txt Tester tool. The tool operates as Googlebot would to check your robots.txt file and verifies that your URL has been blocked properly.
For many webmasters, the bulk of their Webmaster Tools activities will occur in the “Traffic” section, where Google reports the most detailed information about how your search presence results in web traffic. As a result, all of the individual features here deserve special mention: In this section, several important information to optimize SEO analysis of keywords, positioning of your web pages and backlinks.
You can use the URL Parameters tool to indicate the purpose of the parameters you use on your site to Google. For example, if you are the owner of a global shopping site, you might tell Google that you use the country parameter to distinguish between pages dedicated to consumers in different countries. Then you can set preferences for Google might crawl the URLs that contain those parameters. The preferences that you set can encourage Google to crawl the preferred version of your URL or simply prevent Google from crawling duplicate content on your site.
In general, URL parameters fall into one of two categories:
The Search Queries area gives you both traffic and keyword information. The Search Queries section is broken down into five main indicators:
Query gives you details on what keywords your site is currently ranked for. This is one of the fastest ways to decode whether your efforts to get on the map for a specific keyword are working. It’s important to remember that “rank for” means showing up in the SERPs – not necessarily actively attracting traffic (yet). This feature can help you quickly identify keywords that are relevant, but need a boost from linking or further content optimization.
If you’ve ever wondered how many people are seeing your website for a specific keyword search, this will tell you. This metric gives you a good sense of how many people are seeing specific sections of your content. This is another way to confirm the value of a keyword in addition to traffic data from the Google Keywords module and other tools.
This information lets you know how many searchers seeing your site are taking action and clicking on your search result.
Your CTR, or click-through rate, is the percentage of people that are clicking on your site in the search results. If your click-through rates are low, look at whether you can improve your meta description for that page. Can the content be made more relevant to the queries driving the most impressions for that page, or can you add a stronger call to action in the title tag or meta description? You should also consider the broader search landscape: is PPC activity driving traffic away from your branded results in the search, for example?
This metric tells you where your site typically ranks for each keyword. Since the vast majority of traffic goes to the sites in the top 2 positions for a given term, it’s helpful to see how you’re performing and the impact that has on your traffic.
The link data here keeps getting updated faster and faster. When this was first launched earlier this year the delay on finding links was around three weeks. Pretty good report for diagnosing internal link issues. This tool is nothing fancy but URLs are sorted by most internal links. Use this to diagnose pages on your site that should be getting more internal link juice.
Remove URLS from index
Use this tool to generate a custom robots.txt file that specifies exactly how the search engine’s automated spiders should read and parse your website’s content.
A quick glance at the information found here will let you know whether or not changes should be made to your website’s Meta tags.
The Content Keywords tab, found under the Google Index section in the left hand navigation, lists the most common keywords Google found when crawling your website. By extension, these are the keyword queries that most often display search results from your website. Keep an eye out for unexpected keywords on this list as this will warrant further investigation and, very likely, corrective action.
If you make use of microdata on your website (and you really, really should), check this page to track the specific types detected on your website.
Data Highlighter is a webmaster tool for teaching Google about the pattern of structured data on your website. You simply use Data Highlighter to tag the data fields on your site with a mouse. Then Google can present your data more attractively – and in new ways – in search results and in other products such as the Google Knowledge Graph.
If your site contains event listings you can use Data Highlighter to tag data (name, location, date, and so on) for the events on your site. The next time Google crawls your site, the event data will be available for rich snippets on search results pages:
Schema.org is a joint effort, in the spirit of sitemaps.org, to improve the web by creating a structured data markup schema supported by major search engines. On-page markup helps search engines understand the information on web pages and provide richer search results. A shared markup vocabulary makes easier for webmasters to decide on a markup schema and get the maximum benefit for their efforts. Search engines want to make it easier for people to find relevant information on the web. Markup can also enable new tools and applications that make use of the structure. Website: https://schema.org/
Our goal is to reduce the rising skill gap and make digital marketers which can assist brands and businesses drive exponential growth. Learn Digital Marketing Courses From The Digital Sandbox (TDSB)