Google Webmaster Tools for Dummies
Google launched the Google Webmaster tools approximately 3 years ago. It allows webmasters to check the indexing status and optimize visibility of their websites.
In this post we will discuss the basic usage of the Google Webmaster Tools
When you log into Google webmaster tool, you will have the following dashboard :
I will consider that you verified your site by adding a meta tag into the header or by uploading a file that Google will give you.
You can add as much sites as you want,but be sure to verify them all
Now under Site Information, click on The Site URL …You will see a new dashboard :
As you can see, we have a general overview of the site. The important things for you to look at are the Web Crawl Errors. You don’t want to have 404 errors on your site or any other kinds of errors.
When you see an error, click on the Details link and you will have more info. In my case, I have 260 NOT found Errors; which is very bad. But this is normal since the site is still under construction …
Your goal is to reduce the number of errors to 0. Start by checking the URL and see why is it causing a 404 errors. In case you don’t want that page anymore on your site :
- Click on the Linked From
- Delete all the references to the 404 page
In case you need that page, you will have to debug and see where the problem is coming from.
Another important point is that you have to check the Problem Detected on. You may have fixed your page, but if Google crawled the page before, you will still see an error until Google crawls it again (Patience is the key).
Your next step is to click on Diagnostics -> Content Analysis
While we were crawling your site, we noticed some issues with the content of your pages. These issues won’t prevent your site from appearing in Google search results, but paying attention to them can provide Google with more information and even help drive traffic to your site. For example, title and meta description text can appear in search results, and useful, descriptive text is more likely to be clicked on by users
As Google mentioned, those issues won’t prevent the site from appearing in the SERP… But it would be better if we solve those issues:
Duplicate Meta Descriptions : instead of having a static description, each page should have a description
Long and short Meta Description : The best would be to write between 1 sentence and 2 sentences (~ 150 chars)
Duplicate Title Tags : Try to add more information on the title ex : DOMAIN TITLE – 3 words
Long and short Title Tags : ~70-90 chars
Non-informative title tags : Better describe a page and improve click through rates on SERP
Let’s jump to the Setting section :
Pretty self explanatory
Geographic target: You would use this only if you target specific geographic users (Ex: You have a restaurant in Canada and you don’t care about other users because that won’t get you any benefits).
Preferred domain:The preferred domain is the one that you would like Google to use to index your site’s pages.
Image search: Might get you some traffic
Crawl rate: Better to be left as it is; except if your server is getting a lot of traffic and Google bot is causing server issues.
Now let’s go to the Sitemap….This is a very cool tool from Google; it allows you to help Google crawl your website and tells Google when you have new contents.
Very easy to be understood is that all what you have to do is to submit an xml sitemap. And that’s it!… Google will do the rest…
The Next Menu Item is : Links
Pages with external links : See which pages on your site have links pointing to them from other sites. (This is good because you can know who’s linking to you and what they are saying about your site)
Sitelinks : See which links on your site have been identified as candidates for appearing directly in Google search results. ( I need to search more on how we can add new sitelinks; currently it’s automated by Google)
Pages with internal links : See which pages on your site have links pointing to them internally from elsewhere on your site.
The Last option in the Google Webmaster Tool is the TOOLS
This is an important part of the Google webmaster tools. Google offers many options :
Generate robots.txt : Now it’s easy for beginners to generate the robot.txt file. All what you will have to do next is to upload that file to the root of your domain .
Analyze robots.txt : In that area, you can see the urls blocked by your robot.txt
Enhance 404 pages : Better 404 pages, in case of a user typing a wrong url ,you can offer him the ability to see links that are close to what he typed .
When a user clicks on a link to a page that’s no longer available on your site, your server returns a 404 (Page Not Found) error. Because generic messages can be frustrating to the user, we recommend making a custom 404 page to provide more useful information about your site
This makes it easier for users to find the information they need (and makes it less likely that they’ll leave your site to look elsewhere.)
Remove URLs : no one usually wants to use that function … except if you really have cached pages that you don’t want to show to the public.
The Other options in Google Webmaster Tools are not for dummies, maybe later I will have another post related to those .
For now, if you have any questions… feel free to comment; and if you don’t agree with what i wrote don’t be shy to tell me about it