Having your site correct from a technical SEO perspective is a major step towards having your site rank well for the search terms you are targeting. A site that is correctly built from a technical perspective from day one is a great achievement and will allow your ecommerce site to start from the strongest possible base.
As you read through the module you will see how important these technical aspects of your site are and how just one of these aspects being incorrect can have catastrophic impact on the success of your site.
Toolkits, Resources & Downloads
Creating Your Webmaster Tools Account
If you already have a Google account then creating a webmaster tools account is very straight forward. Log in to your Google account and type into Google the search term ‘Webmaster tools’.
Clicking this will take you through to the webmaster tools sign up page.
You then enter the full URL of your homepage into the box and press the Add Property button. You will then be presented with a number of different authentication methods to verify your webmaster tools account.
The recommended verification methods is via an HTML file upload. This can be done via uploading the file to your sites FTP. There are alternative methods available if this method is not for you.
The first alternate method is placing a HTML meta tag onto the homepage of your site. The code displayed just needs to be copied and placed into the <head> section of your homepage. Once this is done go back to the screen shown above and click Verify.
The third alternative method is using your Google Analytics account to verify webmaster tools. You can see how to setup Google Analytics in our Visitor & Conversion Tracking With Google Analytics module. If Google Analytics has been set up this is the easiest verification method available.
The final alternative method is using your Google Tag Manager account if you have one. Again a very easy method to verify if the account exists but the Google Tag Manager system is quite advanced so you should only use this if you have past knowledge of similar software.
Now you have verified your site you will be able to see the screen below:
Navigating The Webmaster Tools Interface
Now you have successfully set up your webmaster tools account you will be able to see the dashboard screen providing you with critical information for your account. The left hand side of the screen is the main navigation menu containing the following sections:
Dashboard – This is effectively the webmaster tools homepage. Displaying snippets from some of the most important and most used sections of webmaster tools including, messages received from webmaster tools, crawl errors, search analytics and also your sitemap information.
Messages – This opens your message inbox in which webmaster tools will send through any critical notifications such as your account being linked to a new Google Analytics web property (shown in the image below) to issues with your account and also manual penalties given by Google to your site (which hopefully will be a message you never receive).
Search Appearance – Search appearance breaks down into four subsections – Structured Data, Rich Cards, Data Highlighter, HTML Improvements and Accelerated Mobile Pages. The i icon in the circle displays the following pop-up page providing you with an overview of Google’s search appearance.
Structured Data – The structured data page breaks down your sites structured data elements by both page and items. It also the displays the various data types and their sources and provides details on both.
Rich Cards – These were introduced in 2016 in the US and became globally available at the start of the 2017. There purpose is to attract more engaged users to your site. They do this by having search results appear at the top of a page with a image and reviews of the product/company.
Data Highlighter – The data highlighter allows you to influence the key information displayed to browsers on the search results page. You can highlight information relating to any of the following; articles, book reviews, events, local businesses, movies, products, restaurants, software applications, TV episodes. It is good to highlight what you feel will bring the biggest section of browsing traffic to your site and can be used as the perfect hook-in to entice browsers.
HTML Improvements – The HTML improvements section displays any issues with your metadata. Showing information on your meta titles and meta descriptions including duplicate, missing, long, short and non-informative meta titles and also duplicate, long and short meta descriptions. (We look at this section in more detail below)
Accelerated Mobile Pages – This section of webmaster tools is for optimising your mobiles pages for your website. It makes your load speed for mobile pages usually much more responsive.
Search Traffic – The search traffic section contains the following features; search analytics, links to your site, internal links, manual actions, international targeting and mobile usability.
Search Analytics – The image above shows the search analytics page which is a stripped back version of Google Analytics. It displays site information on clicks, impressions, click-through rates and positions. These can all be filtered on search type, dates, devices, countries, pages and queries. It also displays a breakdown of the information selected below the graphs.
Links To Your Site – This is a great feature which shows you the websites which are currently linking to you. This is a great help if you are undertaking a link removal campaign as it shows you the links which are being analysed by Google. It also displays information on your most linked content so you can see which of your content campaigns has been the most successful from a link gaining perspective.
Internal Links – The internal links function shows the most linked pages internally within your site. Internal linking is an important attribute you can use to pass link equity through your site to the pages you feel which need it most and this function will give you an idea of which pages are currently receiving the most benefit from internal linking.
Manual Actions – This screen is one which you want to see with a message like the one shown below. If your site does pick up a manual action penalty then this screen is the one that will tell you why your site has been penalised and will give examples of what has caused the infraction.
International Targeting – The international targeting tool checks that hreflang tags have been properly placed into your site if you have a multilingual platform and are targeting the correct language and country on the correct page.
Mobile Usability – This is a fantastic section as it will point out major usability issues that occur on mobile and tablet devices. With mobile traffic now overtaking desktop traffic having a site without any mobile usability issues is now key and this section will help you to achieve this.
Google Index – The Google index section contains four options; Index status and blocked resources and remove URL’s.
Index Status – The index status page shows the number of URL’s currently indexed in Google’s search engines. This feature is a great indicator as to which parts of your site browsers will actually be able to find in search engines and having very low numbers can indicate major technical issues with your site.
Blocked Resources – This section tells you which resources are currently blocked from indexing. Some of these resources being blocked can cause major issues for your site. The page lists all of the blocked resources and the number of pages affected by this resource being blocked. Clicking into this resource will display a list of the pages being affected by this.
Remove URL’s – This section displays URL’s that you have removed from the site within the last six months. It also gives you the option to temporarily hide URL’s (this should not be used as a removal alternative).
Crawl – The crawl section includes the features; crawl errors, crawl stats, fetch as google, robots.txt tester, sitemaps and url parameters.
Crawl Errors – The crawl errors section shows data from the last 90 days. Displaying the current status of your DNS, Server connectivity and robots.txt fetch in the site errors section and also showing server errors in the url error section. It will also display the pages affected by the errors.
Crawl Stats – The crawl stats screen displays google bot’s activity in the last 90 days showing how many pages have been crawled, kilobytes downloaded and time taken to download in three graphs.
Fetch as Google – This tool is great as it allows you to force a crawl of your site by Google. This is very handy if you have just launched a new section of your site or have updated a lot of content. Forcing a crawl of the site can speed up your pages indexing. It also gives you four options to fetch as a desktop, mobile: smartphone, mobile: XHTML/WML and mobile: cHTML. This allows you to see how the crawl will read your site on different devices.
Robots.txt Tester – This tool is fantastic for testing your robots.txt file to make sure any url you do not want to be crawled by search engines or other bots are blocked out. It also allows you to test to see if URL’s you do want to be crawled are accessible to the robots.
Sitemaps – This tools analyzes any sitemaps associated with your website and displays information about your number of submitted urls to the number of urls actually indexed. This allows you to see if certain sections of your site are yet to be picked up by the google search bot yet. Unless you have blocked out urls from being indexed the aim is to have the two numbers matching as closely as possible. There is also a facility to test a sitemap before it is uploaded to your website.
URL Parameters – This is an advanced feature and it is advised that this section be left alone unless you have a deep understanding of parameters as the misuse of this section can lead to parts of your site disappearing from the search results.
Security Issues – This section will alert you if Google has noticed any security issues or threats with your sites content.
Identifying And Repairing HTML Errors
HTML errors can cause major site issues for you so quickly identifying and repairing these issues are key. Google webmaster tools is a great place to start correcting HTML errors as it has the HTML improvements screen which can be used as a guide to fixing some of the key elements.
To begin this go to your webmaster tools and access the HTML Improvements screen.
From there you will be able to see this screen:
In the example above it shows that there are 196 duplicate title tags, 194 duplicate meta descriptions and 104 short meta descriptions. These will be causing major issues for the site and need to be corrected. The first error to deal with is the Title tag errors.
In this example there are only duplication errors to amend and they are the most important errors to address as this will affect more than one page of your site.
Click into the Duplicate title tag section to display which pages the errors are occurring on.
From here you will be able to see which pages contain the duplicate title tags. You can work on altering the title tags for the pages so that each has their own unique title tag.
There will be occasions where the duplications shown will be a url e.g. www.demodomain.com/page1/ and www.demodomain.com/page2/ which are being flagged up as having duplicate title tags. To deal with these pages you can use the rel=”next” and rel=”prev” to let the search engines know that these are all linked together and showed be deemed as one continuing page.
Missing title tags are a fundamental error which needs to be addressed. Not giving a page a title tag is a massive missed opportunity for targeting that page at its most relevant audience. A title tag should be added to every page of your website. Even add a title tag to pages such as /checkout and /privacy-policy just stating what these page are.
Short title tags often represent missed opportunities for a page to compete for possible search traffic. Remember that a title tag should be a maximum of 60 characters and try to shuffle the order of keywords around to bring the title tag as close to this limit as possible. Webmaster tools will not display title tags that are only short by a few characters so all the pages displayed in this section should have the opportunity to be expanded and new potential audiences can be reached with this page.
Long title tags are something that should be avoided as after the 60 character limit search engines stop reading the tag so anything after this is discarded. Any search terms or phrases that come after this will be discarded by the search engines. If you find that you are a few characters over with your title tag, include the tag and the opening of your meta description and this help to have the overlimit part of the tag picked up by the search engines. This trick will not work if you are 20 characters over the limit so as a rule try to keep your title tags under the 60 character limit.
In the example shown above the site also has duplicate meta description issues. this can cause similar problems to having duplicate title tags and can be dealt with in the same way. Unique meta descriptions give each page of your site the chance to give users further information about what they will see if they click through to the page from a search engine, so displaying a unique, enticing message can gain your site that crucial extra percentage of the search traffic. You can also use the meta description to reinforce the search terms targeted in the title tags and to target some secondary terms.
Long meta description errors are caused when meta descriptions go beyond the 160 character limit. Search engines do not display anything past the 160 character mark so anything beyond this is a waste. This does not cause any really issues with the search engines but it does mean that your site will not be displaying a clear message for the page which can put search browsers off so it is advised to stall under the 160 character limit.
Short meta descriptions are not really errors but more missed opportunities to promote the pages of your site. With 160 characters to work with, most pages can easily have an enticing meta description created which ties in secondary search terms for the page increasing the likelihood of the page ranking well in the search engines.
Note: Any changes made to title tags or meta descriptions should be documented in the keyword map that was created in the Drive Targeted Traffic With Effective Keyword Mapping module previously undertaken.
Identifying And Repairing Structural Errors
Structural errors are errors with the urls of your site. These can be created automatically by the platform you are using or manually by using a non-logical hierarchical structure as the base for your website.
The structure of your website is key to its success as your site visitors need to be able to easily navigate the site to find your products in order to purchase. A good site structure also means that link equity can easily flow down throughout your site.
Examples of site structure
A good site structure looks like the following:
With this site structure there is a clear path from home page to product page. Each section of the structure should have a clear name and purpose like this example:
The url shows the product name as7700, which is found in the all season subcategory, which is under the tires category. The clear path structure is easy to see. Now compare this to the following:
It’s obviously about tires, but we have no idea what “cat2″ or “prod_11″ are. This style of URL structuring was very common in the early days of the internet.
Here are some tips for how to properly structure your urls:
Use dashes to separate words in the url – A URL of “history-of-automobiles.html” is better than “historyofautomobiles.html”. This is the search engine recommended way to separate words in URLs
Avoid Complex URL’s that point to the same page – Search generated URLs with too many filters can create an excess amount of links and also produce long complex URL’s
Choose your prefix and stay with it – Search engines treat https://www.example.com and http://www.example.com as different URL’s so you should choose the one you want use and 301 redirect the other url to it. It is recommended to use https going forward as it offers an extra level of security to your site which will make visitors feel safer.
Always use relevant category, page, post and products names for each part of your URL. As shown in the example above, the difference of doing this to not doing this.
Where possible use static urls over dynamic urls – Dynamic urls have their use and can be essential for some sites, but they can also cause unnecessary complications which can be avoided by using static urls where ever possible.
After your have restructured your urls there are a few steps you must follow for your hard work to pay off:
Set up 301 redirects from the old urls to your newly created urls as this will pass any page authority previously gained on the old url and means you will not be starting from scratch on the new url.
Update your sitemap and do not forget to re-submit it once it is updated.
Make sure you update all internal links within your site to avoid sending visitors to 404 pages.
If you follow the guide above you will have no issues with your site structure and will be in a great position to achieve the rankings needed to make your site a success.
Other Key Elements In Webmaster Tools
There are many key elements with webmaster tools which can be used to monitor and improve your sites performance. We have explained some of these in more detail below:
Structured Data – Structured data markup is a standard way to annotate your content so that search engines find it easier to read, manipulate and recycle to its browsers. With the use of structured data search engines are clearer on the purpose of your content and also the target market of this content so it can deliver as a result to the most appropriate user. The most common structured data vocabulary is schema.org. Google uses structured data to find the best information on topic to use as part of its knowledge graph. This then means your content could feature as an authority within Google products for the topic it is based on.
Structured data is also used to create rich snippets which are featured on google’s search search results page. These can be a great help in increasing the appeal of your result in the search engines leading to attracting more traffic to your site.
Other protocols such as the open graph protocol for Facebook and twitter cards for twitter can also be included in structured data. Both these social platforms use their protocols to improve how people share pages on their individual platforms. This allows you greater control of the information displayed on these platforms for your pages.
A template for basic code for this can be found in the toolkit section above.
Data Highlighter – The data highlighter tool has the same purpose as structured data, it is designed for you to make search engines (in this case Google) aware of the key information on your pages, information that you feel will help browser’s find exactly what they are looking for in the search engines.
The data highlighter offers you the opportunity to markup your pages content and categorize them into the following types articles, book reviews, events, local businesses, movies, products, resturants, software applications and TV episodes. the screenshot below shows the initial form to be filled in:
You first enter the URL you want to use the data highlighter on, then select the type of information you will be highlighting and then where the highlighted information can be applied to other pages like this one or just for this page alone. Once this is filled in press OK.
The above screen will then appear to the right of the screen. If you have previously highlighted data for a page this will appear here.
If you have chosen to highlight this page and others like it then the following bar will appear at the top of the screen:
You can now begin to tag the page by simply highlighting the information and the following box will appear:
You then find the most appropriate type of data and select. The highlighted information will then appear in the chosen type on the right side of the screen. This can be removed by clicking the cross that appears, if a mistake is made.
Continue this for all relevant content on the page then press publish if this is the only page you are highlighting and finish if you chose the pages like this option.
If you chose just this page then the process is complete. If you chose the pages like this option a screen will appear suggesting other pages of your site like the one you have just selected or allow you to create your own page set.
It will bring up some pages that it considers not similar, if one of these pages appear you can simply click remove page so that the highlighter is not applied to this page.
This process will continue until the tool has displayed five pages where the highlighter once this is done press finished.
This brings you to the final page, which displays the pages that the highlighter will be applied to as well as the option to delete the page set, edit the example pages, go back to webmaster tools or publish the highlighted data.
If the above warning appears it is wise to either continue tagging or delete the page set and start again.
Now you should be well in the know with Google Webmaster Tools and how to use it for SEO. If you have questions regarding the article please comment or feel free to get in touch with us at:
Facebook: Integrity Commerce
Image source credits go to: