Said ‘ouch’ after Panda and Penguin updates? If you do not want any member of the animal kingdom hurting your site just because you did bad in your SEO, then read on to get not just an idea on how to do it correctly but also improve your rankings exponentially. There are three areas that you would want to cover, and we will be discussing them one by one.
Everything begins in doing research. Look at how well your website is doing without taking any action in SEO first. Add Google Analytics and Google Webmasters tool to analyze the quality of your visits (if you still don’t have one). Get as much data in the span of two weeks before looking for keywords that “might” help you in working out the best SEO project—ever.
Find out how many links are linking and what keywords are being used by other sites to link in your site and as well as what keywords are being used by your users to be able to find your website in search engines. This is very important to websites that use unrelated or irrelevant words in their domain names but brands or company names because they want to find out where all the visitors are coming from.
Why are you doing this? It is because you want to specifically target consumers that are willing to view your products and services, and of course make a reservation, purchase, and refer your website to others.
When you already have a handful of search terms after letting Google Analytics and Google Webmaster bots to crawl into your site, check your rankings in SERPs for those keywords for all the top search engines—Google, Bing, and Yahoo. Not only that, check the quality of Global and Local monthly searches for these keywords in Google AdWords tool or any other Keyword tool checkers to weigh these keywords because this helps you decide on which keywords are to be used in an SEO campaign. Also, check on related keywords and as well as their quality and jot them down. Build as many keywords as possible, do a keyword mapping because it is needed.
Don’t forget to:
- Take individual screenshots or copies of your website rankings, website traffic, and website links in order to have a good basis of improvement.
2. Website Usability Evaluation
Even in SEO, you have to get as much information about your website. A usability and functionality hub like usabilityhub.com will give you a good feedback as to how your website looks like for x number of random persons. You can create feedback on your landing pages, contact pages, website design itself, and get heat maps, comments, and feedback which will help you in evaluating your website. A website that ranks top spot on good keywords at SERPs but does not have the ability to convert leads into sale is next to useless. Getting a website evaluated accordingly will help you enough to move forward.
Your website may be looking great in Firefox or Chrome, but have you checked how your website looks like in IE8 or IE9? Keep in mind that as of April 2012, Internet Explorer remains to get 50% of the net market share for all of the competitor browsers. Firefox gets 18.65% market share, and Chrome gets 17.41% market share—the rest of the other browsers such as Opera, safari, and many others are fighting for the remaining 14% of the market share. Refer to this statistical browser market share data. What does this statistic say? It simply means that in order for you to effectively reach your targeted users, make sure your website is ready for them first.
3. On-page SEO
On-page SEO is all about optimizing the website itself for users to be very much guided and grounded in the website and as well as for search engines to gather a large amount of information that is relevant for its users.
Titles, Meta Descriptions, and Headers
Optimizing titles, Meta descriptions, and headers is already a cliché but will always remain true to every on-page optimization strategy as it does not only make search engines use these data to present in SERP but as well as the ability of users to understand what the website or page is all about by just looking at these important elements in the website.
Optimize URL Structure
Optimizing URL structure does not only help search engines gather the right data about your pages, but it also helps bots on how to properly crawl through the pages in the manner that you would want them to. For example, you have a website that talks about how to take good care of Whoodles. “Yourwhoodledoodle.com/puppy-care/food” looks better than “yourwhoodledoodle.com/123456/?=p28”
Also, keeping your pages at only three clicks away from the homepage makes it easier for both users and bots to view. Don’t forget to use 301 redirects once you decide on renaming your page URLs.
Adding robots.txt file in the root directory of your site is very important as this is what the search engines go to first before navigating the rest of your website. Having this is the root directory of your site will tell search engine spiders to either crawl a page of your site or not for indexing purposes.
Just like Robots.txt, a sitemap XML file should be placed in the root directory of your website as it tells search engine spiders how to crawl the pages of your site and which pages is relevant to each of the categories you assigned to make sure all pages of your website is being crawled to and indexed except for those pages you disallowed the bots to crawl to as being described in your robots.txt file. This does not mean that search engines will automatically follow the details as is, but serve as a guide for better indexing. You should not also forget to add this to your Google Webmasters account or Bing Webmasters tool.