Monday, January 30, 2017

Basic SEO checklist before launching your new website

Take a pause. Is your website ready? ask yourself again. Technically it might be, but are you really sure that it's optimized and ready with SEO as well? OK, so now if you are not sure then this article is very important for you and your business. I have worked on more than 15 different business models and optimized 100 plus websites to improve their search engine listings. While working on such a huge number of projects, I realized that there are some very very common things which we need to do irrespective of their business model or type of website. So this is the list of very basic and important things which we should integrate before launching your new business website.


When it comes to SEO, the first thing to learn is, don't be impatient, good things take time. . It's very very difficult to recover from SEO penalty. If search engine ranks your website lower, then bringing it up is a challenge. So while launching your new website it's always better and safest way to check it for common things first and not to rush to make it live. So let's go over the pre-launch SEO checklist & make your website ready for launch.

1. Make sure you have prepared a perfect robots.txt

What is robots.txt? - Many time there are few pages/directories which we don't want any search engine to crawl/index. e.g. some backup files, control panel directories, basically anything which you don't want to appear in search engine result pages.
Why we should do this? - Let search engines have only the good and optimised pages of your website to decide it's ranking. If there are such unnecessary and more number of pages then overall website rankings will be calculated from all these pages, so let only few & important pages be indexed. We also have to control the traffic of search engine crawlers to our website because we don't want our web server to keep serving to web crawlers mainly for the pages which are not important.

How to allow full access in robots.txt file
Following block will allow all crawlers(User-agent: *) to access all files and folders from your web server.
User-agent: *
Disallow:
   
How to block full website in robots.txt file
Following lines will block all crawlers to access your web site, so nothing will appear in search engine result pages.
User-agent: *
Disallow: /
   
How to block a multiple folders in robots.txt file
User-agent: *
Disallow: /folder_name1/
Disallow: /folder_name2/
   
How to block a pages in robots.txt file
User-agent: *
Disallow: /filename.php
   
How to block all bots except google, Yahoo and Bing
User-agent: *
Disallow: /

User-agent: Googlebot
Allow: /

User-agent: Yahoo-slurp
Disallow: 

User-agent: Msnbot
Disallow:
   
How to add sitemap.xml in robots.txt file
Its always a better practice to have a xml sitemap for your website. Its not mandatory but its good practice. Check the below code to add the sitemap to robots.txt.
http://www.dhaneshmane.com/sitemap.xml
   
Following is the URL of my robots.txt, you can have a look at it while making one for your domain.
http://www.dhaneshmane.com/robots.txt
   
You should always test your robots.txt file with the tester provided by Google.
Click here to test your robots.txt file.

2. Non www to www redirect in .htaccess

If you are having 2 different URLs on your website with absolutely same contents then search engines consider it as duplicate content issue. As a best SEO practice, we should always avoid such issues.
Your website's home page can be open with 2 URLs, www.domainname.com and domainname.com. Both of these URLs point to the home page of the website so it will be considered as the duplicate content issue. I usually keep 301 permanent redirect from non-www to www using .htaccess file in apache web server. You can copy following code in your .htaccess file.
RewriteEngine on
RewriteCond %{HTTP_HOST} ^dhaneshmane.com [NC]
RewriteRule ^(.*)$ http://www.dhaneshmane.com/$1 [L,R=301,NC]

3. Redirect index.php to root in htaccess

This is one more URL which can create the duplicate content issue with domain root. Your domain root (w.g.www.dhaneshmane.com) is always pointing to index pages like index.php or index.html so this is another instance where we can have the duplicate content issue. In this case, we do a 301 permanent redirect of index.php to website root. Just copy the lines in below box and paste in your htaccess file.
RewriteRule ^(.*)index\.(php|html?)$ /$1 [R=301,NC,L]
Above rewrite rule will redirect index.html / php to it's root with server response code as 301. the NC flag at the end will make it case case-insensitive, so this is the safest way to achieve this.

4.Content is god - 300 plus words per page is very important

SEO is much difficult these days as most of the old SEO strategies began to fail. Google and other search engines updating their algorithm very frequently and scrapping all old techniques to promote your website.

In order to make a successful website, you have to have a very powerful content writer. Make sure each page is having more than 300 words. Make sure all pages have their unique identity & did not conflict with any other page. Try to generate original contents and do not copy from any other website, if you think this process will take time then don't worry. It's always better to make a perfect website and then launch it rather than launching and later thinking about optimizing.

5. Optimized & unique meta data for every page

I am sure everyone know what a meta data is and how important it is for the website. One important thing to remember while preparing the meta data for your website is, every page must have unique set of keywords and description.
  • Make sure you have proper contents for each keyword you are planning to use in meta keyword.
  • While choosing keywords don't forget long tail keywords
  • Link keywords in contents to respective website page if available.


No comments:

Post a Comment