Why My Website is Not Ranking Higher in Search Engines?

Recently I met with a client for SEO consultation, who was already doing SEO for his site with some other Vendor. Since we knew each other personally, he asked me for free SEO consultation and I did the same. The main complaint of that guy was that despite a doing various offpage optimization he was not ranking for his major keywords and was much disappointed over his vendor’s performance. I just went through the site and after a day I got the reason why the site was not ranking for the major keywords. These reasons are very much applicable to majority of the website owners and as a result they lag back in the SERP and miss our important traffic to the site. Some of the reasons that I found out, which may be the reason for the non performance of the site are given below.

1) The site was not properly optimized for the major keywords
The major issue with its onpage optimization was that none of the pages were optimized for the major keywords. It is always advisable to optimize one of your pages for a particular keyword and it would be better not to optimize any other page for the same keywords. This would help in listing this page on Search Engine Results Page for that particular keyword.

2) No Sitemap
The site didn’t have the XML sitemap as well as the regular sitemap, which have made it difficult for the search engine spiders to easily crawl all the pages of the website. It is always advisable to upload an updated XML file to the root of the website, and also include the link to the regular sitemap on all the pages (preferably on the footer). The sitemaps should be updated whenever you make changes to the website structure.

3) Bad Navigational Structure
The website had poor navigational structure with unwanted functions, java scripts etc at the inappropriate places, which made it difficult for the search engine spiders to crawl through the pages. It is better to make your site navigation as simple as possible, to make it spider friendly.

4)  Poorly Written Robots.txt file
The site had a poorly written robots.txt file which made the crawling of the search engine spiders more miserable. Make sure that your site contains a well written robots.txt file which allows the major search engine spiders to easily crawl through and block private pages from being crawled by the spiders.

5) Less Content
The Site had less content on his keywords, which deprived the major keywords from appearing in the content. A good and unique content based on the major keywords is essential for getting better SERPs and forms the back bone of the website.

6) Weak Backlinks
The site had a weak backlinks, which gave the search engine ranking a big blow. Even though the site was having a few backlink, it was of less quality, ie links from poor, irrelevant and spammy sites.

After the client paid importance to these factors, the site really showed tremendous improvement in SERPs. This can be applicable to all webmasters and take care that these points are not missed while going for SEO for your website.

Trackbacks

Leave a Reply

Your email address will not be published. Required fields are marked *