RSS Feed (xml)

SEO Expert India

How Chasing Bing and Google can Cause Fundamental SEO Mistakes

With Microsoft’s release of their new search engine Bing, some major changes by Google, and the much-hyped ascent of social media, more and more companies are trying to win traffic by reacting to adjustments in the SEO landscape. However, the center of an SEO strategy should be based on fundamentally sound principles, not reactions to frequent changes in the game. In a hurry to capitalize on untapped resources or temporary benefits, many companies lose sight of important best practices – things to do and things not to do.

An effective SEO campaign should find ways to continually improve performance through optimization and link building, while simultaneously avoiding penalties from the search engines. Best practices don’t promise success, but they do ensure that your time spent link building, researching keywords, and optimizing your sites isn’t wasted. Here are some common mistakes that people make in their SEO efforts and how to avoid them by using engine friendly, white-hat techniques.

Hanging Out With the Wrong Crowd
In their zealous race to get to the top of the search engine results pages (SERPs), many sites are a bit careless about where they get their links from. While getting links is an important part of Google’s algorithm, the quality of the sites the links come from can be more important than the number of links you get. If you are linked to from too many sites that have penalties, then Google is likely to associate you with those sites. This is called “the bad neighborhood” effect and will result in penalties for your site. This sort of guilt by association penalization means that you need to be careful about where you do your link building, and that not all links are worth having. Any site that has been penalized for link scheming, spyware, malware, or phishing is probably a site that you should avoid in your link building efforts.

Not Paying Attention to Search Engine Updates
As I said above, the search engines are constantly changing their algorithms. Not keeping a close eye on SEO news could spell trouble for your website. For example, just recently Google made a major change to the way their web crawlers treat no-follow links. In the past, many sites used no-follow as a way to conserve page rank. However, with the recent changes, Google won’t pass “link juice” through the no-followed links, but your other links on the page will still pass page rank normally. This means that if you have five links on a page and no-follow two of them, your other three will pass page rank in accordance with there being five links on the page. Previously they would have passed page rank as if there were only three links on the page. This is a pretty big deal, and it means that page rank sculpting is now a matter of cutting back on links, rather than just no-following everything that you’re not concerned with ranking. Whoever manages your SEO campaign should have the time and energy to monitor for these sort of changes and to make adjustments on the fly, or else you might wake up one day and find that you’re not even on page one for your best keywords.

Making It Hard or Impossible to Get Indexed
If you don’t get indexed, you can’t rank. While the best practices in terms of indexing won’t necessarily help you to rank higher, they will facilitate the ranking process by ensuring that engines index your changes more quickly. This will help you in the long run as you make numerous SEO changes over the life of your site. When you make changes, you’ll want them to be picked up quickly so you aren’t killing time waiting to see if you’ve affected any real change. Two good rules to follow are:

1.Use off-page CSS. This will keep your page light on coding, which will help search engines to index your pages more quickly. Using off-page CSS also has other benefits such as decreasing your code to content ratio, and pushing your content closer to the top of the html file.

2.Mix it up; don’t let your site feel like a template. This is especially an issue if you have a large site. With a high number of pages getting indexed, it’s important that they have very little content in common. If they do, search engines may label some of your pages as duplicate content and not index them at all. This also includes meta content such as your titles and descriptions. All of your content should have some degree of variation.

Targeting Redundant Keywords
Speaking of variation, it’s important that you target a variety of keywords throughout your site. The search engines don’t rank your entire site for keywords—they rank pages. This means that if you target the same keywords on all of your pages, then you’ll be competing with yourself. Your strategy for which pages to build links for will be diminished, and many of your pages will become a waste in terms of potential search traffic. By targeting a variety of keywords for each page you can tap into a wider range of search terms. Don’t be bothered if you find that the only keywords some of your pages can rank for are low traffic. Not all of your pages are going to be major conversion points or content wells. A high number of extra pages can still serve as a great source of traffic by ranking well in the long tail of obscure search terms.

Not Redirecting or Redirecting Improperly
One of the sure-fire ways to destroy your SEO campaign is to not redirect moved or non-existent pages. There are a variety of reasons why you might end up in this scenario. You could be rewriting urls, moving domains, or have simply removed old content. If you’re ever nullifying the value of a url by rewriting it or removing the page, it’s important that you redirect the old url to the new one. By setting up a 301 redirect you can still capture most of the link juice and send it to your new url. Another way to address this issue is to contact some of the webmasters who have linked to you and request that they point their link to the new address. Google Webmaster Tools will help to identify any crawl errors, including ‘not found’ (404) errors, and will even go so far as telling you how many pages on the web link to the page not found on your site.

Building a Deep Site Architecture
Many otherwise quality sites suffer from poor site architecture. Building a site that will rank effectively takes more than getting indexed quickly and avoiding penalties. It’s important that your site’s structure of internal links from one page to another is established in such a way as to get the most page rank to the most important pages. A good way to approach site architecture is by trying to build a “flat” or “shallow” site. The goal should be to reduce the number of clicks it takes to navigate the entire site to as few as possible. Having a flat site architecture increases the amount of page rank passed from page to page, makes the site easier to index, and increases usability for site visitors.


Check your SEO efforts for these mistakes; if you don’t have any of them, great! If you do, this is your notice to start making some changes and start ranking better. SEO requires a watchful eye, regular maintenance, and a thoughtful strategy. At times it can be confusing – unexplained penalties or poor performance may be a turn off to continuing an SEO campaign; however, by always designing for usability and keeping these mistakes in mind, a successful SEO campaign will be within your reach. - by Brian Easter