According to many SEO Companies when you conduct search engine optimisation campaigns you should forget SEO. Before you react of course allow us to quantify the statement. During the start of online marketing search engine optimisation was one of the most sought after strategy in order to gain traffic for their website. However, as time goes by many people have started to feel lazy in conducting SEO campaigns that they have relied on software and automation tools in order to conduct their content development and link building processes. Thus, black hat SEO strategies were born. Black Hat SEO is a combination of different SEO strategies where it’s purpose is to manipulate the search engines into thinking that such websites are better optimised only to find out after a closer look that the links and contents that were developed were actually of low quality and does not give great user experience.

One search engine, Google seeks to change that and in pursuit of their goal to make sure that their search engine results page return only the most relevant and accurate search results they have updated their search engine algorithm at a regular basis until they have developed the Google Panda and Google Penguin algorithms which were developed in order to combat low quality content and links respectively.

This makes the SEO strategy that people came to know to now be obsolete. If during the early days of SEO people seeks to create lots of contents and try to create lots of links, according to present day SEO this is not always the case. According to them in order to make an efficient SEO campaign one must forget SEO this does not mean that you have to forget the tenets of search engine optimisation, rather, it means that you have to change your mindset or perspective about search engine optimisation.

If during the early day you consider SEO as a process of gathering contents and links and focusing mostly on the quantity of links you must now look at SEO as a process of gathering quality links, regardless of the quantity as long as your links or contents are of the highest quality then your site is considered to be properly optimised.

Forgetting SEO means forgetting the old ways of implementing search engine optimisation, SEO is not dead it just evolved to a more challenging form but this time this form allows little or no room for any black hat search engine optimisation techniques.

Posted by: Ben Austin

Posted on: September 27, 2012 3:47 pm

-

It would seem that Google have been updating their search engine algorithm on a monthly basis, staying true to their commitment to improve on their search engine algorithm. Of course every SEO Company knows that search engines are updating their search engine algorithm on a regular basis and most often than not the algorithm updates could have an effect on a websites’ Ranking. When Google released their Panda and Penguin updates a lot of websites were affected so much that it did not only hit their rankings but also it greatly affected their return of investment. Because of transparency search engines always inform the public regarding their algorithm updates through blogs or even through tweets. Once algorithm updates are released and rankings have already been hit many SEO Consultants are actually on a panic mode that they lose focus, worrying instead of looking for solution.

Instead of fretting over something that you cannot change you have to start creating strategies that will help you control the damage done to your ranking by the change or update of the search engine algorithm or to at least give you a good start in recovering from the damage done. You can use the following tips in order for you to recover from the algorithm update just in case your site has gone down in terms of the search engine ranking.

1. Check Your Webmaster Tools – both Google and Bing has its own webmaster tools, this is where you can check the status of your website with regards to the search engine. Webmaster tools usually sends notifications to its users whenever there are warnings or alerts concerning the website, violations or causes of suspensions are also seen through the webmaster tools, so if you haven’t connected your site to Google or Bing Webmaster Tools it is time for you to do it.

2. Read and Understand the Algorithm Update – as what was said above the search engine algorithm updates are always published by the search engines for the public or those interested persons to know. Explanations are given in Google and Bing’s blogs and from such explanation try to understand the goal of each update this way you can have an idea on what might have caused your site to be de-ranked and what maybe the best practices that you can do in order to recover from such damage done by the algorithm update.

3. Review Your Analytics – the analytics is one of the most powerful search engine optimisation tool, this is because it allows you to track the number of visitor in your website as well as the demographics of your visitors. This way you will be able to understand your audience, and in relation to an algorithm update you will be able to easily see if your traffic has gone up or down significantly. Further, you will be able to see which keywords are affected, this way you will be able to create an SEO plan that will help you recover from and algorithm update.

These simple steps that you should be doing will definitely help you recover and not fret about any algorithm changes or updates.

Well this is not exactly your official animal battle, this pertains to what Google Panda is as compared with that of Google Penguin. Both Panda and Penguin are Google’s name for the different algorithm updates. Many SEO companies and SEO consultants have been fixated in monitoring the effects of Google Panda and Google Penguin as well as the different algorithm updates because of the dynamism of such algorithm update. Dynamism here pertains to the ever changing validity of an SEO strategy from another. To put it in perspective it means that an SEO strategy like that of a specific link building strategy may be acceptable by the search engines today but is shunned or frowned upon tomorrow. Due to the different abuses and desire to manipulate the search engines the algorithms are constantly improved to combat such.

What is Google Panda?

So we come now to the question on what is Google Panda? We have actually discussed this topic in our previous posts however in order to refresh our memories we will be discussing the overview of Google panda. Google Panda, was developed in order to combat abuses and manipulations done by those who wants to rank their site in Google through the use of content. The Following are the different content abuses or signals for Google to flag your site to be in violation of Google Panda:

1. Copied Content – Google is able to determine whether your content is copied or not, some SEO companies say that around 40% of content copied would signal the alarms for Google in order to ban or de-rank your website.

2. Content is Thin – this means that your website only contains around less than two hundred fifty words or the content does not give off any substantial information about the topic. You must understand that the goal of every search engine is to give out the best relevant results so if such content is not informative and does not give enough information about the topic at hand then it is nothing but a spam content.

What is Google Penguin?

On the other hand Google Penguin is focused more on the link building strategies being considered as web spam, the following are what Google Penguin is trying to combat:

1. Cloaked links – these are the different keywords that were turned into links but were designed in order to make such link non-apparent. This means that the link is hidden from plain sight, this is why it is important to make your links apparent.

2. Inclusion of Keyword Outside the Article – there are some websites that utilises several keywords and are linked to specific websites but are not part of the article itself, these bunch of keywords would be considered by Google as signals for spam.

3. Using Thin Content for Link Building – recently Google has de-ranked and de-indexed websites that allows posting of poor quality content for link building purposes one of which is the popular BMR. Google has gotten smart in identifying such manipulation of search engine ranking.

Conclusion

Google Panda and Google Penguin are two algorithms implemented by Google that compliments each other. Google has developed an algorithm that will monitor abuses or manipulations in the two cornerstones of SEO these are content and links.

Many have been affected when the Google Panda was implemented by the search engine giant. It is no wonder that when Google announced that they will again incorporate a new algorithm to their existing one they many SEO companies and SEO consultants have again been saying “the sky is falling!”. This reaction is brought about by the fact that Google Panda did a lot of damage to the rankings of around 12 percent of websites including those websites that are known to have well established rankings and brand names. The Google panda created a lot of problems as well as loss in revenue for those affected websites. This is why when Google announced another algorithm update dubbed as Penguin a lot SEO companies and SEO consultants waited for its effects and have done several measures to prevent damage to their rankings.

What Did the Penguin Do?

The Google penguin attacked website which utilised different web spam techniques, and upon its implementation there were about 3.1 percent of the total websites worldwide that were affected which means that their rank went down while others got suspended. Penguin specifically targets the different websites using different web spam methods. Those backlinking schemes that are regarded as being used to manipulate the search engine results page ranking.

When Was the Update?

Google, through Matt Cutts announced via Twitter made their announcement that Google penguin was updated to Google Penguin 1.1. This made Google dispel any allegations that Google penguin has had several updates. Rather the update was done last May 25, 2012.

Why Did SEO Companies Assume Several Penguin Updates?

The question now is that why several SEO companies and SEO consultants would have announced that there were several penguin updates? This is because of the fact that there were constant changes in rankings throughout the week after the implementation of the Google Penguin. According to Google these changes are not brought about by several updates but rather this is caused by the gradual effect of Google Penguin.

Conclusion

Google Panda is focused on targeting poor contents while the penguin is targeting web spam techniques whatever the reasons are SEO companies and SEO consultants must always take into consideration quality over quantity and not be mindful on how to manipulate rankings when implementing your search engine optimisation campaigns.

 

Posted by: Ben Austin

Posted on: February 20, 2012 11:29 am

-

When Google released their Google Panda algorithm a lot of companies that are marketing their business online including SEO Companies and SEO consultants have been affected. We saw a lot of websites that were previously in page one of Google and later on finding themselves to be at page 2 or a lower search engine results page ranking. Google Panda took the search game by surprise as many large online companies suffered the algorithm change. Of course a drop in the search engine results page ranking would also mean a drop in revenue as competitors may have taken the higher rank. This is why a lot of SEO companies and SEO consultants scrambled to fix their implemented SEO campaign, and in doing so several truths about the Google Panda was revealed thus opening the doors to improving SEO techniques as well as introducing new ones. The following are some techniques that may help you in your campaign restoring or ranking high in Google after the Panda Algorithm update:

  1. Focus on Traffic Source Diversity – although Google is one of your target search engine where you aim to gain your website’s traffic from it is imperative to diversify your traffic sources. Take into consideration gaining traffic from social media networks, blog sites, forums and social bookmarks. This strategy is necessary in order for your website to still take in traffic after an algorithm change by  the search engines especially that of Google.
  2. Incorporate a Social Media Campaign – Google has released its own social media the Google +, they also released Google’s version of the “like” button known as plus one and incorporating social media promotion as well as integrating plus one buttons in your website would be beneficial as such are considered to be ranking factors for Google’s new algorithm. Apart from that, promoting your website via Twitter, Facebook, Google + and other social media networks would assure you that you will be promoting your site to real people targeting specific niches and capturing your desired audience.
  3. Content Must be Fresh and of High Quality – first we must define what is “fresh” and “high quality”. A fresh content is something that is current, therefore there is a need to regularly post to your website of in order to make your websites content to be fresh. High quality on the other hand means that your content must be unique, well-written, must be grammatically correct, of correct spelling and of course it must be informative. This means therefore that your content must be regularly posted on to your website as well as it must be a content that is written for the reader and not the search engine this means the content should be informative, relevant, original and worth reading.

Indeed Google has redefined the world of search, by making the contents of a website to be as close to relevant as possible or accurately relevant, we might see the end of poor, generated and unoriginal content, making the role of content developers for websites to be a lot more important.

Posted by: Ben Austin

Posted on: February 5, 2012 2:23 pm

-

It is a known fact that Google is the search engine that is commonly targeted by SEO companies or SEO consultants. The main reason of course is that it is the search engine of choice by sixty to seventy percent (60%-70%) of internet users. The popularity of Google makes ranking in this search engine to be a very profitable endeavor. However, ranking in Google is not an easy task since Google pioneered in the continuous improvement of their search engine algorithm. This continuous change or update in their algorithm is aimed to make their search results to be as relevant as possible to the keyword used by the user.

With the goal of improving their search algorithm, Google have successively introduced several updates, when the update was launched it was dubbed as the Google Panda. When Google Panda was first released there were many websites including websites that are owned by large branded companies. Google announced that Google Panda algorithm will as much as possible would deliver relevant results in every user search. It is already several months and Google has been continuously delivering updates to the Google Panda algorithm. We thus, have to understand that the reason why sites were de-ranked by Google is because there are several new ranking factors which became site SEO vulnerabilities that were added to the Google Panda algorithm and identifying these would help you secure a strong Google SERP ranking.

1. The lack of relevant content incorporated in the website. Since Google is focused on relevance your content must be relevant to your site’s overall theme and content should always be added and are fresh, this will tell Google that your site relevancy is high.

2. Inconsistency between the title tag and the actual content of the web page. This again will determine the relevance of your site thus you have to be consistent when writing the title of your site page with the content that such page has.

3. Duplicate contents – probably one of the main site SEO vulnerability is the presence of duplicate contents in your website, in order to avoid being banned or de-ranked by Google you have to make sure that your content are original and is not a copy from another website.

There are more site SEO vulnerabilities however these are the top three site SEO vulnerabilities that every SEO Company or SEO consultant should look at especially if they are trying to regain the ranking lost brought about by Google Panda.

Authors
Categories
Archives
Blogroll