Posted by: SEO Positive

Posted on: January 29, 2010 12:41 pm


Search engine optimisation is composed of a myriad of strategies that when utilised correctly, it would give the assurance that every target website would be indexed and ranked. Most of the time however, pure search engine optimisation where everything is done by hand or manually would prove to be very tedious and results may not be desirable, on top of that it may even prove to be very expensive.

These are the reasons why there is a need for every SEO consultants and SEO companies to utilise different tools that may help them speed-up the process or even help them in accurately pinpointing the right keywords. The existence of such tools have been present for a long time but unfortunately some SEO consultants or SEO companies are not aware of such existence of SEO tools because of several reasons, one of which is that the SEO software is not properly advertised, it may be sitting within the deep recesses of the internet and only few who knows its URL may be the ones using it. Second reason may be because of the fear that it may fraudulently claim it’s features by making claims that may be too good to be true, which would immediately repel any prospective client or SEO tool user.

While some SEO tools being advertised may be taking all your money with very little results, there are also some SEO tools that are very effective, provided that they are utilised properly and that you should be able to interpret the data that they generate correctly. Most SEO tools out there requires you to subscribe and would then later on require you to pay on a monthly basis, caution should be exercised in this type of transaction as the SEO tool may be effective but if you do not have the technical knowhow on manipulating the SEO software then it may be too expensive as you would be utilising more than one month within a learning curve process.

Posted by: SEO Positive

Posted on: January 29, 2010 12:40 pm


Much weight have been given by search engines on the quantity and quality of links that points to the website. One main reason for this is the fact that links are being viewed as a benchmark for online reputation of the target website. This means that website that has lots of links from external sites would mean that such links are votes to the website. To put it in perspective, websites A to Y will place a link to their website pointing to website Z, this would then mean that website Z will have twenty-five votes. However, new SEO algorithms would suggest that quality is better than quantity, that is a website having few links where such link however is going to be coming from websites with high page ranking would be better than a website having hundreds of links which are coming from non-reputable websites.

Building your links is not that easy because of the fact that you may be able to encounter several limitations that will hinder your website from gaining from the different links that were established. This is because of the fact that search engines have been incorporated with an algorithm that will check whether a link or links are natural or not. Natural links are those that emanated from an SEO company’s and SEO consultant’s hard work by submitting links to directories among others. We could further determine natural from unnatural links from the following examples:

If website A gathers 200 links in one day that is unnatural, because by practice and by actual valid white hat SEO strategy creating 200 links in one day is impossible, thus unnatural. While if that 200 links were built over a period of time, say, for one month then it would be natural.

A website that gains all its links from websites with page rank of 4 or in a constant page rank that would be considered as unnatural, while if the websites linking to the target website would vary, say, from 2, 3, 4,5,6 among others then it would be considered by search engines as natural links.

Posted by: SEO Positive

Posted on: January 27, 2010 10:18 am


Last 2009 was the staging area where the different trends of 2010’s own version of Search engine optimisation this is because the last quarter of 2009 we  saw several movements or actions being done by search engine companies. One of which is Microsoft, their highly publicised  search engine version have yet to develop an application or a marketing technique that would prove to be potentially threaten Google’s number one position in terms of user share. Moving towards the last quarter of 2009 Bing have announced their deal with Yahoo with regards to the search capability, this merger would give Microsoft the ability to provide the search abilities of Yahoo and would impliedly benefit from being utilised by Yahoo’s user-base. Google remains unfased and have announced the on-going work on its new search engine algorithm the Google Caffeine. With the battle becoming intense, Microsoft has announced their entry to the world of real time search through Twitter, however, Google have also agreed to a deal giving now the search engine giant real-time search capability through the crawling of the Tweets from Twitter.

Early this year, effects of the real-time search is already evident, giving now SEO companies and SEO consultants that problem on how tweets may be utilised as part of a real-time search engine optimisation campaign. The search engine companies have released several hints to SEO’s but the most distinguishable among them is on how your reputation is being assessed by the search engine and how this would affect you in your real-time SERP ranking. Establishing reputation online have been one of Google’s primary goal and with Tweets or in their real-time search it is no different. Reputation of tweets depends on the user who published the tweet, that is, it depends on the number of tweets and the number of followers he or she has. Aside from this the tweets’ reputation depends on the relevance of the tweet with the community where the user belongs or to the type of followers he or she has. Thus, being friendly and updating your tweets online would help your tweets rank higher in the real-time search arena.

Posted by: SEO Positive

Posted on: January 27, 2010 10:17 am


Search engine optimisation is a complex process that combines different strategies from the main category of an onsite SEO and offsite SEO to the detailed strategy of optimising keywords to link building. Indeed search engine optimisation is a complex process, that is why many business owners or website owners are resorting to hiring the skills or expertise of different SEO companies and SEO consultants. One of the processes that is always considered to be the central or important strategy that is a must to be executed is the process of link building. Search engines considers links going to your site as “votes” which means that it is going to be a determinant in assessing your websites’ reputation online.

The more quality links you have the higher the chance for your site to being indexed and ranked. One of the major changes that were noticed by SEO companies and SEO consultants is that search engines would heavily favor a website with quality links rather than a website with more links but are not of the desired quality. What does quality mean? It means that the link or the website linking to your website must also have a decent reputation online, for Google this is determined through the page ranking system. Where the page rank is determined by the number of “votes” or links going to your website, to illustrate, a website with a page rank of 2 and above and is ranked high up in the SERP means that it’s a reputable website with decent links which means that if this website links to your website then you are getting a quality link as compared to a website with no page rank and with a very low SERP.

Another determinant of the quality of link is relevance, the link must be relevant to the site’s theme. Meaning your website must be relevant to the site that is linking to you, again to illustrate, the site that tackles or discusses fitness must have links from websites that also tackles topics on fitness or health. A website with page rank of six may be linked to your fitness site but if the linking site is about construction materials which does not have both direct and indirect relationship then such link would be rendered as useless.

Posted by: SEO Positive

Posted on: January 26, 2010 9:29 am


In search engine optimisation every little errors that you commit may mean a lot with regards to your ranking. There are a lot of websites that are being optimised properly that makes it rank high in the search engine results page but after sometime this would then go down just because of a simple error committed in the website content.

There are several errors that affects the ranking of a website and the following are the most common mistakes being committed by the SEO company or the SEO consultant or even the website owner or the website designer.

1. Less Links – apart from content building, search engine optimisation is also focused on the process of link building. This is the process wherein you try to gather links that goes into your website for traffic purposes and page ranking purposes. The fewer links that you have the lesser chances you would be ranked in the search engine results page.

2. Slow Page Load – page loading is now one of Google’s criteria in indexing and ranking their websites, this is why your website page must only have contents that are web friendly and not the formats that slows down the web page loading time., as much as possible use image contents that are in PNG format or JPG format.

3. Keyword mismatch – keywords must always be relevant to your websites’ theme if there is no relevance between your keywords and your content or site theme then your SEO campaign would be useless. Make sure the relevance would always come into play especially when you decide on your keywords.

4. Poor Quality of Titles – titles may just be a text that shows in your web browsers’ title tag but it gives your website a lot of traction in the search engine results page ranking because of the fact that website titles are the first one’s being crawled by the search engines and this is where search engines base the description of your whole website or web page.

Correcting these simple errors would mean a lot to your SEO campaign that is why you have to work hard in researching and looking into the flaws of your SEO campaign before its too late.

Posted by: SEO Positive

Posted on: January 22, 2010 1:49 pm


Even with recent tussle with Chinese authorities in Google’s China branch the search engine company still posted that highest revenue during the last quarter of 2009. It is to be remembered the search engine giant have constantly been in a battle with China beginning from the censorship of their search results up to the cyber attack that compromised the Gmail account resulting to the theft of several email accounts of well-known Chinese activists. The last quarter of 2009 have truly been a challenging time for the search engine company, not to mention the emergence of new rivals such a Bing and Yahoo where the two companies merged to take on Google in the field of search.

However, even with this problem the search engine giant still posted a large revenue for the fourth quarter of 2009. The total revenue reported by Google for the fourth quarter alone is $6.67 billion which represents a seventeen percent (17%) increase, as compared to the 2008 revenue where there was a sixteen percent (16%) increase.

Google’s proprietary sites also generated some increase in revenues where it amounts to $4.42 billion accounting for an increase of sixteen percent (16%) as compared to the fourth quarter of 2008 where they earned $3.81 billion. On the side of Google’s partner sites like those enrolled in the AdSense program revenue also increased amounting to $2.04 billion which is thirty-one percent of the total revenues only in the fourth quarter of 2009, this is around twenty-one percent increase from 2008’s revenue of $1.69 billion.

Even if the search engine company have been facing problems with Google China the international revenue still increased which totaled to $3.52 billion which is fifty three percent (53%) of the total revenue, a significant increase as compared with fifty percent during the fourth quarter of 2008.

What does this increase in revenue mean? Basically, for SEO companies and SEO consultants this means that Google is still the top search engine in the said industry and having the target website rank in its first page would mean a lot for your SEO campaign.

Posted by: SEO Positive

Posted on: January 21, 2010 9:16 am


The story on Google China’s cyber attack is far from over, the search engine company has speculated that the series of break-in’s that were  directed to their Gmail services were of Chinese in origin. At first however, this speculation only emanated from the fact that several Gmail accounts of Chinese human rights activists were the target of the said attack and were then compromised. It is because of this Google have initiated a probe regarding the matter to find out the truth behind the cyber attack on the search engine giant.

This week, computer security companies seems to agree with Google as they have found evidences to support Google’s claim that the attack is of Chinese in origin. According to a computer security company, Secure Works, after analysing the program or software that were used for the break-in they were able to ascertain the primary program utilised during the attack and it was further found out that it is of a module-based algorithm that was deemed to have emanated from a Chinese technical paper which is exclusively being shown in Chinese-language websites.

It was further determined that the type of malware that caused Google china’s attack was a Trojan Horse which was intended or developed to open the back door to a computer via the internet. Computer security companies have identified the Trojan horse as Hydraq a malware whose main function is to destabilize computers that are running on any Windows operating system.

The malware code was deciphered by the computer security companies via the process of reverse engineering which is a process wherein the reverse engineer would take the malware code and disassemble it in order to understand the nature or even the origin of the said malware code.

Given these evidences that computer security companies however are not ruling out the possibility that the Chinese codes were inserted intentionally in order to frame the Chinese government.

For more internet marketing news, or information, and tips on search engine optimisation, feel free to browse our SEO Blog.

Posted by: SEO Positive

Posted on: January 20, 2010 11:46 am


The presence of Google in China may slowly dwindle down, several problems came out with regards to Google’s presence in the communist country. This is because of several reasons where most of which are due to the government’s intervention. One example of such is the implementation of censorship in Google’s search result filtering out information that would be detrimental to the Chinese government and the Chinese society. The biggest coercive action against Google however was the attack done to their gmail accounts.

Last December of 2009, the search engine giant’s China branch was attacked by several hackers with the purpose of compromising the Gmail accounts of some Chinese activists. The cyber attack revealed Google China’s weakness in terms of their network infrastructure security and thus compromising the privacy of their clients. In response to this attacks and the pressures being exerted by the Chinese government on Google china the search engine giant have decided to stop censoring their search results an action that might lead to the shutting down of the search engine’s operations in China.

The search engine giant seems to have already taking the fight on with the Chinese government by defying the request of the latter to censor search results. In addition, the search engine company is serious in looking into the cyber attack incident by considering their own employees as suspects in the cyber attack done against their company. Google’s investigating body is not discounting the fact that there might be a certain degree of involvement by their own employees in relation to what happened last Mid-December. One of the damage done by the cyber attack is the stealing of some intellectual property where only employees of the company have access to such intellectual properties and thus giving rise to suspicions that the attack may have emanated from within Google China’s walls.

Posted by: SEO Positive

Posted on: January 18, 2010 2:52 pm


It is every SEO company and SEO consultant’s goal to have their target website be ranked first in every search engine. This is because of the fact that ranking first in the different search engines will give you or the SEO client’s website the edge of being visited by websites. However, the main goal of search engine optimisation is to give the client the much needed traffic and the much needed revenue. Most website owners do not have the technical knowledge regarding website design and most of all in search engine optimisation that is why a lot of website owners fall prey into the traps of some unscrupulous people who promises results of astronomical proportions. One most common indicator if the Search engine optimisation company or search engine optimisation consultant is a fraud if they promise to get your website indexed and on page one immediately.

The promise of being indexed and being ranked on the first page of the search engines is considered to be a fraudulent promise because of the fact that no one can really promise or even predict page one ranking in the different search engines, fraudulence of this act may even be evident if the SEO company or SEO consultant would tell guarantee such page one ranking. Does it mean that no one can attain page one in search engine? No, of course not but rather page one might be attained after a considerable amount of time, what is suspicious is the amount of time that these fraudulent SEO companies and SEO consultants are guaranteeing.

Fraudulent SEO consultant and SEO companies may also show you result of their effort by having your website on page one, however, you do not have to only look at the fact that you are on page one but also the manner on how you got to page one of every search engines. This is where your knowledge in identifying spam or black hat SEO techniques would come in handy. Although black hat techniques would give you the desired result after some time the search engine companies would detect the use of such black hat technique that they would then be considered as spam websites thus would merit being banned from the search engines.

For a website owner it would be proper for him or her to first know the background and track record of the SEO company or SEO consultant he or she wants to engage their services in order to avoid falling prey to scammers.

Posted by: SEO Positive

Posted on: January 15, 2010 9:04 am


Last Month both Google and Bing announced that they have forged a deal with Twitter, the foremost microblogging website, wherein both the search engine giants will be integrating into their search results those that are coming from Twitter as part of their real-time search strategy. This means that Google and Bing will both index tweets coming from Twitter which is of course known to be microblogging in real-time. However the two search engine giants approached this in different ways, while Bing immediately launched their version of a real time search engine results page, Google waited out but eventually and gradually integrating real-time search results. Microsoft made a separate interface that is apart from the standard Bing interface while Google have incorporated real-time search results into their Search engine results page.

How about for ranking factors? How will Google and Bing index and rank Tweets? According to Microsoft it all depends on the number of followers, which means that if a person has many followers there is a big chance that tweets of such person will rank higher, fewer followers means lower rankings. Another criteria that Bing openly stated with regards to ranking Tweets is on the content, if the tweet is exactly the same as that of another tweet, which is similar to a duplicate content, then it would rank lower.

Google on the other hand have slightly revealed their approach on how to rank tweets in their real-time search results. For Google it not just about the number of followers number may matter but what is Google focusing on is the reputation of those followers. This is the same as that of the link popularity where a website is heavily favored if another website is linking to it and such website have a significant page rank. This means therefore, having a follower that has a good reputation, meaning he or she may be an authority on the related topic, then you would get the change of ranking higher.

Another ranking factor on Google’s real-time search is the presence of a hashtags in tweets. Google have considered the practice of including hashtags to be red flags which means it will be considered as spam tweets therefore you should avoid the practice of including hashtags as early as now.

For Google, those mentioned are not the only ranking factors, but this is the start of their restructuring of their search engine algorithm, Google executives hopes that geo-location data for tweets will also be incorporated. The Search engine algorithm is still evolving a process that SEO companies and SEO consultants should watch out for.