Quantcast
Channel: Online Marketing Training Courses » The Geeks Guide To SEO
Viewing all articles
Browse latest Browse all 10

Regional Search Engines

$
0
0

Almost all Search Engines serve different countries. Search Engines do list content from other countries but most of the content that is listed is either US or UK dominated content.

With this in mind, most popular Search Engines have started deploying regional editions that serve only a specific country. For instance, Google has an Indian edition (http://www.google.co.in) that caters to the Indian audience.

Types of Regional Search Engines

Given below are some of the types of Search Engine Regional Editions.

Regional Interface is nothing but a translated version of the main Search Engine. Many Search Engines have interfaces in different languages such as French, German, Spanish, Japanese etc. However, the only difference between these regional interfaces and the main version of the Search Engine is that the language used on the interface is not English.  In other words, if you search using a keyword on both the interfaces, the listings are exactly the same.

Regional Interfaces are aimed at an audience that does not understand English.

Human Categorization, as the name suggests, is categorization of websites by human beings. Search Engine employees categorize different websites into regional listings. Websites that are more relevant to a specific country are listed in that edition of the Search Engine. Hence, for a French edition a search would mainly list documents from France. This eliminates the problem mentioned above. The only caveat being that the whole process is manual. Directories such as Yahoo, LookSmart, and Open Directory make use of this process.

Domain Filtering automatically segregates websites from different countries into their respective regional editions. This segregation is done on the basis of domain names. For instance a website from Australia would generally have a domain .au. The Domain filtering mechanism looks at the domains of all websites and creates a country specific edition listing.

Some Search Engines also have region specific editions which contain listings from the whole of that region. As an example: A French edition of Google may also return German or Spanish websites in some cases.

Domain Filtering has a drawback though. This mechanism can only filter out websites based on the domain name, and hence .com is always considered to be a United States website. This is obviously not true. Many websites from other countries also have .com domains.

Maintaining A Local And Regional Site

Domain crawling is probably the best solution for maintaining both a main site and a regional version. With domain crawling the regional listing is far more comprehensive as compared to the other mechanisms explained above.  Some pages, although regional may be listed in the main listing as well.

SpamDexing And Cloaking – Time Wasters

A couple of years ago spamming may have worked wonders for your website. However, with sophisticated algorithms being developed by all popular search engines, spamming can only backfire. Algorithms, these days, can easily detect spam and not only ignore your website but also ban your website.

Besides, instead of spending considerable time and effort on spamming you can always follow other proven strategies and have a higher rank with most search engines. Spamming can also easily irritate readers. Think about it – if your homepage has unnecessary repetitions of a particular keyword, it is bound to frustrate a reader. Consequently your site, instead of being content rich, would be junk rich. This can have nothing but a negative impact on your business.

Search engine cloaking is a technique used by webmasters to enable them to get an advantage over other websites. It works on the idea that one page is delivered to the various search engine spiders and robots, while the real page is delivered to real people. In other words, browsers such as Netscape and MSIE are served one page, and spiders visiting the same address are served a different page.

The page the spider will see is a bare bones HTML page optimized for the search engines. It won’t look pretty but it will be configured exactly the way the search engines want it to be for it to be ranked high. These ‘ghost pages’ are never actually seen by any real person except for the webmasters that created it of course.

When real people visit a site using cloaking, the cloaking technology (which is usually based on Perl/CGI) will send them the real page, that look’s good and is just a regular HTML page.

The cloaking technology is able to tell the difference between a human and spider because it knows the spiders IP address, no IP address in the same, so when an IP address visits a site which is using cloaking the script will compare the IP address with the IP addresses in its list of search engine IP’s, if there’s a match, the script knows that it’s a search engine visiting and sends out the bare bones HTML page setup for nothing but high rankings.

There are two types of cloaking. The first is called User Agent Cloaking and the second is called IP Based Cloaking. IP based cloaking is the best method as IP addresses are very hard to fake, so your competition won’t be able to pretend to be any of the search engines in order to steal your code.

User Agent Cloaking is similar to IP cloaking, in that the cloaking script compares the User Agent text string which is sent when a page is requested with it’s list of search engine names (user agent = name) and then serves the appropriate page.

The problem with User Agent cloaking is that Agent names can be easily faked. Search Engines can easily formulate a new anti-spam method to beat cloakers, all they need to do is fake their name and pretend they are a normal person using Internet explorer or Netscape, the cloaking software will take Search Engine spiders to the non – optimized page and hence your search engine rankings will suffer.

To sum up, Search engine cloaking is not as effective as it used to be, this is because the search engines are becoming increasingly aware of the different cloaking techniques being used be webmasters and they are gradually introducing more sophisticated technology to combat them. It may be considered as unethical by Search Engines if not used properly.

The post Regional Search Engines appeared first on Online Marketing Training Courses.


Viewing all articles
Browse latest Browse all 10

Latest Images

Trending Articles





Latest Images