Search Engine Cloaking explained
Search engine cloaking is a technique used by webmasters to enable them to get an
advantage over other websites. It works on the idea that one page is delivered to the
various search engine spiders and robots, while the real page is delivered to real people.
In other words, browsers such as Netscape and MSIE are served one page, and spiders
visiting the same address are served a different page.
The page the spider will see is a bare bones HTML page optimized for the search
engines. It won't look pretty but it will be configured exactly the way the search engines
want it to be for it to be ranked high. These 'ghost pages' are never actually seen by any
real person except for the webmasters that created it of course.
When real people visit a site using cloaking, the cloaking technology (which is usually
based on Perl/CGI) will send them the real page, that look's good and is just a regular
HTML page.
The cloaking technology is able to tell the difference between a human and spider
because it knows the spiders IP address, no IP address in the same, so when an IP address
visits a site which is using cloaking the script will compare the IP address with the IP
addresses in its list of search engine IP's, if there's a match, the script knows that it's a
search engine visiting and sends out the bare bones HTML page setup for nothing but
high rankings.
There are two types of cloaking. The first is called User Agent Cloaking and the second
is called IP Based Cloaking. IP based cloaking is the best method as IP addresses are very
hard to fake, so your competition won't be able to pretend to be any of the search engines
in order to steal your code.
User Agent Cloaking is similar to IP cloaking, in that the cloaking script compares the
User Agent text string which is sent when a page is requested with it's list of search
engine names (user agent = name) and then serves the appropriate page.
The problem with User Agent cloaking is that Agent names can be easily faked. Search
Engines can easily formulate a new anti-spam method to beat cloakers, all they need to
do is fake their name and pretend they are a normal person using Internet explorer or
Netscape, the cloaking software will take Search Engine spiders to the non - optimized
page and hence your search engine rankings will suffer.
To sum up, Search engine cloaking is not as effective as it used to be, this is because the
search engines are becoming increasingly aware of the different cloaking techniques
being used be webmasters and they are gradually introducing more sophisticated
technology to combat them. It may be considered as unethical by Search Engines if not
used properly.
Search engine cloaking is a technique used by webmasters to enable them to get an
advantage over other websites. It works on the idea that one page is delivered to the
various search engine spiders and robots, while the real page is delivered to real people.
In other words, browsers such as Netscape and MSIE are served one page, and spiders
visiting the same address are served a different page.
The page the spider will see is a bare bones HTML page optimized for the search
engines. It won't look pretty but it will be configured exactly the way the search engines
want it to be for it to be ranked high. These 'ghost pages' are never actually seen by any
real person except for the webmasters that created it of course.
When real people visit a site using cloaking, the cloaking technology (which is usually
based on Perl/CGI) will send them the real page, that look's good and is just a regular
HTML page.
The cloaking technology is able to tell the difference between a human and spider
because it knows the spiders IP address, no IP address in the same, so when an IP address
visits a site which is using cloaking the script will compare the IP address with the IP
addresses in its list of search engine IP's, if there's a match, the script knows that it's a
search engine visiting and sends out the bare bones HTML page setup for nothing but
high rankings.
There are two types of cloaking. The first is called User Agent Cloaking and the second
is called IP Based Cloaking. IP based cloaking is the best method as IP addresses are very
hard to fake, so your competition won't be able to pretend to be any of the search engines
in order to steal your code.
User Agent Cloaking is similar to IP cloaking, in that the cloaking script compares the
User Agent text string which is sent when a page is requested with it's list of search
engine names (user agent = name) and then serves the appropriate page.
The problem with User Agent cloaking is that Agent names can be easily faked. Search
Engines can easily formulate a new anti-spam method to beat cloakers, all they need to
do is fake their name and pretend they are a normal person using Internet explorer or
Netscape, the cloaking software will take Search Engine spiders to the non - optimized
page and hence your search engine rankings will suffer.
To sum up, Search engine cloaking is not as effective as it used to be, this is because the
search engines are becoming increasingly aware of the different cloaking techniques
being used be webmasters and they are gradually introducing more sophisticated
technology to combat them. It may be considered as unethical by Search Engines if not
used properly.
0 comments:
Post a Comment