spamdexing

(redirected from Black-hat SEO)

spamdexing

Techniques employed by some Web marketers and site designers in order to fool a search engine's indexing programs. The objective is to ensure their website appears at or near the top of the list of search engine results.

Word/Keyword Stuffing
A common method is "word stuffing" or "keyword stuffing," which embeds descriptive words on the page dozens or even hundreds of times. These words may also be invisible to the human eye, such as white text on a white background, but search engines see them.

Bait and Switch
Another technique combines word stuffing with "bait-and-switch," which loads the page with a popular search word such as "sex," "travel" or "antivirus," even though the word has nothing to do with the site content.

Search Engines Are Savvy
Major search engines always try to outsmart spamdexers. For example, they may automatically give a lower ranking to any page that contains a lot of repeated words. However, as soon as one method is successfully defeated, spamdexers come up with others.

Legitimate Methods
Site designers can use ethical ways to make their site get a higher ranking, such as inserting appropriate keywords in the page's meta tags. In addition, the search engine itself may offer tips on how to obtain higher rankings. A good source of information is Search Engine Watch (www.searchenginewatch.com), which covers the major search engines. See doorway page, meta tag, spam and Google bomb.
References in periodicals archive ?
Regarding these terms, the world of SEO has some curious neologisms, such as spamdexing, which refers to keyword stuffing, search engine spam, or black-hat SEO [6].
Black-hat SEO not only uses fake profiles, paid backlinks and link networks, but also hurts the competition to displace it in search results.
The update is directed at websites that violate Google's webmaster guidelines by using black-hat SEO techniques (keyword stuffing, cloaking, link-building and swapping schemes, copyright violations and deliberate creation of duplicate content).
The follow-up to Panda, the Google Penguin filter, was released April 24, 2012 and it focused on devaluing websites that Google believed participated in link schemes or black-hat SEO techniques.
Features of the service include daily scans; more than 50,000 signatures that are regularly updated; a black-hat SEO scanner to show whether there is anything on hosted websites that can get the cloud host delisted by search engines; and scanning of websites, web shops, content management systems and database servers - even if they are firewalled - as well as routers and anything else that has a public-facing IP address.