(redirected from Black-hat SEO)


Techniques employed by some Web marketers and site designers in order to fool a search engine's indexing programs. The objective is to ensure their website appears at or near the top of the list of search engine results.

Word/Keyword Stuffing
A common method is "word stuffing" or "keyword stuffing," which embeds descriptive words on the page dozens or even hundreds of times. These words may also be invisible to the human eye, such as white text on a white background, but search engines see them.

Bait and Switch
Another technique combines word stuffing with "bait-and-switch," which loads the page with a popular search word such as "sex," "travel" or "antivirus," even though the word has nothing to do with the site content.

Search Engines Are Savvy
Major search engines always try to outsmart spamdexers. For example, they may automatically give a lower ranking to any page that contains a lot of repeated words. However, as soon as one method is successfully defeated, spamdexers come up with others.

Legitimate Methods
Site designers can use ethical ways to make their site get a higher ranking, such as inserting appropriate keywords in the page's meta tags. In addition, the search engine itself may offer tips on how to obtain higher rankings. A good source of information is Search Engine Watch (, which covers the major search engines. See doorway page, meta tag, spam and Google bomb.
References in periodicals archive ?
Google may have difficulty indexing your site, or a page may have been penalized for duplicate content or black-hat SEO.
0 was an algorithmic update by Google which aims to prevent websites that employ black-hat SEO techniques, i.
Black-hat SEO not only uses fake profiles, paid backlinks and link networks, but also hurts the competition to displace it in search results.
The update is directed at websites that violate Google's webmaster guidelines by using black-hat SEO techniques (keyword stuffing, cloaking, link-building and swapping schemes, copyright violations and deliberate creation of duplicate content).
The follow-up to Panda, the Google Penguin filter, was released April 24, 2012 and it focused on devaluing websites that Google believed participated in link schemes or black-hat SEO techniques.
Features of the service include daily scans; more than 50,000 signatures that are regularly updated; a black-hat SEO scanner to show whether there is anything on hosted websites that can get the cloud host delisted by search engines; and scanning of websites, web shops, content management systems and database servers - even if they are firewalled - as well as routers and anything else that has a public-facing IP address.
There are literally tens of dozens of other categorical vulnerabilities grey and black-hat SEOs manipulate, with hundreds of far more sophisticated tricks and tactics that are far more effective and with relatively minimal exposure of getting caught.