Put that black hat away. Shady SEO tactics can get your site severely penalized by search engines. Punishments can range from losing organic traffic for a few days to losing it permanently.
For any business, its website is a business asset with value that accrues over time and should be treated as such. Here are five timeless “worst practices” to avoid:
1) Link buying – Attempting to make your site more authoritative by paying for links (see J.C. Penney).
A major element of most search engines’ ranking algorithm (especially Google’s) is “link popularity.” Simply put, link popularity is a measure of the authority, trustworthiness and number of links pointing to a domain. Authoritative and trustworthy websites (Forbes, The New York Times, PBS, etc.) are able to pass along a significant amount of their authority and trust if they link to a company’s website. Not surprisingly, these links tend to be very difficult to get. However, there are also cases where sheer “tonnage” of links suffice to boost rankings and organic search traffic.
Unfortunately, it’s very common for sites to try to cheat the system. Instead of creating a remarkable website, stellar services and content, unethical Web marketers try to buy their way to the top by purchasing links. There is no shortage of site owners who would link to a site for a fee. Buying links should be avoided at all costs. It is a violation of any search engine’s Terms of Service, and it can get a site banned from the index.
J.C. Penney famously got caught in 2011 for buying large amounts of links. The company was banned from Google’s index for 90 days. While losing a full fiscal quarter’s worth of profit from organic search traffic is certainly nothing to take lightly, that’s not the worst-case scenario. Plenty of sites without the brand clout of J.C. Penney have been banned for much longer periods of time for the same infraction.
If “building links” is a service that an agency or vendor offers to you, have them explain to you exactly how they’re building links, and how their methods are within the engines’ Terms of Service.
2) Cloaking – Serving different content to a search engine versus a human visitor.
The term “cloaking” certainly sounds dark and mysterious, but the concept is relatively simple. It means that a Web server will deliver different content based on whether the request is coming from a search engine or a Web browser.
Some “black hat” Web marketers use cloaking for very nefarious purposes, serving pages that are radically different to engines and humans. Usually the cloaked version served to the search engines is very text heavy (which engines understand well) and often targets off-topic, popular phrases just for the traffic it might pull in. The version served to humans is typically a conversion-centric page with very little content, trying to lure some fraction of visitors to pull out their credit cards and spend some money.
Not all cloaking is done with malicious intent. A few years ago, I worked with a very large comparison-shopping site, helping the international versions of its sites generate more traffic. One of the interesting things I noticed immediately was that it was cloaking its own home pages! Instead of the version that humans saw with products, photos and marketing copy, the cloaked version was a simple list of links to most of the categories for which the sites had products.
When I brought this issue to light, it turned out that the engineer responsible had knowingly done this. He thought it would be more helpful to search engines to present a simple list of links to enable them to discover the content on the site. He didn’t even know that cloaking was against the rules and could get the sites banned!
This kind of oversight, while not done maliciously, could have gotten the site into a penalty situation. Any agency you hire should be checking for cloaking on all of your Web properties as part of a standard technical SEO audit.
3) Keyword stuffing – Cramming page content full of a keyword to make the page appear more relevant for that keyword.
Keyword stuffing is a very old-fashioned tactic that stopped working well a long time ago—around 2005. The rationale was based on the notion that the more times a particular keyword appeared on the page, the more “relevant” an engine would consider that page to be for that keyword. Text on the page would be “stuffed” with the keyword, and it would read terribly.
Search engines have long since moved past basing relevance calculations on simple keyword repetition. While good keyword research is still central to publishing content that performs well in engines, your writers should be writing for the visitors that they have to inform and persuade, not a search engine spider.
4) Hidden text – Using tricks to make different content visible to humans than engines (see BMW Germany).
Hidden text and keyword stuffing often travel hand in hand. Because keyword-stuffed text reads so awfully to visitors, the idea is to hide it from them while still having it as machine-readable text on the page for engines.
BMW Germany was guilty of the aforementioned variety of hidden text in 2006 when it was caught serving pages with pretty pictures and very little text to visitors, but behind the pretty pictures was keyword-stuffed garbage for search engines.
Content should be central to your website. It’s how you communicate value to prospective customers. There are plenty of legitimate ways to balance content with aesthetics and conversion, and hidden content isn’t worth the risk.
5) Being a seed for spam – If visitors are allowed to create profiles or leave comments on your site, and they create links to “bad neighborhoods” on the Web, your site can suffer as a result.
Social media and community involvement continue to develop as important aspects of a fully realized Web marketing campaign. But while interaction with your customers and peers is undeniably good, it can also hurt your website if implemented and managed incorrectly.
Especially on blogs, a central part of fostering an interactive and healthy ecosystem is to involve your readers. Commenting and discussion should be encouraged, both as a way to strengthen bonds with existing customers, acquire new ones and even address news that may be particularly challenging for your organization.
The flip side of this coin is that the more active and successful your community is, the more it can attract bad actors just looking to exploit your site’s popularity and authority by creating profiles and comments that exist only to drive links back to the bad actors’ sites.
Ensure that your blog comments are being actively filtered (by software, using “nofollow” on outbound links, and by employee curation) to remove manipulative, promotional posts and the users involved.
Search engines are particularly sensitive to sites that feature links to “bad neighborhoods.” These sites are seen as the seeds for Web spam and can be penalized as a result. Engines don’t want bad sites in their indexes. Keep your community clean, and you’ll reap rewards. If you allow your site to be a seed for spam, you can suffer for it.
With the Web becoming an increasingly important marketing channel for most companies, it’s more critical than ever that you ensure your Web marketing tactics are focused on the long-term health and success of your website. The risks associated with outlawed SEO tactics are not worth the reward.