Google is currently the most popular search engine in use (responsible for over 60% of web searches [1]). This means the amount of traffic your website gets largely depends upon your Google rank.
In order to improve their rank, many websites look into Search Engine Optimization (SEO). SEO is the practice of structuring a website to increase its visibility to search engines. Keeping SEO in mind while building your website can improve your rank as well as improve the general user experience on your website. Many of the things a Search Engine is looking for—from well structured content highlighting keywords and phrases, clearly defined text links, and the use of header tags—will make it easier for users to scan through your page and find what they are looking for as well. Good SEO tends to overlap with W3C Web Content Accessibility Guidelines which means increased accessibility to people with disabilities [2].
However, not everyone is willing to put in the time and effort to improve their site through good (white-hat) SEO practices. Instead of focusing on improving the usability, content, or relative keywords within their site, some try to increase their search engine ranking by tricking search engines. A few of these tricks include keyword stuffing, hiding extra content, and sneaky redirects. While that may be a quick fix to increase the traffic of your site, it has the potential to land you on the Google blacklist. If your site is blacklisted, it is removed from the Google index which means a dramatic drop in website traffic and a loss of business.
If you’re not consciously trying to cheat the system, then you probably don’t have anything to worry about. However, if you hired someone for SEO you might want to check Google’s quality guidelines [3] to make sure they are playing according to the rules. Avoiding the following will keep your site off the blacklist and possibly improve your rank:
- Hidden text or hidden links – there are lots of ways to hide content from a user and make it available to a search engine. Bottom line is if a search engine can see it, then your user should be able to see it too.
- Cloaking or sneaky redirects – Don’t offer different content to a search engine then you’re offering to your user.
- Irrelevant keywords– One way to get extra traffic to your site would be to create keywords that target popular Google Searches (Hot Trends– provides a list of the day’s 100 most popular searches). The top search for Nov. 19, 2008 was “tom daschle,” so if your site had this keyword you might get a few extra hits but unless you have actual content relating to Tom Daschle then you’re at risk for getting blacklisted and annoying users who are looking for information on Tom Daschle[4].
- Duplicate content – rather than creating multiple pages with the same content, link within your site back to the relevant content. It will improve your rank and keep you off of the blacklist.
- Malicious content – sites that try to get personal information under false pretenses or attempt to install viruses or spyware on your computer.
- Doorway pages – These pages are created just for search engines, are poor quality pages that have been optimized for a keyword or phrase, tend to annoy users and have led to the blacklisting of larger sites.
- No original content – this doesn’t mean everything you put on your website has to be an original thought or product. It refers to scraping content from other sites or generating random content for the sake of keywords.
These quality guidelines are designed to improve the Google user experience as well as insure that businesses with good websites and relevant content reach the top of Google listings. Even if your website is blacklisted, justly or unjustly, you can change some of your practices and submit your site for reconsideration.
Resources:
1 Comment
I think Google’s guidelines are perfectly fair. It helps to keep spammy sites away from being listed, and I believe it’s all directed to help make the online experience better for the average individual. I think Google’s ultimate goal is to make their algorithm able to judge a page in identical fashion to an everyday user, determining the on-site SEO qualifications in this style of judgment. Off-site SEO will always be more algorithm than human imitation, but blacklisting will likely always focus more on direct site content.