In the world of SEO, we name Black Hat SEO those tactics and techiques which are focused on gaining a better ranking in a not-so-ethical manner (since sites normally don’t supply value to the user) fooling search engines and exploiting weaknesses on their algorithm.
In Google’s case, it’s them who establish the Webmasters’ guidelines, what type of SEO is allowed and what not. Google can always create new norms (updates) and if you don’t respect them then those sites can be penalized and removed from their results.
These are some of the Black Hat SEO tactics:
Repeating a single keyword all across out site just to position it.
Alter already existis posts with automated software to then change words by synonyms thus getting tens of “original” copies but having low quality.
Very old trick: hiding text or keywords using the same background color or with a very reduced size.
Spam Comments in Blogs
Place comments without value in blogs in automated or manual manner just to gain links to our site.
Show one content to search engines and another to user. The page which shows it’s over-optimized with the keywords they want to position.
Buy expired domains
Buy domains which have expired but which still hold authority to build a new site or redirect it where it interests us.
Build hundreds of blogs and sites of low quality just to provide links to other sites to improve indexing and their positioning thus creating a large link pyramid.
Buying and seeling linkes in high PageRank sites to gain authority and improve in rankings.
A lot of these are already outdated and are quickly picked by search engines but they were effective back on their time.
You could say that all of those who work in SEO manipulate search results and it’s not bad: the problem is that Google wants to be capable of showing the best results to the users, that’s their business and ours is to improve the positioning our clients’ sites. In this case it’s very reccomendable use techniques which are inside of Google’s guidelines. Let’s not forget we must experiment and toy against the algorithm because… If we don’t search for the limits… How do we get to know them? That is, make it on test sites to avoid ruining anyone’s business.