Google announced last week that it was making changes in it’s algorithms in an effort from Google’s battle to reduce web polluters by punishing websites that simply ‘farm content’ for the purpose of increased Google rankings.
“Our goal is simple: to give people the most relevant answers to their queries as quickly as possible. This requires constant tuning of our algorithms, as new content—both good and bad—comes online all the time.” http://www.iloubnan.info/technology/actualite/id/57465/titre/Google-fighting-%E2%80%9Cweb-polluters%E2%80%9D
Google’s Battle to Reduce Web Polluters | Why change?
The most popular search engine site is now on a hunt for these ‘trash’ sites in order to reduce web polluters that produce keyword based content solely for the purpose of manipulating Google’s SEO algorithm for increased ad revenues.
These content farm sites have overrun the Internet with content based solely on keyword patterns in texts without any order or substance to them. These content farms have made searching the Internet for quality, specific data increasingly difficult, and improving the user’s experience is the reason for Google’s battle to reduce web polluters.
Google’s battle to reduce web polluters has changed the algorithm, and the new rules of the search engine would remove those articles from the first pages of the results.
Initially, web sites were classified only according to the code of the website. Only the invisible, technical part of the websites were “read” by Google’s robots, and that established its referencing based on mathematical and structural indicators. Keywords matter, but subject matter or readability of the site was not taken into consideration. Many websites realized this and learned how to take advantage of Google’s algorithm. These companies would produce articles solely for the purpose of generating advertising revenues. They would rely on freelance journalists to create more than 1000 articles per day which topics are based on Google Trends, Facebook, Twitter … These articles were great for fooling Google, but the quality of the articles did not meet the expectations of users, motivating Google’s battle to reduce web polluters.
Google’s new algorithm was optimized to allow the listing of sites based on these technical indicators and actual content of the webpage, meaning that Google’s robots have now become capable of reading the text, identifying key words, and then indexing the pages based on content. Therefore, if a site is simply regurgitating key word rich content that makes no sense but produces key words, then Google will be able to reduce web polluters website rankings accordingly.
These farm content sites are now in the cross hairs of Google’s battle to reduce web polluters, and these new web robots should seek and destroy these farm content sites, dropping them from the “top rankings”.
Under Creative Commons License: Attribution
Original Picture Source:http://blogoscoped.com/forum/123419.html
