How to maintain your position in SERP

What is shown in SERP instead of the site?

Through a somewhat comparisons and analysis with colleagues and friends of the web, I can say that usually instead of “downgraded” sites are proposed search results rather unique, in addition to sites that “merit” in the engine ranking:

– List directory containing the backlink to the site in fluctuation (although this appears to be inserted in the directory)
– Sites SPAM (spam engine, sites that use black-hat techniques, etc.)
– Sites with duplicate contents
– Sites without valid content and concerned for the end user (for example, an e-commerce site that has similar materials but does not have those)

What kind of sites have been affected?

Unlike previous fluctuations associated with new sites with little popularity and low TRUST, the fluctuation of the end of June struck several very authoritative sites with hundreds of backlink, contents valid both at SEO level and users. Moreover, it seems not to be no association between the nature of the affected sites. Indeed were downgraded websites:
– E-commerce
– Showcase / Institutional
– With / without Adsense
– From industrial to adult content – Etc..

When you have these fluctuations?

The fate has it that the beginning or the end of fluctuations coincide with the PageRank export/update. Much more difficult is to find the link between these two actions carried out “simultaneously” by Google.You should check your positioning on serp’s regularly using the google keyword ranking api

Why these sites suffer such penalties?

Hard to say, but what I can say is that the idea of cleaning SERP is no longer reliable, given that times are too long and would not guarantee a good service to the end user. Probably this is the fluctuations due to inclusion/activation of new algorithms that Google tests and try again and re-evaluate the results obtained from the SERP.But how can a website resolve this situation? My suggestion is always the same, continue the promotion of the site as if nothing had (I know it is difficult but now there are not immediate remedies) trying to create original content and collect spontaneous links even from sites with high TRUST (we will see later how to recognize and capture the trust of a site). The ideal would be to became independent from Google, by creating a community always present on the site through a blog or a forum or other innovative services that provide users’ loyalty.

Leave a Reply

Your email address will not be published. Required fields are marked *