How To Protect Search Rankings From Algorithm Changes?
Your rankings will improve. They will also get worse. Many people rush off to change things right away when the algorithms change. Sometimes the search engines roll in new algorithms aggressively, and then later roll them back. They cannot fight off new forms of spam and determine how aggressive to be with new algorithms unless they sometimes go too far with them.
If you are unsure of what just happened, then you may not want to start changing things until you figure it out. Sometimes when algorithms are rolled back or made less aggressive, many sites still do not rank well because their webmasters changed things that were helping them. Nobody is owed a good rank, and just because a ranking temporarily changes does not mean that a site has been penalized. It is far more likely that the ranking criteria shifted and the site may not match the new ranking criteria as well as it matched the old ranking criteria.
One of the greatest SEO techniques is knowing when to do nothing at all. I had one client with whom I shared profit, but for whom I did not do much work after the first few months. Why? After I built his site up, he had a strong market position. I could have kept building many links, but it would not help him reach much more of the market. It would have added nothing but cost and risk. If you are too aggressive, it adds to the risk profile without adding much on the reward side.
All SEO techniques are just a balance of risk versus reward, and while you want to rank at or near the top of the search results, you probably do not want to use techniques that are exceptionally aggressive as compared to the other top-ranking sites if you intend to build a site for long-term profits.
Common SEO Abuse Techniques
There is no such thing as a perfectly optimized page. Search engines do not want to return the most optimized page, but the page that best satisfies the searcher’s goals.
If you have a page title and H1 header that are exactly the same, and all of your internal links and all of your inbound links from other sites pointing to that page use that same text, then that looks suspicious (like attempted ranking manipulation). As a result, the search engines may de-weight that or filter that out of the search results.
How do you minimize your risks and make your site more stable? It’s best to mix things up a bit and create something that markets itself. Or, try looking at things like a search engine engineer would.
There is a concept called poison words, where if you have things like link exchange, add URL, or link partners on a page, there stands a good chance a search engines mayplace less weight on that page or its outbound links. In the past, some common poison words were things like forum and guestbook. The more likely the content is to be of low quality or related to spam, the more likely search engines want to de-weight it.
Search engines may want to penalize the use of “spammy” sites using an H1 header, so instead people use an H2 header for the highest level header tag on the spammy sites. Maybe they look to de-weight site-wide links to the home page near the end of the page code using the exact same link text as the home page’s page title, so instead you link to the home page from earlier in the page code and/or use slightly different anchor text than your page title and most of your link profile.
Keep in mind that some of the search relevancy algorithms are genetic algorithms that train themselves to test the relevancy of new result sets, but humans still program them. Google wants to have a bias toward informational resources. Yahoo! will be more biased toward commerce. These biases can affect optimization, as well.
If you think like a search engineer, those techniques that are common in SEO and not so common on regular websites are the most likely to be de-weighted or penalized. Remember that optimizing content is about matching quality signs, but if you match too many too closely, it could send a negative signal.