The Google Panda update from March 2012 changed the SEO practices forever. SEO now isn’t only about doing quality keyword research, using the keywords and building good quality content, making it accessible to the popular search engines, and trying to get some links to it.

After these new algorithm rules were introduced, the SEO experts and web owners had to start thinking about the website design and user experience, creating only highest quality content that people like, comment on and want to share, as well as optimizing around important user and usage metrics, like the time on site/page, the bounce and browse rate, click through rate from the search engine result pages, etc.

These new SEO practices are great for the future, but the problem the web owners are now faced is what to do with the content that already exists on their sites. Here’s what’s important to find and correct on your website to avoid being hit and pushed down in the rankings by the Google Panda update:

True Duplicates

The first thing you should fix on your website is the duplicate content. If you have created different unique URLs for the same pieces of content on your website you should remove these internal true duplicates, because the more URLs you build to lead to a particular page on your website, the more Google will think you are trying to trick the algorithms, though you maybe had no intentions of doing so, but was only trying to promote a certain article, for example.

If you syndicate your content across other domains, no matter if they are your own or some other websites, Google will penalize it as outside site true duplicate for causing ‘noise’ when all the different domains with same content appear in the search results. There’s also a lot of scrapping on the Internet, and if your site authority is lower than the authority of the scrapper sites, your content, though it is the original one, will end up lower in the search results. You’ll have to either improve your site authority, or issue a DMCA complaint.

If you own the domains where you syndicate your content, to fix this problem you’ll have to decide the source URL and redirect the other pages to it. If you’ve distributed your articles across other websites and you don’t own all the content properties, solving this problem will take more time as you’ll need to work things out with their webmasters.

Near Duplicates

Some websites, when trying to compete for better position in the search results for a specific geographic location or theme, are using near duplicate pages, where most of the content is the same across all pages, slightly modified according to their strategic keywords and headers. This is now penalized by Google as it immediately signalizes low quality to the algorithms. For the ecommerce websites this can cause a significant problem, because these websites often have multiple pages for same products but for different colors and sizes.

The best way to fix the geo-location or theme-based duplicate content is to rewrite the existing content so it would be fresh and original, because this is what Google loves the most. The owner of an ecommerce website can create one page for each product and block the pages generated dynamically with meta no index tags from getting crawled by Google.

The external near duplicates created when your partners and affiliates borrow content or product pages from your website are also fined with the Google Panda update. It would be best to create new unique content for these pages and solve this problem before Google punishes your website.

Internal Search Results

When people search for something on the Internet they want to get results that are relevant and full with information they’ll find useful. That’s what Google wants too. They don’t want the users to have to additionally search within the website internal search results to find what they need. This is more of interest of the owners or SEO experts responsible for larger websites, and the solution is blocking, or no indexing, the pages from being crawled by Google.

Heavily Emphasized Web Design and Low Density of the Copy

Websites that have highly structured design neglecting the copy portion are also penalized with the Google Panda update, because they create space that’s dominated by repetitive themes like mega footers, dynamic content, repeated images or excessive navigation, which indicates low quality. To solve this problem you need to determine what your site design can live without and eliminate it. Tone down on the design template and write some high quality, interesting and engaging unique content instead, and your users and Google will certainly love it.

High Ad Ratio

Since the Google Panda algorithms are created based on what the quality raters stated they like and dislike, and they prefer fewer ads on the websites, Google will give thumbs down to those that have high ad ratio. You should only keep the top two or three highest performing ads on your website and pull the ones that aren’t working to make sure your website won’t get punished with this update.

In addition to this, you can also eliminate the lowest-performing pages on your site, with low visit rate or high bounce rate, and few if any incoming links, and use tools to help you monitor your pages and detect duplicate content, like PlagSpotter or CopyScape.

Author Bio:
Austin Rinehart is the senior writer on PlagSpotter.com (this is external link), married and have two lovely adult daughters. Looking for opportunities to publishing on various topics such as internet trends, science researches, strategies of life improving and etc.
Related Posts with Thumbnails

Post a Comment