Google Algorithm Change This Past Week

I follow SitePro News. This is super e-newsletter. You can get the feed by clicking our post title. In the recent newsletter a very savvy author was speaking about the algorithm change on Google that just started this Thursday.

In the article, he mentions that Google has created a trust factor that is placing site’s with older domains preferentially above new domains. He also mentions changes in the weighting of PageRank shown on the green bar on the Google Toolbar or within the Google Sitemap control panel and also inbound links.

Clearly in the next several days we will continue to see a shakedown and mix up in the index. Are all of these changes good things? Well to scrap the PageRank indicator in the Google Toolbar is a good thing. PageRank has been an area that can be gamed by search engine optimizers and so to get rid of that I personally feel is a good thing. It appears that TrustRank may be the next big factor and this may be a better indicator of real value of a site and therefor a strong indicator of good organic search placement.

Although this newsletter is not online yet on the SitePro News site, when it is in the next several days, it is a must read. The title is “Google Algorithm Update Analysis” and is written by Dave Davies. You may not agree with everything that he has to say, but if you have been following the various Google patent disclosures over the last several months to one year, what he says makes sense based on the technology that Google has been actively patented.

From my viewpoint all of this information just reaffirms that excellent and unique content on your website is important. If you take time to create and build a great site, it should not stop when you launch, new content, a blog, an e-newsletter that are done on a regular basis build new content and authority over time.

There is no quick fix for great organic site placement on search engines, but once it is achieved, you have hit a tipping point and your business and market presence increases dramatically because of it. So specifically working to improve organic placement is crucial for all growing businesses.

Check to See If Someone Has Snatched Your Blog Content

A blog reader at my other blog Web-World Watch, left this link http://www.copyscape.com/ on a post that spoke about Google dinging sites for showing duplicate content.

I entered my own blog address in this tool, and found that there were sites that had actually snatched my own blog content verbatim and had not supplied a link back or even had identified me as the author. In fact they had passed the content off as their own, and had selected some of my hottest traffic posts!

I have notified them of copyright infringement! You should check your own content to see if you have a similar problem. If you are like me, you don’t mind if others quote you, even show one or two paragraphs of your post and link back to read the full content, or even contact you for approval, but to simply snatch content and provide no links back and pass the content off as their own intellectual property? Very bad form!

The issue on duplicate content that Google is particularly targeting in one of their most recent patent disclosures is simply this case in point. Who should get the credit for duplicate content? Google is developing a way to identify the author of content just in a case like this. I would imagine that this will revolve around the initial post date recorded by the web server and a factor of a match to other content and writing style on the site. Eventually I am looking to the development of a trust certification for site owner to embed on their page that tags their content for Google.

In the meantime, if you are scraping someone else’s content from their blog, please stop! It’s time to create your own, and if you aren’t then check to see if someone is at Copyscape.com.

Check to See If Your Content Has Been Copied

In a previous post, I noted that Google is really cracking down on duplicate content. All site owners should work to clean up their site to make sure that duplicate pages like printer friendly versions of pages are blocked from spidering using the robots.txt file. This will prevent Google from dinging your site for duplicate content.

I did get a comment from a reader which pointed to a site where you can also check to see if someone has snatched your content or duplicated what you have done. Click my post title to visit CopyScape.com.

When I ran my own site through the tool, I found another site that had scraped several blog posts verbatim from my site and passed the content off as theirs. Hmm, that’s a copyright violation. I have notified the sites! I do not mind if you mention my content or show one or two paragraphs, but you must link back to the full article on my site. To simply snatch my content and say it is your intellectual property is wrong.

This is what the Google duplicate content algorithm change is all about! Identifying the legitimate owner and blocking from the index other sites that show this content. In some cases Google is identifying the rightful owner by the post date and by authority. I believe in the next year or even months to come, that we will even see a digital authority head tag tied to domains that Google will pick up to verify the site owner.

In the meantime, watch your site for duplicate content, check to see who has scraped your content, and if you have scraped my content please remove it or link back to my site and give me credit with a link.

An Interesting Article on the Supplemental Index

Click our post title to read this interesting article on how to keep your blog out of Google’s Supplemental Index. The writer offers an interesting tip on how to update your .htaccess file to turn all URLs into www’s. However you can only consider doing this if you are using FTP blogging on many different platforms. If your blog is hosted at Blogspot, you don’t have access to the server.