To anyone that’s been writing on the Internet for any length of time, it’s no surprise that Matt Cutts needs to be constantly supplying information about what is and isn’t considered duplicate content. Of course, if you copy and paste whole sections from other websites, you’re lighting a fuse. However, there are some other less incendiary areas that legitimate online marketers and bloggers run into that need some kind of clarification and using press releases is one of them.
Here’s a clarification to muddy the waters right off. What this post is dealing with is press releases that are used on websites to keep readers current with newsworthy events in their particular market. Here’s a link to one of my more recent efforts for Big4.com where I am the content manager. Cutts recently posted a video on the subject of websites that search out relevant press releases like the ones on Big4.com and warned against searching out keywords and then auto generating the text. By auto generating, I think he meant copying and pasting.
One of the points of interest in his post dealt with intent. Unless I’m reading him wrong, Cutts seems to be saying that it’s evident to Google when a content manager or writer is trying to scam the system. How? Simply a large amount of content that rings the bells on Copyscape. Still, therein lies the problem for websites that use press release material in a relevant way. Big4.com, for instance, is a site that deals with the Alumni of the Big Four firms like Accenture, PwC, Deloitte and Ernst &Young, and as such a constant flow of press release information is necessary to build followers whether keywords and SEO are used or not.
With the large volume that passes through, it’s inevitable that there’s going to be some duplicate content when you consider these press releases are put out for the consumption of the media both print and online. From a journalism standpoint, for example, quotes can’t be altered but good ones on hot topics in the financial arena will get used by hordes of online and other publications.
That’s why I think it all comes back to the original intent of the Google updates in that they wanted to cut down on the rampant Black Hat seo that was turning the web into a wasteland of half intelligible babble. In other words, getting penalized for duplicate content doesn’t seem as ominous as it once did. In fact, it looks like as they sort out the boundaries, it will be more clear who has the Black hats on and who is wearing the White ones.