Google’s New Algorithm Against Content Theft

Google's New Algorithm Against Content Theft

As mentioned in a recent Hubspot blog post, Google is launching a new initiative against stolen web content. Users will apparently have the ability to report certain sites of stealing content, and if it can be maintained in court the website will be de-indexed by Google.
The downside is that this only happens if it can be proven in court that a website indeed stole copyrighted material from another site. Despite how rampant this is, imagining how often it will ever actually go to court is tough. Still, it feels like a step in the right direction for the search engine giant.

As I’ve blogged about in the past, duplicate content up until this point doesn’t actually get penalized the way it should, the way it’s said that it is. Luckily there are apparently also measures in place for a user being falsely accused to dismiss the claim, which will hopefully prevent legitimate writers from erroneously being de-indexed.

Ultimately, as the above article said well, “…if you’re less than scrupulous about proper source attribution for your content, now is the time to get serious. Google’s watching.”

Thoughts on this?


5 Responses to Google’s New Algorithm Against Content Theft

  1. TaeWoo March 1, 2014 at 4:56 pm #

    Wow.. mind boggling. But you know who the BIGGEST beneficiary of content scraping is? Google… they write zero content yet make billions in ads.

    I’m also a blogger and an engineer.. and i’ve noticed people quote websites without giving credit back to the content writer. So I came up with a little free tool to automatically give link back to bloggers & website owners when they copy/paste onto social media sites.

    • Brian Watkins March 1, 2014 at 8:45 pm #

      Hadn’t considered it like that, but true! I guess the saving grace for them versus the average scraper is that most websites want Google to acknowledge and share their content, and in the case of ads even pay for it. There’s meaningful reciprocity, unlike sites that fashion themselves as something like news sites while they “curate and package everybody else’s cool stuff” without ever asking for permission.

      Thanks for sharing that! I was just reading your post “4 Really Stinkin Awful Ways To Market” and really enjoyed it. Your idea about the reverse image search is another clever measuring stick for authenticity on sites.

  2. Brian Watkins August 16, 2012 at 4:25 pm #

    Another gripe of mine is when unanswered or unsolved forum threads rank over everything else. You can be researching or troubleshooting and find a thread with the issue you’re having, only to annoyingly discover no one’s replied or simply a bunch of people have thrown in their “Yeah, me too” statements. Some forums mark the thread as answered, and it’d be cool if there were some way for Google to de-prioritize unanswered threads, as it’s not relevant to my search if it doesn’t have an answer.

  3. Marty Watkins August 16, 2012 at 9:44 am #

    As I’ve said before, I get so sick and tired of looking through reviews on something only to find the exact same thing posted on so many different sites. It’s like they copy form someone else, post on their own site and try to make it look like it’s their work. It’s also funny that most of the posts are on or around the same date frame.
    I understand that not every one can get their hands on the latest software or hardware to do a real review on it, but then, if they feel a real need to post something, then give credit to where it came from.

    • Brian Watkins August 16, 2012 at 4:22 pm #

      Agreed. Even in scenarios where sites have an affiliate relationship where they can redistribute each other’s content, that’s not really useful to Google searchers to see every one of those affiliates’ duplicated results down the first page.

Leave a Reply

Show Buttons
Hide Buttons