Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Mozscape API
    • Free SEO Tools
      • Competitive Research
      • Link Explorer
      • Keyword Explorer
      • Domain Analysis
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • SEO Q&A
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • Case Studies
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your All-In-One Suite of SEO Tools

        The essential SEO toolset: keyword research, link building, site audits, page optimization, rank tracking, reporting, and more.

        Learn more
        Try Moz Pro free
        Illustration of Moz Pro
      • Moz Local

        Complete Local SEO Management

        Raise your local SEO visibility with easy directory distribution, review management, listing updates, and more.

        Learn more
        Check my presence
        Illustration of Moz Local
      • STAT

        Enterprise Rank Tracking

        SERP tracking and analytics for SEO experts, STAT helps you stay competitive and agile with fresh insights.

        Learn more
        Book a demo
        Illustration of STAT
      • Mozscape API

        The Power of Moz Data via API

        Power your SEO with the proven, most accurate link metrics in the industry, powered by our index of trillions of links.

        Learn more
        Get connected
        Illustration of Mozscape API
      • Compare SEO Products
    • Free SEO Tools
      • Competitive Research

        Competitive Intelligence to Fuel Your SEO Strategy

        Gain intel on your top SERP competitors, keyword gaps, and content opportunities.

        Find competitors
        Illustration of Competitive Research
      • Link Explorer

        Powerful Backlink Data for SEO

        Explore our index of over 40 trillion links to find backlinks, anchor text, Domain Authority, spam score, and more.

        Get link data
        Illustration of Link Explorer
      • Keyword Explorer

        The One Keyword Research Tool for SEO Success

        Discover the best traffic-driving keywords for your site from our index of over 500 million real keywords.

        Search keywords
        Illustration of Keyword Explorer
      • Domain Analysis

        Free Domain SEO Analysis Tool

        Get top competitive SEO metrics like Domain Authority, top pages, ranking keywords, and more.

        Analyze domain
        Illustration of Domain Analysis
      • MozBar

        Free, Instant SEO Metrics As You Surf

        Using Google Chrome, see top SEO metrics instantly for any website or search result as you browse the web.

        Try MozBar
        Illustration of MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

        Read the Beginner's Guide
      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

        See All SEO Guides
      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

        Visit the Learning Center
      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

        Explore the Catalog
      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

        View All Webinars
      • SEO Q&A

        Insights & discussions from an SEO community of 500,000+.

        Find SEO Answers
      The Impact of Local Business Reviews
      SEO Industry Report

      The Impact of Local Business Reviews

      Learn more
    • Blog
    • Why Moz
      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

        Grow Your Business
      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

        Read Our Story
      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

        Drive Client Success
      • Case Studies

        Explore how Moz drives ROI with a proven track record of success.

        See What's Possible
      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

        Scale Your SEO
      • New Releases

        Get the scoop on the latest and greatest from Moz.

        See What’s New
      Surface actionable competitive intel
      New Feature: Moz Pro

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Mozscape API
      • Mozscape API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Intermediate & Advanced SEO
    4. Noindexing Duplicate (non-unique) Content

    Noindexing Duplicate (non-unique) Content

    Intermediate & Advanced SEO
    4
    32
    2761
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • khi5
      khi5 last edited by

      When "noindex" is added to a page, does this ensure Google does not count page as part of their analysis of unique vs duplicate content ratio on a website? Example: I have a real estate business and I have noindex on MLS pages. However, is there a chance that even though Google does not index these pages, Google will still see those pages and think "ah, these are duplicate MLS pages, we are going to let those pages drag down value of entire site and lower ranking of even the unique pages". I like to just use "noindex, follow" on those MLS pages, but would it be safer to add pages to robots.txt as well and that should - in theory - increase likelihood Google will not see such MLS pages as duplicate content on my website?

      On another note: I had these MLS pages indexed and 3-4 weeks ago added "noindex, follow". However, still all indexed and no signs Google is noindexing yet.....

      1 Reply Last reply Reply Quote 0
      • AlanMosley
        AlanMosley last edited by

        Canonical pages don't have to be the same.

        it will merge the content to look like one page.

        Good luck

        1 Reply Last reply Reply Quote 0
        • khi5
          khi5 @AlanMosley last edited by

          thx, Alan. I am already using re=next prev. However, that means all those paginated pages will still be indexed. I am adding the "noindex, follow" to page 2-n and only leaving page 1 indexed. Canonical: I don't think that will work. Each page in the series shows different properties, which means pages 1 - n are all different......

          1 Reply Last reply Reply Quote 0
          • AlanMosley
            AlanMosley last edited by

            Ok if you use follow, that will be ok. but I would be looking at canonical or next previous first

            khi5 1 Reply Last reply Reply Quote 0
            • khi5
              khi5 @AlanMosley last edited by

              I am trying to rank for those MLS duplicate alike pages, since that is what users want (they don't want my guide pages with lots of unique data, when they are searching "....for sale"). I will add unique data to page 1 of these MLS result pages. However, page 2-50 will NOT change (stay duplicate alike looking). If I have page 1-50 indexed, the unique content on page 1 may look like a drop in the ocean to G, and that is why I feel including "noindex, follow" on pages 2-50 may make sense.

              1 Reply Last reply Reply Quote 0
              • AlanMosley
                AlanMosley last edited by

                That's correct.

                you wont rank for duplicate pages, but unless most of your site is duplicate you wont be penalized

                khi5 1 Reply Last reply Reply Quote 0
                • khi5
                  khi5 @AlanMosley last edited by

                  http://moz.com/blog/handling-duplicate-content-across-large-numbers-of-urls  - that is Rand's whiteboard Friday a few weeks ago and I quote from the transcripts:

                  "So what happens, basically, is you get a page like this. I'm at BMO's Travel Gadgets. It's a great website where I can pick up all sorts of travel supplies and gear. The BMO camera 9000 is an interesting one because the camera's manufacturer requires that all websites which display the camera contain a lot of the same information. They want the manufacturer's description. They have specific photographs that they'd like you to use of the product. They might even have user reviews that come with those.

                  Because of this, a lot of the folks, a lot of the e-commerce sites who post this content find that they're getting trapped in duplicate content filters. Google is not identifying their content as being particularly unique. So they're sort of getting relegated to the back of the index, not ranking particularly well. They may even experience problems like Google Panda, which identifies a lot of this content and says, "Gosh, we've seen this all over the web and thousands of their pages, because they have thousands of products, are all exactly the same as thousands of other websites' other products."

                  1 Reply Last reply Reply Quote 0
                  • AlanMosley
                    AlanMosley last edited by

                    There is nothing wrong with having duplicate content. It becomes a problem when you have a site that is all or almost all duplicate or thin content.

                    Having a page that is on every other competitors site will not harm you, you just may not rank for it.

                    but no indexing can cause lose of link juice as all links pointing to non indexed pages waste there link juice. Using noindex,follow will return most of this, but still there in no need to no-index

                    khi5 1 Reply Last reply Reply Quote 1
                    • khi5
                      khi5 @AlanMosley last edited by

                      http://www.honoluluhi5.com/oahu-condos/  - this is an "MLS result page". That URL will soon have some statistics and it will be unique (I will include in index). All the paginated pages (2 to n) hardly has any unique content. It is great layout, users love it (ADWords campaign average user spends 9min and views 16 pages on site), but since it is MLS listings (shared amongst thousands of Realtors) Google will see "ah, these are duplicate pages, nothing unique". That is why I plan to index page 1 (the URL I list) but all paginated pages like: http://www.honoluluhi5.com/oahu-condos/page-2) I will keep as "noindex, follow". Also, I want to rank for this URL: http://www.honoluluhi5.com/oahu/honolulu-condos/ which is a sub-category of the first URL and 100% of the content is exactly the same as the 1st URL. So, I will focus on indexing just the 1st page and not the paginated pages. Unfortunately, G cannot see value in layout and design and I can see how keeping all pages indexed could hurt my site.

                      Would be happy to hear your thoughts on this. I launched site 4 months ago, more unique and quality content than 99% of other firms I am up against, yet nothing happens ranking wise yet. I suspect all these MLS pages is the issue. Time will show!

                      1 Reply Last reply Reply Quote 0
                      • AlanMosley
                        AlanMosley last edited by

                        If you no index, I don't think Next Previous will have any affect.

                        If they are different then and if the keywords are all important why no-index?

                        khi5 1 Reply Last reply Reply Quote 0
                        • khi5
                          khi5 @Philip-DiPatrizio last edited by

                          Thx ,Philip. I am using already, but I thought adding "noindex, follow" to those paginated pages (on top of rel=next prev") will increase likelihood G will NOT see all those MLS result pages as a bunch of duplicate content. Page 1 may look thin, but with some statistical data I will soon include it is unique and that uniqueness may offset lack of indexed MLS result pages.....not sure if my reasoning is sound. Would be happy to hear if you feel differently

                          1 Reply Last reply Reply Quote 0
                          • Philip-DiPatrizio
                            Philip-DiPatrizio @khi5 last edited by

                            Sounds like you should actually be using rel=next and rel=prev.

                            More info here: http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html

                            khi5 1 Reply Last reply Reply Quote 1
                            • khi5
                              khi5 @AlanMosley last edited by

                              Hi Alan, thx for your comment. Let me give you an example and if you have a though that's be great:

                              1. Condos on Island: http://www.honoluluhi5.com/oahu-condos/
                              2. Condos in City: http://www.honoluluhi5.com/oahu/honolulu-condos/
                              3. Condos in Region: http://www.honoluluhi5.com/oahu/honolulu/metro-condos/

                              Properties on the result page for 3) are all in 2) and all properties within 2) is within 1). Furthermore, for each of those URL, the paginated pages (2 to n) are all different, since each property is different, so using canonical tags would not be accurate. 1 + 2 + 3 are all important keywords.

                              Here is what I am planning: add some unique content to the first page in the series for each of those URL and include just the 1st page in the serious to the index, but pages 2 to n I will keep "noindex, follow" on. Argument could be "your MLS result pages will look too thin and not rank" but the other way of looking at it is "with potentially 500 or more properties on each URL, a bit of stats on page 1 will not offset all the MLS duplicate data, so even though the page may look thin, only indexing page 1 is best way forward".

                              Philip-DiPatrizio 1 Reply Last reply Reply Quote 0
                              • AlanMosley
                                AlanMosley last edited by

                                Remember that if you no-index pages, any link you have on your site pointing to those pages is wasting its link juice.

                                This looks like a job for Canonical tag

                                khi5 1 Reply Last reply Reply Quote 1
                                • khi5
                                  khi5 @Philip-DiPatrizio last edited by

                                  lol - good answer Philip. I hear you. What makes it difficult is the lack of crystal clear guidelines from search engines....it is almost like they don't know themselves and each case is sort of on a "what feels right" basis.....

                                  1 Reply Last reply Reply Quote 0
                                  • Philip-DiPatrizio
                                    Philip-DiPatrizio @khi5 last edited by

                                    Good find.  I've never seen this part of the help section.  Their resonating reason behind all of the examples seems to be "You don’t need to manually remove URLs; they will drop out naturally over time."

                                    I have never had an issue, nor have I ever heard of anyone having an issue, removing URLs with the Removal Tool.  I guess if you don't feel safe doing it, you can wait for Google's crawler to catch up, although it could take over a month.  If you're comfortable waiting it out, have no reasons to rush it, AND feel like playing it super safe... you can disregard everything I've said 🙂

                                    We all learn something new every day!

                                    khi5 1 Reply Last reply Reply Quote 1
                                    • khi5
                                      khi5 @Philip-DiPatrizio last edited by

                                      based on Google's own guidelines it appears to be a bad idea to use the removal tool under normal circumstances (which I believe my site falls under): https://support.google.com/webmasters/answer/1269119

                                      It starts with: "The URL removal tool is intended for pages that urgently need to be removed—for example, if they contain confidential data that was accidentally exposed. Using the tool for other purposes may cause problems for your site."

                                      Philip-DiPatrizio 1 Reply Last reply Reply Quote 0
                                      • khi5
                                        khi5 @Philip-DiPatrizio last edited by

                                        thx, Philip. Most helpful. I will get on it

                                        1 Reply Last reply Reply Quote 0
                                        • Philip-DiPatrizio
                                          Philip-DiPatrizio @khi5 last edited by

                                          Yes.  It will remove /page-52 and EVERYTHING that exists in /oahu/honolulu/metro/waikiki-condos/.  It will also remove everything that exists in /page-52/ (if anything).  It trickles down as far as the folders in that directory will go.

                                          **Go to Google search and type this in: **site:honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/

                                          That will show you everything that's going to be removed from the index.

                                          khi5 2 Replies Last reply Reply Quote 1
                                          • Philip-DiPatrizio
                                            Philip-DiPatrizio @khi5 last edited by

                                            Yep, you got it.

                                            You can think of it exactly like Windows folders, if that helps you stay focused.  If you have C:\Website\folder1 and C:\Website\folder12.  "noindexing" \folder1\ would leave \folder12\ alone because they're not in the same directory.

                                            1 Reply Last reply Reply Quote 1
                                            • khi5
                                              khi5 @Philip-DiPatrizio last edited by

                                              for some MLS result pages I have a BUNCH of pages and I want to remove from index with 1 click as opposed to having to include each paginated page. Example: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/page-52 I simply include"/oahu/honolulu/metro/waikiki-condos/" and that will ALSO remove from index this page: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/page-52 - is that correct?

                                              Philip-DiPatrizio 1 Reply Last reply Reply Quote 0
                                              • khi5
                                                khi5 @Philip-DiPatrizio last edited by

                                                removing directory "/oahu/waianae-makaha-condos/" will NOT remove "/oahu/waianae-makaha/maili-condos/" because the silo "waianae-makaha" and "waianae-makaha-condos" are different.

                                                HOWEVER,
                                                removing directory " /oahu/waianae-makaha/maili-condos/" will remove "/oahu/waianae-makaha/maili-condos/page-2" because they share this silo "waianae-makaha"

                                                Is that correctly understood?

                                                Philip-DiPatrizio 1 Reply Last reply Reply Quote 0
                                                • Philip-DiPatrizio
                                                  Philip-DiPatrizio @khi5 last edited by

                                                  Yep.  Just last week I had an entire website deindexed (on purpose, it's a staging website) by entering just / into the box and selecting directory.  By the next morning the entire website was gone from the index 🙂

                                                  It works for folders/directories too.  I've used it many times.

                                                  khi5 2 Replies Last reply Reply Quote 1
                                                  • khi5
                                                    khi5 @khi5 last edited by

                                                    so I will remove directory for "/oahu/waianae-makaha/maili-condos/" and that will ensure removal of "/oahu/waianae-makaha/maili-condos/page-2" as well?

                                                    1 Reply Last reply Reply Quote 0
                                                    • khi5
                                                      khi5 @Philip-DiPatrizio last edited by

                                                      thx, Philip. So you are saying if I use the directory option that will ensure the paginated pages will also be taken out of the index like this page: /oahu/waianae-makaha/maili-condos/page-2

                                                      khi5 Philip-DiPatrizio 2 Replies Last reply Reply Quote 0
                                                      • Philip-DiPatrizio
                                                        Philip-DiPatrizio @khi5 last edited by

                                                        I'm not 100% sure Google will understand you if you leave off the slashes.  I've always added them and have never had a problem, so you want to to type: /oahu/waianae-makaha-condos/

                                                        Typing that would NOT include the neighborhood URL, in your example.  It will only remove everything that exists in the /waianae-makaha-condos/ folder (including that main category page itself).

                                                        edit >> To remove the neighborhood URL and everything in that folder as well, type /oahu/waianae-makaha/maili-condos/ and select the option for "directory".

                                                        edit #2 >> I just want to add that you should be very careful with this.  You don't want to use the directory option unless you're 100% sure there's nothing in that directory that you want to stay indexed.

                                                        khi5 1 Reply Last reply Reply Quote 1
                                                        • khi5
                                                          khi5 @Philip-DiPatrizio last edited by

                                                          thx. I have a URL like this for a REGION: http://www.honoluluhi5.com/oahu/waianae-makaha-condos/  and for a "NEIGHBORHOOD" I have this: http://www.honoluluhi5.com/oahu/waianae-makaha/maili-condos/

                                                          As you can see Region has "waianae-makaha-condos" directory, whereas the Neighborhood has "waianae-makaha" without the "condos" for that region directory part.

                                                          Question: when I go to GWT and remove can I simply type "oahu/waianae-makaha-condos" and select the directory option and that will ALSO exclude the neighborhood URL? OR, since the region part in the URL within the neighborhood URL is different I have to submit individually?

                                                          Philip-DiPatrizio 1 Reply Last reply Reply Quote 0
                                                          • Philip-DiPatrizio
                                                            Philip-DiPatrizio @khi5 last edited by

                                                            Yep!  After you remove the URL or directory of URLs, there is a "Reinclude" button you can get to.  You just need to switch your "Show:" view so it shows URLs removed.  The default is to show URLs PENDING removal.  Once they're removed, they will disappear from that view.

                                                            khi5 1 Reply Last reply Reply Quote 1
                                                            • khi5
                                                              khi5 @Philip-DiPatrizio last edited by

                                                              good one, Philip. Last BIG question: if I remove URL's from GWT, is it possible to "unremove" without issue? I am planning to index some of these MLS pages in the future when I have more unique content on.

                                                              Philip-DiPatrizio 1 Reply Last reply Reply Quote 0
                                                              • Philip-DiPatrizio
                                                                Philip-DiPatrizio last edited by

                                                                When "noindex" is added to a page, does this ensure Google does not count page as part of their analysis of unique vs duplicate content ratio on a website? Yes, that will tell Google that you understand the pages don't belong in the index.  They will not penalize your site for duplicate content if you're explicitly telling Google to noindex them.

                                                                Is there a chance that even though Google does not index these pages, Google will still see those pages and think "ah, these are duplicate MLS pages, we are going to let those pages drag down value of entire site and lower ranking of even the unique pages". No, there's no chance these will hurt you if they're set to noindex.  That is exactly what the noindex tag is for.  You're doing what Google wants you to do.

                                                                I like to just use "noindex, follow" on those MLS pages, but would it be safer to add pages to robots.txt as well and that should - in theory - increase likelihood Google will not see such MLS pages as duplicate content on my website? You could add them to your robots.txt but that won't increase your likelihood of Google not penalizing you because there is already no worry about being penalized for pages not being indexed.

                                                                On another note: I had these MLS pages indexed and 3-4 weeks ago added "noindex, follow". However, still all indexed and no signs Google is noindexing yet.....
                                                                Donna's advice is perfect here. Use the Remove URLs tool.  Every time I've used the tool, Google has removed the URLs from the index in less than 12-24 hours.  I of course made sure to have a noindex tag in place first.  Just make sure you enter everything AFTER the TLD (.com, .net, etc) and nothing before it.  Example: You'd want to ask Google to remove /mls/listing122 but not example.com/mls/listing122.  The ladder will not work properly because Google automatically adds "example.com" to it (they just don't make this very clear).

                                                                khi5 1 Reply Last reply Reply Quote 1
                                                                • khi5
                                                                  khi5 @DonnaDuncan last edited by

                                                                  thx, Donna. My question was mainly around whether Google will NOT consider MLS pages as duplicate content when I place the "noindex" on. We can all guess, but does anyone have anything concrete on this, to make me understand reality of this. Can we with 90% certainty say "yes, if you place noindex on a duplicate content page, then Google will not consider that duplicate content, hence it will not count towards how Google views duplicate vs unique site content". This is the big question: If we are left in uncertainty, then only way forward may be to password protect such pages and not offer users without creating an account.....

                                                                  Removal on GWT: I plan to index some of these MLS pages in the future (when I get more unique content on them) and I am concerned if once submitted to GWT for removal, then it is tough to get such pages indexed again.

                                                                  1 Reply Last reply Reply Quote 0
                                                                  • DonnaDuncan
                                                                    DonnaDuncan last edited by

                                                                    Hi khi5,

                                                                    I think excluding those MLS listings from your site using the robots.txt file would be over kill.

                                                                    As I'm sure you well know, Google does what it wants. I think tagging the pages you don't want indexed with "noindex follow" AND adding them to the robots.txt file doesn't make the likelihood that Google will respect your wishes any higher. You might want to consider canonicalizing them though, so links to and bookmarks and shares of said pages get credited to your site.

                                                                    As to how long it takes for Google to deindex said pages, it can take a very long time. In my experience, "a very long time" can run 6-8 months. You do have the option however, of using Google Webmaster Tools > Google Index > Remove URLs to ask to have them deindexed faster. Again, no guarantees that Google will do as you ask, but I've found them to be pretty responsive when I use the tool.

                                                                    I'd love to hear if anyone else feels differently.

                                                                    khi5 1 Reply Last reply Reply Quote 0
                                                                    • 1 / 1
                                                                    • First post
                                                                      Last post

                                                                    Got a burning SEO question?

                                                                    Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


                                                                    Start my free trial


                                                                    Browse Questions

                                                                    Explore more categories

                                                                    • Moz Tools

                                                                      Chat with the community about the Moz tools.

                                                                    • SEO Tactics

                                                                      Discuss the SEO process with fellow marketers

                                                                    • Community

                                                                      Discuss industry events, jobs, and news!

                                                                    • Digital Marketing

                                                                      Chat about tactics outside of SEO

                                                                    • Research & Trends

                                                                      Dive into research and trends in the search industry.

                                                                    • Support

                                                                      Connect on product support and feature requests.

                                                                    • See all categories

                                                                    Related Questions

                                                                    • chalet

                                                                      Same content, different languages. Duplicate content issue? | international SEO

                                                                      Hi, If the "content" is the same, but is written in different languages, will Google see the articles as duplicate content?
                                                                      If google won't see it as duplicate content. What is the profit of implementing the alternate lang tag?Kind regards,Jeroen

                                                                      Intermediate & Advanced SEO | | chalet
                                                                      0
                                                                    • webmethod

                                                                      Duplicate Content / Canonical Conundrum on E-Commerce Website

                                                                      Hi all, I’m looking for some expert advice on use of canonicals to resolve duplicate content for an e-Commerce site. I’ve used a generic example to explain the problem (I do not really run a candy shop). SCENARIO I run a candy shop website that sells candy dispensers and the candy that goes in them. I sell about 5,000 different models of candy dispensers and 10,000 different types of candy. Much of the candy fits in more than one candy dispenser, and some candy dispensers fit exactly the same types of candy as others. To make things easy for customers who need to fill up their candy dispensers, I provide a “candy finder” tool on my website which takes them through three steps: 1. Pick your candy dispenser brand (e.g. Haribo) 2. Pick your candy dispenser type (e.g. soft candy or hard candy) 3. Pick your candy dispenser model (e.g. S4000-A) RESULT: The customer is then presented with a list of candy products that they can buy. on a URL like this: Candy-shop.com/haribo/soft-candy/S4000-A All of these steps are presented as HTML pages with followable/indexable links. PROBLEM: There is a duplicate content issue with the results pages. This is because a lot of the candy dispensers fit exactly the same candy (e.g. S4000-A, S4000-B and S4000-C). This means that the content on these pages are the basically same because the same candy products are listed. I’ll call these the “duplicate dispensers” E.g. Candy-shop.com/haribo/soft-candy/S4000-A Candy-shop.com/haribo/soft-candy/S4000-B Candy-shop.com/haribo/soft-candy/S4000-C The page titles/headings change based on the dispenser model, but that’s not enough for the pages to be deemed unique by Moz. I want to drive organic traffic searches for the dispenser model candy keywords, but with duplicate content like this I’m guessing this is holding me back from any of these dispenser pages ranking. SOLUTIONS 1. Write unique content for each of the duplicate dispenser pages: Manufacturers add or discontinue about 500 dispenser models each quarter and I don’t have the resources to keep on top of this content. I would also question the real value of this content to a user when it’s pretty obvious what the products on the page are. 2. Pick one duplicate dispenser to act as a rel=canonical and point all its duplicates at it. This doesn’t work as dispensers get discontinued so I run the risk of randomly losing my canonicals or them changing as models become unavailable. 3. Create a single page with all of the duplicate dispensers on, and canonical all of the individual duplicate pages to that page. e.g. Canonical: candy-shop.com/haribo/soft-candy/S4000-Series Duplicates (which all point to canonical): candy-shop.com/haribo/soft-candy/S4000-Series?model=A candy-shop.com/haribo/soft-candy/S4000-Series?model=B candy-shop.com/haribo/soft-candy/S4000-Series?model=C PROPOSED SOLUTION Option 3. Anyone agree/disagree or have any other thoughts on how to solve this problem? Thanks for reading.

                                                                      Intermediate & Advanced SEO | | webmethod
                                                                      0
                                                                    • iam-sold

                                                                      Duplicate Content: Is a product feed/page rolled out across subdomains deemed duplicate content?

                                                                      A company has a TLD (top-level-domain) which every single product: company.com/product/name.html The company also has subdomains (tailored to a range of products) which lists a choosen selection of the products from the TLD - sort of like a feed: subdomain.company.com/product/name.html The content on the TLD & subdomain product page are exactly the same and cannot be changed - CSS and HTML is slightly differant but the content (text and images) is exactly the same! My concern (and rightly so) is that Google will deem this to be duplicate content, therfore I'm going to have to add a rel cannonical tag into the header of all subdomain pages, pointing to the original product page on the TLD. Does this sound like the correct thing to do? Or is there a better solution? Moving on, not only are products fed onto subdomain, there are a handfull of other domains which list the products - again, the content (text and images) is exactly the same: other.com/product/name.html Would I be best placed to add a rel cannonical tag into the header of the product pages on other domains, pointing to the original product page on the actual TLD? Does rel cannonical work across domains? Would the product pages with a rel cannonical tag in the header still rank? Let me know if there is a better solution all-round!

                                                                      Intermediate & Advanced SEO | | iam-sold
                                                                      0
                                                                    • fablau

                                                                      Last Panda: removed a lot of duplicated content but no still luck!

                                                                      Hello here, my website virtualsheetmusic.com has been hit several times by Panda since its inception back in February 2011, and so we decided 5 weeks ago to get rid of about 60,000 thin, almost duplicate pages via noindex metatags and canonical (we have no removed physically those pages from our site giving back a 404 because our users may search for those items on our own website), so we expected this last Panda update (#25) to give us some traffic back... instead we lost an additional 10-12% traffic from Google and now it looks even really badly targeted. Let me say how disappointing is this after so much work! I must admit that we still have many pages that may look thin and duplicate content and we are considering to remove those too (but those are actually giving us sales from Google!), but I expected from this last Panda to recover a little bit and improve our positions on the index. Instead nothing, we have been hit again, and badly. I am pretty desperate, and I am afraid to have lost the compass here. I am particularly afraid that the removal of over 60,000 pages via noindex metatags from the index, for some unknown reason, has been more damaging than beneficial. What do you think? Is it just a matter of time? Am I on the right path? Do we need to wait just a little bit more and keep removing (via noindex metatags) duplicate content and improve all the rest as usual? Thank you in advance for any thoughts.

                                                                      Intermediate & Advanced SEO | | fablau
                                                                      0
                                                                    • MarloSchneider

                                                                      Does having a page that ends with ? cause duplicate content?

                                                                      I am working on a site that has lots of dynamic parameters. So lets say we have www.example.com/page?parameter=1 When the page has no parameters you can still end up at www.example.com/page? Should I redirect this to www.example.com/page/    ? Im not sure if Google ignores this, or if these pages need to be dealt with. Thanks

                                                                      Intermediate & Advanced SEO | | MarloSchneider
                                                                      0
                                                                    • AlightAnalytics

                                                                      Duplicate content for swatches

                                                                      My site is showing a lot of duplicate content on SEOmoz.  I have discovered it is because the site has a lot of swatches (colors for laminate) within iframes.  Those iframes have all the same content except for the actual swatch image and the title of the swatch. For example, these are two of the links that are showing up with duplicate content: http://www.formica.com/en/home/dna.aspx?color=3691&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= http://www.formica.com/en/home/dna.aspx?color=204&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= I do want each individual swatch to show up in search results and they currently are if you search for the exact swatch name.  Is the fact that they all have duplicate content affecting my individual rankings and my domain authority?  What can I do about it?  I can't really afford to put unique content on each swatch page so is there another way to get around it? Thanks!

                                                                      Intermediate & Advanced SEO | | AlightAnalytics
                                                                      0
                                                                    • grayloon

                                                                      Mobile Site - Same Content, Same subdomain, Different URL - Duplicate Content?

                                                                      I'm trying to determine the best way to handle my mobile commerce site. I have a desktop version and a mobile version using a 3rd party product called CS-Cart. Let's say I have a product page.  The URLs are... mobile:
                                                                      store.domain.com/index.php?dispatch=categories.catalog#products.view&product_id=857 desktop:
                                                                      store.domain.com/two-toned-tee.html I've been trying to get information regarding how to handle mobile sites with different URLs in regards to duplicate content.  However, most of these results have the assumption that the different URL means m.domain.com rather than the same subdomain with a different address. I am leaning towards using a canonical URL, if possible, on the mobile store pages.  I see quite a few suggesting to not do this, but again, I believe it's because they assume we are just talking about m.domain.com vs www.domain.com. Any additional thoughts on this would be great!

                                                                      Intermediate & Advanced SEO | | grayloon
                                                                      0
                                                                    • erangalp

                                                                      ECommerce syndication & duplicate content

                                                                      We have an eCommerce website with original software products. We want to syndicate our content to partner and affiliate websites, but are worried about the effect of duplicate content all over the web. Note that this is a relatively high profile project, where thousands of sites will be listing hundreds of our products, with the exact same name, description, tags, etc. We read the wonderful and relevant post by Kate Morris on this topic (here: http://mz.cm/nXho02) and we realize the duplicate content is never the best option. Some concrete questions we're trying to figure out: 1. Are we risking penalties of any sort? 2. We can potentially get tens of thousands of links from this concept, all with duplicate content around them, but from PR3-6 sites, some with lots of authority. What will affect our site more - the quantity of mediocre links (good) or the duplicate content around them (bad)? 3. Should we sacrifice SEO for a good business idea?

                                                                      Intermediate & Advanced SEO | | erangalp
                                                                      0
                                                                    Moz logo
                                                                    • Contact
                                                                    • Community
                                                                    • Free Trial
                                                                    • Terms & Privacy
                                                                    • Jobs
                                                                    • Help
                                                                    • News & Press
                                                                    • Mozcon
                                                                    © 2021 - 2023 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.

                                                                    Looks like your connection to Moz was lost, please wait while we try to reconnect.