To allow or disallow ?

Discussion in 'Lounge (non-StatCounter related topics here!)' started by Simmos, Aug 3, 2017.

  1. Simmos

    Simmos New Member

    Joined:
    Jul 6, 2014
    Messages:
    2
    Likes Received:
    0
    I have a small site currently 25 pages indexed. I also have in this site a further 300 odd html pages which are set to display inside a Lightbox window when a certain related link is clicked. These pages are simple product pages all identical except for images and h2 text and are currently sited in the robots text file as disallow.

    My question is should I submit these pages for index or am I correct in thinking that they are seen by google as duplicate content. I would like to include these in the index thus increasing my site content BUT not if there is a risk of these being seen as duplicate content

    Here is a link to one of them http://www.stubbycoolersonline.com.au/pops/683.html

    Thanks in advance

    Simmo
     
  2. webado

    webado Moderator

    Joined:
    Apr 29, 2004
    Messages:
    28,159
    Likes Received:
    1
    Well, unless you add more distinct and useful text to each of those pages, including a proper title tag and a description meta tag, as well as a navigation menu, there's no point in letting them get indexed, they will be considered thin content, rather useless.

    But rather than blocking them in the robots.txt file, you could add a robots noindex meta tag. The difference is that blocking in robots.txt doe snot prevent urls from being indexed, though they will be with the mention of being blocked by robots.txt. A robots noindex meta tag mean they will not get indexed and if already indexed, eventually they will get deindexed, but all this is provided they are not also blocked in the robots.txt file.
     

Share This Page