Robot.txt error really starting to p*ss me off! Help Please

Discussion in 'Search Engine Optimization' started by rjd1265, Jan 31, 2012.

  1. rjd1265

    rjd1265
    uix_expand uix_collapse
    Member

    Joined:
    Jan 27, 2012
    Messages:
    94
    Likes Received:
    13
    View attachment 1124

    The image may be hard to read but it says "robot.txt unreachable"

    I had the webmaster error stating "fatal health (or whatever) robot.txt not found.

    So i went into my sites webmaster page, created the robot.txt file pointing to all pages and then uploaded to my hosting company.

    The error went away but it still shows my index page and other page on the site as unreachable?

    Did i do something wrong or just it just take time for the bots to find the pages again?
     
  2. bretttina

    bretttina
    uix_expand uix_collapse
    Member

    Joined:
    Apr 19, 2011
    Messages:
    213
    Likes Received:
    20
    First. did you limit the files the spiders can crawl?

    Second, did you add your XML sitemap to the txt?

    Third, try re submitting a sitemap and then have a look?

    Hope this helps.

    Brett.
     
    • Like Like x 1
  3. rjd1265

    rjd1265
    uix_expand uix_collapse
    Member

    Joined:
    Jan 27, 2012
    Messages:
    94
    Likes Received:
    13
    I did not limit anything, i dont think
    Not sure how to add my sitemap as Webmaster tools does not give me the option

    This is the screen I get (below). I clicked the top button (Allow all)
    Then in section 2 I did allow and user-agent

    Is this correct?

    View attachment 1125
     
  4. bretttina

    bretttina
    uix_expand uix_collapse
    Member

    Joined:
    Apr 19, 2011
    Messages:
    213
    Likes Received:
    20
    OK to add your sitemap and set for restrictions,

    Open your preffered text editor (Notepad, Microsoft Word or any text editor and save the file as 'robots,'),
    Start with your usual robots text:

    User-agent: *
    Disallow:

    Blocking the spiders from your whole site is also one of the options. To do this, add these two lines to the file:

    User-agent: *
    Disallow: /

    If you'd like to block the spiders from certain areas of your site it might look something like this:

    User-agent: *
    Disallow: /database/
    Disallow: /scripts/

    Now to add your sitemap:

    Where you generated your sitemap you should be able to "view or download" it, press "view" then copy and paste this into your "robots.txt" then simply upload to your root
     
    • Like Like x 1
  5. band

    band
    uix_expand uix_collapse
    Member

    Joined:
    Oct 15, 2011
    Messages:
    183
    Likes Received:
    36
    it is robots.txt, not robot.txt. Missing a "S" there!!
     
  6. bretttina

    bretttina
    uix_expand uix_collapse
    Member

    Joined:
    Apr 19, 2011
    Messages:
    213
    Likes Received:
    20
    I have put the "s" in where it says

    Open your preffered text editor (Notepad, Microsoft Word or any text editor and save the file as 'robots,'),
    Start with your usual robots text:
     
  7. rjd1265

    rjd1265
    uix_expand uix_collapse
    Member

    Joined:
    Jan 27, 2012
    Messages:
    94
    Likes Received:
    13


    I want my enite site crawle so would i just do:

    User-agent *
    Allow /


    and I added my site map to Google Webmaster toos. Is that good enough or do i still need to add the sitemap context uner the Allow / line?

    Thanks for all your help...this error, in 5 years never came up for me and not sure why it just did now?
     
  8. bretttina

    bretttina
    uix_expand uix_collapse
    Member

    Joined:
    Apr 19, 2011
    Messages:
    213
    Likes Received:
    20
    you don't have to add your sitemap but when the robots/spiders crawl your "robots.txt" file
    they will automatically find your sitemap and it should be indexed faster
     

Share This Page