i was getting about 60 pages in the error log a week for bots looking for things like the portal page that no longer exists and a couple other pages. i looked things up on the net to find out what to do with the robots.txt file as i had never had one before. here's what i found from many searches...
all settings were on one line... this is wrong. here's what was in my robots.txt file as the result of many searches and even google's webmaster tools would not display any errors or warnings
Code: Select all
now, here's how this file is supposed to look and now actually works
User-agent: * Disallow: /portal.php Disallow: /donate/ Disallow: /downloadcentre/ Disallow: /tracker.php
Code: Select all
everything on it's own line. i also have the path to the sitemap added as some bots were looking for sitemap.txt
User-agent: *
Disallow: /portal.php
Disallow: /tracker.php
Disallow: /donate/
Disallow: /downloadcentre/
Sitemap: http://www.livemembersonly.com/sitemap.xml