How does Nibbler know what pages are inside my site?
Nibbler looks at the domain name for the address you're testing, and looks for pages inside that. So if you tested example.com, all of these would be valid:
But these would not:
Nibbler can't see all of my pages!
Nibbler is a 'bot' - an automated computer program which tries to find webpages by looking at the first one, and following links from that to other pages. If a website doesn't link pages to each other in a way that a bot can see, then Nibbler will not be able to explore it fully.
Most conventional websites will work fine. If a website is made entirely from Flash, we won't be able to generate an SEO audit. If a website requires you fill in a form to enter it (e.g. to confirm your age, or log in) then you can't run it through our website checker tool.
Some websites use special anti-bot technologies to stop spammers. These websites may also stop Nibbler from being able to test them.
My results are wrong!
When results don't match what you expect:
- Be sure to read what the test says carefully. A lot of common issues are explained in the text of the test, or in the help (click the "Help" link, at the top right of the test).
- Bear in mind Nibbler only audits the first 5 pages of the website that it finds. You can see the exact pages Nibbler found at the bottom of your report.
- The Facebook, Twitter and Google tests only work if they can find a link to the Facebook Page, Twitter account, or Google page within the 5 pages tested, and if that page/account links back to the website. This is done to ensure that the page/account belongs to the website.
- You may be testing a different web address from what you think you are using. Note that www.example.com and example.com (without the www prefix) are different web addresses, even though to most people they look like the same. Make sure you're testing the same website, and if you have both addresses, don't be surprized if the results are different.
- You may have found a bug. Please let us know and be sure to include the web address you were auditing.
I don't want my website to be tested!
You can prevent Nibbler from SEO auditing a website and remove any existing reports for that site by implementing a Robots directive (if you don't know what a robots directive is, see the Wikipedia page on Robots).
You'll need to add the following 3 lines to your robots.txt file:
# Make nibbler ignore this site
If you haven't already got a robots.txt, just create one in the root of your site (it should be publicly accessible via http://YOURDOMAIN.COM/robots.txt) and include the above lines.
- Nibbler will NOT obey a wildcard user-agent, you need to explicitly deny nibbler.
- No matter what you put on the 'Disallow' line, Nibbler will NOT test any part of a site which has a robots.txt file mentioning nibbler. This is in order to stop anyone manipulating their Nibbler scores.
- If the site in question has an existing report in Nibbler, it won't automatically get deleted. You will need to visit Nibbler, browse to the report and click the re-test button. Nibbler will then detect the presence of the robots.txt entry and remove the existing report.
- If the Nibbler report of a site has already been indexed by a search engine, it be take some time before the search engine updates its indexes to reflect the report being removed. The report may still appear in search engine results for a short time after it has been removed.
What user agent does Nibbler use?
Nibbler uses "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-GB; rv:220.127.116.11) Gecko/20090824 Firefox/3.5.3" for all requests.