Home / SEO Blog - Our Articles / Removing Your Dodgy urls from the Google Index

Removing Your Dodgy urls from the Google Index

edit your robots.txt file to block Google from specific files and directories
edit your robots.txt file to block Google from specific files and directories

Last week I was doing some SEO on a reasonably new web site that was not ranking well and I checked to see what pages Google had already indexed. As well as the pages I was hoping for, there were also a lot of other pages that should not have been there.
These Google index entries were primarily created by sub-directories on the site open to the search engine. Most of these entries were generated by plugins.

You can check your web site by entering the following search into Google: site:domaintocheck.com.au Obviously you need to replace domaintocheck.com.au with your domain name 🙂

These strange pages can create problems from a rankings point of view because there is a good chance that Google will not understand what the site is all about. eg. 100 pages about galleries verses 7 pages about pest control.
It can also give hackers information that makes it easier for them to access your web site.

If you do find there are web pages in the Google index that you would rather see in the search results, there are three steps you can take:

1. Put a blank index.html file in the subdirectories that you do not want indexed.
2. Use the robots.txt file in your web hosting to block search engines from specified page
3. Remove undesired pages from the Google index via your Google Webmaster tools account.
None of these steps are particularly complicated, but you may have to get help from your web-person.

Top