Getting Delisted from Google – Counter SEO??By
In case you don’t know what I mean by “delisted” from Google, it essentially means removing one of your website or a portion of your website from the Google search result.
Now why would I want to do that? Doesn’t that seem kinda silly in the realm of SEO where you want as much exposure as possible? The answer is that there may be times where you have sensitive material you want to “hide” from the public.
In my case, it was a bunch of SEO Networker content Fernando and I put together that we didn’t want released into the public. Well, unfortunately, Google bot got a hold of it and was more than happy to list its location. Now anybody can search for it and get their hands on it.
So I went on a mission – how do you remove certain elements of your website from Google without actually removing it from your server? After all, you still want it available to a select group of people.
Three things are required for this to happen:
- Get a Google Webmaster account and verify your website
- Setup a (proper) robots.txt file and upload it to your server (your top folder, where you main files reside)
- Submit a delist request to Google via Google Webmaster
Signing Up for Google Webmasters
Signing up for a webmaster account is a process well explained by… well, the process itself so I won’t bother listing out what needs to be done here. Just head over to Google Webmaster and click on “Sign in to Webmaster Tools”, then follow the instructions provided (i.e. log into your Google account).
Your Robots. txt…
Okay, this is the fun part
Robots.txt is a file you put on your server (associated with your domain) that tells the search engine bots what they can/should and can’t/shouldn’t crawl. If that was greek to you, let me explain another it in simpler terms. Search engines have to know what part of your website they can show in the search results. They look to your robots.txt file for permission.
You can go to Robotsxtx.org for more details.
The real question is… what do you put in this file so Google (and other search engines) know what NOT to crawl?
So this is what you do, first create a file called robots.txt (all in small letters) on your desktop. Open it in an editor like notepad. And then copy and paste the following snippet of code in there:
The first line says “what is stated in the following lines every single search engine robot“. Then the ensuing lines tell them each of the folders you DON’T want listed.
For example, http://www.yourdomain.com/example/ would NOT be listed. And neither would any of your subdomains that start with “sec” such as http://www.yourdomain.com/seconds and http://www.yourdomain.com/secrets.
Then you save the robots.txt file and upload it to your domain – at the very top folder (i.e. http://www.yourdomain.com)
Telling Google to Bug Off… Nicely & Strategically of Course
Okay, the last step is a snap and it takes place right inside your Google Webmaster account. You simply log in, and on the menu to the left, click on “Tools” and then “Remove URLs”.
And all the remains now is to click on the button that says and then following the instructions on the screen (selecting an entire website, just a directory within your website, specific files, or some outdated info still residing in the Google’s database – this would force them to update what your webpage looks like in their database.Confused?»
Voila, That’s It Folks!
Google’s response was fairly quick, they had my requested folder delisted by the next time, when I checked on it. Which folder got delisted…? That’s for me to know and you to will… never find out