Over the last few weeks I've been advising Clients on quite large on-site changes for multiple reasons, including:
- Getting rid of duplicate content by disallowing chunks of pages in the Robots.txt file
- Condensing small pages into one larger tabbed page, then 301'ing the old URL's.
The Disallow change was made approximately 2 weeks ago and I'm still waiting for Google to clear out the duplicated pages. It is declining, just very slowly. I'm use to seeing these sorts of changes happen pretty quickly.
I think it's fair to expect a "Disallow" change in the Robots.txt file to have taken full affect within 24 – 48 hours. Imagine if some private data got leaked and it had taken two weeks for it to be cleared up!
On the other hand it's taken about 9 days for Google to pick up the new tab pages & 301 Redirects which is a little disappointing when by comparison this blog post will be in Google's index within minutes of me posting it.
I do question whether Google's priority to new content over keeping the existing index up to date is right one. It's frustrating for webmasters and could potentially cause irrelevant results in the SERPS.
It also leads to the question of how long it's taking Google to count new IBL's – probably too long unless they're links on a new page, I'm interested to know if anyone thinks a link isn't counted until it shows in Webmaster Tools which also sometimes takes several weeks.