The world of SEO moves extremely quickly, however the Googlebot often does not. There are many reasons why it would be advantageous to have a site recrawled before the next scheduled crawl is due. If the hit website has undergone major changes, a clean-up of links has taken place, and any malware that the site might have been infected with has been removed, the site should be reconsidered prior to a new advertising campaign launching.
Previously SEO experts employed several tactics which encouraged a recrawl ahead of schedule however this was never guaranteed to work. The most common tips and tricks included resubmitting an XML sitemap, using social bookmarking sites which linked into the site or employing a ping service.
There is now a new and significantly faster way for SEO experts to have your site recrawled by the Googlebot. Even better it is a service supplied by Google so is not only reliable but is also fully endorsed by the Google Masters themselves.
Within Google Webmaster tools can be found a new, poorly advertised feature called “Submit URL to index”. This came about as part of the update named “Fetch as Googlebot” and sadly is not widely known to exist by many SEO consultants, despite being a hugely useful tool that can save a lot of time and effort for those trying to coax the Googlebot to recrawl their site.
This Submit URL to Index tool enables site owners and SEO experts to add an updated or even a new URL for submission and Google endeavours to crawl these within a day or two at the most. The benefits that come with being able to manually resubmit a url in this way are countless. Aside from the time wasted on “might work” strategies to entice the Googlebot in, the diagnostic benefits to a site owner or SEO consultant are many.
Being able to resubmit a site to be crawled with not only ensures a site is accurately ranked according to its current state; it will also help SEO experts identify and deal with any issues that are negatively affecting the health of the site.
Caution is to be advised however. To maintain the quick turnaround for this service and to avoid the Submit URL to Index function being pulled or access being denied to some users it may be wise to not use the tool excessively. Ensure that pages are not hidden and that all known issues are dealt with before submitting a site for recrawling. Abuse of the system will no doubt result in consequences and no-one will wish to return to Googlebot baiting after enjoying the ease of use that comes with this new system.
Images: FreeDigitalPhotos.net & Jenn and Tony Bot