Regular spidering should be part of a web application's maintenance regiment. Of course, there are plenty of free and commercial tools to do it for you. Vulnerability scanners will typically come with a powerful spider function. On the other hand, public search engines like Google already do most of the work for you. In particular Google does provide you with a nice insider view into your web application. All you need to do is register. In order to register, you first need to sign up at Google Webmaster Tools. Next, you need to show that the site is actually yours. You may do so by adding a special file to the site, or by adding a meta header with a specific code. The process usually only takes a couple minutes.
But why should you bother?
First of all, Google webmaster tools will provide you with more insight into how Google indexes your page. The section I am looking at first is "crawl errors." It will tell you if any pages were not found, or if pages timed out. You can also check how robots.txt limited the crawl.
The "HTML Suggestions" section will tell you about search related errors in your HTML code. For example, inefficient use of META tags.
A common question we get at the Internet Storm Center is how to remove a URL from Google's index. Webmaster tools will again help with that. The feature is a little bit hidden, but if you select "Site Configuration" and "Crawler Access," you will see the "Remove URL" feature. This is probably the fastest way to remove a URL from Google.
Finally, the "Labs" section: One feature Google is currently experimenting with is malware detection. If Google found malware on your site, it will tell you about it. Sure, this is not the ideal way to find out about malware on your site. But better late then never.
I think Google's webmaster tool is a "must have" for everybody running a website. While not strictly a security tool, it does help with security (and of course search engine optimization).