The classic example is JavaScript-rendered dynamic content. This tends not to work so well when you're dealing with search engines. However, if you can serve them a static page that contains the text of the page minus all the rendering, then it can index the content without choking on the JavaScript. I'm not sure how important this is these days, but it certainly was a problem at one time.
That's what the noscript tag is for
It's also useful to serve modified versions for search engines so that searches for content within your site can return more relevant results. For example, you might insert certain keywords that describe the content of the page using terms that don't actually appear. Case in point, your page talks about Airport, but you serve a copy to Google that inserts the terms 802.11 and Wi-Fi.
That's what the meta name=keywords tag is for
Finally, there's the question of bandwidth and CPU overhead. If your site changes a lot, Google beats on your servers rather frequently. You can reduce the bandwidth hit by stripping JavaScript, CSS, images, etc. from your content before serving it to Google. This won't significantly change the searchability of the content, but will reduce the bandwidth overhead. And, of course, if there are static versions of content that you can serve instead of a server-side-dynamic version, this also saves on CPU overhead.
Google spiders text, not images. It also doesn't spider the text of css or javascript files. Also, I question how effective it is to dynamically decide to serve a static page based on a user-agent as opposed to merely serving everyone the dynamic page.