Search engines do not care much about dynamic content - if your site changes every 24 hour, that's already great. So why not cache pages specifically for search engines to lighten the server load?
Pfff, it is a mouthful - the Yireo Search Engine Page Cache extension for Magento - but it tells you precisely what it does: Your pages might be filled with all kinds of dynamic data - a shopping cart, a compare-to box, etcetera.
Guess what? Search-engines do not really care about that dynamic functionality, because search engines do not want to add products to cart and they do not want to compare products. The only thing that search engines care about is reading content from your site to index it.
What is your Magento content composed of? CMS-content, product-categories and products. If those entities do not change hourly, you might as well cache that content so that it is available quickly. This is where our extension comes in: It allows full-page caching - not for normal visitors (which would require tricks like hole-punching to make sure dynamic pieces like the cart still work), but only for search engines.
The extension is fully compatible with Magento multi-site.
Because robots that crawle your site will no longer need to load everything dynamically, but get served HTML-output right away, this lightens the load on your entire server. The longer the cache lifetime of this full-page cache, the more your server will benefit from it. A side-effect is also that pages will load faster for robots. This might have a beneficial impact on your SEO-rankings.
To use this extension, purchase the package, download the package-file and extract the files in your Magento root. Refresh the Magento cache and you're good to go.
Optionally, you can configure additional user-agents from within the Magento configuration. Through the System Configuration you can also configure the cache lifetime (default set to 1 day).
An additional tip is to add a statement Crawl-delay: 10 to your robots.txt file, which will tell robots to delay the next request for 10 seconds. This delay shouldn't be too long, but it shouldn't be too short either. Setting it between 2 and 10 seconds makes sense.
The following robots are cached by default:
This list can be extended using the System Configuration.