I seem to recall that dynamic cms systems, such as PHPNuke or Mambo, are crawled slower and that static pages are best if you have a large number of articles. Is this true? Can anybody point me to a case study?
i think its many factor : If you have a good PR or actually content its maybe crawler come everyday. If you have a xx-24.html page instead xxx.php?id=24 its become a PR if its important for you. Anyway html or php or asp its important that your sourcecode is easy. ITs good if you use only css instead tables. But tables mean not that crawler dont like it. Use css and js files instead all css and js code in everypage. I think mambo would be better because you can use with mod_rewrite = like html page. And mambo is more professionel. Its my opinion. Dont forget, your code must be easily for crawler, the pagesize not to big. Like 200 - 300 kb. Better you use no js menus, and the content is on the top of your source code.
As forum-index.com has said, it's the crawlability that matters, not the file extensions. Google give some basic parameters, no SID, <=3 parameters. The things to avoid are on page javascript javascript driven navigation on page CSS poor HTML where the browser is compensating for your design - but the bot can't follow it the output text file is too large
The short answer is static pages. However, managing static pages can be time consuming. Thr trick is to use dynamic pages that mimic the "good" qualities of static pages.