Is there really a difference between static html pages and dynamic pages as far as google is concerned with backlinks or pagerank? Is a dynamic page that looks like a static html page different? Would google look at the server response to see if it had php stuff? I'm thinking it doesn't matter as long as the crawler can find all the pages? Or am I wrong?
The difference between static and dynamic pages becomes significant when arguments appear in the stem. ie /index.php?ac=1&dc=2 etc. The proposition is that there is additional overhead involved, so crawling occurs at a moderate pace to prevent loading the server unnecessarily. Google thus far, will index content (at a slower rate) with up to 2 args (as above). It's not about whether content is dynamically generated (ie the bot could see this in the response headers where php for example may be reported as the generator of the page). It's simply about load. Hence, if you have both static and dynamic content (as determined by the extension and/or presence of args) the static content will be readily indexed by google, while dynamic content is indexed at a slower rate. Look at the bottom of this forum.. you'll see 'Library'. Googlebot would prefer to chew on that, than this forums dynamic content. Of course, once a site is popular and has a decent PR, the equation does change somewhat..
Hi, in my experience two vars are OK. I've dynamic page in all major SE's. On a new domain I build the content up ststic untill it's got PR4 or WR3, than switch to dynamic not using mod_rewrite. If it fits I use the var names as keywords e.g. "airport=" and avoid things like ID or var content that could look like randomized cookies etc. I'm not sure if G is actually slower in sucking off content I think they allow a slightly longer waiting time before the bot gets bored and wanders off. On the below domain I see all bot's often at nearly the same time but G is very good at avoiding hitting content twice as very rarely I see bots fighting to get the same page. Nevertheless somewhere on such domains I use a site map where every possible combination is resolved and I use custom 404's and a small on page php line to send me warnings if requested content is blank. So I get e-mails when there is a problem or I've forgotten to create content for a possible combination. M
Isn't hard to tranform a dynamic URL to a static one, using a .htaccess file. Search Engines prefer static URLs, so.... If you use dynamic ones, NEVER use long vars, like the name in md5() or so, because Google will think that it is a SessionID: /getpage.php?url=hgkdfhdfh_md5_HASH_435636476
What about the extension .htm, .html, v .php, or .asp? I believe google prefers the .htm/l endings. Any ideas how they rank it -- a little bit more PR for .htm over .php?
My site is almost all dynamic and I'm in the top 5 for many of my kw/p's, top 10 for most of the rest and in the top 25 for several others. Google spyders my site often and the query string arguments don't seem to hamper them at all. Of course, I do offer a full site map to make sure every "page" can be found.
Shawn is right. Search engines merely request and analyze the client-side code, which is what gets sent to the browser. ASP, PHP, JSP and other dynamic languages use server-side code to generate client-side HTML code. Server-side code is not visible to a search engine. It runs on the server and generates the client-side code for a browser or a spider or anything else that requests the file. I recently reworked a site that was using static HTML pages and we turned them into ASP pages to give the client the flexibility to add dynamic content, menus, ads, etc. We did SEO the site as we worked on it. When we were finished, Google indexed the entire 120+ page site in less than a week and the client has dozens of top ten rankings for the first time, even though the content is essentially the same.
I feel that even though HTML and other languages do not have a weighted system, PHP and ASP, due to their robust abilities are the way to go when creating a website for search engines. You can add content in the drop of a hat, rather than coding for hours.
Thanks for all of the replies. I agree with the opinion that the extension doesn't matter. I prefer the dynamic coding too. Long live .asp and .php! I had watched another site get many more backllinks, but I now believe that was due to higher page rank (since page rank is required to be counted as a backlink)
google looks at the output of dynamic pages. For example to any dynamic page and view source, it returns looking just like html, and that is what google and other spiders see the page as when they crawl it. I have noticed over the past year or so google seems to crawl dynamic pages more frequently than static, mostly because they expect the content to change. But as far as overall rankings go there is no difference, but then again things like that do not make a difference anymore. The things that make the most factor is definatly backlinks, with some mild power of page content, though even that is being debated.
The problem with dynamic pages isn't the content! The problem are the URLs. As I stated before, Google doens't like URLs with random chars, but if you use URLs like index.php?page=page_description its OK.
I have a boat load of top 10 rankings, a wagon load of top 5 rankings and plethora of #1 rankings for my site. For many of my sponsors kw/p's, my site generally comes up higher in the serps than does theirs. Search for 'skid unit', 'reverse shackle kit', 'acoustishield', 'dana 60 front axle', 'ecco lightbar', 'jacobs ignition', 'volunteer firefighter', etc... and Project Responder will be #1 or in the top 5. Based on this success, I've created an SEO based content management system that will be up and running in a few weeks which is designed to help other site owners add relevant content and get high rankings in the serps. MN
im no expert but i have dynamic and static pages both indexed. I also have some sites which have a combination of pages (dynamic and static) and google has no problem and doesn't seem to get stuck
I have forum http://forums.gameguru.ru/ and it was all with dynamic pages for 2 years. It has PR 4 at home page and Google indexed just about 100 pages from it. 2 weeks ago I used URL Rewriting for static pages and now it's about 10,000 pages indexed by Google in just 10 days!!! I have +2000 uniq visitors per day surplus. So static rules for sure.
I'm using YaBB forum + some reprogramming + Apache URI Rewriting. I have to split it into 2 parts. The first is spiderable consists of just messages. And other - user profiles, post reply, send to friend, print version and some other junk that doesn't have any keywords and value is dynamic and it robots.txt there's string Disallow: /?action & Disallow: /?board. So every page of my forum have some valuable information for search engines and other 50,000 pages are not indexed and it takes 3-5 times less to spider my forum. It's very important for local search engines because they can't spider so many pages as Google does. And it seems Google likes it when there's no empty or semi-duplicate pages at all.