Okay, I'll use a non-complex example for this so here goes: Let's say I have a simple form that allows the user to enter their birth date. Upon filing it reloads the same page, passes the date, and my php script determines your zodiac sign using case and switch. At this point the page pulls information from my database (depending upon your sign) and displays a whole bunch of good content all about that zodiac sign. My question is this: will the search engines crawl all 365 configurations of birth date (not even counting the year) to arrive at all 12 different possible outcomes, and thus index ALL OF THE CONTENT I've written about the twelve zodiac signs? Because in this case, since the page is simply reloading, the URL is the same for all twelve case: 'zodiac.php'. OR... would it be wiser for me (upon page reload) to pass the person's sign as a parameter within the URL itself, thus ensuring that google/yahoo indexes all twelve signs and therefore all of my content. Example: 'zodiac.php?sign=aries', 'zodiac.php?sign=taurus', etc... I know that in the 2nd example google would definitely index the info. The first example though, is probably cleaner programming. Thanks in advance for the help.
How will a sitemap benefit me? In the first case, the sitemap would generate a single URL: 'page.php' In the second case, it would generate twelve URL's: 'page.php?sign=aries' 'page.php?sign=taurus' etc... Sooooo... my question still stands.
I don't believe Google will check all 365 dates in the first option, so it would be no help. Also, the second option would also not give you content unless you linked to all the pages from one central page yourself. This is where a sitemap may help you, but you would have to create it yourself, as a custom sitemap creation tool would not be able to do this.
According to the original post the information is stored in a database. All you have to do is write a simple script to grab the data and build the xml file to build a sitemap. I have done this for a website with over 30,000 pages. Trust me it can be done easily if the data is stored in a database.
Silly, Respectfully, I think you're wrong here. That is, unless you can explain to me what you're talking about in terms of URL listings. Because a sitmap is nothing more than a bunch of URL's. You're right in that a PhP-driven page making inquiries on a database would be fully indexed by google. I have an online retail site with a single 'detail.php' page. From the master page my products are displayed, each with an individual id number. When someone clicks an item, it passes the id within the URL, so the URL for the detail page ends up looking like this: 'detail.php?id=1001' for one item, and 'detail.php?id=1002' for the next item. Google indexed ALL of these pages, even without a site map (this is before I knew about site maps). They showed up in natural searches keyed off of content that was in the item descriptions. In THIS case (the case in my original post)... I'm not passing anything in the URL. I'm simply using a php script to determine the zodiac sign based upon the user's form-submitted birthday. When the page is reloaded the URL is exactly the same as it was when the person initially arrived on the page. Therefore, there's really nothing for google to index... from the URL. Now, if you're telling me google can somehow plow through my php script and determine all possible outcomes of someone entering information (birthdate) through the form... and then index all twelve zodiac signs worth of content on this one page... that's the ideal situation. But I don't trust the search engines to see all of my textual content if my script is internally determining what should get displayed on the page. Whew. Hope that makes sense to everyone.
It seems that your form is a POST form, even if its GET form google will not fire forms, if you want google to index those pages, you should change form to GET and put those URL in sitemap or list some links on homepage or other pages
I believe your stuck on user entry. From what I have read in your original post all your data is already stored in db, it just needs an entry from the user to tell it what data to show. With a sitemap you dont need user input, just all the links to each zodiac information to get it index.
Yeah, that's my problem. There's only one link for the page. Since the URL never changes, I have no way of telling the search engines how to access that info. My script pulls the DB info based on internally passed <form> elements, and not URL parameters. I'll probably just code it to assign a single variable for each zodiac sign, then reload the page with that variable in the URL. This way, there will be twelve combinations that I can list in the site map. Each one will pull unique content. I'll have to re-code the script to read the zodiac sign from the URL instead of calculating it internally. The original way is cleaner programming, but this way is tons better for SEO purposes. Thanks for the help though.
To be clear I still think a sitemap is your best option and my last statement was there to make my case of why I think a sitemap should be used. It allows search engines to reach content on your site that it might not normally see. You have the data, just put the links to it in a sitemap and submit it to google. I do the samething for my football website, the search engines might not find the players profile and to get to a player profile you have to search for them. With a sitemap Google now has index almost all 200+ players in the database.
I don't believe search engines will follow form controls designed to collect user input. What I would do is set it up so GET variables will lead to the page, probably just like you have it. But I would use a series of dropdowns for make the visitors life easier. From there I'd pretty much have two options. 1) Submit a sitemap that includes these urls 2) Have the indx page start out with links to all of these, and when the page loads have Javascript swap the container out with the form/dropdowns.