Nice job! FREEBS - Free Book Store 1.06 is posted. With 1.06 you can run the p3.php script multiple times giving it an override keyword, limit, and start page to populate your books database. I got my computer book store over 4K books in like 3 keywords. The new start page parameter allows those with php in safe mode and those with limited memory to populate their databases in small chunks that their server can handle. Now everyone can have a book store with a little effort. Upgrades are painless, just replace the specified php files and you're done. Thanks
p3.php has become too powerful. You can and if you're like me, you're trying to popluate the database with as many books as possible. Keep the Amazon TOS in mind when doing so. That's on you. However this has brought up another issue. Book List pages with hundreds of books. These pages have too many links and they are well over 50k and even 100K in some cases. This is not good search engine food IMO. So, we have pagination. Check my FREEBS book stores for the example and let's hear some feedback. Thanks
Sounds like a very good mod. I may end up editing my list anyway to look less spammy by removing the ISBN and AS of Date. Anyway seems I'm having a hell of a time getting goolge to suck this site up. I did have freebie on the domain previously maybe that turned google off. O well I'm going to make on more book store and then check back on them in 8 months =).
Yep, we're in vigin territory here where google is concerned. We'll have to see how it plays out. But in the webmaster guide, it's pretty obvious that hundreds of links and pages over 50K can be ignored. So this is where I'm taking FREEBS. I think we're on the right track.
Well Google indexed the index.php file within 48 hrs, but still hasn't gone deeper than that. I'm interested to see how long it takes.
I added this code to all my pages <?php if(eregi("googlebot",$HTTP_USER_AGENT)) { $crawl = gethostbyaddr($_SERVER["REMOTE_ADDR"]); if(eregi("64.",$REMOTE_ADDR)) { $crawler = "Refresh GoogleBot"; } if(eregi("216.",$REMOTE_ADDR)) { $crawler = "Google Deep Crawler"; } else { $crawler = "Unknown Crawler"; } if ($QUERY_STRING != "") {$url = "http://".$SERVER_NAME.$PHP_SELF.'?'.$QUERY_STRING;} else {$url = "http://".$SERVER_NAME.$PHP_SELF;} $today = date("F j, Y, g:i a"); mail("youremail@youraddress.com", "Googlebot detected on $SERVER_NAME", " $today \n Googlebot IP Address: $REMOTE_ADDR \n Googlebot Domain: $crawl \n Crawler Type: $crawler \n Url Visited: $url"); } ?> PHP:
That's nice. I have a hit counter I wrote that does similar and adds the info to a database. I just didn't add it to FREEBS. Thanks for sharing your idea, that's cool.
I got rewrite rule going for books. Nice .html pages. Feedback please. http://www.familyseekers.org/amazon/ <-- test folder excluded in my robots.txt
Yeap, maybe the the letter pages to so instead of Http://.../.php?key=f it would be http://.../keyf.html . Just an idea.
I'm working on that one, the index URL has a page number too with pagination. http://www._your_books_.com/books/index.php?key=h&p=2 That is not going as easily. I never did this before.
mnemtsas - Could you expand on how you did this? I'm not a CSS expert. Did you do it via php includes? I'd like to get my new bookstore http://www.sagetips.com/FREEBS Release 1.07/index.php to fit in better with my site. Thanks
I've added one and increased the number of books in two others in the last 10 minutes. Deleted a couple of testing bookshops. Down to only 4 now. Can't wait to see them indexed.
April 16, 2005, 5:03 pm Googlebot IP Address: 66.249.71.40 Googlebot Domain: crawl-66-249-71-40.googlebot.com Crawler Type: Unknown Crawler Url Visited: http://www.lakecs.com/computer-books/index.php