I just moved to a new software app, supposed to be nice and SEO friendly. It actually is something I've used on other sites with no problem - but for some reason my site: http://www.mmorpgedge.com has nothing more than the homepage scanned and when I try to build a sitemap - it doesn't spider through. Does anyone see some obvious reason why this is happening?
Have you checked your server logs to see when google last visited? The main thing that stops google indexing is a lack of high quality links to your site. If you build more links the site will get spidered more often.
One more thing, could it be because you have close to 300 links on the main page? Someone please correct me, I believe Search bots will have difficulties spidering a page containing more than 100 links.
Forget about the search engines. Having 300 links isn't going to make your users very happy...........
Yeah perhaps I should re-think the entire "game-list" on the front page... However, my issue wasn't "just" search engine spiders. I have tried several of the normal spiders that will compile a google sitemap for you and they all just stop....
You need to make sure you choose whether to use www.site.com or just site.com for your links. If you use both versions you are diluting your link juice which will not help.
Google Algorithm changes from time to time.... As I notice Google now crawls or spiders a site when it has fresh content... And add more quality links as Mad4 said... Try putting some contents on your site and you can add this tool to check when was the Google spiders your site.. Here is the link for the tool to check when was Googlebot last access your site.