I see what you mean, but still there is no body who can say that's true, this box exists. What if it just takes too long for google to index those 1oo pages that I have?
That is just the dashboard that lists the sites that you have added to webmaster tools. Click on the second link, (not the forum one.. the aquatropicalfish.com. one) That opens up the page you want. what else does it say about Googlebot accessing and indexing your site. on that page? How come you have not added a robots.txt? Also there is a link in your sig that someone told you earlier about. it points to http://http//aquatropicalfish.wordpress.com/ Code (markup): Remove one of the "http://" James
Yeah, it's very unusual that you do not get ranked with 1,200 pages that are unique... and having backlinks. You are definitely banned, or have a problem with Google... There's no way you wouldn't be indexed... I can index a page that is like 200 words in one day using socialbookmarking. /Chris
1. every site always has a robot.txt but above all to exclude agents from certain site sections such as /cgi-bin or admin sections. make one and you have your peace 2. YES there is a solid possibility your site never was banned. but never properly indexed for reasons mentioned much earlier!!! 3. re your 150 times googlebot visit in past 2 months FYI to compare this month until yesterday I have 120039 Googlebot visits on my site - for 20 days! my site is larger ... but you should have with 1200 pages and regularly added fresh content a googlebot visit in the ten thousands per month. 4. to all others re code / HTML errors there are different errors with different impact: - errors early or top of page that kill bots - errors later in a page that kill bots after this point of the page or that make bots struggle and produce errors in parsing page-content - errors that in SOME weird combinations with other errors may cause bots to struggle and produce errors in parsing page content - errors that have no indexing/parsing impact on bots but may "only" influence proper/clean display in certain browsers - errors that impact only certain browsers but may be compensated by browser's capability to compensate for YOUR errors why the heck should OTHER people like SE or browser-industry have to worry how to compensate FOR a publisher's GROSS negligence and carelessness to produce clean valid code ? compensating for OTHER ppl / web publishers errors is expensive in additional resources, additional parsing time, additional source code needed to handle errors. professional publishers who LOVE their job and want to BE successful always offer clean code to speed up and facilitate the life of all others, bots, SE, browsers, etc! if you want to be a airplane pilot, you have to learn a LOT, including weather etc you want to be a ships Captn you have to learn about waves, stars, sat nav, GPS, sea charts, intl law at sea and much more you want to be a web publisher ... that's just another job like the a.m. samples you have to sit down and learn a few thousand hours, a few years to BE prepared, to KNOW your trade and to learn the rules else you fail and start whining and questioning and hoping that the error always in in others - such as banned or missing BL or whatever. web publisher includes being webmaster webmaster = web MASTER requires learning until MASTERING the trade with all applicable rules and techniques, code, netiquette, etc in normal countries an apprenticeship typically is 3-4 years full time ... for any job before it can be done as a professional. jobs are made to earn money, to make a living - first the sweat then the $ - is a saying in some languages. here in my host country ( ph ) we say "no work no money", "no money no honey" hence in short: "no work = no honey" whatever you do - do it right, do it all, and you'll really succeed to the point where your work makes and is FUN!
that is very strange :S for it only takes about 10-15 days and all my pages are index... send an email to google thats all I can say... and good luck by the way I dont this is the case but, just want to tell you that your home pape is almost 500kbs of size just check on rankingtoday.com, try to optimize your images for a smaller size just an opinion
Guys recently I got indexed finally but not cached yet, but just only forum, not the whole website. So it means to me that website name was not banned? How did this happen maybe. 1. I sent google recons. letter 2. I sent forum to google directory. here are indexed pages. http://www.google.com/search?q=site:aquatropicalfish.com/forum&hl=en
404: http://www.aquatropicalfish.com/robots.txt I see you still have not added a robots.txt Its pretty easy to do, and ya never know, it may help! James