Hello All, I have made a new directory www.freesubmitdir.com few months ago.. I posted its link to many websites so that google will crawl it and index it.But it is still not indexed..Please can any one tell me why its not getting index..Or is there any method by which i can get my directory indexed by google.
Yes not yet index..just do little bit social bookmarking and forum posting. (http://72.14.235.132/search?hl=en&q=info:www.freesubmitdir.com&btnG=Search)
yes i have done so by using site: it shows only one url..but not cached..Also when trying to check the cached snapshot from page rank tool bar..it shows not indexed...
When we see using SITE: Only the main page indexed. and Mr.SEOgray, You haven't defined any keywords and description at your site in META tags. So i think google is not showing any description under that page link. i suggest you that do some digging then and be patient and wait, you will sure get success, i mean you will get indexed.
Hi All, I have made some changes to my website. I converted most of html files to ASP. It has been more than 2 months but I don't know why the ASP files are still not indexed with google. Traffic come on those asp pages only from yahoo, msn or other search engines. I am very much dishearten. I used the metatags as I was using when created them using HTML. Don't know if ASP pages need different kind of tagging or something else to be indexed by Google. Please help me as earlier when I was adding a HTML page to my website the page was getting indexed in not more than 2-3 days. And now when I am adding a ASP page, its not getting indexed to google. Please help me on this....do I need to change back all asp pages to html.
I had a look at your site. It was Indexed by google. Your site doesn't had any META description and META keywords so put META tags and submit your site into SBM sites, wait for a period and your site will get reindex soon.
looking at your robots.txt you've disallowed google and other places to cache it, here is how it reads # go away User-agent: * Disallow: / (when you disallow / you basically say disallow everything) Disallow: /admin/
Problem is with your robots.txt man! Correct it not remove below line from it Disallow: / It will solve your problem. SEO Company Australia
it is indexed already, please include your Keywords for more specific search because SEs indexed your main page immediately and your sub pages needs to wait until they're crawled by spiders...
LOL, its funny how his website tells search engines to go away yet he is seeking help to get his website indexed.
theres plenty of websites where people have meta tags saying noindex etc... and still come to ask why they havent been indexed. It is mildly amusing on a friday morning
your site is not truely indexed, yes it appears when you search site:www.yoursite.com but when you disallow everyone from the / folder this is saying, disallow http://www.yoursite.com/ which is everything google has seen it, and knows there is something useful, hence why its there but not cached or have a description in the results. change your robots text to remove the line disallow / and then submit it to social bookmarking sites again.
The first thing you must do is to add relevant META tags to your site. The SE spiders consider it as important for a site to be authentic. And yes always check the indexing of the site by using site:www.yoursite.com operator.