I think it sounds like text sitemap where we list all our URLS intext file so it can be crawled and indexed easily...
Hi, no I mean how I get lot of links from 3nd. Website in the google index where my links are exists. So when not crawled then worthless.
1. build deep links toward each page 2. use a good internal linking strategy and build link to homepage 3. create sitemap and submit them via Google Webmaster Tools
Hi, I mean deeplinks from 3nd websites. Example: I generate blog comments, board links, foren links, and more. Now I check some weeks later what links are in the google index and not. Now my question, how I get non indexd links in the google index? It is not so good when I make a html link with some 1000 links on my site. Is it a good idea and make a rss feed with all the links and ping them?
Hi, now I found the right name of services. My question, how can I do this "Backlink Indexing Service" self?