Very well, you have proved that you have an ability to spam webmaster forums... I apologise for underestimating you. So... How about this script
I think you will realise that Google doesn't tell lies: link:www.openpress.com link: www.openpress.com Maybe you should try it yourself or take advice from this thread?
Yahoo says http://www.mystockforum.com has about 500 incoming links. Now see what G says. That site has been up for about 2 years.
PRbot.com I'm not been sarcastic but do you know how to look on Google for backlinks? Both domains show 0 on link:www. and 74 links on link: www. Google always shows less backlinks than Google, but even so all I can see on Google is backlinks from forum threads .. nothing more. I haven't looked on Yahoo.
The site http://www.openpress.com/ has 314 pages indexed in Google, so it is not a spidering or indexing problem, IMO. The only correct use of the Google link:URL operator is without a space behind it, with a space it does not return a list on links but a list of pages that have the term link: and/or the term URL. The Google link:URL command returns at best a partial random list of some of the sites linking to the URL. link:www.openpress.com returns no results, and using the search @www.openpress.com returns 2,320 results which contain the text openpress.com but of the ten or so I checked none were actual hyperlinks, even the links from your own design site. IMO it looks like the problem is that you have not entered links to your site but text representing the URL, which does no good at all. Get some real links and you should start ranking.
There are at least 3000 pages on openpress.com so if they only indexed 314 it sure is not good but I don't see how you determine that they have indexed 314 pages.
The standard search for the number of pages indexed is site:URL and in the case of site:www.openpress.com Google returns: and then lists the pages which they have indexed.
There could be lots of answers to that question, but IMO a better question is, should they index the entire content of every site that is built? In the days when sites were built by hand there was generally unique or valuable content in each page, but with sites that are built by database, there are conceivably as many pages as there are keyword combinations, but after the first 100 or so pages is there any real value in indexing the same page with different keywords inserted into various places?