First check the pages have been cached in the search engines, if they are then its a good indication that the pages are search engine friendly. If not they are blocked from spidering via robots.txt etc, its brand new and a bot has not visited yet, or they are unfriendly.
You can check if they are Search Engine Friendly if the pages doesn't contain a question mark. example: something like this. yourdirectory.com/dir/index.php?=49 is not search engine friendly This is search engine friendly: yourdirectory.com/dir/49.php You can use a rewrite engine to make your directory more search engine friendly.
This is not entirely true. The very popular biz directory to give an example uses dynamic pages and offers direct links which are spiderable.
I think the original question was about how to determine if a directory is search engine friendly and not if the links are spiderable instead of a link-out system. If that is what you are saying you're right, the biz-script uses hard links. But I still believe that G is having a harder time indexing sites with lots of urls like this /c=341&s=11 For example, my new site dirspace.com (about 2 months old), already has over 3000 pages indexed, all of my links are /dir/11/341.php and so on.
Getya dude, I completely agree about going to the effort in making URLs more search engine friendly. I think I just misinterpreted the question, I was thinking the original poster was trying to discover how to know which directories are of benefit to your rankings...
Also check for the presence of the rel=nofollow tag (rare), and whether you are receiving a static link, or is that just some kind of javascript or php redirect (more frequent).