Can anyone point me in the direction of a good tool (preferably free, of course) that would allow me to check my link partners, for a link back to my site. I guess I would just like to be able to run through my reciprocal links pages and be able to tell if there is a link back from each of the sites I link to. Happy to expand or think about the specs a bit more if required, but I am sure there is something already out there. I know linksmanager does it but I'd rather just stick with staight html on my pages and just automate the checking not the adding. Thanks,
I thought about making a free web-based system for people... but then I shied away from it because I didn't want digitalpoint.com to be somehow branded as a site that condones and helps reciprocal linkers (not that I personally have anything against them). That being said, if you have a list of the pages your links *should* be on, I probably could whip you up a script to do it for you (I don't want to get into spidering whole sites looking for a link though... so you would need to know where the link is supposed to be). If you have the list in a text file, I could make you a PHP script that did it for you if you want. - Shawn
Ive made a PHP script that checks recipical links from them awhile back its actually quite an easy script to make, which suprises people, I can make just about anything with PHP! If digitalpoint turns out to busy or you would perfer a ready made developed one then give me a shout.
Go for it... If you want, zip it up and attach it to this thread if you want anyone to be able to grab it. Wouldn't take me more than 5 or 10 minutes to do, so it's not a time thing... but if you already have one, then it's all you. - Shawn
OK, so I found PHPCrawl over at Sourceforge and managed to get it to work. Even managed to modify it to run through a mySQL database and return pages where it finds a link. My problem is that once I confirm a link at the site I want to move on to the next site. Can't seem to get this to work with my very limited knowledge of PHP. Normally I just adapt other stuff. The code is below and I don't think you need to see all the other parts. I just want to be able to move to the next site if foo.com is found in the site in the $row[LinkURL] array in which the content is in the array $page_data[source]. Can anyone help? (Sorry don't know how to put code in any other way) <?php set_time_limit(10000); include("classes/phpcrawler.class.php"); include("classes/phpcrawlerutils.class.php"); include("dbConnect.inc"); class myCrawler extends phpcrawler { function handlePageData($page_data) { if ($page_data[received]==true) { if ( preg_match("/.*www.foo.com.*/", $page_data[source]) ) { echo "found at ".$page_data[url]."<br>"; } } flush(); } } mysql_select_db("foo_reciprocal"); $result = mysql_query('SELECT * FROM links ORDER BY ID'); while ($row = mysql_fetch_array($result)) { $crawler = &new MyCrawler(); $crawler->setURL($row[LinkURL]); $crawler->addReceiveContentType("/text\/html/"); $crawler->addNonFollowMatch("/.(jpg|gif|png)$/ i"); $crawler->setCookieHandling(true); // $crawler->setTrafficLimit(1000 * 1024); $crawler->go(); $report = $crawler->getReport(); } echo "Summary:<br>"; if ($report[traffic_limit_reached]==true) echo "Traffic-limit reached <br>"; echo "Links followed: ".$report[links_followed]."<br>"; echo "Files received: ".$report[files_received]."<br>"; echo "Bytes received: ".$report[bytes_received]."<br>"; // ?> PHP:
hey shawn, this is all getting too complicated... you said you could (in 10 minutes) build a tool? time for you to step in and give us something that actually works and does not require a degree in php or perl...
Nothing too fancy... <?php $mydomain = "www.yourdomain.com"; // Set this to your domain $list = file_get_contents("sites.txt"); $urls = explode ("\n", $list); echo "<B>Checking back links to $mydomain....</B><P><FONT SIZE=-1>"; foreach ($urls as $url) { if (strlen ($url)) { echo $url . "<B><FONT COLOR="; if (strpos (file_get_contents($url), $mydomain) != FALSE) { echo "GREEN> Found"; } else { echo "RED> Missing"; } echo "</FONT></B><BR>"; } } echo "</FONT>"; ?> PHP: Reads a file (in the same directory) named sites.txt which is a list of URLs to check (separated by a carriage return). - Shawn
How effective is this link exchange? Does anyone can tell for sure that the return justify the required effrots. Thanks www.all-battery.com
Shawn, Your "nothing too fancy" reciprocal links checker is pretty good. But, I'd really like to see a few simple things added if it's not too much trouble: 1) Is there a way to speed up his reaction to a 404 page: http://www.southtexastrading.com/links.htm Warning: file_get_contents(http://www.southtexastrading.com/links.htm): failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found in /home/finalone/public_html/@linkcheckerAA/link-checker.php on line 13 RED> Missing 2) Could you make it so the site.txt file could have a comments in it that don't get processed? For example if a line in site.txt reads http://www.adomain.com I'd like to have a comment on it like: http://www.adomain.com|emailed on 4/6/04.... Don't need the comment processed (just ignored) by the php script. handle add to the site file
Better test dude - has errors. I need to be able to put the comment on the same line as the url OR on a seperate line. It turns out that the old script seems to deal with it better than the new one?????????????????????????
Not sure... works for me... checker.php <?php $mydomain = "www.domain.com"; // Set this to your domain $list = file_get_contents("sites.txt"); $urls = explode ("\n", $list); ini_set (default_socket_timeout, "5"); echo "<B>Checking back links to $mydomain....</B><P><FONT SIZE=-1>"; foreach ($urls as $url) { if (strlen ($url) && $url{0} != "#") { echo $url . "<B><FONT COLOR="; if (strpos (file_get_contents($url), $mydomain) != FALSE) { echo "GREEN> Found"; } else { echo "RED> Missing"; } echo "</FONT></B><BR>"; } } echo "</FONT>"; ?> PHP: sites.txt http://www.digitalpoint.com #comment #blah http://www.site2.com http://www.site3.com Code (markup): Attempts to only do the 3 sites... - Shawn
Yep, you are right. But, I decided I liked the original script because if you code the site.txt like this: http://www.domain.com #-- emailed on 4/12 it results in this http://www.domian.com/links.htm #---emailed on 4/12 Found The comment is reported and I kinda like that. Plus the comment can be on the same line as the url! With the new script any comments on the same line as the URL are not reported......
i use a great free tool called swapware..it manages, and checks your reciprocal links..super easy to install too. you can download it here http://www.bookfinder.us/swapware_demo/about.htm or see it in action on my site http://www.sitetutor.com/linkexchange/ -Todd
try mine http://www.link-swapper.com it is new but I am getting many new users since I started it it is free and has the functionality of all the paid ones, if something is missing, let me know and I will add it. /BP