How to update automatically from one spot the links in all my websites

Discussion in 'Programming' started by centreurope.org, Sep 15, 2007.

  1. #1
    Hi,
    I've got several websites on which I put links to my other websites. All are located on different ip addresses.
    What I would like is not to have to write manually the links in each website but rather find a way to put them in a single file, which will allow the links on the websites to be updated automatically.
    how to do that?
    thanks a lot
    david
     
    centreurope.org, Sep 15, 2007 IP
  2. aRo`

    aRo` Peon

    Messages:
    141
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    you should put the file on 1 domain and include it from the other domains with includes.

    A more secure solution is to include the file with curl/fopen.

    Hope this helps :)
     
    aRo`, Sep 15, 2007 IP
  3. dizzy

    dizzy Peon

    Messages:
    43
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Use RSS. That's what I use & it's popular, so it's easy to find a plugin or script for your site.

    Put an RSS file on one site with the links.

    Then on each site, load & parse the RSS XML to display your links.
     
    dizzy, Sep 15, 2007 IP
  4. Spetto

    Spetto Peon

    Messages:
    5
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    You could also use javascript and div to make it work.
    First you create a DIV where you want to have the link:

    <div id="adlink" style="position: relative;">
    <!-- look for link.js for the html that goes here //-->
    </div>

    And the js file could have something like this:

    function commonlink() {

    data = '<a href ="http://www.location.com">Click here</a>';

    if(navigator.family =="ie4"){
    adlink.innerHTML=data; // for IE
    }
    else if(navigator.family =="gecko"){
    document.getElementById("adlink").innerHTML=data; // for Gecko
    }
    }

    You could execute commonlink from an onload event and you call js file using.

    <SCRIPT LANGUAGE="JavaScript" src="http://www.website.com/link.js"></script>

    The interesting part of this is that you can put the js file in your own page so that you have only to change this one file to have it work in all other sites that calls it.
     
    Spetto, Sep 15, 2007 IP
  5. Psych0

    Psych0 Banned

    Messages:
    99
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #5
    nah.
    use this code.
    and give me an itrader if you like my idea,

    1. make a file named " link.php "
    2. edit that file and write any link code
    example " <a href="http://xxxxxxxxxx.com">your link</a> "
    3. save that file.
    4. open the page on which you want that link to be shown.
    5. write this code whereever you want that link to be shown:
     <?php include"/path_to/link.php"; ?> 
    Code (markup):
    6. save that page with extension of " anyname.php " if it is "filename.html" , rename it to "filename.php".

    there will be no changes in the html coding of the page you rename.
    and you just have to write this line whenever you want to show the link on any page:
     <?php include"/path_to/link.php"; ?> 
    Code (markup):
     
    Psych0, Sep 16, 2007 IP
  6. craigedmonds

    craigedmonds Notable Member

    Messages:
    705
    Likes Received:
    134
    Best Answers:
    0
    Trophy Points:
    235
    #6
    For me the best way, is to create a file on site.com/links.html (or .aspx or .asp or .php or . any other extension you want).

    That file will contain the html code with your links in it.

    Then on sites b-z install a grabber script which grabs the content of the page from sitea.com/links.html

    This solution works great for me becasue I use php and asp driven sites so using an grabber script works great, however rss is the best way because you can install rss onto html pages or any non dynamic page and the spiders will eat them up (which is the oal right?). I have had problems with spiders and javascript in the past so dont recommend it.

    There are grabber scripts out there for php and asp. I have a couple o examples so pm me for them if you like.
     
    craigedmonds, Sep 16, 2007 IP
  7. AstarothSolutions

    AstarothSolutions Peon

    Messages:
    2,680
    Likes Received:
    77
    Best Answers:
    0
    Trophy Points:
    0
    #7
    All of the above are fine (though Physco sounds to be suggesting having the link file on every server you use) and you could also use a database that allows remote connections.

    You could just use a standard XML file rather than one following the RSS schema but then you would need to write the script rather than find one (not that paphasing XML is difficult)
     
    AstarothSolutions, Sep 17, 2007 IP
  8. Psych0

    Psych0 Banned

    Messages:
    99
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #8
    no no , i am not saying to have the link.php in every server you use.
    just 1 file with all the link code works on all the pages.
    though i have never tested it on different server.
    wait , let me test it and give you the result.
    it works fine on the same server , but wait for some hours, let me test it on different server...
     
    Psych0, Sep 18, 2007 IP
  9. Synchronium

    Synchronium Active Member

    Messages:
    463
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    58
    #9
    
    <?php
    	$links = array(
    		array(
    			'url' => 'url',
    			'title' => 'title',
    			'description' => 'description'
    		),
    			array(
    			'url' => 'url',
    			'title' => 'title',
    			'description' => 'description'
    		),		array(
    			'url' => 'url',
    			'title' => 'title',
    			'description' => 'description'
    		),		array(
    			'url' => 'url',
    			'title' => 'title',
    			'description' => 'description'
    		)
    	);
    ?>
    
    Code (php):

    save that as links.php then for your main page:

    
    <?php
    
    	require_once( 'links.php' );
    
    
    	foreach ( $links as $link ) {
    
    		$echo .= '<a href="', $link['url'], '" title="', $link['title'], '">', $link['title'], "</a>\r\n";
    
    	}
    
    ?>
    
    Code (php):
    My sites uses something similar, but does stuff with definition lists, which is why it also has a description. But you get the idea.
     
    Synchronium, Sep 18, 2007 IP
  10. Synchronium

    Synchronium Active Member

    Messages:
    463
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    58
    #10
    
    <?php
    	$links = array(
    		array(
    			'url' => 'url',
    			'title' => 'title',
    			'description' => 'description'
    		),
    			array(
    			'url' => 'url',
    			'title' => 'title',
    			'description' => 'description'
    		),		array(
    			'url' => 'url',
    			'title' => 'title',
    			'description' => 'description'
    		),		array(
    			'url' => 'url',
    			'title' => 'title',
    			'description' => 'description'
    		)
    	);
    ?>
    
    Code (php):

    save that as links.php then for your main page:

    
    <?php
    
    	require_once( 'links.php' );
    
    
    	foreach ( $links as $link ) {
    
    		$echo .= '<a href="', $link['url'], '" title="', $link['title'], '">', $link['title'], "</a>\r\n";
    
    	}
    
    ?>
    
    Code (php):
    My sites uses something similar, but does stuff with definition lists, which is why it also has a description. But you get the idea.
     
    Synchronium, Sep 18, 2007 IP