hello I need some php code to show the last 5 pages viewed any help plz ? note: i use smarty template
You could get that information with javascript, or you would have to write a PHP application that will store page visits into the database, then you can do what you like with that information. Remember to use timestamps! Regards, Glen
it is done, here is the script if someone want it <?php function curPageURL() { $pageURL = 'http'; if ($_SERVER["HTTPS"] == "on") {$pageURL .= "s";} $pageURL .= "://"; if ($_SERVER["SERVER_PORT"] != "80") { $pageURL .= $_SERVER["SERVER_NAME"].":".$_SERVER["SERVER_PORT"].$_SERVER["REQUEST_URI"]; } else { $pageURL .= $_SERVER["SERVER_NAME"].$_SERVER["REQUEST_URI"]; } return $pageURL; } $CurrentPage = curPageURL(); $_session['pages']=$CurrentPage; $_SESSION['pages'][] = $CurrentPage; if ( Count ( $_SESSION['pages'] ) > 10 ) Array_Shift ( $_SESSION['pages'] ); //Knock the oldest page off when array count gets to 11: if(count($_SESSION["pagehistory"]) > 10) { array_shift($_SESSION["pagehistory"]); } //Print the list of pages: if($_SESSION["pagehistory"]) { echo "<h2>Page History</h2>"; echo "<ul>"; foreach($_SESSION["pagehistory"] as $page) { echo "<a href='$page' class='link'><li>$page</li></a>"; } echo "</ul>"; } //Add the current page to the recent list: $_SESSION["pagehistory"][] = $_SERVER["HTTP_REFERER"]; ?> Code (markup):
@paul174 If I run your script, It's giving following errors: 1. Notice: Undefined index: HTTPS in C:\wamp\www\summa\1.php on line 3 2. Notice: Undefined index: pagehistory in C:\wamp\www\summa\1.php on line 18 3. Notice: Undefined index: pagehistory in C:\wamp\www\summa\1.php on line 22 4. Notice: Undefined index: HTTP_REFERER in C:\wamp\www\summa\1.php on line 31 What to do?
That would be because the script above is horribly coded. Try this modified version: <?php function curPageURL() { $pageURL = (!empty($_SERVER['HTTPS']) && $_SERVER['HTTPS'] !== 'off' || $_SERVER['SERVER_PORT'] == 443) ? 'https://' : 'http://'; $pageURL .= ($_SERVER['SERVER_PORT'] != "80") ? $_SERVER['SERVER_NAME'].':'.$_SERVER['SERVER_PORT'].$_SERVER['REQUEST_URI'] : $_SERVER['SERVER_NAME'].$_SERVER['REQUEST_URI']; return $pageURL; } $currentPage = curPageURL(); // $_SESSION['pages'] = $currentPage; $_SESSION['pages'][] = $currentPage; if (count($_SESSION['pages']) > 10) { array_shift($_SESSION['pages']); if (isset($_SESSION['pagehistory']) && count($_SESSION['pagehistory']) > 10) { array_shift($_SESSION['pagehistory']); echo '<h2>Page History</h2> <ul>'; foreach ($_SESSION['pagehistory'] as $page) { echo '<li><a href="'.$page.'" class="link">'.$page.'</a><li>'; } echo '</ul>'; } } $_SESSION['pagehistory'][] = (!empty($_SERVER['HTTP_REFERER'])) ? $_SERVER['HTTP_REFERER'] : ''; // var_dump($_SESSION); // enable this to show the $_SESSION-arrays made above ?> PHP:
I'd simplify down to remove the unnecessary string additions, extra variables for nothing, improper/bloated use of classes, etc, etc. <?php function curPageURL() { return 'http' . (( !empty($_SERVER['HTTPS']) && $_SERVER['HTTPS'] !== 'off' || $_SERVER['SERVER_PORT'] == 443 ) ? 's' : '') . '://' . $_SERVER['SERVER_NAME'] . ( $_SERVER['SERVER_PORT'] == 80 ? '' : $_SERVER['SERVER_PORT'] ) . $_SERVER['REQUEST_URI']; } $currentPage = curPageURL(); $_SESSION['pages'][] = $currentPage; if (count($_SESSION['pages']) > 10) { array_shift($_SESSION['pages']); if ( isset($_SESSION['pagehistory']) && count($_SESSION['pagehistory']) > 10 ) { array_shift($_SESSION['pagehistory']); echo ' <h2>Page History</h2> <ul class="links">'; foreach ($_SESSION['pagehistory'] as $page) echo ' <li><a href="', $page, '">', $page, '</a><li>'; echo ' </ul>'; } } $_SESSION['pagehistory'][] = empty($_SERVER['HTTP_REFERER']) ? '' : $_SERVER['HTTP_REFERER']; ?> Code (markup): ... admittedly you'd need to target ".links a" instead of 'a.link' -- remember, if every child tag of a parent element is getting the same class, none of them should have classes.
Sorry, I am resurrecting an old thread here. The question I have can the above codes push the cookies past the limit for a busy site (with hundreds of requests being sent at any given moment)?
Don't see why not, although if you're that concerned about util, would not be too hard to code in JS and let the client handle processing