i was thinking on how to do a file_get_contents that has no referrer.... like inputting the address to the browser URL directly..... willing to pay for a working script that can get or parse a URL that is LIKE accessing it directly on the browser....
If I'm correct, file_get_contents never sends a referrer.. it does send a user agent. Anyway, you'll want to look at streams: $opts = array( 'http'=>array( 'method'=>"GET", 'header'=>"Accept-language: en\r\n" . "Cookie: foo=bar\r\n" . "Referer: http://www.example.com\r\n" ) ); $context = stream_context_create($opts); $file = file_get_contents('http://www.example.com/', false, $context); PHP: You could try an empty referer in this example.
@premiumscripts thanks bro but it doesn't help.... i get a 503 service temporarily unavailable error when i use file_get_contents or even curl. but i dont get that error when i input the URL directly in the browser.... can anyone help me with this? I'm willing to pay... ^_^
503 means the functions are working and you're getting a response from the remote server denying you. If you ever tried to cURL google results the same thing happens, because they can detect the user-agent used by file_get_contents and cURL and as a result block those user agents. It's also possible that the server you're accessing from also has it's IP address blackballed for such practices. You would probably need to go thru a non-transparent proxy that has the option of sending a different user-agent string.
thanks kblessinggr... only http not https. i do know curl but it's the same with a simple file_get_contents... but is it possible to change the user agent and hide the IP when using curl or parsing a remote URL?
Hiding IP... definitely not without going thru a proxy. User agent with curl is pretty easy $useragent="Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.1) Gecko/20061204 Firefox/2.0.0.1"; $ch = curl_init(); // set user agent curl_setopt($ch, CURLOPT_USERAGENT, $useragent); PHP: Course I would probably pull the User Agent of whatever browser is invoking the script , makes it more dynamic.
Well, you can add the user agent to the stream context as well for file_get_contents. And you can use a proxy with it as well, look it up.
PHP.net's example (I'm feeling slightly generous) Basically the user agent ( USER_AGENT ) would go into the header string.
There are a few ways to hide your referring IP/referrer, but then you are getting into blackhat stuff.. the real issue isn't your coding, it is most likely that the other website has disabled CURL and file_get_contents to protect themselves from leechers like you.
Um, other websites cannot "disable" curl and file_get_contents from your system. If your request looks exactly as it would if you typed it in a browser it should work, unless ofcourse your IP is banned.
But not remotely. Your hosting provider could disallow it, but then you couldn't connect to Any sites using file_get_contents or fopen. For example if you're on hostgator, and I'm on softlayer, hostgator disabling allow_url_fopen, will not prevent me from accessing your site with fopen, but you wouldn't be able to access mine (or any) with that method cuz your host disabled it. Mainly three common reasons why the commands wouldn't work just like the browser in a remote situation. 1) The default USER-AGENT has been blocked. 2) Your server's IP block has been blocked. 3) Remote host has a proxy detection.