file-get-contents

Why doesn't file_get_contents work?

随声附和 提交于 2019-12-17 06:45:11
问题 Why does file_get_contents not work for me? In the test file code below, it seems that everyone's examples that I've searched for all have this function listed, but it never gets executed. Is this a problem with the web hosting service? Can someone test this code on their server just to see if the geocoding array output actually gets printed out as a string? Of course, I am trying to assign the output to a variable, but there is no output here in this test file.... <html> <head> <title>Test

file_get_contents - failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found

拥有回忆 提交于 2019-12-17 06:14:17
问题 I'm having some weird problems with file_get_contents after moving my site to a new domain. I had to set up a new domain and IP address (using Plesk) to get a new ssl certificate working. Now my file_get_contents calling a script on the same domain is giving me this: failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found If I call the same url using file_get_contents on another server it works fine, and if I call www.google.com from the server thats failing that works, so it only

How to use CURL instead of file_get_contents?

一曲冷凌霜 提交于 2019-12-17 04:28:46
问题 I use file_get_contents function to get and show external links on my specific page. In my local file everything is okay, but my server doesn't support the file_get_contents function, so I tried to use cURL with the below code: function file_get_contents_curl($url) { $ch = curl_init(); curl_setopt($ch, CURLOPT_HEADER, 0); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_URL, $url); $data = curl_exec($ch); curl_close($ch); return $data; } echo file_get_contents_curl('http:

PHP Parallel curl requests

隐身守侯 提交于 2019-12-17 03:25:53
问题 I am doing a simple app that reads json data from 15 different URLs. I have a special need that I need to do this serverly. I am using file_get_contents($url) . Since I am using file_get_contents($url). I wrote a simple script, is it: $websites = array( $url1, $url2, $url3, ... $url15 ); foreach ($websites as $website) { $data[] = file_get_contents($website); } and it was proven to be very slow, because it waits for the first request and then do the next one. 回答1: If you mean multi-curl then,

Why I'm getting 500 error when using file_get_contents(), but works in a browser?

不打扰是莪最后的温柔 提交于 2019-12-17 02:37:48
问题 $html = file_get_contents("https://www.[URL].com"); echo $html; produces this in the error logs: PHP Warning: file_get_contents(https://www.[URL].com) [function.file-get-contents]: failed to open stream: HTTP request failed! HTTP/1.1 500 Internal Server Error in /Applications/MAMP/htdocs/test.php on line 13"; However, the site works fine in a browser. I tried using cURL as well. I don't get any errors in the log file, but $html now echoes: Server Error in '/' Application. Object reference not

PHP json_decode brings back null

时光怂恿深爱的人放手 提交于 2019-12-14 01:06:48
问题 I am trying to get this to work but can't see where I'm going wrong. Can anyone assist? <?php $jsonurl = 'http://www.foxsports.com.au/internal-syndication/json/livescoreboard'; $json = file_get_contents($jsonurl,0,null,null); $json_output = var_dump(json_decode($json,true)); echo $json_output ?> 回答1: Hint : The initial JSON (stored in $json variable) does NOT validate. Code : (FIXED) <?php $jsonurl='http://www.foxsports.com.au/internal-syndication/json/livescoreboard'; $json = file_get

Perl equivalent of PHP's get_file_contents()?

混江龙づ霸主 提交于 2019-12-14 00:48:46
问题 The following PHP code does exactly what I want to do. The problem is I need to re-create it in Perl and I've been playing around with the open() and sysopen() Perl functions but can't do it. Does anyone have any help or know of any links that might help? Thanks. $URL = "http://example.com/api.php?arguments=values"; echo file_get_contents($URL); 回答1: You can make use of LWP: use LWP::Simple; $contents = get $URL or die; print $contents; 来源: https://stackoverflow.com/questions/3413151/perl

file_get_contents returns empty string in shared host

心已入冬 提交于 2019-12-13 20:04:01
问题 I'm trying get url contents with file_get_contents(), its works in my localhost server, but in shared host server, the function returns a empty string, without errors . My code follows: $uri = 'http://my_url.com:81/datasnap/rest/TServerMethods/getLoginCliente/galf/123/'; $result = file_get_contents($uri); var_dump($result); and the result is: string(0) "" instead of: {result: [4532,1]} I'm test change the url for google.com, and works perfectly. Anyone know why this happens? 回答1: Check if

file_get_contents on other port

杀马特。学长 韩版系。学妹 提交于 2019-12-13 15:22:28
问题 I must contact services rest on different ports by 80, but the function file_get_contents () returns an error: failed to open stream: Connection refused $url = "http://nexusdigital.agency:81/API/...."; $result = file_get_contents($url, false); How can I configure the reading on other ports? 回答1: Use CURL : <?php $curl = curl_init('http://nexusdigital.agency/API/....'); curl_setopt($curl, CURLOPT_PORT, 81); curl_setopt($curl, CURLOPT_RETURNTRANSFER, 81); $result = curl_exec($curl); ?> 回答2:

Can't get remote filename to file_get_contents() and then store file

蓝咒 提交于 2019-12-13 12:08:17
问题 I want to download a remote file and put it in my server directory with the same name the original has. I tried to use file_get_contents($url) . Problem is that the filename isn't included in $url , it is like: www.domain.com?download=1726 . This URL give me, e.g.: myfile.exe , so I want to use file_put_contents('mydir/myfile.exe'); . How could I retrieve the filename? I tried get_headers() before downloading, but I only have file size, modification date and other information, the filename is