file-get-contents

php file_get_contents returns null when allow_url_fopen is on

喜夏-厌秋 提交于 2019-12-19 08:04:41
问题 I get the warning message: file_get_contents failed to open stream permission denied I have set all_url_open to on in the php.ini file. My php file is in my apache server and it is trying to access a url (that returns JSON) from a tomcat server on the same machine. The code in the php file looks like: $srcURL = 'http://samemachine:8080/returnjson/'; $results = file_get_contents($srcURL); I have also tried curl and it returns nothing and doesn't hit the tomcat server either: function curl($url

file_get_contents script works with some websites but not others

邮差的信 提交于 2019-12-18 13:39:26
问题 I'm looking to build a PHP script that parses HTML for particular tags. I've been using this code block, adapted from this tutorial: <?php $data = file_get_contents('http://www.google.com'); $regex = '/<title>(.+?)</'; preg_match($regex,$data,$match); var_dump($match); echo $match[1]; ?> The script works with some websites (like google, above), but when I try it with other websites (like, say, freshdirect), I get this error: "Warning: file_get_contents(http://www.freshdirect.com) [function

file_get_contents not working with utf8

帅比萌擦擦* 提交于 2019-12-18 09:34:56
问题 I'm trying to get Thai characters from a website. I've tried: $rawChapter = file_get_contents("URL"); $rawChapter = mb_convert_encoding($rawChapter, 'UTF-8', mb_detect_encoding($rawChapter, 'UTF-8, ISO-8859-1', true)); When I do this then the characters come back like: ¡ÅѺ˹éÒáá¾ÃФÑÁÀÕÃìÀÒÉÒä·Â©ºÑº But if I take the source of the page I'm trying to load and save that into my own .htm file on my localhost as a utf8 file then it loads the Thai characters correctly. Only when I try to load it

file_get_contents no caching?

耗尽温柔 提交于 2019-12-18 09:19:03
问题 I'm using file_get_contents() to load a dynamic image from an external website. The problem is that the image has been updated on the remote website but my script is still displaying the old image. I assume the server cache the image somewhere but how can i force the server to clear the cache and use the updated image when getting the file with file_get_contents ? On my local machine, i had to do CTRL+F5 to force refresh on the image. I also tryed to add no cache header to my script, but it

Handling delays when retrieving files from remote server in PHP

淺唱寂寞╮ 提交于 2019-12-18 06:54:21
问题 I am working with PHP to access files and photos from remote servers. I am mostly using the file_get_contents() and copy() functions. Sometimes accessing a small text file or photo is almost instant, but other times it seems to get "stuck" for a minute on the same exact file. And sometimes it actually causes my script to hang, and even when I stop the script Apache remains locked up for several minutes. I'm quite willing to accept the fact that internet connections can be flaky. My concern is

Handling delays when retrieving files from remote server in PHP

别等时光非礼了梦想. 提交于 2019-12-18 06:52:04
问题 I am working with PHP to access files and photos from remote servers. I am mostly using the file_get_contents() and copy() functions. Sometimes accessing a small text file or photo is almost instant, but other times it seems to get "stuck" for a minute on the same exact file. And sometimes it actually causes my script to hang, and even when I stop the script Apache remains locked up for several minutes. I'm quite willing to accept the fact that internet connections can be flaky. My concern is

php - file_get_contents - Downloading files with spaces in the filename not working

天涯浪子 提交于 2019-12-18 04:53:27
问题 I am trying to download files using file_get_contents() function. However if the location of the file is http://www.example.com/some name.jpg , the function fails to download this. But if the URL is given as http://www.example.com/some%20name.jpg , the same gets downloaded. I tried rawurlencode() but this coverts all the characters in the URL and the download fails again. Can someone please suggest a solution for this? 回答1: I think this will work for you: function file_url($url){ $parts =

file_get_contents or curl in php?

泄露秘密 提交于 2019-12-18 04:48:06
问题 Which of file_get_contents or curl should be used in PHP to make an HTTP request? If file_get_contents will do the job, is there any need to use curl ? Using curl seems to need more lines. eg: curl: $ch = curl_init('http://www.website.com/myfile.php'); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_POSTFIELDS, $content); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); $output = curl_exec ($ch); curl_close ($ch); file_get_contents: $output = file_get_contents('http://www

How to resolve the error “[ErrorException] file_get_contents(/var/www/laravel/.env): failed to open stream: No such file or directory”?

流过昼夜 提交于 2019-12-18 03:04:35
问题 I'm using Ubuntu 14.04 on my machine. I installed composer and then laravel in the document root i.e. /var/www I also gave -R 777 persmission to folder laravel present in directory /var/www Then I go to directory laravel using cd /var/www/laravel and run the following command php artisan and I got to see all the available commands there. Then I typed in php artisan key:generate and got the error [ErrorException] file_get_contents(/var/www/laravel/.env): failed to open stream: No such file or

file_get_contents from url that is only accessible after log-in to website

怎甘沉沦 提交于 2019-12-17 16:11:05
问题 I would like to make a php script that can capture a page from a website. Think file_get_contents($url) . However, this website requires that you fill in a username/password log-in form before you can access any page. I imagine that once logged-in, the website sends your browser an authentication cookie and with every consequent browser request, the session info is passed back to the website to authenticate access. I want to know how i can simulate this behavior of the browser with a php