file-get-contents

Error handling & workarounds allow_url_fopen and allow_url_include is disabled on the server [duplicate]

守給你的承諾、 提交于 2019-12-11 05:36:45
问题 This question already has answers here : How to scrape websites when cURL and allow_url_fopen is disabled (4 answers) Closed 6 years ago . I've got several functions in my php app that relay on calls to file_get_contents(), file_put_contents, and getimagesize(). The problem is that when allow_url_fopen an allow_url_include are disabled in php.ini I'm getting errors on these critical functions. Warning: getimagesize() [function.getimagesize]: URL file-access is disabled in the server

file_get_contents() suddenly not working

久未见 提交于 2019-12-11 04:49:43
问题 This is all of my code: <html> <body> <form> Playlist to Scrape: <input type="text" name="url" placeholder="Playlist URL"> <input type="submit"> </form> <?php if(isset($_GET['url'])){ $source = file_get_contents($_GET['url']); $regex = '/<a href="(.*?)" class="gothere pl-button" title="/'; preg_match_all($regex,$source,$output); echo "<textarea cols=100 rows=50>"; $fullUrl = array(); foreach($output[1] as $url){ array_push($fullUrl,"http://soundcloud.com".$url); } $final = implode(";",

Getting Zillow API Data

空扰寡人 提交于 2019-12-11 03:37:58
问题 I can't access any zillow information even though I feel that I am using their API correctly. Any help? $zillow_id = '<MY ZPID>'; $search = "2114 Bigelow Ave"; $citystate = "Seattle, WA"; $address = urlencode($search); $citystatezip = urlencode($citystate); $url = "http://www.zillow.com/webservice/GetSearchResults.htm?zws-id=$zillow_id&address=$address&citystatezip=$citystatezip"; $result = file_get_contents($url); //$data = simplexml_load_string($result); print_r($result); Edit 1: Error When

failed to open stream: Connection timed out error - Themoviedb

送分小仙女□ 提交于 2019-12-11 03:33:52
问题 I need some help with the movie db api. I constantly get a connection timed out error. Below is my code, I'm trying to just output the raw json respone so I can work my way from there. $header_opt = array( 'http'=>array( 'method'=>"GET", 'header'=>"Accept: application/json\r\n" . "Content-Type: application/json\r\n" ) ); $headers = stream_context_create($header_opt); $rawjson = file_get_contents('http://api.themoviedb.org/3/movie/tt0076759?api_key=myapikey', false, $headers); $cleansjon =

file_get_contents failed to open stream: HTTP request failed! HTTP/1.1 500 Internal Server Error

走远了吗. 提交于 2019-12-11 01:59:14
问题 I'm retrieving and saving images from a external source to my webserver by URL's. Unfortaley sometimes I get the following error: failed to open stream: HTTP request failed! HTTP/1.1 500 Internal Server Error. When I visit the given url with my browser the image is shown and valid. The url: http://****.com/00/s/NTgwWDcyNg==/z/7LEAAMXQVT9TCcxx/$_85.JPG The code which I use: $opts = array('http'=>array('header' => "User-Agent:Mozilla/5.0 (Windows NT 6.2) AppleWebKit/537.1 (KHTML, like Gecko)

'file_get_contents' The contents are encrypted?

拜拜、爱过 提交于 2019-12-11 00:47:42
问题 The problem is when i use file_get_contents to get source (HTML) from this site, the result that i receive is NOT a plain html code. The code i used: $source = file_get_contents("http://mp3.zing.vn/bai-hat/Dance-With-My-Father-Luther-Vandross/ZWZ9D6FD.html"); echo $source; // OR print_r($source); The source i received: ��}{�#Ǒ��-��!E��=��Mv�5�B���R�����h��E�HV7YE�������a�X��p{��[�:�!{��;,v��u��Or��̬��Y��M��ʌ̌�����������F��ޖ����ػ��S� #�~��H�7k�����ʎȦ2���M?�ު&D�����t���$u�O��N���>%(Y����I��Vb�[

PHP readfile or file_get_contents in a loop

醉酒当歌 提交于 2019-12-11 00:22:28
问题 there may be other ways to do this but I'm looking for an fairly easy set-up because it's basically a one-time process. I have 50 state directories with a handful of txt files in each one. I want to step into each directory, "read" each file (and then do a mysql insert of each file) I've tried a few variations, but each time I try to loop through and use readfile or file_get_contents it breaks after the first file and I get failed to open stream: errors on the remainder of the list. I've

Parse Web Page content using PHP

 ̄綄美尐妖づ 提交于 2019-12-10 23:39:13
问题 I think it's simple question but I've done what I know and still not work. I want get output from this link : http://api.microsofttranslator.com/V2/Ajax.svc/Translate?text=siapa+rektor+ipb&appId=58C40548A812ED699C35664525D8A8104D3006D2&from=id&to=en You can paste on the browser and look into it. There some text output. I've tried with some function in PHP like file_get_contents and curl. I'm not using ajax or JavaScript because I'm not expert with it. And the last, I'm working with XAMPP. 回答1

Parse a string instead of a file for a Laravel Blade template

痞子三分冷 提交于 2019-12-10 20:59:18
问题 I need to cache a remote blade template generated by a CMS to keep an application's public interface up to date. Ideally I would be able to use file_get_contents and a cache to check for updates to this once a week. Is there any way to get Laravel to use the contents of a variable instead of a file as a blade template? 回答1: I could not find a way to get laravel to parse a string as a blade template so I developed this work around that stores the remote template as a local file.

explode() doesn't separate a string by spaces?

我们两清 提交于 2019-12-10 18:09:13
问题 I am attempting to open a text file and explode the list of names within into an array but when I var_dump the new array I get this: array(1) { [0]=> string(61) "name name name name " } The entire contents of the list goes into one single key field in the array. This is the code I'm using: $ingame_list = file_get_contents('../../../../home/folder/folder/list.txt'); $newarray = explode(" ", $ingame_list); var_dump($newarray); How can I get each name to be in its own key field within the new