Is there a limit on PHP file_get_contents? [closed]

非 Y 不嫁゛ 提交于 2019-12-08 15:40:07

问题


I am trying to read a large file (10M) using php file_get_contents

$file = 'http://www.remoteserver.com/test.txt';
$data = file_get_contents( $file );
var_dump ( $data );

It dumps back

string(32720)

and then the output with only showing part of the file. Is there a limit somewhere of file_get_contents? I tried doing ini_set('memory_limit', '512M'), but that did not work.

EDIT: ** forgot to mention ** it's a remote file.

PROBLEM RESOLVED:: Out of HDD space. Fixed that and now everything works.


回答1:


Assuming the contents of the file you want to load are logically separated by line breaks (eg: not a binary file), then you might be better off reading line by line.

$fp = fopen($path_to_file, "r");  
$fileLines = array();
while (!feof($fp)){
  array_push(fgets($fp),$fileContents);
} 
fclose($$fp);

You could always implode() (with your choice of line break character) the array back to a single string if you really need the file in one "chunk".

Reference -

  • fgets()
  • implode()


来源:https://stackoverflow.com/questions/12628801/is-there-a-limit-on-php-file-get-contents

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!