Sending large data from a server to another

六月ゝ 毕业季﹏ 提交于 2019-12-12 04:05:09

问题


I am using CURL to send large amounts of data between servers , am using POST , is this OK or is there any better/standard way to send large serialized data with curl ?

the problem is in the max-post-size in the php settings , i have to change it (default 2MB) . i didn't encounter any problems with this yet , but when the system will be online it is possible that data larger than 50MB will be sent each time !

Any ideas ? Thank you .

EDIT :

I am sending DATA and not FILES , a data that once received should be processed by the second server and saved to database/file/do some action and might need to send a response after processing the data .

I just would like to know , will i face any other problem except max-post-size ? (forget about timeouts of both curl and php) , and is there anyway to make the server not look at max_post_size ? maybe by using PUSH ? or PUT ? does that post_size affect the PUSH or PUT ?? and how to use it via curl ? so many questions !


回答1:


Using cURL is perfectly fine.

Personally, I would prefer to not having to do it through web server (eg. Apache) as there can be too many potential faults along the way, eg. PHP timeout, web server timeout, memory limit, no write privileges, limited to web root, etc.

I would prefer to do it through mechanisms designed for file transfers:

  • FTP
  • scp (generally FTP over SSH)
  • Dropbox (there are APIs)
  • Amazon S3 (simple API with PHP library)
  • etc.



回答2:


The way is ok.

Two more ideas for you:

  1. Use FTP (you can upload large serialized files to a ftp-server which is reachable from your servers
  2. Use mysql (you can store the large serialized content on a mysql-server)


来源:https://stackoverflow.com/questions/11613522/sending-large-data-from-a-server-to-another

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!