curl-multi

PHP multi cURL performance worse than sequential file_get_contents

…衆ロ難τιáo~ 提交于 2019-12-07 19:35:08
问题 I am writing an interface in which I must launch 4 http requests to get some infomation. I implemented the interface in 2 ways: using sequential file_get_contents. using multi curl. I have benchmarked the 2 versions with jmeter. The result shows that multi curl is much better than sequential file_get_contents when there's only 1 thread in jmeter making requests, but much worse when 100 threads. The question is: which could bring the bad performance of multi curl? My multi curl code is as

cURL using multiple proxies in a chain

吃可爱长大的小学妹 提交于 2019-12-07 14:29:53
问题 Is it possible to chain multiple proxies in a single request using cURL? For example: start cURL -> proxy1 -> proxy2 -> destination address Can this be achieved using cURL? 回答1: A proxy is by definition a middle man. Software running and doing work between the client and the server. The client asks the proxy which then it turn asks the server. "Chained" proxies would then imply that a first proxy would ask a second proxy, but as the client asks the proxy to do its request it cannot be the job

Asynchronous/parallel HTTP requests using PHP curl_multi

99封情书 提交于 2019-12-06 14:11:29
问题 I recently looked into the possibility of making multiple requests with curl. I may not be understanding it fully, so I am just hoping to clarify some concepts. It's definitely a good option if you are fetching content from multiple sources. That way, you can start processing the results from faster servers while still waiting for slower ones. Does it still make sense to use it if you are requesting multiple pages from the same server? Would the server still serve multiple pages at the time

Simulating a cookie-enabled browser in PHP

岁酱吖の 提交于 2019-12-06 04:11:36
How can I open a web-page and receive its cookies using PHP? The motivation : I am trying to use feed43 to create an RSS feed from the non-RSS-enabled HighLearn website (remote learning website). I found the web-page that contains the feed contents I need to parse, however, it requires to login first. Luckily, logging in can be done via a GET request so it's as easy as fopen()ing " http://highlearn.website/login_page.asp?userID=foo&password=bar " for example. But I still need to get the cookies generated when I logged in , pass the cookies to the real client (using setcookie() maybe?) and then

cURL using multiple proxies in a chain

主宰稳场 提交于 2019-12-06 01:14:11
Is it possible to chain multiple proxies in a single request using cURL? For example: start cURL -> proxy1 -> proxy2 -> destination address Can this be achieved using cURL? A proxy is by definition a middle man. Software running and doing work between the client and the server. The client asks the proxy which then it turn asks the server. "Chained" proxies would then imply that a first proxy would ask a second proxy, but as the client asks the proxy to do its request it cannot be the job of the client to ask the second proxy, but it must be the first proxy's task. Alas: sure, if you have a

PHP multi cURL performance worse than sequential file_get_contents

随声附和 提交于 2019-12-05 19:05:18
I am writing an interface in which I must launch 4 http requests to get some infomation. I implemented the interface in 2 ways: using sequential file_get_contents. using multi curl. I have benchmarked the 2 versions with jmeter. The result shows that multi curl is much better than sequential file_get_contents when there's only 1 thread in jmeter making requests, but much worse when 100 threads. The question is: which could bring the bad performance of multi curl? My multi curl code is as below: $curl_handle_arr = array (); $master = curl_multi_init(); foreach ( $call_url_arr as $key => $url )

What's the fastest way to scrape a lot of pages in php?

放肆的年华 提交于 2019-12-05 05:16:21
问题 I have a data aggregator that relies on scraping several sites, and indexing their information in a way that is searchable to the user. I need to be able to scrape a vast number of pages, daily, and I have ran into problems using simple curl requests, that are fairly slow when executed in rapid sequence for a long time (the scraper runs 24/7 basically). Running a multi curl request in a simple while loop is fairly slow. I speeded it up by doing individual curl requests in a background process

Get all the URLs using multi curl

馋奶兔 提交于 2019-12-05 04:56:29
I'm working on an app that gets all the URLs from an array of sites and displays it in array form or JSON. I can do it using for loop, the problem is the execution time when I tried 10 URLs it gives me an error saying exceeds maximum execution time . Upon searching I found this multi curl I also found this Fast PHP CURL Multiple Requests: Retrieve the content of multiple URLs using CURL . I tried to add my code but didn't work because I don't how to use the function. Hope you help me. Thanks. This is my sample code. <?php $urls=array( 'http://site1.com/', 'http://site2.com/', 'http://site3.com

What's the fastest way to scrape a lot of pages in php?

不问归期 提交于 2019-12-03 20:38:15
I have a data aggregator that relies on scraping several sites, and indexing their information in a way that is searchable to the user. I need to be able to scrape a vast number of pages, daily, and I have ran into problems using simple curl requests, that are fairly slow when executed in rapid sequence for a long time (the scraper runs 24/7 basically). Running a multi curl request in a simple while loop is fairly slow. I speeded it up by doing individual curl requests in a background process, which works faster, but sooner or later the slower requests start piling up, which ends up crashing

AWS S3 batch upload from localhost php error

陌路散爱 提交于 2019-12-03 20:36:04
I am trying to batch/bulk upload from localhost (xampp) to my S3 bucket. It seems to work for about 6 items then i get an error message: The cURL error says Failed sending network data. from http://curl.haxx.se/libcurl/c/libcurl-errors.html Fatal error: Uncaught exception 'cURL_Multi_Exception' with message 'cURL resource: Resource id #34; cURL error: SSL_write() returned SYSCALL, errno = 0 (cURL error code 55). See http://curl.haxx.se/libcurl/c/libcurl-errors.html for an explanation of error codes.' in D:\xampp\htdocs\path\to\my\files\sdk-1.5.14\lib\requestcore\requestcore.class.php on line