curl-multi

PHP cURL multi_exec delay between requests

亡梦爱人 提交于 2019-12-01 21:27:08
If I run a standard cURL_multi_exec function (example below), I get all cURL handles requested at once. I would like to put a delay of 100ms between each request, is there a way to do that? (nothing found on Google & StackOverflow search) I've tried usleep() before curl_multi_exec() which slows down the script but does not postpone each request. // array of curl handles & results $curlies = array(); $result = array(); $mh = curl_multi_init(); // setup curl requests for ($id = 0; $id <= 10; $id += 1) { $curlies[$id] = curl_init(); curl_setopt($curlies[$id], CURLOPT_URL, "http://google.com");

How to reliably reproduce curl_multi timeout while testing public proxies

末鹿安然 提交于 2019-12-01 15:33:41
Relevant information: issue 3602 on GitHub I'm working on a project that gathers and tests public/free proxies, and noticed that when I use the curl_multi interface for testing these proxies, sometimes I get many 28(timeout) errors. This never happens if I test every proxy alone. The problem is that this issue is unreliably reproducible, and it does not always show up , it could be something in curl or something else . Unfortunately, I'm not such a deep networks debugger and I don't know how to debug this issue on a deeper level, however I wrote 2 C testing programs (one of them is originally

How to reliably reproduce curl_multi timeout while testing public proxies

微笑、不失礼 提交于 2019-12-01 14:32:08
问题 Relevant information: issue 3602 on GitHub I'm working on a project that gathers and tests public/free proxies, and noticed that when I use the curl_multi interface for testing these proxies, sometimes I get many 28(timeout) errors. This never happens if I test every proxy alone. The problem is that this issue is unreliably reproducible, and it does not always show up , it could be something in curl or something else . Unfortunately, I'm not such a deep networks debugger and I don't know how

Are there any alternatives to shell_exec and proc_open in PHP?

强颜欢笑 提交于 2019-12-01 12:32:51
It seems like I can't use shell_exec or proc_open on my shared server. The message I get when I try to use it is: Warning: shell_exec() has been disabled for security reasons in /home/georgee/public_html/admin/email.php on line 4 Are there any alternatives to these functions? Gordon I assume you want to use this for async processing, for instance sending eMails in a separate process (hence the error in email.php). If so, please check if cURL is enabled. You can trigger your scripts through an HTTP request without waiting for the response. Further reading: Asynchronous/parallel HTTP requests

Problem with CURL (Multi)

谁都会走 提交于 2019-12-01 01:29:55
I'm having a problem with curl_multi_* , I want to create a class / function that receives, lets say 1000 URLs, and processes all those URLs 5 at a time, so when a URL finishes downloading it will allocate the now available slot to a new URL that hasn't been processed yet. I've seen some implementations of curl_multi, but none of them allows me to do what I want, I believe the solution lies somewhere in the usage of curl_multi_select but the documentation isn't very clear and the user notes don't help much. Can anyone please provide me with some examples how I can implement such a feature?

Problem with CURL (Multi)

扶醉桌前 提交于 2019-11-30 19:10:13
问题 I'm having a problem with curl_multi_*, I want to create a class / function that receives, lets say 1000 URLs, and processes all those URLs 5 at a time, so when a URL finishes downloading it will allocate the now available slot to a new URL that hasn't been processed yet. I've seen some implementations of curl_multi, but none of them allows me to do what I want, I believe the solution lies somewhere in the usage of curl_multi_select but the documentation isn't very clear and the user notes

How can I use cURL to open multiple URLs simultaneously with PHP?

和自甴很熟 提交于 2019-11-30 02:22:12
Here is my current code: $SQL = mysql_query("SELECT url FROM urls") or die(mysql_error()); //Query the urls table while($resultSet = mysql_fetch_array($SQL)){ //Put all the urls into one variable // Now for some cURL to run it. $ch = curl_init($resultSet['url']); //load the urls curl_setopt($ch, CURLOPT_TIMEOUT, 2); //No need to wait for it to load. Execute it and go. curl_exec($ch); //Execute curl_close($ch); //Close it off } //While loop I'm relatively new to cURL. By relatively new, I mean this is my first time using cURL. Currently it loads one for two seconds, then loads the next one for

How to fix curl: (35) Cannot communicate securely with peer: no common encryption algorithm(s)

孤人 提交于 2019-11-29 22:22:28
I am trying to access and download some .torrent files from https://torrage.com using php curl . But nothing happens , curl_error($ch) gives $ch = curl_init ('https://torrage.com/torrent/640FE84C613C17F663551D218689A64E8AEBEABE.torrent'); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0'); curl_setopt($ch, CURLOPT_HEADER, 1); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); curl_setopt($ch, CURLOPT_VERBOSE,true); $data = curl_exec($ch); $error = curl_error($ch); curl_close ($ch); echo $error; this

How can I use cURL to open multiple URLs simultaneously with PHP?

微笑、不失礼 提交于 2019-11-28 23:53:25
问题 Here is my current code: $SQL = mysql_query("SELECT url FROM urls") or die(mysql_error()); //Query the urls table while($resultSet = mysql_fetch_array($SQL)){ //Put all the urls into one variable // Now for some cURL to run it. $ch = curl_init($resultSet['url']); //load the urls curl_setopt($ch, CURLOPT_TIMEOUT, 2); //No need to wait for it to load. Execute it and go. curl_exec($ch); //Execute curl_close($ch); //Close it off } //While loop I'm relatively new to cURL. By relatively new, I mean

Asynchronous HTTP requests in PHP

≡放荡痞女 提交于 2019-11-27 14:41:59
Is there any sane way to make a HTTP request asynchronously in PHP without throwing out the response? I.e., something similar to AJAX - the PHP script initiates the request, does it's own thing and later, when the response is received, a callback function/method or another script handles the response. One approach has crossed my mind - spawning a new php process with another script for each request - the second script does the request, waits for the response and then parses the data and does whatever it should, while the original script goes on spawning new processes. I have doubts, though,