curl-multi

cURL with multi interface for many connections with proxy

佐手、 提交于 2020-01-17 10:19:37
问题 I need to check many proxies from list against one website. I decided to use libcurl to do this. I used this example and modified it according to my needs. Here is my code: #include <cstdio> #include <cstring> #include <fstream> #include <string> #include <iostream> #include <curl/curl.h> /* somewhat unix-specific */ #include <sys/time.h> #include <unistd.h> using namespace std; CURL * handles [100]; CURL * createProxyHandle (string proxyData){ CURL * handle = curl_easy_init (); curl_slist *

Link-Checking with Multi-Curl

[亡魂溺海] 提交于 2020-01-14 03:19:21
问题 Im building a Link Checker function that checks if the link has code 200/301/302. I want to check about 1000 links so i I used the Multi-CURL functionality and i do get all the headers, codes, the URL to which a URL redirected. The Problem is that Multi-CURL executes in parallel adding all the URLs to curl_multi_add_handle and returns the results it gets and ignores the rest. I know from the header which result i got back but i dont know which URL brought it. Is there an Identifier which URL

PHP curl_multi_getcontent partial body received

懵懂的女人 提交于 2020-01-05 17:23:42
问题 I'm struggling on this problem. The body I am fetching is not a big one, 3100 chars. Apache logs on the server says the content length was 3100. However, the string returned by curl_multi_getcontent was cut to 1290 characters. Usually curl_multi_getcontent() works fine, but sometimes we get this weird behaviour. Any ideas? 回答1: This one kicked my butt. It seems to be a bug in php5's (multi?)curl system. I encountered this bug while using the rolling-curl multicurl lib, but the underlying

PHP cURL multi_exec delay between requests

蓝咒 提交于 2019-12-31 00:02:10
问题 If I run a standard cURL_multi_exec function (example below), I get all cURL handles requested at once. I would like to put a delay of 100ms between each request, is there a way to do that? (nothing found on Google & StackOverflow search) I've tried usleep() before curl_multi_exec() which slows down the script but does not postpone each request. // array of curl handles & results $curlies = array(); $result = array(); $mh = curl_multi_init(); // setup curl requests for ($id = 0; $id <= 10;

Are there any alternatives to shell_exec and proc_open in PHP?

醉酒当歌 提交于 2019-12-30 11:33:09
问题 It seems like I can't use shell_exec or proc_open on my shared server. The message I get when I try to use it is: Warning: shell_exec() has been disabled for security reasons in /home/georgee/public_html/admin/email.php on line 4 Are there any alternatives to these functions? 回答1: I assume you want to use this for async processing, for instance sending eMails in a separate process (hence the error in email.php). If so, please check if cURL is enabled. You can trigger your scripts through an

curl_multi_getcontent returns empty string

久未见 提交于 2019-12-25 08:02:15
问题 This question is very similar to PHP curl_multi_gecontent returns null, but I could not find a solution there. If I try to echo the result of the function, which should contain the request response, I get an empty string ("") instead. Surely I am missing something wrong in my code but I can't put my finger on it. Can anyone help? $id = "stuff"; $password = "mcmuffin"; $data = json_decode(file_get_contents('php://input'), true); $ch = array(); // build the individual requests, but do not

How can I send GET data to multiple URLs at the same time using cURL?

亡梦爱人 提交于 2019-12-23 20:33:19
问题 My apologies, I've actually asked this question multiple times, but never quite understood the answers. Here is my current code: while($resultSet = mysql_fetch_array($SQL)){ $ch = curl_init($resultSet['url'] . $fullcurl); //load the urls and send GET data curl_setopt($ch, CURLOPT_TIMEOUT, 2); //Only load it for two seconds (Long enough to send the data) curl_exec($ch); //Execute the cURL curl_close($ch); //Close it off } //end while loop What I'm doing here, is taking URLs from a MySQL

Is curl_multi_exec() a blocking call?

戏子无情 提交于 2019-12-23 09:56:27
问题 Was just curious if the curl_multi_exec() call in PHP is blocking or non-blocking call. 回答1: Shot answer : curl_multi_exec() is non-blocking Longer answer : curl_multi_exec() is non-blocking , but blocking can be made with the combination of curl_multi_select , which blocks until there is activity on any of the curl_multi connections. Edit: Currently I am working on a crawler, this is outline of a piece of code I used. do { $mrc = curl_multi_exec($mh, $active); if($to_db_queue->count()>0){

curl_multi_exec: some images downloaded are missing some data / stream incomplete

∥☆過路亽.° 提交于 2019-12-22 11:46:26
问题 I have implemented a PHP function which checks & downloads a lot of images (> 1'000) - as passed to it using an array - using the PHP curl_multi_init() method. After reworking this a few times already, because I was getting things like 0 bytes files, etc. I have a solution now which downloads all images - BUT every other image file downloaded is not complete. It looks to me as if I use file_put_contents() "too early", meaning, before some of the images' data has been received completely using

AWS S3 batch upload from localhost php error

我与影子孤独终老i 提交于 2019-12-21 05:35:12
问题 I am trying to batch/bulk upload from localhost (xampp) to my S3 bucket. It seems to work for about 6 items then i get an error message: The cURL error says Failed sending network data. from http://curl.haxx.se/libcurl/c/libcurl-errors.html Fatal error: Uncaught exception 'cURL_Multi_Exception' with message 'cURL resource: Resource id #34; cURL error: SSL_write() returned SYSCALL, errno = 0 (cURL error code 55). See http://curl.haxx.se/libcurl/c/libcurl-errors.html for an explanation of error