Node JS Api request in loop

旧城冷巷雨未停 提交于 2020-01-03 03:59:20

问题


I'm trying my damndest to avoid callback hell with my Node JS. But I'm trying to make a large number of api-requests and insert these into my database.

My issue here (of course) is that my for-loop runs and increments i before I finish my request and database insertion.

for(var i = 0; i <= 1 ; i++){
    apiRequest = data[i]; 
    apicall(apiRequest);
}


function apicall(urlApi){
    request((urlApi), function(error, response, body){
        if(error){
            console.log("error");
        } else if(!error && response.statusCode == 200){
            var myobj = JSON.parse(body);
            dbInsert(myobj);
        }
    });
}

function dbInsert(obj) {
    //insert into database
}

If someone else would come by this question I can truly recommend this blogpost which I found after reading the response by joshvermaire:

http://www.sebastianseilund.com/nodejs-async-in-practice


回答1:


There are a number of ways to approach this type of problem. Firstly, if you can run all the API calls in parallel (all in flight at the same time) and it doesn't matter what order they are inserted in your database, then you can get a result a lot faster by doing that (vs. serializing them in order).

In all the options below, you would use this code:

const rp = require('request-promise');

function apicall(urlApi){
    return rp({url: urlApi, json: true}).then(function(obj){
        return dbInsert(obj);
    });
}

function dbInsert(obj) {
    //insert into database
    // return a promise that resolves when the database insertion is done
}

Parallel Using ES6 Standard Promises

let promises = [];
for (let i = 0; i <= data.length; i++) {
    promises.push(apicall(data[i]));
}

Promise.all(promises).then(() => {
    // all done here
}).catch(err => {
    // error here
});

Parallel using Bluebird Promise Library

With the Bluebird Promise library, you can use Promise.map() to iterate your array and you can pass it the concurrency option to control how many async calls are in flight at the same time which might keep from overwhelming either the database or the target API host and might help control max memory usage.

Promise.map(data, apiCall, {concurrency: 10}).then(() => {
    // all done here
}).catch(err => {
    // error here
});

In Series using Standard ES6 Promises

If you have to serialize them for some reason such as inserting into the database in order, then you can do that like this. The .reduce() pattern shown below is a classic way to serialize promise operations on an array using standard ES6:

data.reduce(data, (p, item) => {
    return p.then(() => {
        return apicall(item);
    });
}, Promise.resolve()).then(() => {
    // all done here
}).catch(err => {
    // error here
});

In Series Using Bluebird Promises

Bluebird has a Promise.mapSeries() that iterates an array in series, calling a function that returns a promise on each item in the array which is a little simpler than doing it manually.

Promise.mapSeries(data, apiCall).then(() => {
    // all done here
}).catch(err => {
    // error here
});



回答2:


I'd recommend using something like async.each. Then you could do:

async.each(data, function(apiRequest, cb) {
    apicall(apiRequest, cb);
}, function(err) {
   // do something after all api requests have been made
});

function apicall(urlApi, cb){
    request((urlApi), function(error, response, body){
        if(error){
            console.log("error");
            cb(error);
        } else if(!error && response.statusCode == 200){
            var myobj = JSON.parse(body);
            dbInsert(myobj, cb);
        }
    });
}

function dbInsert(obj, cb) {
    doDBInsert(obj, cb);
}

When the dbInsert method completes, make sure the cb callback is called. If you need to do this in a series, look at async.eachSeries.



来源:https://stackoverflow.com/questions/46798670/node-js-api-request-in-loop

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!