In the following code I am trying to make multiple (around 10) HTTP requests and RSS parses in one go.
I am using the standard forEach construct on an a
using a copy of the list of url as a queue to keep track of arrivals makes it simple: (all changes commented)
var q=feedsToFetch.slice(); // dupe to censor upon url arrival (to track progress)
feedsToFetch.forEach(function (feedUri)
{
feed(feedUri, function(err, feedArticles)
{
if (err)
{
throw err;
}
else
{
articles = articles.concat(feedArticles);
}
q.splice(q.indexOf(feedUri),1); //remove this url from list
if(!q.length) done(); // if all urls have been removed, fire needy code
});
});
function done(){
// Code I want to run once all feedUris have been visited
}
in the end, this is not that much "dirtier" than promises, and provides you chance to reload un-finished urls (a counter alone wont tell you which one(s) failed). for this simple parallel download task, it's actually going to add more code to your project implement Promises than a simple queue, and Promise.all() is not in the most intuitive place to stumble across. Once you get into sub-sub-queries, or want better error handling than a trainwreck, i strongly recommend using Promises, but you don't need a rocket launcher to kill a squirrel...