I have a few hundred thousand URLs that I need to call. These are calls to an application server which will process them and write a status code to a table. I do not need to
You can also use async methods of .net webclients. Say if you just need to send a get request to your Urls, Net.WebClient will work. Below is a dummy example with example.com:
$urllist = 1..97
$batchSize = 20
$results = [System.Collections.ArrayList]::new()
$i = 1
foreach($url in $urllist) {
$w = [System.Net.Webclient]::new().DownloadStringTaskAsync("http://www.example.com?q=$i")
$results.Add($w) | Out-Null
if($i % $batchSize -eq 0 -or $i -eq $urllist.Count) {
While($false -in $results.IsCompleted) {sleep -Milliseconds 300} # waiting for batch to complete
Write-Host " ........ Batch completed ......... $i" -ForegroundColor Green
foreach($r in $results) {
New-Object PSObject -Property @{url = $r.AsyncState.AbsoluteURI; jobstatus =$r.Status; success = !$r.IsFaulted}
# if you need response text use $r.Result
}
$results.Clear()
}
$i+=1
}