问题
I have some javascript that fires off about 100 calls to a php script. the php script uses up a lot of memory and takes a few seconds to complete, then returns a json response of pass or fail.
I don't want the ajax calls to be asynchronous as the server would come to a grinding halt running 100 instances of itself, so I tried using synchronous, the only problem being it freezes the webpage while it calls the script one call at a time.
How can I fire off the ajax calls one at a time and not freeze the page i'm on?
var a = [];
a[0] = 'test';
a[1] = 'hello';
a[2] = 'another';
$(document).ready(function(){
$.each(a, function(k,v) {
$.ajax({
url:'/this-script-takes-a-few-seconds-to-complete.php',
async:false,
data: {foo: v},
success: function(data){
console.log(data);
}
});
});
});
回答1:
You can put in in a function that performs the next call, and call this from your success
handler, like this:
$(function(){
var i = 0;
function nextCall() {
if(i == a.length) return; //last call was last item in the array
$.ajax({
url:'/this-script-takes-a-few-seconds-to-complete.php',
data: { foo: a[i++] },
success: function(data){
console.log(data);
nextCall();
}
});
}
nextCall();
});
Though...the better approach would be to not do 100 calls, unless this is a test suite of some-sort, I'd re-visit the whole approach.
回答2:
If you don't want your call to be asynchronous then the browser can't update the UI which will freeze. Why can't you make it async? Instead of doing all 100 at once, just wait for one to complete and then call the next one.
回答3:
You need to refactor this thing. 100 ajax requests is insane, and all of them being synchronous as well.
I suggest you refactor your app so you can parse all that data and send it in one asynchronous go, or a couple asynchronous requests, not 100.
If you need 100, make some sort of queue and asynchronously process them in order.
回答4:
Well the problem with that is that Ajax... stands for "Asynchronous JavaScript and XML"...
Trying to do Ajax in sync defeats the purpose of using Ajax in the first place.
Try using a thread pool and running it async.
回答5:
Edit - Correct Answer
Most browsers will limit the amount of HTTP requests to a single server, presumably to prevent situations exactly like you are trying to prevent (server overload). I'm not sure what the actual figures are for modern browsers, though I assume it's more than the old "2" that I observed in Firefox in 2005.
Some resources:
- In Firefox, the default seems to be 15: check
about:config
, and look formax-connections-per-server
- IE seems to vary by connection type and version: http://msdn.microsoft.com/en-us/library/cc304129(VS.85).aspx
- Chrome seems to be 6: http://www.google.com/support/forum/p/Chrome/thread?tid=2218bcd015cdc0cc&hl=en
Old Answer
This script will call a synchronous $.ajax
every 2 seconds, each time shifting the data off of the array a
. The UI will still lock during the time each request is actually executing, but will allow for it to update itself between calls. This, I think, is a pretty good compromise between completely synchronous and completely asynchronous.
Note that this script is destructive - a
will be empty when it's done. If you need the contents of a
, then you may want to create a copy.
a = [
'test',
'hello',
'another'
];
$(function() {
var send = function() {
$.ajax({
url: '/this-script-takes-a-few-seconds-to-complete.php',
async: false,
data: {foo: a.shift()},
success: function(data) {
console.log(data)
}
});
if (!a.length)
{
clearInterval(timer);
}
}
var timer = setInterval(send, 2000);
});
回答6:
I had similar issue where I needed to call synchronously and IE was freezing out.In short I came up with this approach.
- Instead of using synchroniou call i used async:true.
- Recursivly call next ajax request in complete:function.
It allowed me to call a synchronious call aysnchroniously, so I called after the previous call.
来源:https://stackoverflow.com/questions/4098826/how-can-i-call-ajax-synchronously-without-my-web-page-freezing