问题
it seems the following works without throwing an error:
var p = new Promise (function (resolve, reject) {
window.setTimeout(function() {
reject('ko');
}, 1000);
});
p.then(function (value) { console.log(value); })
.catch(function () { console.log('catched'); });
// → 'catched'
But this throws an error:
var p = new Promise (function (resolve, reject) {
window.setTimeout(function() {
p.catch(function () { console.log('catched'); });
reject('ko');
}, 1000);
});
p.then(function (value) { console.log(value); });
// → 'catched'
// Uncaught (in promise) ko
Any wild guesses to why ?
回答1:
The .catch
must be directly chained after .then
. Even if you write it this way, it will still report uncaught:
var p = new Promise(function(resolve, reject) {
window.setTimeout(function() {
//p.catch(function () { console.log('catched'); });
console.log(p)
reject('ko');
}, 1000);
});
p.then(function(value) {
console.log(value);
});
p.catch(function() {
console.log('catched');
});
The reason for this is that if you don't chain it like that, the .catch
function doesn't receive the return value that gets generated when you call .then
来源:https://stackoverflow.com/questions/35751143/promise-catch-behavior