Suppose I have the following functions
async function test() {
return new Promise<string>((reject, resolve) => {
resolve("test OK");
});
}
async function test2() {
return new Promise<string>((reject, resolve) => {
reject("test error");
});
}
async function f() {
try {
await test();
} catch (err) {
console.log("error test 1");
}
}
async function f2() {
try {
await test2();
} catch (err) {
console.log("error test 2");
}
}
f(); //This prints 'error test 1'
f2(); //This prints nothing
My assumption (obviously incorrect) is that calling resolve
in the promise returned by test()
would result in f()
executing without issue, and conversely, calling reject
in test2()
would result in the catch block of f2()
executing. However, the opposite happens. What is my fundamental misunderstanding as to why calling resolve()
results in an error being caught?
CodePudding user response:
The problem is that resolve
and reject
should be reversed. The arguments to the Promise constructor depend on position rather than name, so the names don't mean what you expect. The first argument will always resolve, the second will always reject.
return new Promise<string>((resolve, reject) => {
Promise constructor MDN docs for reference: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/Promise#creating_a_new_promise