I was trying to create a Cloud Function using Node.js. At first I created 3 files with each different function, but actually it spent almost 30-45 seconds to finish all of them just for authentication and check the device, only. Then, I decided to combine all the files into one function and now it has 185 complexity. It really became faster and it only requires 10-14 seconds.
const functions = require("firebase-functions");
const admin = require("firebase-admin");
const level = require("../../library/level");
const hamsu = require("../../library/useful");
const db = admin.firestore();
const medium = level.medium;
exports.Authentication = functions.runWith(medium).https.onCall(async (data, context) => {
if (context.app == undefined) {
throw new functions.https.HttpsError(
"failed-precondition",
"The function must be called from an App Check verified app.");
}
if (!context.auth) {
throw new functions.https.HttpsError("failed-precondition", "Unauthorized User is trying to access the App.");
}
let code;
const token = data.token;
});
So, what I wanna ask is does having complexity up to 185 really cause the Server into overload, or not? Thank you for any tips and tricks. Sorry, I am a newbie.
CodePudding user response:
185 is very high for cognitive complexity, and would be inadvisable in a situation where:
- Other people need to understand and maintain your code
- You are likely to have memory fade on how the function works
- There is code in that function that is / needs to be duplicated elsewhere
Best practice, generally speaking, is to maintain low cognitive complexity code with as few to non instances of code repetition where possible.
An aside: I hope you enjoy the refactoring process! Some IDEs will offer a refactoring tool built into them which can automate the process somewhat.