Home > Back-end >  How do I manage concurrent connections with postgres and node-postgres?
How do I manage concurrent connections with postgres and node-postgres?

Time:08-13

I have read a lot of posts here about the "too many connections" issue.

I am struggling to loop a modest amount of CSV data (10,000 rows) upload to ElephantSQL (psql).

I am on the free plan for ElephantSQL. And while I can upgrade to get more concurrent connections, the issue is that I am not sure how to manage connections.

Here is my code:

First I create individual URLs to pass to axios in the extractToRaw (raw_data table in psql)

readCSV(async (list) => {
    const apiURLList = await list.map((item) => {
        return `apiDomain=${domain}&api_key=${apiKey}`;
    });
    for (const url of apiURLList) {
            await extractToRaw(url);
    }
});

Then:

const extractToRaw = async (url) => {
    let records = [];

    try {
        await axios({
            method: "get",
            url: url,
            params: {
                //things here
            },
        }).then((data) => {
            const contactRecord = data.data.data;
            const emailData = data.data.data.emails;
            const metaData = data.data.meta;
            //

            if (metaData.results === 0) {
                try {
                    console.log(`no emails for ${contactRecord.domain}`);
                    upload_no_email(contactRecord.domain);
                } catch (err) {
                    console.log("name: ", err.name, "message: ", err.message);
                }
            } else
                for (const record of emailData) {
                    console.log(`Writing ${record.value} record...`);
                    records.push({
                        firstname: record.first_name,
                        lastname: record.last_name,
                        position: record.position,
                        seniority: record.seniority,
                        email: record.value,
                        website: record.value.split("@")[1],
                        confidence: record.confidence,
                    });
                    console.log(records);
                }
            //upload to table
            uploadToRaw(records);
        });
    } catch (err) {
        console.log(err);
    }
};

Finally - upload to PSQL

const uploadToRaw = (records) => {
    console.log(`uploading from ${records[0].website}`);
    for (const record of records) {
        const valuesArr = [
            record.firstname,
            record.lastname,
            record.position,
            record.seniority,
            record.email,
            record.website,
            record.confidence,
        ];
        pool.query(
            `
        INSERT INTO raw_data(firstname, lastname, position, seniority, email, website, confidence) VALUES($1, $2, $3, $4, $5, $6, $7)`,
            valuesArr
        );
    }
};

Without fail, I will get a "too many connections" error.

Am I using pool.query wrong?

UPDATE: add node-postgres initialise script

const { Pool, Client } = require("pg");
const connectionString =
    "string here";

const pool = new Pool({
    connectionString,
});

const client = new Client({
    connectionString,
});

CodePudding user response:

The call to create a new pool takes in a parameter call max, which is what the max number of connections in this pool will be set to

Care must be taken that it aligns with the number of connections available on the DB server

  • Related