My project is web back-end that using Go http framework: gin-gonic/gin module. My example repository
PostgreSQL database
I am using a remote hosted database (Render) and jmoiron/sqlx module to connect to PostgreSQL database. With the local database exactly the same story, the same errors. But unlike a local database, hosting provides metrics in your personal account. I observe there the use of the CPU, RAM and the number of connections at the current moment.
Connection
// db is *sqlx.DB that contain standard *sql.DB
db, err := sqlx.Open("postgres", fmt.Sprintf("host=%s port=%s user=%s password=%s dbname=%s sslmode=%s",
postgresConfig.Host, postgresConfig.Port, postgresConfig.Username, postgresConfig.Password, postgresConfig.DBName, postgresConfig.SSLMode))
if err != nil {
panic(err)
}
I call the sqlx.Open() method and then pass the connection pointer to all my necessary functions that use the database. As expected, the connection succeeds. But then I expected one single connection and execution of sql queries with one connection in the metric.
SQL query
// this is how database use
func (r *AuthPostgres) CheckSession(identifier string) (*models.Account, error) {
var account models.Account
if err := r.db.Get(&account, `select login, password, credits from "Account"
where login=$1`,
identifier); err != nil {
return nil, err
}
return &account, nil
}
Actually this is not quite as expected. When my application executes sql queries, the number of connections grows. This shouldn't happen.
Screenshots. Amount of connections metrics:
What I already tried
I have a project that is slightly larger than this example, but in this repository I have outlined exactly what one function looks like and how it connects to the database. And I already tried to tweaking this:
- Add or remove pointers (*, &)
- Different database (localhost -> remote)
- Change max_connections in postgres config file (localhost)
- Set time-out for connection
- Set max open/idle connection limit
These not work for me. I would like get a single connection and do sql queries many times per minute.
Logs
Errors that I recieved from logs during testing in different ways. I have highlighted the most important. These were repeated in different order and in different combinations.
- FATAL: sorry too many clients already
- pq: out of memory
- fatal error: out of memory allocated heap arena metadata
CodePudding user response:
sql.DB
is a poor name for that struct. It is not a single connection it is a connection pool
That is what you need in the majority of real-life applications. Connection Poll enables concurrency and simplifies code by eliminating the need to manage connections explicitly.
You still have control over Connection Pool size and lifetime with DB.SetMaxOpenConns and DB.SetMaxIdleConns methods.
CodePudding user response:
The answer is: Using context and open connection pool with sqlx.ConnectContext().
- // db is *sqlx.DB that contain *sql.DB
- db, err := sqlx.Connect("postgres", fmt.Sprintf("host=%s port=%s user=%s password=%s dbname=%s sslmode=%s",
ctxTimeout, ctxCancel := context.WithTimeout(context.Background(), time.Second*3)
defer ctxCancel()
db, err := sqlx.ConnectContext(ctxTimeout, "postgres", fmt.Sprintf("host=%s port=%s user=%s password=%s dbname=%s sslmode=%s",
postgresConfig.Host, postgresConfig.Port, postgresConfig.Username, postgresConfig.Password, postgresConfig.DBName, postgresConfig.SSLMode))
if err != nil {
panic(err)
}
To save resources and control idle activity in your database pool:
db.SetMaxIdleConns(5)
db.SetConnMaxIdleTime(10 * time.Second)
db.SetMaxOpenConns(95)