Home > OS >  SQLAlchemy doesn't correctly create in-memory database
SQLAlchemy doesn't correctly create in-memory database

Time:11-24

Making an API using FastAPI and SQLAlchemy I'm experiencing strange behaviour when database (SQLite) is in-memory which doesn't occur when stored as file.

Model:

from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, Integer, String

Base = declarative_base()

class Thing(Base):
    __tablename__ = "thing"
    id = Column(Integer, primary_key=True, autoincrement=True)
    name = Column(String)

I create two global engine objects. One with database as file, the other as in-memory database:

from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker

args = dict(echo=True, connect_args={"check_same_thread": False})
engine1 = create_engine("sqlite:///db.sqlite", **args)
engine2 = create_engine("sqlite:///:memory:", **args)
Session1 = sessionmaker(bind=engine1)
Session2 = sessionmaker(bind=engine2)

I create my FastAPI app and a path to add an object to database:

from fastapi import FastAPI

app = FastAPI()

@app.get("/")
def foo(x: int):
    with {1: Session1, 2: Session2}[x]() as session:
        session.add(Thing(name="foo"))
        session.commit()

My main to simulate requests and check everything is working:

from fastapi.testclient import TestClient

if __name__ == "__main__":
    Base.metadata.create_all(engine1)
    Base.metadata.create_all(engine2)
    client = TestClient(app)
    assert client.get("/1").status_code == 200
    assert client.get("/2").status_code == 200

thing table is created in engine1 and committed, same with engine2. On first request "foo" was successfully inserted into engine1's database (stored as file) but second request raises "sqlite3.OperationalError" claiming "no such table: thing".

Why is there different behaviour between the two? Why does in-memory database claim the table doesn't exist even though SQLAlchemy logs show create table statement ran successfully and was committed?

CodePudding user response:

The docs explain this in the following https://docs.sqlalchemy.org/en/14/dialects/sqlite.html#using-a-memory-database-in-multiple-threads

To use a :memory: database in a multithreaded scenario, the same connection object must be shared among threads, since the database exists only within the scope of that connection. The StaticPool implementation will maintain a single connection globally, and the check_same_thread flag can be passed to Pysqlite as False

It also shows how to get the intended behavior, so in your case

from sqlalchemy.pool import StaticPool

args = dict(echo=True, connect_args={"check_same_thread": False}, poolclass=StaticPool)
  • Related