Home > Net >  For loop to create a dataframe using pandas read_sql in python
For loop to create a dataframe using pandas read_sql in python

Time:03-11

I have a SQLite database named Europe.db. I want to import, filter and save the results as several pandas data frames. The current code works, but I'm sure it could somehow be simplified with a for loop.

Current code:

company = "THEP.PA"
database = "sqlite:///Europe.db"

sqlite_table = f"SELECT * FROM balance_sheet WHERE symbol='{company}'"
bs_df = pd.read_sql(sqlite_table, database)

sqlite_table = f"SELECT * FROM cashflow_statement WHERE symbol='{company}'"
cf_df = pd.read_sql(sqlite_table, database)

sqlite_table = f"SELECT * FROM income_statement WHERE symbol='{company}'"
is_df = pd.read_sql(sqlite_table, database)

sqlite_table = f"SELECT * FROM key_executives WHERE company='{company}'"
key_executives_df = pd.read_sql(sqlite_table, database)

sqlite_table = f"SELECT * FROM key_metrics WHERE symbol='{company}'"
metrics_df = pd.read_sql(sqlite_table, database)

CodePudding user response:

A dict is great for this:

company = "THEP.PA"
database = "sqlite:///Europe.db"

tables = {
    'balance_sheet': None,
    'cashflow_statement': None,
    'income_statement': None,
    'key_executives': None,
    'key_metrics': None,
}

for table_name in tables.keys():
    sqlite_table = f"SELECT * FROM {table_name} WHERE symbol='{company}'"
    tables[table_name] = pd.read_sql(sqlite_table, database)

Now, to get at one of those dataframes, use, e.g. tables['balance_sheet'] where you would have previously used bs_sheet.

CodePudding user response:

Dict would be the better solution but another way could be:

table_names = [
"balance_sheet",
"cashflow_statement",
..
..
]
tables = []

for table_name in table_names:
    sqlite_table = f"SELECT * FROM { table_name } WHERE symbol='{ company }'"
    tables.append(pd.read_sql(sqlite_table, database))
    
  • Related