Home > Back-end >  R query database tables iteratively without for loop with lambda or vectorized function for Shiny ap
R query database tables iteratively without for loop with lambda or vectorized function for Shiny ap

Time:02-25

I am connecting to a SQL Server database through the ODBC connection in R. I have two potential methods to get data, and am trying to determine which would be more efficient. The data is needed for a Shiny dashboard, so the data needs to be pulled while the app is loading rather than querying on the fly as the user is using the app.

Method 1 is to use over 20 stored procedures to query all of the needed data and store them for use. Method 2 is to query all of the tables individually.

Here is the method I used to query one of the stored procedures:

get_proc_data <- function(proc_name, url, start_date, end_date){
  dbGetQuery(con, paste0(
    "EXEC dbo.", proc_name, " ",
    "@URL = N'", url, "', ",
    "@Startdate = '", start_date, "', ",
    "@enddate = '", end_date, "' "
  ))
}
data <- get_proc_data(proc_name, url, today(), today() %m-% years(5))

However, each of the stored procedures has a slightly different setup for the parameters, so I would have to define each of them separately.

I have started to implement Method 2, but have run into issues with iteratively querying each table.

# use dplyr create list of table names
db_tables <- dbGetQuery(con, "SELECT * FROM [database_name].INFORMATION_SCHEMA.TABLES;") %>% select(TABLE_NAME)
# use dplyr pull to create list
table_list <- pull(db_tables , TABLE_NAME)

# get a quick look at the first few rows 
tbl(con, "[TableName]") %>% head() %>% glimpse()

# iterate through all table names, get the first five rows, and export to .csv
for (table in table_list){
    write.csv(
        tbl(con, table) %>% head(), str_glue("{getwd()}/00_exports/tables/{table}.csv")
    )
}

selected_tables <- db_tables %>% filter(TABLE_NAME == c("TableName1","TableName2"))

Ultimately this method was just to test how long it would take to iterate through the ~60 tables and perform the required function. I have tried putting this into a function instead but have not been able to get it to iterate through while also pulling the name of the table.

Pro/Con for Method 1: The stored procs are currently powering a metrics plug-in that was written in C and is displaying metrics on the webpage. This is for internal use to monitor website performance. However, the stored procedures are not all visible to me and the client needs me to extend their current metrics. I also do not have a DBA at my disposal to help with the SQL Server side, and the person who wrote the procs is unavailable. The procs are also using different logic than each other, so joining the results of two different procs gives drastically different values. For example, depending on the proc, each date will list total page views for each day or already be aggregated at the weekly or monthly scale then listed repeatedly. So joining and grouping causes drastic errors in actual page views.

Pro/Con for Method 2: I am familiar with dplyr and would be able to join the tables together to pull the data I need. However, I am not as familiar with SQL and there is no Entity-Relationship Diagram (ERD) of any sort to refer to. Otherwise, I would build each query individually.

Either way, I am trying to come up with a way to proceed with either a named function, lambda function, or vectorized method for iterating. It would be great to name each variable and assign them appropriately so that I can perform the data wrangling with dplyr.

Any help would be appreciated, I am overwhelmed with which direction to go. I researched the equivalent to Python list comprehension in R but have not been able get the function in R to perform similarly.

> db_table_head_to_csv <- function(table) {
    write.csv(
      tbl(con, table) %>% head(), str_glue("{getwd()}/00_exports/bibliometrics_tables/{table}.csv")
    )
  }
> 
> bibliometrics_tables %>% db_table_head_to_csv()
 Error in UseMethod("as.sql") : 
no applicable method for 'as.sql' applied to an object of class "data.frame"

CodePudding user response:

Consider storing all table data in a named list (counterpart to Python dictionary) using lapply (counterpart to Python's list/dict comprehension). And if you use its sibling, sapply, the character vector passed in will return as names of elements:

# RETURN VECTOR OF TABLE NAMES 
db_tables <- dbGetQuery(
    con, "SELECT [TABLE_NAME] FROM [database_name].INFORMATION_SCHEMA.TABLES"
)$TABLE_NAME

# RETURN NAMED LIST OF DATA FRAMES FOR EACH DB TABLE
df_list <- sapply(db_tables, function(t) dbReadTable(conn, t), simplify = FALSE)

You can extend the lambda function for multiple steps like write.csv or use a defined method. Just be sure to return a data frame as last line. Below uses the new pipe, |> in base R 4.1.0 :

db_table_head_to_csv <- function(table) {
    head_df <- dbReadTable(con, table) |> head()
    write.csv(
        head_df, 
        file.path(
            "00_exports", "bibliometrics_tables", paste0(table, ".csv")
        )
    )

    return(head_df)
}

df_list <- sapply(db_tables, db_table_head_to_csv, simplify = FALSE)

You lose no functionality of data frame object if stored in a list and can extract with $ or [[ by name:

# EXTRACT SPECIFIC ELEMENT
head(df_list$table_1)
tail(df_list[["table_2"]])
summary(df_list$`table_3`)
  • Related