I have a folder with over 50 different csv files in my computer, with different names, different number of variables. I have to upload them to a SQL server and instead of having to upload them one by one I would like to ask how to batch upload them all at once while keeping their original names. I've tried creating a list csv <- dir(pattern = "csv")
and then using RIO::export_list
function but haven't figured out how to use it with DBI::dbWriteTable
. I would be very grateful if someone has dealt with this in the past could help me.
CodePudding user response:
Like @r2evans I do this using DBI, rather than dbplyr (I often use the two packages together).
Below is a cut down version of the custom function I use. You can find the full version here.
copy_r_to_sql <- function(db_connection, db, schema, sql_table_name, r_table_name) {
suppressMessages( # mutes translation message
DBI::dbWriteTable(
db_connection,
DBI::Id(
catalog = db,
schema = schema,
table = sql_table_name
),
r_table_name
)
)
}
For uploading multiple tables, either loop through them or apply
.