I have a table ("my_table") located on a SQL database server.
This table has 1000 rows - I am trying to select 100 rows of this table at a time (making sure that no row is selected twice and all rows are selected), and then append all these mini tables into a single table.
For example:
- result_1: rows 0 - 100
- result_2: rows 101-200
- etc.
I tried to do this with the falling code:
library(dplyr)
library(DBI)
con <- dbConnect(RSQLite::SQLite(), ":memory:")
sequence = seq(from = 1, to = 1000, by = 100)
the_list = list()
for (i in 1:10)
{
for (j in 1:sequence)
{
result_i = DBI::dbGetQuery(con, "select * from my_table ORDER BY ID limit 100 OFFSET J;")
the_list[[i]] = result_i
}
}
final = do.call(rbind.data.frame, the_list)
I thought I could do this with a loop, but I don't think that SQL is recognizing my loop index.
Can someone show me how to fix this?
Thank you!
CodePudding user response:
This should work...
library(dplyr)
library(DBI)
library(data.table)
con <- dbConnect(RSQLite::SQLite(), ":memory:")
sequence <- seq(from = 1, to = 1000, by = 100)
results_list <- lapply(sequence, function(x) {
qstr <- sprintf('SELECT * FROM my_table ORDER BY ID LIMIT 100 OFFSET %s', x)
res <- DBI::dbGetQuery(con, qstr)
res
})
results_df <- rbindlist(results_list) |> as.data.frame()
print(results_df)
CodePudding user response:
Use fetch
as shown.
library(RSQLite)
# Create test database in memory with one 6 row table.
m <- dbDriver("SQLite")
con <- dbConnect(m, dbname = ":memory:")
dbWriteTable(con, 'BOD', BOD)
res <- dbSendQuery(con, "select * from BOD")
# fetch 4 rows at a time
L <- NULL; i <- 0
repeat {
dat <- fetch(res, n = 4)
if (nrow(dat)) L[[i <- i 1]] <- dat else break
}
dbClearResult(res)
dbDisconnect(con)
library(dplyr)
Data <- bind_rows(L)