Home > Mobile >  My R function is consuming too much memory. Can you help me optimizing it?
My R function is consuming too much memory. Can you help me optimizing it?

Time:10-23

I'm new to R and having trouble with optimizing a function.

My function is to:

  1. create a directory specified in the function
  2. download the zip file from the link inside the function and extract it to the directory
  3. move extracted files to the main directory if files are extracted under a new subfolder
  4. delete the subfolder

It works but consumes a lot of memory and takes 30mins to do such an easy job on a 2.7MB zip file.

Thank you in advance!

create_dir <- function(directory) {
  path <- file.path(getwd(), directory)
  if (!file.exists(path)) {
    dir.create(path)
  }
  link <-
    "https://d396qusza40orc.cloudfront.net/rprog/data/specdata.zip"
  temp <- tempfile()
  download.file(link, temp, mode = "wb")
  unzip(temp, exdir = path)
  unlink(temp)
  existing_loc <- list.files(path, recursive = TRUE)
  for (loc in existing_loc) {
    if (length(grep("/", loc))) {
      file.copy(file.path(path, loc), path)
      file.remove(file.path(path, loc))
    }
  }
  dirs <- list.dirs(path)
  rm_dirs <- dirs[dirs != path]
  if (length(rm_dirs)) {
    for (dir in rm_dirs) {
      unlink(rm_dirs, recursive = TRUE)
    }
  }
}
create_dir("testDirectory")

CodePudding user response:

Thanks, I found the problem. It's because of setting a working directory on OneDrive that syncs for every extraction, moving, and deletion of 332 files processed by the function. AntiVirus also run along with OneDrive and caused my PC to freeze for 30 mins by using 70% of CPU.

  • Related