Home > Blockchain >  How to copy local full directory with files to another local directory using Terraform?
How to copy local full directory with files to another local directory using Terraform?

Time:11-11

I have a local directory with files. How can I copy that directory with included objects to another one using Terraform? And I need to do that every time when I apply module, not only the first time.

I tried to use provisioner "local-exec" but it doesn't suitable for me cause I'll run this on Linux and Win and I don't want to change command and interpreter every time

CodePudding user response:

Terraform is not designed for managing local files; it is primarily intended for working with remote APIs over the network, so that the results can persist in the remote system between runs.

However, the hashicorp/local provider does have some resource types which try to treat the local file system as if it were a remote API. Its documentation contains a caution about the likely limitations of doing that:

Terraform primarily deals with remote resources which are able to outlive a single Terraform run, and so local resources can sometimes violate its assumptions. The resources here are best used with care, since depending on local state can make it hard to apply the same Terraform configuration on many different local systems where the local resources may not be universally available. See specific notes in each resource for more information.

If this tradeoff is acceptable to you then it should be possible to implement a Terraform module which declares the effect you're describing using the local_file resource type to declare the destination files, but the provider does not have a data source for reading the contents of a directory so for the source files the set of files will need to be fixed on disk before Terraform runs so that it's feasible to use the fileset function. (Functions get evaluated as part of initial configuration evaluation, before Terraform takes any other actions, and so this function cannot react to changes made to the filesystem while Terraform is running.)

variable "source_dir" {
  type = string
}

variable "destination_dir" {
  type = string
}

locals {
  source_files = fileset(var.source_dir, "**")
}

resource "local_file" "dest" {
  for_each = local.source_files

  filename       = "${var.source_dir}/${each.value}"
  content_base64 = filebase64("${var.destination_dir}/${each.value}")
}

This uses local_file to declare that each destination file should exist.

An important caveat here is that the filebase64 function is reading the full content of the given file into memory and then Terraform is copying that value to the provider's content_base64 argument as part of the request to the provider plugin. That means that this technique is only feasible if you know that all of the files you will be copying are relatively small. If you have any particularly large files then you are likely to hit either RAM limits loading the file into memory or provider plugin protocol message size limits sending the content to the hashicorp/local provider.

If there is any way to solve this part of your problem using software outside of Terraform, such as using a traditional configuration management tool, then I would recommend considering that approach instead. Although this might work, this is not what Terraform is designed for.

  • Related