I am using ElasticSearch, here we are creating the day wise index and huge amount of data is being ingested every minute. wanted to export few fields from index created every day to Google cloud storage. am able to achieve this with output file as json as shown below:
input {
elasticsearch {
hosts => "localhost:9200"
index => "test"
query => '
{
"_source": ["field1","field2"],
"query": {
"match_all": {}
}
filter {
mutate {
rename => {
"field1" => "test1"
"field2" => "test2"
}
}
}
}
'
}
}
output {
google_cloud_storage {
codec => csv {
include_headers => true
columns => [ "test1", "test2" ]
}
bucket => "bucketName"
json_key_file => "creds.json"
temp_directory => "/tmp"
log_file_prefix => "logstash_gcs"
max_file_size_kbytes => 1024
date_pattern => "%Y-%m-%dT%H:00"
flush_interval_secs => 600
gzip => false
uploader_interval_secs => 600
include_uuid => true
include_hostname => true
}
}
However how to export it as CSV file and send it to Google Cloud Storage
CodePudding user response:
You should be able to change output_format
to plain
but this setting is going to be deprecated
You should remove output_format
and use the codec
setting instead, which supports a csv
output format
google_cloud_storage {
...
codec => csv {
include_headers => true
columns => [ "field1", "field2" ]
}
}
If you want to rename your fields, you can add a filter
section and mutate/rename
the fields however you like. Make sure to also change the columns
settings in your csv codec output:
filter {
mutate {
rename => {
"field1" => "renamed1"
"field2" => "renamed2"
}
}
}
output {
...
}