I have the following .conf file for Logstash:
input {
file {
path => "C:/elastic/logstash-8.3.2/config/*.csv"
start_position => "beginning"
sincedb_path => "NULL"
}
}
filter {
csv {
separator => ";"
columns => ["name","deposit","month"]
}
mutate {
convert => {
"deposit" => "integer"
}
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "payment_test"
}
stdout {}
}
I get inputs from 10 .csv files, which have names like in-0.csv
, in-1.csv
and so on. I want the index names in ElasticSearch to be payment_test-0
, payment_test-1
and so on for the corresponding .csv input files (the data in in-0.csv
would be in index payment_test-0
and so on). How can I achieve this?
CodePudding user response:
I would simply do it like this with the dissect
filter instead of grok
:
filter {
... your other filters
dissect {
mapping => {
"[log][file][path]" => "%{?ignore_path}/in-%{file_no}.csv"
}
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "payment_test-%{file_no}"
}
stdout {}
}
CodePudding user response:
You can create new field as shown below and that you can use in index name:
input {
file {
path => "C:/elastic/logstash-8.3.2/config/*.csv"
start_position => "beginning"
sincedb_path => "NULL"
}
}
filter {
csv {
separator => ";"
columns => ["name","deposit","month"]
}
mutate {
convert => {
"deposit" => "integer"
}
}
grok {
match => ["path","%{GREEDYDATA}/%{GREEDYDATA:file_name}\.csv"]
}
grok { match => { "file_name " => "^.{3}(?<file_no>.)" } }
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "payment-test-%{file_no}"
}
stdout {}
}
I have used file_name
field name for file name but you can used your original field in which file name is coming.