Home > Blockchain >  Collecting logs from different remote servers using just Logstash
Collecting logs from different remote servers using just Logstash

Time:09-09

Is it possible to send logs from different remote machines to elasticsearch using just logstash(no filebeats)? Is so, do I define same index in all the conf.d file in all the machines? I want all the logs to be in the same index.

Would i use logs-%{ YYYY.MM.dd} for the index of all config files to have them indexed into the same folder?

input {
  file {
    part => /home/ubuntu/logs/data.log
  }
}
output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index =>"logs-%{ YYYY.MM.dd}"
  }
}

CodePudding user response:

What you do is ok and it will work. Just one thing I would correct is that you should simply write to a data stream and not have to care about the index name and ILM matters (rollover, retention, etc), like this:

input {
  file {
    part => /home/ubuntu/logs/data.log
  }
}
output {
  elasticsearch {
    hosts => ["localhost:9200"]
    data_stream => "true"
    data_stream_type => "logs"
    data_stream_dataset => "ubuntu"
    data_stream_namespace => "prod"
  }
}

The data stream name will be logs-ubuntu-prod, you can change the latter two to your liking.

Make sure to properly set up your data stream first, with an adequate Index Lifecycle Management policy, though.

On a different note, it's a waste of resource to install Logstash on all your remote machines which is supposed to work as centralized streaming engine. You should definitely either use Filebeat, or even better now the Elastic Agent which is fully manageable through Fleet in Kibana. You should have a look.

  • Related