I know I can do something like this:
input {
jdbc {
type => "a"
}
jdbc {
type => "b"
}
}
output {
if [type] == "a" {
...
}
if [type] == "b" {
...
}
}
But as the offical document of jdbc input plugin said:
If you try to set a type on an event that already has one (for example when you send an event from a shipper to an indexer) then a new input will not override the existing type. A type set at the shipper stays with that event for its life even when sent to another Logstash server.
As far as my own attempts are concerned, this is true. If there are already type
fields in the database, the type
here will not take effect. This makes it impossible for me to use type
to distinguish data in the filter and output. So is there any other way to complete the task of multiple jdbc inputs and outputs separately? Do I have to make them in different pipelines to each other?
I've tried to use tags
as well. Unfortunately, there are tags
fields in my database TOO!! when I wrote input.jdbc.tags => ["TAG_NAME"]
to the config, the "TAG_NAME" string element was appended to the original tags array. So it cannot be used, too.
CodePudding user response:
Can you try creating new field with different values in different jdbc plugin configuration
input {
jdbc {
add_field => { "jdbc_plugin" => "a" }
}
jdbc {
add_field => { "jdbc_plugin" => "b" }
}
}
output {
if [jdbc_plugin] == "a" {
...
}
else if [jdbc_plugin] == "b" {
...
}
}