I’m looking for a best/newest way to send spring boot application logs directly into elasticsearch
server without using filebeats
or logstash
. How can I do that? Is there a simple/modern way in Spring Boot or using any good/reputed library to achieve that? What I need is directly sending logs from spring boot to elasticsearch without any middle service like logstash. If a third party library that can add to pom.xml
and if it is doing that completely that is fine. I need the spring application it self handle this. I have checked some similar questions in so. But some libraries are deprecated now and some are not updated for a long time. I like to know about a new library or a way to do this now? Basically what it is writing to console should be sent to elasticsearch. Can anyone help me?
CodePudding user response:
You can use the LOGGER that Spring boot provides:
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
private static final Logger LOGGER = LogManager.getLogger(yourClass.class);
LOGGER.info("PRINT THIS OUT");
This should show up in a file called "Messages" in elastic beanstalk.
This comes from spring-boot-starter-web dependency.
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
CodePudding user response:
You could include the logback-elastic-appender library in your pom.xml and use the included com.internetitem.logback.elasticsearch.ElasticsearchAppender
class as the appender in the logging config in logback.xml
file
<dependency>
<groupId>com.internetitem</groupId>
<artifactId>logback-elasticsearch-appender</artifactId>
<version>1.6</version>
</dependency>
Is that one of the libraries you didn't want to use because it hasn't been updated recently? If so, you could write a custom appender and point to it in the logback.xml
file. A simple implementation of the appender could look like below:
package com.example.demo;
import java.time.Instant;
import java.time.ZoneId;
import java.time.format.DateTimeFormatter;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.logging.Level;
import java.util.logging.Logger;
import org.springframework.web.reactive.function.client.WebClient;
import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.core.AppenderBase;
import reactor.core.publisher.Mono;
public class ElasticSearchAppender extends AppenderBase<ILoggingEvent> {
private static final String ELASTIC_SEARCH_API_HOST = "http://localhost:9200";
private static final String ELASTIC_SEARCH_INDEX_NAME = "dummy-index";
private static final WebClient webClient = WebClient.create(ELASTIC_SEARCH_API_HOST);
private static final Logger LOGGER = Logger.getLogger(ElasticSearchAppender.class.getName());
public static final DateTimeFormatter ISO_8601_FORMAT = DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSSXXX")
.withZone(ZoneId.systemDefault());
@Override
protected void append(ILoggingEvent eventObject) {
Map<String, Object> loggingEvent = new LinkedHashMap<>();
loggingEvent.put("@timestamp",
ISO_8601_FORMAT.format(Instant.ofEpochMilli(eventObject.getTimeStamp())));
loggingEvent.put("message", eventObject.getMessage());
// Add additional fields like MDC
webClient.post()
.uri("/{logIndex}/_doc", ELASTIC_SEARCH_INDEX_NAME)
.bodyValue(loggingEvent)
.retrieve()
.bodyToMono(Void.class)
.onErrorResume(exception -> {
LOGGER.log(Level.SEVERE, "Unable to send log to elastic", exception);
return Mono.empty();
})
.subscribe();
}
}
logback.xml:
<configuration>
<appender name="ELASTIC" class="com.example.demo.ElasticSearchAppender" />
<root level="INFO">
<appender-ref ref="ELASTIC"/>
</root>
</configuration>