I have this Camel Rest Route:
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.main.Main;
import org.apache.camel.model.rest.RestBindingMode;
import spark.Spark;
public class MainCamel {
public static void main(final String[] args) throws Exception {
final Main camelMain = new Main();
camelMain.configure().addRoutesBuilder(new RouteBuilder() {
@Override
public void configure() throws Exception {
this.getContext().getRegistry().bind("healthcheck", CheckUtil.class);
this.restConfiguration()
.bindingMode(RestBindingMode.auto)
.component("netty-http")
.host("localhost")
.port(11010);
this.rest("/healthcheck")
.get()
.description("Healthcheck for docker")
.outType(Integer.class)
.to("bean:healthcheck?method=healthCheck");
}
});
// spark
Spark.port(11011);
Spark.get("/hello", (req, res) -> "Hello World");
System.out.println("ready");
camelMain.run(args);
}
public static class CheckUtil {
public Integer healthCheck() {
return 0;
}
}
}
I also created a second REST server with Spark.
The Camel route does NOT work if the code is executed in a Docker container.
Exception: org.apache.http.NoHttpResponseException: localhost:11010 failed to respond
The Spark server works fine.
However when executing the code directly in IntelliJ both REST Servers work. Of course both ports are exposed in the container.
CodePudding user response:
You are binding the Netty HTTP server to localhost
. Meaning that it will not be able to serve requests that originate from outside of the container.
Change .host("localhost")
to .host("0.0.0.0")
so that the sever listens on all available network interfaces.