Home > Blockchain >  List kubernetes jobs from an application inside the cluster
List kubernetes jobs from an application inside the cluster

Time:09-16

we're using the official kubernetes java client https://github.com/kubernetes-client/java to get a list of the pods running in a cluster:


      V1JobList jobList = batchV1Api.listJobForAllNamespaces(null, null, null,
          null, null, null, null, null,
          null, null);

      for (V1Job job : jobList.getItems()) {
        System.out.println(job.getMetadata().getName());
      }

This works perfectly in a local machine. Now we want to do the same from a java application running inside the cluster.

I have packed the above code in a Docker image an executed in GKE. The difference being that I had to pass the config file used locally:


FROM adoptopenjdk/openjdk11:latest
WORKDIR /app
COPY target/official-kubernetes-client-example-1.0-SNAPSHOT-spring-boot.jar ./official-kubernetes-client-example-1.0-SNAPSHOT.jar
COPY config ./.kube/config
ENV HOME=/app
CMD java $JAVA_OPTIONS -jar official-kubernetes-client-example-1.0-SNAPSHOT.jar

However I just get the following error:


2022-09-13 10:47:14.645 EDTio.kubernetes.client.openapi.ApiException: at io.kubernetes.client.openapi.ApiClient.handleResponse(ApiClient.java:973) at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:885) at io.kubernetes.client.openapi.apis.BatchV1Api.listJobForAllNamespacesWithHttpInfo(BatchV1Api.java:3418) at io.kubernetes.client.openapi.apis.BatchV1Api.listJobForAllNamespaces(BatchV1Api.java:3311) at official.kubernetes.client.Main.main(Main.java:33) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:49) at org.springframework.boot.loader.Launcher.launch(Launcher.java:108) at org.springframework.boot.loader.Launcher.launch(Launcher.java:58) at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:65)

Is there a way for an application running within a kubernetes cluster to contact the control plane in order to do things like listing the jobs that are running and creating new jobs?

CodePudding user response:

Alright, I realized that pods running inside a kubernetes cluster do not have permissions since they use the default service account. The steps I took to solve this issue were:

  1. Create a Kubernetes service account https://jamesdefabia.github.io/docs/user-guide/kubectl/kubectl_create_serviceaccount/

  2. Add the cluster admin role to the service account kubectl create clusterrolebinding kubernetes-service-account --clusterrole=cluster-admin --serviceaccount=default:kubernetes-service-account

  3. It is necessary to use the following syntax to create the api client

  ApiClient client = ClientBuilder.cluster().build();
  1. Then you can do something like this
BatchV1Api batchV1Api = new BatchV1Api();

      V1JobList jobList = batchV1Api.listJobForAllNamespaces(null, null, null,
          null, null, null, null, null,
          null, null);



      logger.info("listing the jobs");
      for (V1Job job : jobList.getItems()) {
        logger.info(job.getMetadata().getName());
      }
  • Related