I know, the question sounds vague, but I'm trying to understand and solve this issue for some time already, i.e., without simply auto-restarting the pod (the application is running in K8s) regularly.
After a few days of the (Kotlin, Spring Boot) application running, I suddenly get a lot of these two:
java.lang.OutOfMemoryError: unable to create native thread: possibly out of memory or process/resource limits reached
Failed to start thread - pthread_create failed (EAGAIN) for attributes: stacksize: 256k, guardsize: 0k, detached.
The weird part is, that neither the memory seems full, now the max number of threads or file descriptors is reached. Monitoring screenshots:
- K8s pod resources: https://i.imgur.com/BqKT9om.png
- JVM (Actuator) Metrics: https://i.imgur.com/Jpdv2F4.png
Additional information:
/etc/security/limits.conf
is empty (only#
-commented lines)cat /proc/sys/kernel/threads-max
gives2060488
java -XX: PrintFlagsFinal -version | grep -i thread
also does not look suspicious: https://gist.github.com/Dobiasd/01623e9066889c8fd059bd387a15fafa
Any ideas on what could be the cause and/or how to fix it?
CodePudding user response:
While cat /proc/sys/kernel/threads-max
is showing 2M threads allowed system wide, there may also be user level limits. On my WSL2 Linux I have:
$ cat /proc/sys/kernel/threads-max
127953
$ ulimit -u
63976
In practice this simple test program manages to create 31k threads:
public class Test {
public static void main(String[] args) {
for (int i = 0; i < 100000; i) {
try {
new Thread(() -> {
try {
Thread.sleep(60000);
} catch (InterruptedException e) {
throw new RuntimeException(e);
}
}).start();
} catch (Throwable e) {
System.out.println(i);
throw e;
}
}
}
}
That's fewer than we expect given the ulimit -u
result.
To start with you should check your ulimit -u
in the container.
CodePudding user response:
I'm not sure if I'm reading your graphs properly but here: https://i.imgur.com/Jpdv2F4.png The "File Descriptors" graph shows "Max: 1.0 Mil Current: 1.0 Mil" So it seems like you are hitting the max number of open files limit.
It's worth checking the limits on the command line via ulimit -n
. You can also check the process limit cat /proc/$(pgrep java)/limits
.