I have a jar file that i run with a systemd
unit file. The run command in the unit file is the following:
ExecStart=/usr/bin/java -Xms200m -Xmx465m --enable-preview -jar myapp-1.0.0.jar
My application always logs the maximum amount of RAM it is allowed to use. It logs this using the following code:
public static int ONE_MEGABYTE_IN_BYTES = 1048576;
Runtime runtime = Runtime.getRuntime();
int maximumJVMHeapAllocation = Math.round(runtime.maxMemory() / oneMegabyteInBytes);
For some reason the value of runtime.maxMemory()
is always about 15mb
less then the value of the -Xmx
argument. So if the jar file is run with the argument -Xmx465m
then the application will only get 450mb
of usable RAM memory.
My question is: What is the remaining 15 mb of RAM
used for? Is the used for stack memory?
EDIT: To avoid confusion; The total amount of RAM available to the server is 1GB
. 512MB
of that is used by the operating system Amazon Linux 2
.
CodePudding user response:
Apparently, JVM runs with Parallel or Serial GC, where the Young Generation of the heap consists of Eden and two Survivor spaces (see Generations).
JVM keeps one of Survivor spaces empty: an application cannot use both simultaneously. Therefore, Runtime.maxMemory
excludes the size of one Survivor space (which seems to be 15 MB in your case).
Other GC algorithms behave differently. For example, if you run JVM with -XX: UseG1GC
, the maximum available memory as shown by Runtime.maxMemory
will be equal to -Xmx
(more precisely, it'll be 466 MB because of the alignment).