hadoop-0.18.3 Could not create the Java virtual machine

Installed hadoop on a VM, and needed to set the java heap size, -Xmx1000m, lower than the default 1000 to get it to work.  I set the HADOOP_HEAPSIZE var in the conf/hadoop-env.sh dir to the lower value, but hadoop continued to spit out this error:

# hadoop -help
Could not create the Java virtual machine.
Exception in thread "main" java.lang.NoClassDefFoundError: Could_not_reserve_enough_space_for_object_heap
Caused by: java.lang.ClassNotFoundException: Could_not_reserve_enough_space_for_object_heap
        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
        at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
Could not find the main class: Could_not_reserve_enough_space_for_object_heap.  Program will exit.

Didn’t matter what I set the HADOOP_MAXHEAP to, the problem persisted. I never did find the answer online, so figured I do the world a favor today and make a note about how to fix it. Maybe I’ll save someone else the 2 hours it took me to figure this out!

THE SOLUTION:

Change the bin/hadoop file. Here’s what to change:

--- hadoop.orig 2009-04-22 22:07:58.000000000 -0700
+++ hadoop      2009-04-22 21:57:00.000000000 -0700
@@ -233,7 +233,7 @@
 # setup 'java.library.path' for native-hadoop code if necessary
 JAVA_LIBRARY_PATH=''
 if [ -d "${HADOOP_HOME}/build/native" -o -d "${HADOOP_HOME}/lib/native" ]; then
-  JAVA_PLATFORM=`CLASSPATH=${CLASSPATH} ${JAVA} org.apache.hadoop.util.PlatformName | sed -e "s/ /_/g"`
+  JAVA_PLATFORM=`CLASSPATH=${CLASSPATH} ${JAVA} ${JAVA_HEAP_MAX} org.apache.hadoop.util.PlatformName | sed -e "s/ /_/g"`
   if [ -d "$HADOOP_HOME/build/native" ]; then
     JAVA_LIBRARY_PATH=${HADOOP_HOME}/build/native/${JAVA_PLATFORM}/lib
   fi

JAVA_HEAP_MAX needs to be used when executing JAVA to get the JAVA_PLATFORM variable: On my VM, I can’t just call java, too much memory required; I must pass a -Xmx128m any time I call the java executable. With the JAVA_PLATFORM variable set incorrectly, the java execution of hadoop was failing.

This is a Virtozzo (aka Parrellels, aka OpenVZ) VM, by the way. I’ll send a note to Apache, maybe the already know!? The solution is so simple!

137 thoughts on “hadoop-0.18.3 Could not create the Java virtual machine”

  1. The developers already know. I downloaded verion 0.20.0 and checked it:

    JAVA_PLATFORM=`CLASSPATH=${CLASSPATH} ${JAVA} -Xmx32m org.apache.hadoop.util.PlatformName | sed -e “s/ /_/g”`

    Here they are just hard coding it to -Xmx32m, which is better. If I have my HADOOP_HEAPSIZE cranked way up on a big server, I don’t want to use this value for getting the JAVA_PLATFORM!

Comments are closed.