$HADOOP_HOME is deprecated ,Hadoop
NickName:Mazy Ask DateTime:2013-06-05T17:50:50

$HADOOP_HOME is deprecated ,Hadoop

I tried to install Hadoop on a single node cluster (my own labtop-ubuntu 12.04). I followed this tutorial and checked it line by line two times . http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

everything seems correct. I set all the core-site.xml ,mapred-site.xml ,hdfs-site.xml .

when I run the following command in hduser su :

hduser@maziyar-Lenovo-IdeaPad-U300s:~$ /usr/local/hadoop/usr/sbin/start-all.sh

I get the following errors :

Warning: $HADOOP_HOME is deprecated.

starting namenode, logging to /usr/local/hadoop/usr/libexec/../logs/hadoop-hduser-namenode-maziyar-Lenovo-IdeaPad-U300s.out
cat: /usr/local/hadoop/usr/libexec/../etc/hadoop/slaves: No such file or directory
cat: /usr/local/hadoop/usr/libexec/../etc/hadoop/masters: No such file or directory
starting jobtracker, logging to /usr/local/hadoop/usr/libexec/../logs/hadoop-hduser-jobtracker-maziyar-Lenovo-IdeaPad-U300s.out
cat: /usr/local/hadoop/usr/libexec/../etc/hadoop/slaves: No such file or directory

I added the export HADOOP_HOME_WARN_SUPPRESS="TRUE" into hadoop-env.sh and still same error.

On the file /home/hduser/.bashrc where I guess my error comming from I have :

# Set Hadoop-related environment variables
export HADOOP_HOME=/usr/local/hadoop

# Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)
export JAVA_HOME=/usr/lib/jvm/jdk-7u10-linuxi586/usr/java/jdk1.7.0_10

# Some convenient aliases and functions for running Hadoop-related commands
unalias fs &> /dev/null
alias fs="hadoop fs"
unalias hls &> /dev/null
alias hls="fs -ls"

# If you have LZO compression enabled in your Hadoop cluster and
# compress job outputs with LZOP (not covered in this tutorial):
# Conveniently inspect an LZOP compressed file from the command
# line; run via:
#
# $ lzohead /hdfs/path/to/lzop/compressed/file.lzo
#
# Requires installed 'lzop' command.
#
lzohead () {
    hadoop fs -cat $1 | lzop -dc | head -1000 | less
}

# Add Hadoop bin/ directory to PATH
export PATH=$PATH:$HADOOP_HOME/usr/sbin

I added /usr/sbin as a bin directory because start-all.sh and commands are there.

I also tried "HADOOP_PREFIX" instead of "HADOOP_HOME" in bashrc file but still the same error.

I have this folders in my hadoop directory ,

maziyar@maziyar-Lenovo-IdeaPad-U300s:/usr/local/hadoop$ ls -lha
total 20K
drwxr-xr-x  5 hduser hadoop 4.0K May 30 15:25 .
drwxr-xr-x 12 root   root   4.0K May 30 15:25 ..
drwxr-xr-x  4 hduser hadoop 4.0K May 30 15:25 etc
drwxr-xr-x 12 hduser hadoop 4.0K Jun  4 21:29 usr
drwxr-xr-x  4 hduser hadoop 4.0K May 30 15:25 var

I downloaded the latest version of apache-hadoop last week: hadoop-1.1.2-1.i386.rpm

Copyright Notice:Content Author:「Mazy」,Reproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/16936745/hadoop-home-is-deprecated-hadoop

Answers
VikasG 2013-06-13T06:19:23

I tried setting export HADOOP_HOME_WARN_SUPPRESS=\"TRUE\" in my conf/hadoop-env.sh file and the warning vanished. Although, I am still unsure of why this warning came in first place.",


Agraj 2013-10-11T09:33:31

Replacing HADOOP_HOME by HADOOP_PREFIX in the ~/.bashrc solves this for me.\n\nDid you try logging out of the current session after making this change and tried again? The changes you make to your bash profile will come into effect when you login into the shell again. ",


Alex Bitek 2013-07-01T12:01:55

Your bash session may still have HADOOP_HOME variable defined. Try to echo $HADOOP_HOME and see if you get any value. \n\nIf you get a value for HADOOP_HOME it means the shell gets it from some config files, check those files (~/.bashrc, ~/.profile, /etc/profile/, /etc/bash.bashrc, etc.) and remove the exported HADOOP_HOME variable.\n\nOpen a new session after you've set HADOOP_PREFIX environment variable instead of HADOOP_HOME in ~/.bashrc and you're sure $HADOOP_HOME is not exported in any config file, and you shouldn't see that warning message.",


Adnan Khan 2016-03-05T03:53:44

Deprecated error means that a particular version you are using is considered unimportant or it will stop supporting soon as read on website.\n\nBy this I mean to say that you have installed Openjdk in hadoop. What I did is instead of installing openjdk I installed Oraclejdk. Maybe you should try doing that. \n\nLemme know if this helped.\n\nRegards.",


More about “$HADOOP_HOME is deprecated ,Hadoop” related questions

$HADOOP_HOME is deprecated

I started a hadoop cluster. I get this warning message: $HADOOP_HOME is deprecated I already add export HADOOP_HOME_WARN_SUPPRESS="TRUE" into hadoop-env.sh When I started the cluster, I do n...

Show Detail

$HADOOP_HOME is deprecated ,Hadoop

I tried to install Hadoop on a single node cluster (my own labtop-ubuntu 12.04). I followed this tutorial and checked it line by line two times . http://www.michael-noll.com/tutorials/running-hado...

Show Detail

HADOOP_HOME and hadoop streaming

Hi I am trying to run hadoop on a server that has hadoop installed but I have no idea the directory where hadoop resides. The server was configure by the server admin. In order to load hadoop I u...

Show Detail

how HADOOP_HOME refers hadoop command

I'm completely new to hadoop framework and for the past few months I've been using linux . After I installing hadoop to /usr/local directory. I tried to run hadoop command in CLI and it responds as

Show Detail

HADOOP_HOME is not set correctly

I downloaded the binary tarball of hadoop from here: http://hadoop.apache.org/releases.html (ver 2.8.4). I unpacked the tar.gz file and then changed the etc/hadoop-env.sh from export JAVA_HOME={$

Show Detail

Hadoop master cannot start slave with different $HADOOP_HOME

In master, the $HADOOP_HOME is /home/a/hadoop, the slave's $HADOOP_HOME is /home/b/hadoop In master, when I try to using start-all.sh, then the master name node start successfuly, but fails to start

Show Detail

Starting Hadoop DFS - no such file or directory in $HADOOP_HOME/bin/hdfs?

I'm setting up a single Hadoop node, but when running $HADOOP_HOME/sbin/start-dfs.sh it prints that it cannot find $HADOOP_HOME/bin/hdfs. The file at that location exists though, and I can read it

Show Detail

Automatically loading HDFS Configuration based on HADOOP_HOME?

I am working on a Java program to interface with an already running hadoop cluster. The program has HADOOP_HOME passed to it as an environment variable. Based on this value, I need to load all of ...

Show Detail

Issues with installing Hive on Ubuntu: $HADOOP_HOME or $HADOOP_PREFIX must be set or hadoop must be in the

I have installed Hadoop on my Ubuntu EC2 instance and have gone through all the steps for installing hive following this tutorial: http://www.tutorialspoint.com/hive/hive_installation.htm However,...

Show Detail

Hive on OS X EL Capitan: Cannot find hadoop installation: $HADOOP_HOME or $HADOOP_PREFIX must be set

I'm trying to start Hive 2.1.1 on my OSX EL Capitan: I have pre installed and working Hadoop-2.7.3 Also my .bash_profile: export JAVA_HOME=$(/usr/libexec/java_home) export HADOOP_HOME=/Users/dev/

Show Detail