我在Linux集群中安装了Hadoop。当我尝试通过在命令 $斌/ start-all.sh启动服务器,我获得以下错误:hadoop守护进程没有启动
mkdir: cannot create directory `/var/log/hadoop/spuri2': Permission denied
chown: cannot access `/var/log/hadoop/spuri2': No such file or directory
/home/spuri2/spring_2012/Hadoop/hadoop/hadoop-1.0.2/bin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-spuri2-namenode.pid: Permission denied
head: cannot open `/var/log/hadoop/spuri2/hadoop-spuri2-namenode-gpu02.cluster.out' for reading: No such file or directory
localhost: /home/spuri2/.bashrc: line 10: /act/Modules/3.2.6/init/bash: No such file or directory
localhost: mkdir: cannot create directory `/var/log/hadoop/spuri2': Permission denied
localhost: chown: cannot access `/var/log/hadoop/spuri2': No such file or directory
我在CONF配置日志目录参数/ hadoop-env.sh到/ tmp目录目录,并且我已将core-site.xml中的“hadoop.tmp.dir”配置到/ tmp /目录。由于我无法访问/ var/log目录,但hadoop守护进程仍在尝试写入/ var/log目录并失败。
我想知道为什么会发生这种情况?
我试过,太多,但没有任何变化。 – 2012-08-04 02:21:00
你的.bashrc文件中变量$ HADOOP_HOME的位置是什么? – 2012-08-04 02:49:02
我的.bashrc文件中没有$ HADOOP_HOME的条目,因为我没有权限,所以无法编辑该文件。我所做的就是使用export命令设置环境变量HADOOP_HOME。但是,它也不起作用。 – 2012-08-04 03:08:37