2012-08-04 199 views
1

我在Linux集群中安装了Hadoop。当我尝试通过在命令 $斌/ start-all.sh启动服务器,我获得以下错误:hadoop守护进程没有启动

mkdir: cannot create directory `/var/log/hadoop/spuri2': Permission denied 
chown: cannot access `/var/log/hadoop/spuri2': No such file or directory 
/home/spuri2/spring_2012/Hadoop/hadoop/hadoop-1.0.2/bin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-spuri2-namenode.pid: Permission denied 
head: cannot open `/var/log/hadoop/spuri2/hadoop-spuri2-namenode-gpu02.cluster.out' for reading: No such file or directory 
localhost: /home/spuri2/.bashrc: line 10: /act/Modules/3.2.6/init/bash: No such file or directory 
localhost: mkdir: cannot create directory `/var/log/hadoop/spuri2': Permission denied 
localhost: chown: cannot access `/var/log/hadoop/spuri2': No such file or directory 

我在CONF配置日志目录参数/ hadoop-env.sh到/ tmp目录目录,并且我已将core-site.xml中的“hadoop.tmp.dir”配置到/ tmp /目录。由于我无法访问/ var/log目录,但hadoop守护进程仍在尝试写入/ var/log目录并失败。

我想知道为什么会发生这种情况?

回答

1

你必须在“core.site.xml”文件在此目录中hadoop-env.sh

<?xml version="1.0"?> 
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?> 

<!-- Put site-specific property overrides in this file. --> 

<configuration> 
<property> 
    <name>hadoop.tmp.dir</name> 
    <value>/Directory_hadoop_user_have_permission/temp/${user.name}</value> 
    <description>A base for other temporary directories.</description> 
</property> 

<property> 
    <name>fs.default.name</name> 
    <value>hdfs://localhost:54310</value> 
    <description>The name of the default file system. A URI whose 
    scheme and authority determine the FileSystem implementation. The 
    uri's scheme determines the config property (fs.SCHEME.impl) naming 
    the FileSystem implementation class. The uri's authority is used to 
    determine the host, port, etc. for a filesystem.</description> 
</property> 

</configuration> 
+1

我试过,太多,但没有任何变化。 – 2012-08-04 02:21:00

+0

你的.bashrc文件中变量$ HADOOP_HOME的位置是什么? – 2012-08-04 02:49:02

+0

我的.bashrc文件中没有$ HADOOP_HOME的条目,因为我没有权限,所以无法编辑该文件。我所做的就是使用export命令设置环境变量HADOOP_HOME。但是,它也不起作用。 – 2012-08-04 03:08:37

0

总之不写,我遇到这个问题,因为有在大学的Hadoop多个安装簇。以root用户身份安装hadoop已经搞乱了我的本地hadoop安装。

Hadoop守护进程未启动的原因是因为它无法写入某些具有超级用户权限的文件。我以普通用户身份运行Hadoop。出现问题是因为我们大学的系统管理员以root用户身份安装了Hadoop,所以当我开始本地安装hadoop时,根安装配置文件优先于本地hadoop配置文件。花了很长时间才弄清楚这个问题,但是以root用户身份卸载hadoop后,问题得到解决。