2015-12-08 53 views
1

我通过PuTTy登录到SSH运行Hadoop MapReduce作业,要求输入主机名/ IP地址,登录名和密码到PuTTY以获取SSH命令行窗口。一旦进入SSH控制台窗口,我会提供相应的MR命令,例如:通过Python无PuTTy/SSH启动Hadoop MapReduce作业

hadoop jar /usr/lib/hadoop-0.20-mapreduce/contrib/streaming/hadoop-streaming-2.0.0-mr1-cdh4 .0.1.jar -file /nfs_home/appers/user1/mapper.py -file /nfs_home/appers/user1/reducer.py -mapper'/usr/lib/python_2.7.3/bin/python mapper.py'-reducer' /usr/lib/python_2.7.3/bin/python reducer.py'-input/ccexp/data/test_xml/0901282-510179094535002-oozie-oozi -W/extractOut// .xml -output/user/ccexptest/output/user1/MRoutput

我想要做的就是使用Python来改变这个笨重的过程,以便我可以从Python脚本中启动MapReduce作业,避免必须通过PuTTy登录到SSH。

可以这样做,如果有的话,有人可以告诉我怎么做?

回答

1

我曾与下面的脚本解决了这个:

import paramiko 

# Define connection info 
host_ip = 'xx.xx.xx.xx' 
user = 'xxxxxxxx' 
pw = 'xxxxxxxx' 

# Paths 
input_loc = '/nfs_home/appers/extracts/*/*.xml' 
output_loc = '/user/lcmsprod/output/cnielsen/' 
python_path = "/usr/lib/python_2.7.3/bin/python" 
hdfs_home = '/nfs_home/appers/cnielsen/' 
output_log = r'C:\Users\cnielsen\Desktop\MR_Test\MRtest011316_0.txt' 

# File names 
xml_lookup_file = 'product_lookups.xml' 
mapper = 'Mapper.py' 
reducer = 'Reducer.py' 
helper_script = 'Process.py' 
product_name = 'test1' 
output_ref = 'test65' 

# ---------------------------------------------------- 

def buildMRcommand(product_name): 
    space = " " 
    mr_command_list = [ 'hadoop', 'jar', '/share/hadoop/tools/lib/hadoop-streaming.jar', 
         '-files', hdfs_home+xml_lookup_file, 
         '-file', hdfs_home+mapper, 
         '-file', hdfs_home+reducer, 
         '-mapper', "'"+python_path, mapper, product_name+"'", 
         '-file', hdfs_home+helper_script, 
         '-reducer', "'"+python_path, reducer+"'", 
         '-input', input_loc, 
         '-output', output_loc+output_ref] 

    MR_command = space.join(mr_command_list) 
    print MR_command 
    return MR_command 

# ---------------------------------------------------- 

def unbuffered_lines(f): 
    line_buf = "" 
    while not f.channel.exit_status_ready(): 
     line_buf += f.read(1) 
     if line_buf.endswith('\n'): 
      yield line_buf 
      line_buf = '' 

# ---------------------------------------------------- 

client = paramiko.SSHClient() 
client.set_missing_host_key_policy(paramiko.AutoAddPolicy()) 
client.connect(host_ip, username=user, password=pw) 

# Build Commands 
list_dir = "ls "+hdfs_home+" -l" 
getmerge = "hadoop fs -getmerge "+output_loc+output_ref+" "+hdfs_home+"test_011216_0.txt" 

# Run Command 
stdin, stdout, stderr = client.exec_command(list_dir) 
##stdin, stdout, stderr = client.exec_command(buildMRcommand(product_name)) 
##stdin, stdout, stderr = client.exec_command(getmerge) 

print "Executing command..." 
writer = open(output_log, 'w') 

for l in unbuffered_lines(stderr): 
    e = '[stderr] ' + l 
    print '[stderr] ' + l.strip('\n') 
    writer.write(e) 

for line in stdout: 
    r = '[stdout]' + line 
    print '[stdout]' + line.strip('\n') 
    writer.write(r) 

client.close() 
writer.close()