2013-04-26 94 views
0

在一个bash脚本中,我将前一个命令的URL存储在一个bash变量$DESTINATION_URL中。我想用这个变量来运行一个curl命令。在bash变量中保存一个URL导致curl失败

如果我使用$DESTINATION_URL变量,curl命令将失败。

如果我尝试使用URL本身的相同curl命令,它工作正常。这似乎是&是造成问题,但我不明白为什么。下面

例子:

[email protected]:~$ echo $DESTINATION_URL 
http://hadoop-fullslot1:50075/webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true 


[email protected]:~$ curl -v -s -i -X PUT -T $SOURCE "$DESTINATION_URL" 
* About to connect() to hadoop-fullslot1 port 50075 (#0) 
* Trying 10.1.3.39... connected 
HTTP/1.1bhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true 
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 
> Host: hadoop-fullslot1:50075 
> Accept: */* 
> Content-Length: 1907377 
> Expect: 100-continue 
> 
* Empty reply from server 
* Connection #0 to host hadoop-fullslot1 left intact 
* Closing connection #0 


[email protected]:~$ curl -v -s -i -X PUT -T $SOURCE "http://hadoop-fullslot1:50075/webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true" 
* About to connect() to hadoop-fullslot1 port 50075 (#0) 
* Trying 10.1.3.39... connected 
> PUT /webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true HTTP/1.1 
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 
> Host: hadoop-fullslot1:50075 
> Accept: */* 
> Content-Length: 1907377 
> Expect: 100-continue 
> 
< HTTP/1.1 100 Continue 
HTTP/1.1 100 Continue 

* We are completely uploaded and fine 
< HTTP/1.1 201 Created 
HTTP/1.1 201 Created 
< Cache-Control: no-cache 
Cache-Control: no-cache 
< Expires: Fri, 26 Apr 2013 09:01:38 GMT 
Expires: Fri, 26 Apr 2013 09:01:38 GMT 
< Date: Fri, 26 Apr 2013 09:01:38 GMT 
Date: Fri, 26 Apr 2013 09:01:38 GMT 
< Pragma: no-cache 
Pragma: no-cache 
< Expires: Fri, 26 Apr 2013 09:01:38 GMT 
Expires: Fri, 26 Apr 2013 09:01:38 GMT 
< Date: Fri, 26 Apr 2013 09:01:38 GMT 
Date: Fri, 26 Apr 2013 09:01:38 GMT 
< Pragma: no-cache 
Pragma: no-cache 
< Location: webhdfs://hadoop-meta1:50070/user/ha/s3distcp.jar 
Location: webhdfs://hadoop-meta1:50070/user/ha/s3distcp.jar 
< Content-Type: application/octet-stream 
Content-Type: application/octet-stream 
< Content-Length: 0 
Content-Length: 0 
< Server: Jetty(6.1.26.cloudera.2) 
Server: Jetty(6.1.26.cloudera.2) 

< 
* Connection #0 to host hadoop-fullslot1 left intact 
* Closing connection #0 
[email protected]:~$ 

回答

3

你的变量包含的不仅仅是URL更多的东西(垃圾)。我会猜测一个CR字节或什么,看到“HTTP/1.1”是如何获得的,虽然它应该在URL的右边,但是它首先被打印出来...

+0

检查是否属于这种情况,请尝试类似'echo“| $ {DESTINATION_URL} |”所以你看到任何空白。 – 2013-04-26 11:38:53

+0

URL中没有特殊字符,因为如果将CURL命令直接放入命令(而不是使用变量),CURL命令可以正常工作。 – 2013-04-26 13:40:42

+0

我们得到“HTTP/1.1 100 Continue”,因为我们尝试在重定向之前发送数据。 - [查看更多内容](http://hadoop.apache.org/docs/r1.0.4/webhdfs.html#CREATE) – 2013-04-26 13:51:39

-1

使用单引号',而不是双引号"

+0

我必须使用双引号,因为我想通过一个变量作为CURL命令的参数。 – 2013-04-26 13:55:32