在一个bash脚本中,我将前一个命令的URL存储在一个bash变量$DESTINATION_URL
中。我想用这个变量来运行一个curl命令。在bash变量中保存一个URL导致curl失败
如果我使用$DESTINATION_URL
变量,curl命令将失败。
如果我尝试使用URL本身的相同curl命令,它工作正常。这似乎是&
是造成问题,但我不明白为什么。下面
例子:
[email protected]:~$ echo $DESTINATION_URL
http://hadoop-fullslot1:50075/webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true
[email protected]:~$ curl -v -s -i -X PUT -T $SOURCE "$DESTINATION_URL"
* About to connect() to hadoop-fullslot1 port 50075 (#0)
* Trying 10.1.3.39... connected
HTTP/1.1bhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3
> Host: hadoop-fullslot1:50075
> Accept: */*
> Content-Length: 1907377
> Expect: 100-continue
>
* Empty reply from server
* Connection #0 to host hadoop-fullslot1 left intact
* Closing connection #0
[email protected]:~$ curl -v -s -i -X PUT -T $SOURCE "http://hadoop-fullslot1:50075/webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true"
* About to connect() to hadoop-fullslot1 port 50075 (#0)
* Trying 10.1.3.39... connected
> PUT /webhdfs/v1/user/ha/s3distcp.jar?op=CREATE&user.name=hdfs&namenoderpcaddress=hadoop-meta1:8020&overwrite=true HTTP/1.1
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3
> Host: hadoop-fullslot1:50075
> Accept: */*
> Content-Length: 1907377
> Expect: 100-continue
>
< HTTP/1.1 100 Continue
HTTP/1.1 100 Continue
* We are completely uploaded and fine
< HTTP/1.1 201 Created
HTTP/1.1 201 Created
< Cache-Control: no-cache
Cache-Control: no-cache
< Expires: Fri, 26 Apr 2013 09:01:38 GMT
Expires: Fri, 26 Apr 2013 09:01:38 GMT
< Date: Fri, 26 Apr 2013 09:01:38 GMT
Date: Fri, 26 Apr 2013 09:01:38 GMT
< Pragma: no-cache
Pragma: no-cache
< Expires: Fri, 26 Apr 2013 09:01:38 GMT
Expires: Fri, 26 Apr 2013 09:01:38 GMT
< Date: Fri, 26 Apr 2013 09:01:38 GMT
Date: Fri, 26 Apr 2013 09:01:38 GMT
< Pragma: no-cache
Pragma: no-cache
< Location: webhdfs://hadoop-meta1:50070/user/ha/s3distcp.jar
Location: webhdfs://hadoop-meta1:50070/user/ha/s3distcp.jar
< Content-Type: application/octet-stream
Content-Type: application/octet-stream
< Content-Length: 0
Content-Length: 0
< Server: Jetty(6.1.26.cloudera.2)
Server: Jetty(6.1.26.cloudera.2)
<
* Connection #0 to host hadoop-fullslot1 left intact
* Closing connection #0
[email protected]:~$
检查是否属于这种情况,请尝试类似'echo“| $ {DESTINATION_URL} |”所以你看到任何空白。 – 2013-04-26 11:38:53
URL中没有特殊字符,因为如果将CURL命令直接放入命令(而不是使用变量),CURL命令可以正常工作。 – 2013-04-26 13:40:42
我们得到“HTTP/1.1 100 Continue”,因为我们尝试在重定向之前发送数据。 - [查看更多内容](http://hadoop.apache.org/docs/r1.0.4/webhdfs.html#CREATE) – 2013-04-26 13:51:39