我正在使用Bitbuckets流水线。我希望它将我的回购(非常小)的全部内容推送到S3。我不想将它压缩,推到S3然后解压缩。我只是希望它将我现有的文件/文件夹结构放在我的Bitbucket仓库中,并将其推送到S3。使用Bitbucket流水线将整个Bitbucket回购上传到S3
yaml文件和.py文件应该如何完成此操作?
这是当前YAML文件:
image: python:3.5.1
pipelines:
branches:
master:
- step:
script:
# - apt-get update # required to install zip
# - apt-get install -y zip # required if you want to zip repository objects
- pip install boto3==1.3.0 # required for s3_upload.py
# the first argument is the name of the existing S3 bucket to upload the artefact to
# the second argument is the artefact to be uploaded
# the third argument is the the bucket key
# html files
- python s3_upload.py my-bucket-name html/index_template.html html/index_template.html # run the deployment script
# Example command line parameters. Replace with your values
#- python s3_upload.py bb-s3-upload SampleApp_Linux.zip SampleApp_Linux # run the deployment script
这里是我当前的Python:
from __future__ import print_function
import os
import sys
import argparse
import boto3
from botocore.exceptions import ClientError
def upload_to_s3(bucket, artefact, bucket_key):
"""
Uploads an artefact to Amazon S3
"""
try:
client = boto3.client('s3')
except ClientError as err:
print("Failed to create boto3 client.\n" + str(err))
return False
try:
client.put_object(
Body=open(artefact, 'rb'),
Bucket=bucket,
Key=bucket_key
)
except ClientError as err:
print("Failed to upload artefact to S3.\n" + str(err))
return False
except IOError as err:
print("Failed to access artefact in this directory.\n" + str(err))
return False
return True
def main():
parser = argparse.ArgumentParser()
parser.add_argument("bucket", help="Name of the existing S3 bucket")
parser.add_argument("artefact", help="Name of the artefact to be uploaded to S3")
parser.add_argument("bucket_key", help="Name of the S3 Bucket key")
args = parser.parse_args()
if not upload_to_s3(args.bucket, args.artefact, args.bucket_key):
sys.exit(1)
if __name__ == "__main__":
main()
这要求我列出回购每一个文件在YAML文件作为另一个命令。我只是想要抓住一切并将其上传到S3。
具体什么是问题? – jbird
@jbird参见编辑 – scottndecker
@jbird他问了如何使用AWS Labs为BitBucket管道提供的示例递归地将多个文件发送到S3的基本问题。 @scottndecker我也有同样的问题。在竹我跑一个shell脚本来处理: '''#/斌/庆典 出口AWS_ACCESS_KEY_ID = $ {} bamboo.awsAccessKeyId出口 = AWS_SECRET_ACCESS_KEY $ {} bamboo.awsSecretAccessKeyPassword出口 = AWS_DEFAULT_REGION美国东西-1 aws s3 sync dist/library s3:// yourbuckethere/- 删除 aws s3 sync dist/library s3:// yourbuckethere /''' 管道中还没有运气 –