Project Icon

pipeline-aws-plugin

Jenkins流水线AWS服务集成插件

该Jenkins插件提供AWS API交互功能,支持S3、CloudFront、CloudFormation等多种AWS服务操作。插件通过withAWS步骤实现授权,支持多种凭证和角色切换,简化Jenkins与AWS集成。它可自动化各类AWS任务,提高CI/CD流程效率。插件提供了丰富的步骤,如s3Upload、cfInvalidate、cfnUpdate等,方便在Jenkins流水线中集成AWS操作。通过这些预定义步骤,用户可以轻松实现AWS资源管理和部署自动化,无需编写复杂的AWS SDK代码。

Gitpod ready-to-code Build Status

Features

This plugins adds Jenkins pipeline steps to interact with the AWS API.

see the changelog for release information

Primary/Agent setups

This plugin is not optimized to setups with a primary and multiple agents. Only steps that touch the workspace are executed on the agents while the rest is executed on the master.

For the best experience make sure that primary and agents have the same IAM permission and networking capabilities.

Retrieve credentials from node

By default, credentials lookup is done on the master node for all steps. To enable credentials lookup on the current node, enable Retrieve credentials from node in Jenkins global configuration. This is globally applicable and restricts all access to the master's credentials.

Usage / Steps

withAWS

the withAWS step provides authorization for the nested steps. You can provide region and profile information or let Jenkins assume a role in another or the same AWS account. You can mix all parameters in one withAWS block.

Set region information (note that region and endpointUrl are mutually exclusive):

withAWS(region:'eu-west-1') {
    // do something
}

Use provided endpointUrl (endpointUrl is optional, however, region and endpointUrl are mutually exclusive):

withAWS(endpointUrl:'https://minio.mycompany.com',credentials:'nameOfSystemCredentials',federatedUserId:"${submitter}@${releaseVersion}") {
    // do something
}

Use Jenkins UsernamePassword credentials information (Username: AccessKeyId, Password: SecretAccessKey):

withAWS(credentials:'IDofSystemCredentials') {
    // do something
}

Use Jenkins AWS credentials information (AWS Access Key: AccessKeyId, AWS Secret Key: SecretAccessKey):

withAWS(credentials:'IDofAwsCredentials') {
    // do something
}

Use profile information from ~/.aws/config:

withAWS(profile:'myProfile') {
    // do something
}

Assume role information (account is optional - uses current account as default. externalId, roleSessionName and policy are optional. duration is optional - if specified it represents the maximum amount of time in seconds the session may persist for, defaults to 3600.):

withAWS(role:'admin', roleAccount:'123456789012', externalId: 'my-external-id', policy: '{"Version":"2012-10-17","Statement":[{"Sid":"Stmt1","Effect":"Deny","Action":"s3:DeleteObject","Resource":"*"}]}', duration: 3600, roleSessionName: 'my-custom-session-name') {
    // do something
}

Assume federated user id information (federatedUserId is optional - if specified it generates a set of temporary credentials and allows you to push a federated user id into cloud trail for auditing. duration is optional - if specified it represents the maximum amount of time in seconds the session may persist for, defaults to 3600.):

withAWS(region:'eu-central-1',credentials:'nameOfSystemCredentials',federatedUserId:"${submitter}@${releaseVersion}", duration: 3600) {
    // do something
}

Authentication with a SAML assertion (fetched from your company IdP) by assuming a role

withAWS(role: 'myRole', roleAccount: '123456789', principalArn: 'arn:aws:iam::123456789:saml-provider/test', samlAssertion: 'base64SAML', region:'eu-west-1') {
  // do something
}

Authentication by retrieving credentials from the node in scope

node('myNode') { // Credentials will be fetched from this node
  withAWS(role: 'myRole', roleAccount: '123456789', region:'eu-west-1', useNode: true) {
    // do something
  }
}

When you use Jenkins Declarative Pipelines you can also use withAWS in an options block:

options {
	withAWS(profile:'myProfile')
}
stages {
	...
}

awsIdentity

Print current AWS identity information to the log.

The step returns an objects with the following fields:

  • account - The AWS account ID number of the account that owns or contains the calling entity
  • user - The unique identifier of the calling entity
  • arn - The AWS ARN associated with the calling entity
def identity = awsIdentity()

cfInvalidate

Invalidate given paths in CloudFront distribution.

cfInvalidate(distribution:'someDistributionId', paths:['/*'])
cfInvalidate(distribution:'someDistributionId', paths:['/*'], waitForCompletion: true)

S3 Steps

All s3* steps take an optional pathStyleAccessEnabled and payloadSigningEnabled boolean parameter.

s3Upload(pathStyleAccessEnabled: true, payloadSigningEnabled: true, file:'file.txt', bucket:'my-bucket', path:'path/to/target/file.txt')
s3Copy(pathStyleAccessEnabled: true, fromBucket:'my-bucket', fromPath:'path/to/source/file.txt', toBucket:'other-bucket', toPath:'path/to/destination/file.txt')
s3Delete(pathStyleAccessEnabled: true, bucket:'my-bucket', path:'path/to/source/file.txt')
s3Download(pathStyleAccessEnabled: true, file:'file.txt', bucket:'my-bucket', path:'path/to/source/file.txt', force:true)
exists = s3DoesObjectExist(pathStyleAccessEnabled: true, bucket:'my-bucket', path:'path/to/source/file.txt')
files = s3FindFiles(pathStyleAccessEnabled: true, bucket:'my-bucket')

s3Upload

Upload a file/folder from the workspace (or a String) to an S3 bucket. If the file parameter denotes a directory, the complete directory including all subfolders will be uploaded.

s3Upload(file:'file.txt', bucket:'my-bucket', path:'path/to/target/file.txt')
s3Upload(file:'someFolder', bucket:'my-bucket', path:'path/to/targetFolder/')

Another way to use it is with include/exclude patterns which are applied in the specified subdirectory (workingDir). The option accepts a comma-separated list of patterns.

s3Upload(bucket:"my-bucket", path:'path/to/targetFolder/', includePathPattern:'**/*', workingDir:'dist', excludePathPattern:'**/*.svg,**/*.jpg')

Specific user metadatas can be added to uploaded files

s3Upload(bucket:"my-bucket", path:'path/to/targetFolder/', includePathPattern:'**/*.svg', workingDir:'dist', metadatas:['Key:SomeValue','Another:Value'])

Specific cachecontrol can be added to uploaded files

s3Upload(bucket:"my-bucket", path:'path/to/targetFolder/', includePathPattern:'**/*.svg', workingDir:'dist', cacheControl:'public,max-age=31536000')

Specific content encoding can be added to uploaded files

s3Upload(file:'file.txt', bucket:'my-bucket', contentEncoding: 'gzip')

Specific content type can be added to uploaded files

s3Upload(bucket:"my-bucket", path:'path/to/targetFolder/', includePathPattern:'**/*.ttf', workingDir:'dist', contentType:'application/x-font-ttf', contentDisposition:'attachment')

Canned ACLs can be added to upload requests.

s3Upload(file:'file.txt', bucket:'my-bucket', path:'path/to/target/file.txt', acl:'PublicRead')
s3Upload(file:'someFolder', bucket:'my-bucket', path:'path/to/targetFolder/', acl:'BucketOwnerFullControl')

A Server Side Encryption Algorithm can be added to upload requests.

s3Upload(file:'file.txt', bucket:'my-bucket', path:'path/to/target/file.txt', sseAlgorithm:'AES256')

A KMS alias or KMS id can be used to encrypt the uploaded file or directory at rest.

s3Upload(file: 'foo.txt', bucket: 'my-bucket', path: 'path/to/target/file.txt', kmsId: 'alias/foo')
s3Upload(file: 'foo.txt', bucket: 'my-bucket', path: 'path/to/target/file.txt', kmsId: '8e1d420d-bf94-4a15-a07a-8ad965abb30f')
s3upload(file: 'bar-dir', bucket: 'my-bucket', path: 'path/to/target', kmsId: 'alias/bar')

A redirect location can be added to uploaded files.

s3Upload(file: 'file.txt', bucket: 'my-bucket', redirectLocation: '/redirect')

Creating an S3 object by creating the file whose contents is the provided text argument.

s3Upload(path: 'file.txt', bucket: 'my-bucket', text: 'Some Text Content')
s3Upload(path: 'path/to/targetFolder/file.txt', bucket: 'my-bucket', text: 'Some Text Content')

Tags can be added to uploaded files.

s3Upload(file: 'file.txt', bucket: 'my-bucket', tags: '[tag1:value1, tag2:value2]')

def tags=[:]
tags["tag1"]="value1"
tags["tag2"]="value2"

s3Upload(file: 'file.txt', bucket: 'my-bucket', tags: tags.toString())

Log messages can be less verbose. Disable it when you feel the logs are excessive but you will lose the visibility of what files having been uploaded to S3.

s3Upload(path: 'source/path/', bucket: 'my-bucket', verbose: false)

s3Download

Download a file/folder from S3 to the local workspace. Set optional parameter force to true to overwrite existing file in workspace. If the path ends with a / the complete virtual directory will be downloaded.

s3Download(file:'file.txt', bucket:'my-bucket', path:'path/to/source/file.txt', force:true)
s3Download(file:'targetFolder/', bucket:'my-bucket', path:'path/to/sourceFolder/', force:true)

s3Copy

Copy file between S3 buckets.

s3Copy(fromBucket:'my-bucket', fromPath:'path/to/source/file.txt', toBucket:'other-bucket', toPath:'path/to/destination/file.txt')

s3Delete

Delete a file/folder from S3. If the path ends in a "/", then the path will be interpreted to be a folder, and all of its contents will be removed.

s3Delete(bucket:'my-bucket', path:'path/to/source/file.txt')
s3Delete(bucket:'my-bucket', path:'path/to/sourceFolder/')

s3DoesObjectExist

Check if object exists in S3 bucket.

exists = s3DoesObjectExist(bucket:'my-bucket', path:'path/to/source/file.txt')

s3FindFiles

This provides a way to query the files/folders in the S3 bucket, analogous to the findFiles step provided by "pipeline-utility-steps-plugin". If specified, the path limits the scope of the operation to that folder only. The glob parameter tells s3FindFiles what to look for. This can be a file name, a full path to a file, or a standard glob ("*", "*.ext", "path/**/file.ext", etc.).

If you do not specify path, then it will default to the root of the bucket. The path is assumed to be a folder; you do not need to end it with a "/", but it is okay if you do. The path property of the results will be relative to this value.

This works by enumerating every file/folder in the S3 bucket under path and then performing glob matching. When possible, you should use path to limit the search space for efficiency purposes.

If you do not specify glob, then it will default to "*".

By default, this will return both files and folders. To only return files, set the onlyFiles parameter to true.

files = s3FindFiles(bucket:'my-bucket')
files = s3FindFiles(bucket:'my-bucket', glob:'path/to/targetFolder/file.ext')
files = s3FindFiles(bucket:'my-bucket', path:'path/to/targetFolder/', glob:'file.ext')
files = s3FindFiles(bucket:'my-bucket', path:'path/to/targetFolder/', glob:'*.ext')
files = s3FindFiles(bucket:'my-bucket', path:'path/', glob:'**/file.ext')

s3FindFiles returns an array of FileWrapper objects exactly identical to those returned by findFiles.

Each FileWrapper object has the following properties:

  • name: the filename portion of the path (for "path/to/my/file.ext", this would be "file.ext")
  • path: the full path of the file, relative to the path specified (for path="path/to/", this property of the file "path/to/my/file.ext" would be "my/file.ext")
  • directory: true if this is a directory; false otherwise
  • length: the length of the file (this is always "0" for directories)
  • lastModified: the last modification timestamp, in milliseconds since the Unix epoch (this is always "0" for directories)

When used in a string context, a FileWrapper object returns the value of its path property.

s3PresignURL

Will presign the bucket/key and return a url. Defaults to 1 minute duration, using GET.

def url = s3PresignURL(bucket: 'mybucket', key: 'mykey')

The duration can be overridden:

def url = s3PresignURL(bucket: 'mybucket', key: 'mykey', durationInSeconds: 300) //5 minutes

The method can also be overridden:

def url = s3PresignURL(bucket: 'mybucket', key: 'mykey', httpMethod: 'POST')

cfnValidate

Validates the given CloudFormation template.

def response = cfnValidate(file:'template.yaml')
echo "template description: ${response.description}"

cfnUpdate

Create or update the given CloudFormation stack using the given template from the workspace. You can specify an optional list of parameters, either as a key/value pair or a map. You can also specify a list of keepParams of parameters which will use the previous value on stack updates.

Using timeoutInMinutes you can specify the amount of time that can pass before the stack status becomes CREATE_FAILED and the stack gets rolled back. Due to limitations in the AWS API, this only applies to stack creation.

If you have many parameters you can specify a paramsFile containing the parameters. The format is either a standard JSON file like with the cli or a YAML file for the cfn-params command line utility.

Additionally you can specify a list of tags that are set on the stack and all resources created by CloudFormation.

The step returns the outputs of the stack as a map. It also contains special values prefixed

项目侧边栏1项目侧边栏2
推荐项目
Project Cover

豆包MarsCode

豆包 MarsCode 是一款革命性的编程助手,通过AI技术提供代码补全、单测生成、代码解释和智能问答等功能,支持100+编程语言,与主流编辑器无缝集成,显著提升开发效率和代码质量。

Project Cover

问小白

问小白是一个基于 DeepSeek R1 模型的智能对话平台,专为用户提供高效、贴心的对话体验。实时在线,支持深度思考和联网搜索。免费不限次数,帮用户写作、创作、分析和规划,各种任务随时完成!

Project Cover

白日梦AI

白日梦AI提供专注于AI视频生成的多样化功能,包括文生视频、动态画面和形象生成等,帮助用户快速上手,创造专业级内容。

Project Cover

有言AI

有言平台提供一站式AIGC视频创作解决方案,通过智能技术简化视频制作流程。无论是企业宣传还是个人分享,有言都能帮助用户快速、轻松地制作出专业级别的视频内容。

Project Cover

讯飞绘镜

讯飞绘镜是一个支持从创意到完整视频创作的智能平台,用户可以快速生成视频素材并创作独特的音乐视频和故事。平台提供多样化的主题和精选作品,帮助用户探索创意灵感。

Project Cover

讯飞文书

讯飞文书依托讯飞星火大模型,为文书写作者提供从素材筹备到稿件撰写及审稿的全程支持。通过录音智记和以稿写稿等功能,满足事务性工作的高频需求,帮助撰稿人节省精力,提高效率,优化工作与生活。

Project Cover

阿里绘蛙

绘蛙是阿里巴巴集团推出的革命性AI电商营销平台。利用尖端人工智能技术,为商家提供一键生成商品图和营销文案的服务,显著提升内容创作效率和营销效果。适用于淘宝、天猫等电商平台,让商品第一时间被种草。

Project Cover

Trae

Trae是一种自适应的集成开发环境(IDE),通过自动化和多元协作改变开发流程。利用Trae,团队能够更快速、精确地编写和部署代码,从而提高编程效率和项目交付速度。Trae具备上下文感知和代码自动完成功能,是提升开发效率的理想工具。

Project Cover

AIWritePaper论文写作

AIWritePaper论文写作是一站式AI论文写作辅助工具,简化了选题、文献检索至论文撰写的整个过程。通过简单设定,平台可快速生成高质量论文大纲和全文,配合图表、参考文献等一应俱全,同时提供开题报告和答辩PPT等增值服务,保障数据安全,有效提升写作效率和论文质量。

投诉举报邮箱: service@vectorlightyear.com
@2024 懂AI·鲁ICP备2024100362号-6·鲁公网安备37021002001498号