software.amazon.awscdk.services.codepipeline.actions.package-info Maven / Gradle / Ivy
Show all versions of codepipeline-actions Show documentation
/**
* AWS CodePipeline Actions
*
* ---
*
*
*
*
*
*
*
* This package contains Actions that can be used in a CodePipeline.
*
*
* import software.amazon.awscdk.services.codepipeline.*;
* import software.amazon.awscdk.services.codepipeline.actions.*;
*
*
*
Sources
*
*
AWS CodeCommit
*
* To use a CodeCommit Repository in a CodePipeline:
*
*
* Repository repo = Repository.Builder.create(this, "Repo")
* .repositoryName("MyRepo")
* .build();
*
* Pipeline pipeline = Pipeline.Builder.create(this, "MyPipeline")
* .pipelineName("MyPipeline")
* .build();
* Artifact sourceOutput = new Artifact();
* CodeCommitSourceAction sourceAction = CodeCommitSourceAction.Builder.create()
* .actionName("CodeCommit")
* .repository(repo)
* .output(sourceOutput)
* .build();
* pipeline.addStage(StageOptions.builder()
* .stageName("Source")
* .actions(List.of(sourceAction))
* .build());
*
*
* If you want to use existing role which can be used by on commit event rule.
* You can specify the role object in eventRole property.
*
*
* Repository repo;
* IRole eventRole = Role.fromRoleArn(this, "Event-role", "roleArn");
* CodeCommitSourceAction sourceAction = CodeCommitSourceAction.Builder.create()
* .actionName("CodeCommit")
* .repository(repo)
* .output(new Artifact())
* .eventRole(eventRole)
* .build();
*
*
* If you want to clone the entire CodeCommit repository (only available for CodeBuild actions),
* you can set the codeBuildCloneOutput
property to true
:
*
*
* PipelineProject project;
* Repository repo;
*
* Artifact sourceOutput = new Artifact();
* CodeCommitSourceAction sourceAction = CodeCommitSourceAction.Builder.create()
* .actionName("CodeCommit")
* .repository(repo)
* .output(sourceOutput)
* .codeBuildCloneOutput(true)
* .build();
*
* CodeBuildAction buildAction = CodeBuildAction.Builder.create()
* .actionName("CodeBuild")
* .project(project)
* .input(sourceOutput) // The build action must use the CodeCommitSourceAction output as input.
* .outputs(List.of(new Artifact()))
* .build();
*
*
* The CodeCommit source action emits variables:
*
*
* PipelineProject project;
* Repository repo;
*
* Artifact sourceOutput = new Artifact();
* CodeCommitSourceAction sourceAction = CodeCommitSourceAction.Builder.create()
* .actionName("CodeCommit")
* .repository(repo)
* .output(sourceOutput)
* .variablesNamespace("MyNamespace")
* .build();
*
* // later:
*
* // later:
* CodeBuildAction.Builder.create()
* .actionName("CodeBuild")
* .project(project)
* .input(sourceOutput)
* .environmentVariables(Map.of(
* "COMMIT_ID", BuildEnvironmentVariable.builder()
* .value(sourceAction.getVariables().getCommitId())
* .build()))
* .build();
*
*
*
GitHub
*
* If you want to use a GitHub repository as the source, you must create:
*
*
* - A GitHub Access Token,
* with scopes repo and admin:repo_hook.
* - A Secrets Manager Secret
* with the value of the GitHub Access Token. Pick whatever name you want (for example
my-github-token
).
* This token can be stored either as Plaintext or as a Secret key/value.
* If you stored the token as Plaintext,
* set SecretValue.secretsManager('my-github-token')
as the value of oauthToken
.
* If you stored it as a Secret key/value,
* you must set SecretValue.secretsManager('my-github-token', { jsonField : 'my-github-token' })
as the value of oauthToken
.
*
*
* To use GitHub as the source of a CodePipeline:
*
*
* // Read the secret from Secrets Manager
* Pipeline pipeline = new Pipeline(this, "MyPipeline");
* Artifact sourceOutput = new Artifact();
* GitHubSourceAction sourceAction = GitHubSourceAction.Builder.create()
* .actionName("GitHub_Source")
* .owner("awslabs")
* .repo("aws-cdk")
* .oauthToken(SecretValue.secretsManager("my-github-token"))
* .output(sourceOutput)
* .branch("develop")
* .build();
* pipeline.addStage(StageOptions.builder()
* .stageName("Source")
* .actions(List.of(sourceAction))
* .build());
*
*
* The GitHub source action emits variables:
*
*
* Artifact sourceOutput;
* PipelineProject project;
*
*
* GitHubSourceAction sourceAction = GitHubSourceAction.Builder.create()
* .actionName("Github_Source")
* .output(sourceOutput)
* .owner("my-owner")
* .repo("my-repo")
* .oauthToken(SecretValue.secretsManager("my-github-token"))
* .variablesNamespace("MyNamespace")
* .build();
*
* // later:
*
* // later:
* CodeBuildAction.Builder.create()
* .actionName("CodeBuild")
* .project(project)
* .input(sourceOutput)
* .environmentVariables(Map.of(
* "COMMIT_URL", BuildEnvironmentVariable.builder()
* .value(sourceAction.getVariables().getCommitUrl())
* .build()))
* .build();
*
*
*
BitBucket
*
* CodePipeline can use a BitBucket Git repository as a source:
*
* Note: you have to manually connect CodePipeline through the AWS Console with your BitBucket account.
* This is a one-time operation for a given AWS account in a given region.
* The simplest way to do that is to either start creating a new CodePipeline,
* or edit an existing one, while being logged in to BitBucket.
* Choose BitBucket as the source,
* and grant CodePipeline permissions to your BitBucket account.
* Copy & paste the Connection ARN that you get in the console,
* or use the codestar-connections list-connections
AWS CLI operation
* to find it.
* After that, you can safely abort creating or editing the pipeline -
* the connection has already been created.
*
*
* Artifact sourceOutput = new Artifact();
* CodeStarConnectionsSourceAction sourceAction = CodeStarConnectionsSourceAction.Builder.create()
* .actionName("BitBucket_Source")
* .owner("aws")
* .repo("aws-cdk")
* .output(sourceOutput)
* .connectionArn("arn:aws:codestar-connections:us-east-1:123456789012:connection/12345678-abcd-12ab-34cdef5678gh")
* .build();
*
*
* You can also use the CodeStarConnectionsSourceAction
to connect to GitHub, in the same way
* (you just have to select GitHub as the source when creating the connection in the console).
*
* Similarly to GitHubSourceAction
, CodeStarConnectionsSourceAction
also emits the variables:
*
*
* Project project;
*
*
* Artifact sourceOutput = new Artifact();
* CodeStarConnectionsSourceAction sourceAction = CodeStarConnectionsSourceAction.Builder.create()
* .actionName("BitBucket_Source")
* .owner("aws")
* .repo("aws-cdk")
* .output(sourceOutput)
* .connectionArn("arn:aws:codestar-connections:us-east-1:123456789012:connection/12345678-abcd-12ab-34cdef5678gh")
* .variablesNamespace("SomeSpace")
* .build();
*
* // later:
*
* // later:
* CodeBuildAction.Builder.create()
* .actionName("CodeBuild")
* .project(project)
* .input(sourceOutput)
* .environmentVariables(Map.of(
* "COMMIT_ID", BuildEnvironmentVariable.builder()
* .value(sourceAction.getVariables().getCommitId())
* .build()))
* .build();
*
*
*
AWS S3 Source
*
* To use an S3 Bucket as a source in CodePipeline:
*
*
* Bucket sourceBucket = Bucket.Builder.create(this, "MyBucket")
* .versioned(true)
* .build();
*
* Pipeline pipeline = new Pipeline(this, "MyPipeline");
* Artifact sourceOutput = new Artifact();
* S3SourceAction sourceAction = S3SourceAction.Builder.create()
* .actionName("S3Source")
* .bucket(sourceBucket)
* .bucketKey("path/to/file.zip")
* .output(sourceOutput)
* .build();
* pipeline.addStage(StageOptions.builder()
* .stageName("Source")
* .actions(List.of(sourceAction))
* .build());
*
*
* The region of the action will be determined by the region the bucket itself is in.
* When using a newly created bucket,
* that region will be taken from the stack the bucket belongs to;
* for an imported bucket,
* you can specify the region explicitly:
*
*
* IBucket sourceBucket = Bucket.fromBucketAttributes(this, "SourceBucket", BucketAttributes.builder()
* .bucketName("my-bucket")
* .region("ap-southeast-1")
* .build());
*
*
* By default, the Pipeline will poll the Bucket to detect changes.
* You can change that behavior to use CloudWatch Events by setting the trigger
* property to S3Trigger.EVENTS
(it's S3Trigger.POLL
by default).
* If you do that, make sure the source Bucket is part of an AWS CloudTrail Trail -
* otherwise, the CloudWatch Events will not be emitted,
* and your Pipeline will not react to changes in the Bucket.
* You can do it through the CDK:
*
*
* import software.amazon.awscdk.services.cloudtrail.*;
*
* Bucket sourceBucket;
*
* Artifact sourceOutput = new Artifact();
* String key = "some/key.zip";
* Trail trail = new Trail(this, "CloudTrail");
* trail.addS3EventSelector(List.of(S3EventSelector.builder()
* .bucket(sourceBucket)
* .objectPrefix(key)
* .build()), AddEventSelectorOptions.builder()
* .readWriteType(ReadWriteType.WRITE_ONLY)
* .build());
* S3SourceAction sourceAction = S3SourceAction.Builder.create()
* .actionName("S3Source")
* .bucketKey(key)
* .bucket(sourceBucket)
* .output(sourceOutput)
* .trigger(S3Trigger.EVENTS)
* .build();
*
*
* The S3 source action emits variables:
*
*
* Bucket sourceBucket;
*
* // later:
* PipelineProject project;
* String key = "some/key.zip";
* Artifact sourceOutput = new Artifact();
* S3SourceAction sourceAction = S3SourceAction.Builder.create()
* .actionName("S3Source")
* .bucketKey(key)
* .bucket(sourceBucket)
* .output(sourceOutput)
* .variablesNamespace("MyNamespace")
* .build();
* CodeBuildAction.Builder.create()
* .actionName("CodeBuild")
* .project(project)
* .input(sourceOutput)
* .environmentVariables(Map.of(
* "VERSION_ID", BuildEnvironmentVariable.builder()
* .value(sourceAction.getVariables().getVersionId())
* .build()))
* .build();
*
*
*
AWS ECR
*
* To use an ECR Repository as a source in a Pipeline:
*
*
* import software.amazon.awscdk.services.ecr.*;
*
* Repository ecrRepository;
*
* Pipeline pipeline = new Pipeline(this, "MyPipeline");
* Artifact sourceOutput = new Artifact();
* EcrSourceAction sourceAction = EcrSourceAction.Builder.create()
* .actionName("ECR")
* .repository(ecrRepository)
* .imageTag("some-tag") // optional, default: 'latest'
* .output(sourceOutput)
* .build();
* pipeline.addStage(StageOptions.builder()
* .stageName("Source")
* .actions(List.of(sourceAction))
* .build());
*
*
* The ECR source action emits variables:
*
*
* import software.amazon.awscdk.services.ecr.*;
* Repository ecrRepository;
*
* // later:
* PipelineProject project;
*
*
* Artifact sourceOutput = new Artifact();
* EcrSourceAction sourceAction = EcrSourceAction.Builder.create()
* .actionName("Source")
* .output(sourceOutput)
* .repository(ecrRepository)
* .variablesNamespace("MyNamespace")
* .build();
* CodeBuildAction.Builder.create()
* .actionName("CodeBuild")
* .project(project)
* .input(sourceOutput)
* .environmentVariables(Map.of(
* "IMAGE_URI", BuildEnvironmentVariable.builder()
* .value(sourceAction.getVariables().getImageUri())
* .build()))
* .build();
*
*
*
Build & test
*
*
AWS CodeBuild
*
* Example of a CodeBuild Project used in a Pipeline, alongside CodeCommit:
*
*
* PipelineProject project;
*
* Repository repository = Repository.Builder.create(this, "MyRepository")
* .repositoryName("MyRepository")
* .build();
* PipelineProject project = new PipelineProject(this, "MyProject");
*
* Artifact sourceOutput = new Artifact();
* CodeCommitSourceAction sourceAction = CodeCommitSourceAction.Builder.create()
* .actionName("CodeCommit")
* .repository(repository)
* .output(sourceOutput)
* .build();
* CodeBuildAction buildAction = CodeBuildAction.Builder.create()
* .actionName("CodeBuild")
* .project(project)
* .input(sourceOutput)
* .outputs(List.of(new Artifact())) // optional
* .executeBatchBuild(true) // optional, defaults to false
* .combineBatchBuildArtifacts(true)
* .build();
*
* Pipeline.Builder.create(this, "MyPipeline")
* .stages(List.of(StageProps.builder()
* .stageName("Source")
* .actions(List.of(sourceAction))
* .build(), StageProps.builder()
* .stageName("Build")
* .actions(List.of(buildAction))
* .build()))
* .build();
*
*
* The default category of the CodeBuild Action is Build
;
* if you want a Test
Action instead,
* override the type
property:
*
*
* PipelineProject project;
*
* Artifact sourceOutput = new Artifact();
* CodeBuildAction testAction = CodeBuildAction.Builder.create()
* .actionName("IntegrationTest")
* .project(project)
* .input(sourceOutput)
* .type(CodeBuildActionType.TEST)
* .build();
*
*
*
Multiple inputs and outputs
*
* When you want to have multiple inputs and/or outputs for a Project used in a
* Pipeline, instead of using the secondarySources
and secondaryArtifacts
* properties of the Project
class, you need to use the extraInputs
and
* outputs
properties of the CodeBuild CodePipeline
* Actions. Example:
*
*
* Repository repository1;
* Repository repository2;
*
* PipelineProject project;
*
* Artifact sourceOutput1 = new Artifact();
* CodeCommitSourceAction sourceAction1 = CodeCommitSourceAction.Builder.create()
* .actionName("Source1")
* .repository(repository1)
* .output(sourceOutput1)
* .build();
* Artifact sourceOutput2 = new Artifact("source2");
* CodeCommitSourceAction sourceAction2 = CodeCommitSourceAction.Builder.create()
* .actionName("Source2")
* .repository(repository2)
* .output(sourceOutput2)
* .build();
* CodeBuildAction buildAction = CodeBuildAction.Builder.create()
* .actionName("Build")
* .project(project)
* .input(sourceOutput1)
* .extraInputs(List.of(sourceOutput2))
* .outputs(List.of(
* new Artifact("artifact1"), // for better buildspec readability - see below
* new Artifact("artifact2")))
* .build();
*
*
* Note: when a CodeBuild Action in a Pipeline has more than one output, it
* only uses the secondary-artifacts
field of the buildspec, never the
* primary output specification directly under artifacts
. Because of that, it
* pays to explicitly name all output artifacts of that Action, like we did
* above, so that you know what name to use in the buildspec.
*
* Example buildspec for the above project:
*
*
* PipelineProject project = PipelineProject.Builder.create(this, "MyProject")
* .buildSpec(BuildSpec.fromObject(Map.of(
* "version", "0.2",
* "phases", Map.of(
* "build", Map.of(
* "commands", List.of())),
* "artifacts", Map.of(
* "secondary-artifacts", Map.of(
* "artifact1", Map.of(),
* "artifact2", Map.of())))))
* .build();
*
*
*
Variables
*
* The CodeBuild action emits variables.
* Unlike many other actions, the variables are not static,
* but dynamic, defined in the buildspec,
* in the 'exported-variables' subsection of the 'env' section.
* Example:
*
*
* // later:
* PipelineProject project;
* Artifact sourceOutput = new Artifact();
* CodeBuildAction buildAction = CodeBuildAction.Builder.create()
* .actionName("Build1")
* .input(sourceOutput)
* .project(PipelineProject.Builder.create(this, "Project")
* .buildSpec(BuildSpec.fromObject(Map.of(
* "version", "0.2",
* "env", Map.of(
* "exported-variables", List.of("MY_VAR")),
* "phases", Map.of(
* "build", Map.of(
* "commands", "export MY_VAR=\"some value\"")))))
* .build())
* .variablesNamespace("MyNamespace")
* .build();
* CodeBuildAction.Builder.create()
* .actionName("CodeBuild")
* .project(project)
* .input(sourceOutput)
* .environmentVariables(Map.of(
* "MyVar", BuildEnvironmentVariable.builder()
* .value(buildAction.variable("MY_VAR"))
* .build()))
* .build();
*
*
*
Jenkins
*
* In order to use Jenkins Actions in the Pipeline,
* you first need to create a JenkinsProvider
:
*
*
* JenkinsProvider jenkinsProvider = JenkinsProvider.Builder.create(this, "JenkinsProvider")
* .providerName("MyJenkinsProvider")
* .serverUrl("http://my-jenkins.com:8080")
* .version("2")
* .build();
*
*
* If you've registered a Jenkins provider in a different CDK app,
* or outside the CDK (in the CodePipeline AWS Console, for example),
* you can import it:
*
*
* IJenkinsProvider jenkinsProvider = JenkinsProvider.fromJenkinsProviderAttributes(this, "JenkinsProvider", JenkinsProviderAttributes.builder()
* .providerName("MyJenkinsProvider")
* .serverUrl("http://my-jenkins.com:8080")
* .version("2")
* .build());
*
*
* Note that a Jenkins provider
* (identified by the provider name-category(build/test)-version tuple)
* must always be registered in the given account, in the given AWS region,
* before it can be used in CodePipeline.
*
* With a JenkinsProvider
,
* we can create a Jenkins Action:
*
*
* JenkinsProvider jenkinsProvider;
*
* JenkinsAction buildAction = JenkinsAction.Builder.create()
* .actionName("JenkinsBuild")
* .jenkinsProvider(jenkinsProvider)
* .projectName("MyProject")
* .type(JenkinsActionType.BUILD)
* .build();
*
*
*
Deploy
*
*
AWS CloudFormation
*
* This module contains Actions that allows you to deploy to CloudFormation from AWS CodePipeline.
*
* For example, the following code fragment defines a pipeline that automatically deploys a CloudFormation template
* directly from a CodeCommit repository, with a manual approval step in between to confirm the changes:
*
*
* // Source stage: read from repository
* Repository repo = Repository.Builder.create(stack, "TemplateRepo")
* .repositoryName("template-repo")
* .build();
* Artifact sourceOutput = new Artifact("SourceArtifact");
* CodeCommitSourceAction source = CodeCommitSourceAction.Builder.create()
* .actionName("Source")
* .repository(repo)
* .output(sourceOutput)
* .trigger(CodeCommitTrigger.POLL)
* .build();
* Map<String, Object> sourceStage = Map.of(
* "stageName", "Source",
* "actions", List.of(source));
*
* // Deployment stage: create and deploy changeset with manual approval
* String stackName = "OurStack";
* String changeSetName = "StagedChangeSet";
*
* Map<String, Object> prodStage = Map.of(
* "stageName", "Deploy",
* "actions", List.of(
* CloudFormationCreateReplaceChangeSetAction.Builder.create()
* .actionName("PrepareChanges")
* .stackName(stackName)
* .changeSetName(changeSetName)
* .adminPermissions(true)
* .templatePath(sourceOutput.atPath("template.yaml"))
* .runOrder(1)
* .build(),
* ManualApprovalAction.Builder.create()
* .actionName("ApproveChanges")
* .runOrder(2)
* .build(),
* CloudFormationExecuteChangeSetAction.Builder.create()
* .actionName("ExecuteChanges")
* .stackName(stackName)
* .changeSetName(changeSetName)
* .runOrder(3)
* .build()));
*
* Pipeline.Builder.create(stack, "Pipeline")
* .stages(List.of(sourceStage, prodStage))
* .build();
*
*
* See the AWS documentation
* for more details about using CloudFormation in CodePipeline.
*
*
Actions for updating individual CloudFormation Stacks
*
* This package contains the following CloudFormation actions:
*
*
* - CloudFormationCreateUpdateStackAction - Deploy a CloudFormation template directly from the pipeline. The indicated stack is created,
* or updated if it already exists. If the stack is in a failure state, deployment will fail (unless
replaceOnFailure
* is set to true
, in which case it will be destroyed and recreated).
* - CloudFormationDeleteStackAction - Delete the stack with the given name.
* - CloudFormationCreateReplaceChangeSetAction - Prepare a change set to be applied later. You will typically use change sets if you want
* to manually verify the changes that are being staged, or if you want to separate the people (or system) preparing the
* changes from the people (or system) applying the changes.
* - CloudFormationExecuteChangeSetAction - Execute a change set prepared previously.
*
*
*
Actions for deploying CloudFormation StackSets to multiple accounts
*
* You can use CloudFormation StackSets to deploy the same CloudFormation template to multiple
* accounts in a managed way. If you use AWS Organizations, StackSets can be deployed to
* all accounts in a particular Organizational Unit (OU), and even automatically to new
* accounts as soon as they are added to a particular OU. For more information, see
* the Working with StackSets
* section of the CloudFormation developer guide.
*
* The actions available for updating StackSets are:
*
*
* - CloudFormationDeployStackSetAction - Create or update a CloudFormation StackSet directly from the pipeline, optionally
* immediately create and update Stack Instances as well.
* - CloudFormationDeployStackInstancesAction - Update outdated Stack Instaces using the current version of the StackSet.
*
*
* Here's an example of using both of these actions:
*
*
* Pipeline pipeline;
* Artifact sourceOutput;
*
*
* pipeline.addStage(StageOptions.builder()
* .stageName("DeployStackSets")
* .actions(List.of(
* // First, update the StackSet itself with the newest template
* CloudFormationDeployStackSetAction.Builder.create()
* .actionName("UpdateStackSet")
* .runOrder(1)
* .stackSetName("MyStackSet")
* .template(StackSetTemplate.fromArtifactPath(sourceOutput.atPath("template.yaml")))
*
* // Change this to 'StackSetDeploymentModel.organizations()' if you want to deploy to OUs
* .deploymentModel(StackSetDeploymentModel.selfManaged())
* // This deploys to a set of accounts
* .stackInstances(StackInstances.inAccounts(List.of("111111111111"), List.of("us-east-1", "eu-west-1")))
* .build(),
*
* // Afterwards, update/create additional instances in other accounts
* CloudFormationDeployStackInstancesAction.Builder.create()
* .actionName("AddMoreInstances")
* .runOrder(2)
* .stackSetName("MyStackSet")
* .stackInstances(StackInstances.inAccounts(List.of("222222222222", "333333333333"), List.of("us-east-1", "eu-west-1")))
* .build()))
* .build());
*
*
*
Lambda deployed through CodePipeline
*
* If you want to deploy your Lambda through CodePipeline,
* and you don't use assets (for example, because your CDK code and Lambda code are separate),
* you can use a special Lambda Code
class, CfnParametersCode
.
* Note that your Lambda must be in a different Stack than your Pipeline.
* The Lambda itself will be deployed, alongside the entire Stack it belongs to,
* using a CloudFormation CodePipeline Action. Example:
*
*
* Stack lambdaStack = new Stack(app, "LambdaStack");
* CfnParametersCode lambdaCode = Code.fromCfnParameters();
* Function.Builder.create(lambdaStack, "Lambda")
* .code(lambdaCode)
* .handler("index.handler")
* .runtime(Runtime.NODEJS_14_X)
* .build();
* // other resources that your Lambda needs, added to the lambdaStack...
*
* Stack pipelineStack = new Stack(app, "PipelineStack");
* Pipeline pipeline = new Pipeline(pipelineStack, "Pipeline");
*
* // add the source code repository containing this code to your Pipeline,
* // and the source code of the Lambda Function, if they're separate
* Artifact cdkSourceOutput = new Artifact();
* CodeCommitSourceAction cdkSourceAction = CodeCommitSourceAction.Builder.create()
* .repository(Repository.Builder.create(pipelineStack, "CdkCodeRepo")
* .repositoryName("CdkCodeRepo")
* .build())
* .actionName("CdkCode_Source")
* .output(cdkSourceOutput)
* .build();
* Artifact lambdaSourceOutput = new Artifact();
* CodeCommitSourceAction lambdaSourceAction = CodeCommitSourceAction.Builder.create()
* .repository(Repository.Builder.create(pipelineStack, "LambdaCodeRepo")
* .repositoryName("LambdaCodeRepo")
* .build())
* .actionName("LambdaCode_Source")
* .output(lambdaSourceOutput)
* .build();
* pipeline.addStage(StageOptions.builder()
* .stageName("Source")
* .actions(List.of(cdkSourceAction, lambdaSourceAction))
* .build());
*
* // synthesize the Lambda CDK template, using CodeBuild
* // the below values are just examples, assuming your CDK code is in TypeScript/JavaScript -
* // adjust the build environment and/or commands accordingly
* Project cdkBuildProject = Project.Builder.create(pipelineStack, "CdkBuildProject")
* .environment(BuildEnvironment.builder()
* .buildImage(LinuxBuildImage.UBUNTU_14_04_NODEJS_10_1_0)
* .build())
* .buildSpec(BuildSpec.fromObject(Map.of(
* "version", "0.2",
* "phases", Map.of(
* "install", Map.of(
* "commands", "npm install"),
* "build", Map.of(
* "commands", List.of("npm run build", "npm run cdk synth LambdaStack -- -o ."))),
* "artifacts", Map.of(
* "files", "LambdaStack.template.yaml"))))
* .build();
* Artifact cdkBuildOutput = new Artifact();
* CodeBuildAction cdkBuildAction = CodeBuildAction.Builder.create()
* .actionName("CDK_Build")
* .project(cdkBuildProject)
* .input(cdkSourceOutput)
* .outputs(List.of(cdkBuildOutput))
* .build();
*
* // build your Lambda code, using CodeBuild
* // again, this example assumes your Lambda is written in TypeScript/JavaScript -
* // make sure to adjust the build environment and/or commands if they don't match your specific situation
* Project lambdaBuildProject = Project.Builder.create(pipelineStack, "LambdaBuildProject")
* .environment(BuildEnvironment.builder()
* .buildImage(LinuxBuildImage.UBUNTU_14_04_NODEJS_10_1_0)
* .build())
* .buildSpec(BuildSpec.fromObject(Map.of(
* "version", "0.2",
* "phases", Map.of(
* "install", Map.of(
* "commands", "npm install"),
* "build", Map.of(
* "commands", "npm run build")),
* "artifacts", Map.of(
* "files", List.of("index.js", "node_modules/**/*")))))
* .build();
* Artifact lambdaBuildOutput = new Artifact();
* CodeBuildAction lambdaBuildAction = CodeBuildAction.Builder.create()
* .actionName("Lambda_Build")
* .project(lambdaBuildProject)
* .input(lambdaSourceOutput)
* .outputs(List.of(lambdaBuildOutput))
* .build();
*
* pipeline.addStage(StageOptions.builder()
* .stageName("Build")
* .actions(List.of(cdkBuildAction, lambdaBuildAction))
* .build());
*
* // finally, deploy your Lambda Stack
* pipeline.addStage(StageOptions.builder()
* .stageName("Deploy")
* .actions(List.of(
* CloudFormationCreateUpdateStackAction.Builder.create()
* .actionName("Lambda_CFN_Deploy")
* .templatePath(cdkBuildOutput.atPath("LambdaStack.template.yaml"))
* .stackName("LambdaStackDeployedName")
* .adminPermissions(true)
* .parameterOverrides(lambdaCode.assign(lambdaBuildOutput.getS3Location()))
* .extraInputs(List.of(lambdaBuildOutput))
* .build()))
* .build());
*
*
*
Cross-account actions
*
* If you want to update stacks in a different account,
* pass the account
property when creating the action:
*
*
* Artifact sourceOutput = new Artifact();
* CloudFormationCreateUpdateStackAction.Builder.create()
* .actionName("CloudFormationCreateUpdate")
* .stackName("MyStackName")
* .adminPermissions(true)
* .templatePath(sourceOutput.atPath("template.yaml"))
* .account("123456789012")
* .build();
*
*
* This will create a new stack, called <PipelineStackName>-support-123456789012
, in your App
,
* that will contain the role that the pipeline will assume in account 123456789012 before executing this action.
* This support stack will automatically be deployed before the stack containing the pipeline.
*
* You can also pass a role explicitly when creating the action -
* in that case, the account
property is ignored,
* and the action will operate in the same account the role belongs to:
*
*
* import software.amazon.awscdk.core.PhysicalName;
*
* // in stack for account 123456789012...
* Stack otherAccountStack;
*
* Role actionRole = Role.Builder.create(otherAccountStack, "ActionRole")
* .assumedBy(new AccountPrincipal("123456789012"))
* // the role has to have a physical name set
* .roleName(PhysicalName.GENERATE_IF_NEEDED)
* .build();
*
* // in the pipeline stack...
* Artifact sourceOutput = new Artifact();
* CloudFormationCreateUpdateStackAction.Builder.create()
* .actionName("CloudFormationCreateUpdate")
* .stackName("MyStackName")
* .adminPermissions(true)
* .templatePath(sourceOutput.atPath("template.yaml"))
* .role(actionRole)
* .build();
*
*
*
AWS CodeDeploy
*
*
Server deployments
*
* To use CodeDeploy for EC2/on-premise deployments in a Pipeline:
*
*
* ServerDeploymentGroup deploymentGroup;
* Pipeline pipeline = Pipeline.Builder.create(this, "MyPipeline")
* .pipelineName("MyPipeline")
* .build();
*
* // add the source and build Stages to the Pipeline...
* Artifact buildOutput = new Artifact();
* CodeDeployServerDeployAction deployAction = CodeDeployServerDeployAction.Builder.create()
* .actionName("CodeDeploy")
* .input(buildOutput)
* .deploymentGroup(deploymentGroup)
* .build();
* pipeline.addStage(StageOptions.builder()
* .stageName("Deploy")
* .actions(List.of(deployAction))
* .build());
*
*
*
Lambda deployments
*
* To use CodeDeploy for blue-green Lambda deployments in a Pipeline:
*
*
* CfnParametersCode lambdaCode = Code.fromCfnParameters();
* Function func = Function.Builder.create(this, "Lambda")
* .code(lambdaCode)
* .handler("index.handler")
* .runtime(Runtime.NODEJS_14_X)
* .build();
* // used to make sure each CDK synthesis produces a different Version
* Version version = func.getCurrentVersion();
* Alias alias = Alias.Builder.create(this, "LambdaAlias")
* .aliasName("Prod")
* .version(version)
* .build();
*
* LambdaDeploymentGroup.Builder.create(this, "DeploymentGroup")
* .alias(alias)
* .deploymentConfig(LambdaDeploymentConfig.LINEAR_10PERCENT_EVERY_1MINUTE)
* .build();
*
*
* Then, you need to create your Pipeline Stack,
* where you will define your Pipeline,
* and deploy the lambdaStack
using a CloudFormation CodePipeline Action
* (see above for a complete example).
*
*
ECS
*
* CodePipeline can deploy an ECS service.
* The deploy Action receives one input Artifact which contains the image definition file:
*
*
* import software.amazon.awscdk.services.ecs.*;
*
* FargateService service;
*
* Pipeline pipeline = new Pipeline(this, "MyPipeline");
* Artifact buildOutput = new Artifact();
* IStage deployStage = pipeline.addStage(StageOptions.builder()
* .stageName("Deploy")
* .actions(List.of(
* EcsDeployAction.Builder.create()
* .actionName("DeployAction")
* .service(service)
* // if your file is called imagedefinitions.json,
* // use the `input` property,
* // and leave out the `imageFile` property
* .input(buildOutput)
* // if your file name is _not_ imagedefinitions.json,
* // use the `imageFile` property,
* // and leave out the `input` property
* .imageFile(buildOutput.atPath("imageDef.json"))
* .deploymentTimeout(Duration.minutes(60))
* .build()))
* .build());
*
*
*
Deploying ECS applications to existing services
*
* CodePipeline can deploy to an existing ECS service which uses the
* ECS service ARN format that contains the Cluster name.
* This also works if the service is in a different account and/or region than the pipeline:
*
*
* import software.amazon.awscdk.services.ecs.*;
*
*
* IBaseService service = BaseService.fromServiceArnWithCluster(this, "EcsService", "arn:aws:ecs:us-east-1:123456789012:service/myClusterName/myServiceName");
* Pipeline pipeline = new Pipeline(this, "MyPipeline");
* Artifact buildOutput = new Artifact();
* // add source and build stages to the pipeline as usual...
* IStage deployStage = pipeline.addStage(StageOptions.builder()
* .stageName("Deploy")
* .actions(List.of(
* EcsDeployAction.Builder.create()
* .actionName("DeployAction")
* .service(service)
* .input(buildOutput)
* .build()))
* .build());
*
*
* When deploying across accounts, especially in a CDK Pipelines self-mutating pipeline,
* it is recommended to provide the role
property to the EcsDeployAction
.
* The Role will need to have permissions assigned to it for ECS deployment.
* See the CodePipeline documentation
* for the permissions needed.
*
*
Deploying ECS applications stored in a separate source code repository
*
* The idiomatic CDK way of deploying an ECS application is to have your Dockerfiles and your CDK code in the same source code repository,
* leveraging Docker Assets,
* and use the CDK Pipelines module.
*
* However, if you want to deploy a Docker application whose source code is kept in a separate version control repository than the CDK code,
* you can use the TagParameterContainerImage
class from the ECS module.
* Here's an example:
*
*
* /**
* * These are the construction properties for {@link EcsAppStack}.
* * They extend the standard Stack properties,
* * but also require providing the ContainerImage that the service will use.
* * That Image will be provided from the Stack containing the CodePipeline.
* */
* public class EcsAppStackProps extends StackProps {
* private ContainerImage image;
* public ContainerImage getImage() {
* return this.image;
* }
* public EcsAppStackProps image(ContainerImage image) {
* this.image = image;
* return this;
* }
* }
*
* /**
* * This is the Stack containing a simple ECS Service that uses the provided ContainerImage.
* */
* public class EcsAppStack extends Stack {
* public EcsAppStack(Construct scope, String id, EcsAppStackProps props) {
* super(scope, id, props);
*
* TaskDefinition taskDefinition = TaskDefinition.Builder.create(this, "TaskDefinition")
* .compatibility(Compatibility.FARGATE)
* .cpu("1024")
* .memoryMiB("2048")
* .build();
* taskDefinition.addContainer("AppContainer", ContainerDefinitionOptions.builder()
* .image(props.getImage())
* .build());
* FargateService.Builder.create(this, "EcsService")
* .taskDefinition(taskDefinition)
* .cluster(Cluster.Builder.create(this, "Cluster")
* .vpc(Vpc.Builder.create(this, "Vpc")
* .maxAzs(1)
* .build())
* .build())
* .build();
* }
* }
*
* /**
* * This is the Stack containing the CodePipeline definition that deploys an ECS Service.
* */
* public class PipelineStack extends Stack {
* public final TagParameterContainerImage tagParameterContainerImage;
*
* public PipelineStack(Construct scope, String id) {
* this(scope, id, null);
* }
*
* public PipelineStack(Construct scope, String id, StackProps props) {
* super(scope, id, props);
*
* /* ********** ECS part **************** */
*
* // this is the ECR repository where the built Docker image will be pushed
* Repository appEcrRepo = new Repository(this, "EcsDeployRepository");
* // the build that creates the Docker image, and pushes it to the ECR repo
* PipelineProject appCodeDockerBuild = PipelineProject.Builder.create(this, "AppCodeDockerImageBuildAndPushProject")
* .environment(BuildEnvironment.builder()
* // we need to run Docker
* .privileged(true)
* .build())
* .buildSpec(BuildSpec.fromObject(Map.of(
* "version", "0.2",
* "phases", Map.of(
* "build", Map.of(
* "commands", List.of("$(aws ecr get-login --region $AWS_DEFAULT_REGION --no-include-email)", "docker build -t $REPOSITORY_URI:$CODEBUILD_RESOLVED_SOURCE_VERSION .")),
* "post_build", Map.of(
* "commands", List.of("docker push $REPOSITORY_URI:$CODEBUILD_RESOLVED_SOURCE_VERSION", "export imageTag=$CODEBUILD_RESOLVED_SOURCE_VERSION"))),
* "env", Map.of(
* // save the imageTag environment variable as a CodePipeline Variable
* "exported-variables", List.of("imageTag")))))
* .environmentVariables(Map.of(
* "REPOSITORY_URI", BuildEnvironmentVariable.builder()
* .value(appEcrRepo.getRepositoryUri())
* .build()))
* .build();
* // needed for `docker push`
* appEcrRepo.grantPullPush(appCodeDockerBuild);
* // create the ContainerImage used for the ECS application Stack
* this.tagParameterContainerImage = new TagParameterContainerImage(appEcrRepo);
*
* PipelineProject cdkCodeBuild = PipelineProject.Builder.create(this, "CdkCodeBuildProject")
* .buildSpec(BuildSpec.fromObject(Map.of(
* "version", "0.2",
* "phases", Map.of(
* "install", Map.of(
* "commands", List.of("npm install")),
* "build", Map.of(
* "commands", List.of("npx cdk synth --verbose"))),
* "artifacts", Map.of(
* // store the entire Cloud Assembly as the output artifact
* "base-directory", "cdk.out",
* "files", "**/*"))))
* .build();
*
* /* ********** Pipeline part **************** */
*
* Artifact appCodeSourceOutput = new Artifact();
* Artifact cdkCodeSourceOutput = new Artifact();
* Artifact cdkCodeBuildOutput = new Artifact();
* CodeBuildAction appCodeBuildAction = CodeBuildAction.Builder.create()
* .actionName("AppCodeDockerImageBuildAndPush")
* .project(appCodeDockerBuild)
* .input(appCodeSourceOutput)
* .build();
* Pipeline.Builder.create(this, "CodePipelineDeployingEcsApplication")
* .artifactBucket(Bucket.Builder.create(this, "ArtifactBucket")
* .removalPolicy(RemovalPolicy.DESTROY)
* .build())
* .stages(List.of(StageProps.builder()
* .stageName("Source")
* .actions(List.of(
* // this is the Action that takes the source of your application code
* CodeCommitSourceAction.Builder.create()
* .actionName("AppCodeSource")
* .repository(Repository.Builder.create(this, "AppCodeSourceRepository").repositoryName("AppCodeSourceRepository").build())
* .output(appCodeSourceOutput)
* .build(),
* // this is the Action that takes the source of your CDK code
* // (which would probably include this Pipeline code as well)
* CodeCommitSourceAction.Builder.create()
* .actionName("CdkCodeSource")
* .repository(Repository.Builder.create(this, "CdkCodeSourceRepository").repositoryName("CdkCodeSourceRepository").build())
* .output(cdkCodeSourceOutput)
* .build()))
* .build(), StageProps.builder()
* .stageName("Build")
* .actions(List.of(appCodeBuildAction,
* CodeBuildAction.Builder.create()
* .actionName("CdkCodeBuildAndSynth")
* .project(cdkCodeBuild)
* .input(cdkCodeSourceOutput)
* .outputs(List.of(cdkCodeBuildOutput))
* .build()))
* .build(), StageProps.builder()
* .stageName("Deploy")
* .actions(List.of(
* CloudFormationCreateUpdateStackAction.Builder.create()
* .actionName("CFN_Deploy")
* .stackName("SampleEcsStackDeployedFromCodePipeline")
* // this name has to be the same name as used below in the CDK code for the application Stack
* .templatePath(cdkCodeBuildOutput.atPath("EcsStackDeployedInPipeline.template.json"))
* .adminPermissions(true)
* .parameterOverrides(Map.of(
* // read the tag pushed to the ECR repository from the CodePipeline Variable saved by the application build step,
* // and pass it as the CloudFormation Parameter for the tag
* this.tagParameterContainerImage.getTagParameterName(), appCodeBuildAction.variable("imageTag")))
* .build()))
* .build()))
* .build();
* }
* }
*
* App app = new App();
*
* // the CodePipeline Stack needs to be created first
* PipelineStack pipelineStack = new PipelineStack(app, "aws-cdk-pipeline-ecs-separate-sources");
* // we supply the image to the ECS application Stack from the CodePipeline Stack
* // we supply the image to the ECS application Stack from the CodePipeline Stack
* new EcsAppStack(app, "EcsStackDeployedInPipeline", new EcsAppStackProps()
* .image(pipelineStack.getTagParameterContainerImage())
* );
*
*
*
AWS S3 Deployment
*
* To use an S3 Bucket as a deployment target in CodePipeline:
*
*
* Artifact sourceOutput = new Artifact();
* Bucket targetBucket = new Bucket(this, "MyBucket");
*
* Pipeline pipeline = new Pipeline(this, "MyPipeline");
* S3DeployAction deployAction = S3DeployAction.Builder.create()
* .actionName("S3Deploy")
* .bucket(targetBucket)
* .input(sourceOutput)
* .build();
* IStage deployStage = pipeline.addStage(StageOptions.builder()
* .stageName("Deploy")
* .actions(List.of(deployAction))
* .build());
*
*
*
Invalidating the CloudFront cache when deploying to S3
*
* There is currently no native support in CodePipeline for invalidating a CloudFront cache after deployment.
* One workaround is to add another build step after the deploy step,
* and use the AWS CLI to invalidate the cache:
*
*
* // Create a Cloudfront Web Distribution
* import software.amazon.awscdk.services.cloudfront.*;
* Distribution distribution;
*
*
* // Create the build project that will invalidate the cache
* PipelineProject invalidateBuildProject = PipelineProject.Builder.create(this, "InvalidateProject")
* .buildSpec(BuildSpec.fromObject(Map.of(
* "version", "0.2",
* "phases", Map.of(
* "build", Map.of(
* "commands", List.of("aws cloudfront create-invalidation --distribution-id ${CLOUDFRONT_ID} --paths \"/*\""))))))
* .environmentVariables(Map.of(
* "CLOUDFRONT_ID", BuildEnvironmentVariable.builder().value(distribution.getDistributionId()).build()))
* .build();
*
* // Add Cloudfront invalidation permissions to the project
* String distributionArn = String.format("arn:aws:cloudfront::%s:distribution/%s", this.account, distribution.getDistributionId());
* invalidateBuildProject.addToRolePolicy(PolicyStatement.Builder.create()
* .resources(List.of(distributionArn))
* .actions(List.of("cloudfront:CreateInvalidation"))
* .build());
*
* // Create the pipeline (here only the S3 deploy and Invalidate cache build)
* Bucket deployBucket = new Bucket(this, "DeployBucket");
* Artifact deployInput = new Artifact();
* Pipeline.Builder.create(this, "Pipeline")
* .stages(List.of(StageProps.builder()
* .stageName("Deploy")
* .actions(List.of(
* S3DeployAction.Builder.create()
* .actionName("S3Deploy")
* .bucket(deployBucket)
* .input(deployInput)
* .runOrder(1)
* .build(),
* CodeBuildAction.Builder.create()
* .actionName("InvalidateCache")
* .project(invalidateBuildProject)
* .input(deployInput)
* .runOrder(2)
* .build()))
* .build()))
* .build();
*
*
*
Alexa Skill
*
* You can deploy to Alexa using CodePipeline with the following Action:
*
*
* // Read the secrets from ParameterStore
* SecretValue clientId = SecretValue.secretsManager("AlexaClientId");
* SecretValue clientSecret = SecretValue.secretsManager("AlexaClientSecret");
* SecretValue refreshToken = SecretValue.secretsManager("AlexaRefreshToken");
*
* // Add deploy action
* Artifact sourceOutput = new Artifact();
* AlexaSkillDeployAction.Builder.create()
* .actionName("DeploySkill")
* .runOrder(1)
* .input(sourceOutput)
* .clientId(clientId.toString())
* .clientSecret(clientSecret)
* .refreshToken(refreshToken)
* .skillId("amzn1.ask.skill.12345678-1234-1234-1234-123456789012")
* .build();
*
*
* If you need manifest overrides you can specify them as parameterOverridesArtifact
in the action:
*
*
* // Deploy some CFN change set and store output
* Artifact executeOutput = new Artifact("CloudFormation");
* CloudFormationExecuteChangeSetAction executeChangeSetAction = CloudFormationExecuteChangeSetAction.Builder.create()
* .actionName("ExecuteChangesTest")
* .runOrder(2)
* .stackName("MyStack")
* .changeSetName("MyChangeSet")
* .outputFileName("overrides.json")
* .output(executeOutput)
* .build();
*
* // Provide CFN output as manifest overrides
* SecretValue clientId = SecretValue.secretsManager("AlexaClientId");
* SecretValue clientSecret = SecretValue.secretsManager("AlexaClientSecret");
* SecretValue refreshToken = SecretValue.secretsManager("AlexaRefreshToken");
* Artifact sourceOutput = new Artifact();
* AlexaSkillDeployAction.Builder.create()
* .actionName("DeploySkill")
* .runOrder(1)
* .input(sourceOutput)
* .parameterOverridesArtifact(executeOutput)
* .clientId(clientId.toString())
* .clientSecret(clientSecret)
* .refreshToken(refreshToken)
* .skillId("amzn1.ask.skill.12345678-1234-1234-1234-123456789012")
* .build();
*
*
*
AWS Service Catalog
*
* You can deploy a CloudFormation template to an existing Service Catalog product with the following Action:
*
*
* Artifact cdkBuildOutput = new Artifact();
* ServiceCatalogDeployActionBeta1 serviceCatalogDeployAction = ServiceCatalogDeployActionBeta1.Builder.create()
* .actionName("ServiceCatalogDeploy")
* .templatePath(cdkBuildOutput.atPath("Sample.template.json"))
* .productVersionName("Version - " + Date.getNow().getToString())
* .productVersionDescription("This is a version from the pipeline with a new description.")
* .productId("prod-XXXXXXXX")
* .build();
*
*
*
Approve & invoke
*
*
Manual approval Action
*
* This package contains an Action that stops the Pipeline until someone manually clicks the approve button:
*
*
* import software.amazon.awscdk.services.sns.*;
*
*
* Pipeline pipeline = new Pipeline(this, "MyPipeline");
* IStage approveStage = pipeline.addStage(StageOptions.builder().stageName("Approve").build());
* ManualApprovalAction manualApprovalAction = ManualApprovalAction.Builder.create()
* .actionName("Approve")
* .notificationTopic(new Topic(this, "Topic")) // optional
* .notifyEmails(List.of("some_email@example.com")) // optional
* .additionalInformation("additional info")
* .build();
* approveStage.addAction(manualApprovalAction);
*
*
* If the notificationTopic
has not been provided,
* but notifyEmails
were,
* a new SNS Topic will be created
* (and accessible through the notificationTopic
property of the Action).
*
* If you want to grant a principal permissions to approve the changes,
* you can invoke the method grantManualApproval
passing it a IGrantable
:
*
*
* Pipeline pipeline = new Pipeline(this, "MyPipeline");
* IStage approveStage = pipeline.addStage(StageOptions.builder().stageName("Approve").build());
* ManualApprovalAction manualApprovalAction = ManualApprovalAction.Builder.create()
* .actionName("Approve")
* .build();
* approveStage.addAction(manualApprovalAction);
*
* IRole role = Role.fromRoleArn(this, "Admin", Arn.format(ArnComponents.builder().service("iam").resource("role").resourceName("Admin").build(), this));
* manualApprovalAction.grantManualApproval(role);
*
*
*
AWS Lambda
*
* This module contains an Action that allows you to invoke a Lambda function in a Pipeline:
*
*
* Function fn;
*
* Pipeline pipeline = new Pipeline(this, "MyPipeline");
* LambdaInvokeAction lambdaAction = LambdaInvokeAction.Builder.create()
* .actionName("Lambda")
* .lambda(fn)
* .build();
* pipeline.addStage(StageOptions.builder()
* .stageName("Lambda")
* .actions(List.of(lambdaAction))
* .build());
*
*
* The Lambda Action can have up to 5 inputs,
* and up to 5 outputs:
*
*
* Function fn;
*
* Artifact sourceOutput = new Artifact();
* Artifact buildOutput = new Artifact();
* LambdaInvokeAction lambdaAction = LambdaInvokeAction.Builder.create()
* .actionName("Lambda")
* .inputs(List.of(sourceOutput, buildOutput))
* .outputs(List.of(
* new Artifact("Out1"),
* new Artifact("Out2")))
* .lambda(fn)
* .build();
*
*
* The Lambda Action supports custom user parameters that pipeline
* will pass to the Lambda function:
*
*
* Function fn;
*
*
* Pipeline pipeline = new Pipeline(this, "MyPipeline");
* LambdaInvokeAction lambdaAction = LambdaInvokeAction.Builder.create()
* .actionName("Lambda")
* .lambda(fn)
* .userParameters(Map.of(
* "foo", "bar",
* "baz", "qux"))
* // OR
* .userParametersString("my-parameter-string")
* .build();
*
*
* The Lambda invoke action emits variables.
* Unlike many other actions, the variables are not static,
* but dynamic, defined by the function calling the PutJobSuccessResult
* API with the outputVariables
property filled with the map of variables
* Example:
*
*
* // later:
* PipelineProject project;
* LambdaInvokeAction lambdaInvokeAction = LambdaInvokeAction.Builder.create()
* .actionName("Lambda")
* .lambda(Function.Builder.create(this, "Func")
* .runtime(Runtime.NODEJS_14_X)
* .handler("index.handler")
* .code(Code.fromInline("\n const AWS = require('aws-sdk');\n\n exports.handler = async function(event, context) {\n const codepipeline = new AWS.CodePipeline();\n await codepipeline.putJobSuccessResult({\n jobId: event['CodePipeline.job'].id,\n outputVariables: {\n MY_VAR: \"some value\",\n },\n }).promise();\n }\n "))
* .build())
* .variablesNamespace("MyNamespace")
* .build();
* Artifact sourceOutput = new Artifact();
* CodeBuildAction.Builder.create()
* .actionName("CodeBuild")
* .project(project)
* .input(sourceOutput)
* .environmentVariables(Map.of(
* "MyVar", BuildEnvironmentVariable.builder()
* .value(lambdaInvokeAction.variable("MY_VAR"))
* .build()))
* .build();
*
*
* See the AWS documentation
* on how to write a Lambda function invoked from CodePipeline.
*
*
AWS Step Functions
*
* This module contains an Action that allows you to invoke a Step Function in a Pipeline:
*
*
* import software.amazon.awscdk.services.stepfunctions.*;
*
* Pipeline pipeline = new Pipeline(this, "MyPipeline");
* Pass startState = new Pass(this, "StartState");
* StateMachine simpleStateMachine = StateMachine.Builder.create(this, "SimpleStateMachine")
* .definition(startState)
* .build();
* StepFunctionInvokeAction stepFunctionAction = StepFunctionInvokeAction.Builder.create()
* .actionName("Invoke")
* .stateMachine(simpleStateMachine)
* .stateMachineInput(StateMachineInput.literal(Map.of("IsHelloWorldExample", true)))
* .build();
* pipeline.addStage(StageOptions.builder()
* .stageName("StepFunctions")
* .actions(List.of(stepFunctionAction))
* .build());
*
*
* The StateMachineInput
can be created with one of 2 static factory methods:
* literal
, which takes an arbitrary map as its only argument, or filePath
:
*
*
* import software.amazon.awscdk.services.stepfunctions.*;
*
*
* Pipeline pipeline = new Pipeline(this, "MyPipeline");
* Artifact inputArtifact = new Artifact();
* Pass startState = new Pass(this, "StartState");
* StateMachine simpleStateMachine = StateMachine.Builder.create(this, "SimpleStateMachine")
* .definition(startState)
* .build();
* StepFunctionInvokeAction stepFunctionAction = StepFunctionInvokeAction.Builder.create()
* .actionName("Invoke")
* .stateMachine(simpleStateMachine)
* .stateMachineInput(StateMachineInput.filePath(inputArtifact.atPath("assets/input.json")))
* .build();
* pipeline.addStage(StageOptions.builder()
* .stageName("StepFunctions")
* .actions(List.of(stepFunctionAction))
* .build());
*
*
* See the AWS documentation
* for information on Action structure reference.
*/
@software.amazon.jsii.Stability(software.amazon.jsii.Stability.Level.Stable)
package software.amazon.awscdk.services.codepipeline.actions;