Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

This tutorial will walk you through how a procedure to create and configure the pipeline step steps for executing the Terraform Script.

The below mentioned representation plots the overall workflow of Terraform script on Opsera portal.

...

The below mentioned steps will guide the step by step workflow for configuration of Terraform ScriptPipeline:

  1. On a Pipeline, add a step for ‘Terraform’ Tool type

...

Scenario-1: Having AWS Credentials - Static on the Terraform script [GITLAB] as shown below

In the below example, we have the Terraform script available at ‘python-example > Terraform > Terraform > subnet.tf’

...

4. On pipeline, Terraform script step → Click on the ‘Configuration’ icon to enter step level configuration to invoke Terraform script run in GGITLAB REPOSITORY.

...

5. On the configuration window, select/enter the details on all the fields

...

6

...

Note: To configure Github account to use it as part of pipeline, Go to Tool Registry in the left nav of the portal, Click New tool, Choose GITHUB or GITLAB from the drop down and add your credentials. If you have 2FA ,select the check box in the configuration tab.

Option 1: Execute Terraform using IAM roles. Use the Toggle button to choose “IAM Roles

...

Note: Terraform specific IAM roles are pulled in the pipeline stage from the Tool Registry. You would need to provide Secret key and access key which has access to specific IAM roles.

Option 2: Users can also use secret key and access of AWS account to execute the terraform if the IAM roles are not defined.

Provide your secret key and access key name from your terraform script in the input field as shown in the below image.

...

Image Added

6.Save the state file in your own S3 bucket - Choose the S3 bucket Name and Region and click save

...

6.1 Click Tool Registry to Choose AWS as New Tool Type, Provide Required details, click S3 buckets

...

6.2 Under S3 Buckets, click New AWS S3 bucket and add your S3 bucket for storing your state file in S3 during terraform execution

...

7. Once all the configurations are set, click on ‘Save’ button to save the pipeline step

...

78. Click on ‘Start Pipeline’ to start the execution of the pipeline

...

89. Pipeline will start and will show a blue bar with predefined message as an indication

...

910. Once the script is executed and successful, a green tick will appear on the step

...

1011. The script success can be tracked in the Summary tab > Activity logs as well

...

1112. Output of the script is given below - where the Terraform script is meant to fetch the Subnet IDs of the VPC given as ‘Terraform script argument’.

...

Scenario-2: Having AWS Credentials - Dynamic on the Terraform script [GITHUB] as shown below

In the below example, we have the Terraform script available at ‘python-example > Terraform > Terraform > subnet.tf’

...

1213. On pipeline, Terraform script step → Click on the ‘Configuration’ icon to enter step level configuration to invoke Terraform script run from GITHUB REPOSITORY

...

1314. On the configuration window, select/enter the details on all the fields

...

1415. Once all the configurations are set, click on ‘Save’ button to save the pipeline step

...

1516. Click on ‘Start Pipeline’ to start the execution of the pipeline

...

1617. Pipeline will start and will show a blue bar with predefined message as an indication

...

1718. Once the script is executed and successful, a green tick will appear on the step

...

1819. The script success can be tracked in the Summary tab > Activity logs as well

...

1920. Output of the script is given below - where the Terraform script is meant to fetch the Subnet IDs of the VPC given as ‘Terraform script argument’.

...