Terraform Script - Workflow
This tutorial will walk you through a procedure to create and configure the pipeline steps for executing the Terraform Script.
The below mentioned representation plots the overall workflow of Terraform script on Opsera portal.
The below mentioned steps will guide the step by step workflow for configuration of Terraform Pipeline:
On a Pipeline, add a step for ‘Terraform’ Tool type
2. Once the step tool and name is entered, click ‘Save’ button
3. Identify the Terraform script available in Git Repository.
Scenario-1: Having AWS Credentials - Static on the Terraform script [GITLAB] as shown below
In the below example, we have the Terraform script available at ‘python-example > Terraform > Terraform > subnet.tf’
4. On pipeline, Terraform script step → Click on the ‘Configuration’ icon to enter step level configuration to invoke Terraform script run in GITLAB REPOSITORY.
5. On the configuration window, select/enter the details on all the fields
Note: To configure Github account to use it as part of pipeline, Go to Tool Registry in the left nav of the portal, Click New tool, Choose GITHUB or GITLAB from the drop down and add your credentials. If you have 2FA ,select the check box in the configuration tab.
Option 1: Execute Terraform using IAM roles. Use the Toggle button to choose “IAM Roles”
Note: Terraform specific IAM roles are pulled in the pipeline stage from the Tool Registry. You would need to provide Secret key and access key which has access to specific IAM roles.
Option 2: Users can also use secret key and access of AWS account to execute the terraform if the IAM roles are not defined.
Provide your secret key and access key name from your terraform script in the input field as shown in the below image.
6.Save the state file in your own S3 bucket - Choose the S3 bucket Name and Region and click save
6.1 Click Tool Registry to Choose AWS as New Tool Type, Provide Required details, click S3 buckets
6.2 Under S3 Buckets, click New AWS S3 bucket and add your S3 bucket for storing your state file in S3 during terraform execution
7. Once all the configurations are set, click on ‘Save’ button to save the pipeline step
8. Click on ‘Start Pipeline’ to start the execution of the pipeline
9. Pipeline will start and will show a blue bar with predefined message as an indication
10. Once the script is executed and successful, a green tick will appear on the step
11. The script success can be tracked in the Summary tab > Activity logs as well
12. Output of the script is given below - where the Terraform script is meant to fetch the Subnet IDs of the VPC given as ‘Terraform script argument’.
Output: Terraform script initialized and 3 Subnet IDs are fetched.
Scenario-2: Having AWS Credentials - Dynamic on the Terraform script [GITHUB] as shown below
In the below example, we have the Terraform script available at ‘python-example > Terraform > Terraform > subnet.tf’
13. On pipeline, Terraform script step → Click on the ‘Configuration’ icon to enter step level configuration to invoke Terraform script run from GITHUB REPOSITORY
14. On the configuration window, select/enter the details on all the fields
15. Once all the configurations are set, click on ‘Save’ button to save the pipeline step
16. Click on ‘Start Pipeline’ to start the execution of the pipeline
17. Pipeline will start and will show a blue bar with predefined message as an indication
18. Once the script is executed and successful, a green tick will appear on the step
19. The script success can be tracked in the Summary tab > Activity logs as well
20. Output of the script is given below - where the Terraform script is meant to fetch the Subnet IDs of the VPC given as ‘Terraform script argument’.
Output: Terraform script initialized and 3 Subnet IDs are fetched.
A Sample Terraform script from GITLAB and GITHUB and its workflow has been demonstrated above, which in the same way can be used to invoke any possible Terraform script.