...
Provide your secret key and access key name from your terraform script in the input field as shown in the below image.
...
6.Save the state file in your own S3 bucket - Choose the S3 bucket Name and Region and click save
...
6.1 Click Tool Registry to Choose AWS as New Tool Type, Provide Required details, click S3 buckets
...
6.2 Under S3 Buckets, click New AWS S3 bucket and add your S3 bucket for storing your state file in S3 during terraform execution
...
7. Once all the configurations are set, click on ‘Save’ button to save the pipeline step
...
78. Click on ‘Start Pipeline’ to start the execution of the pipeline
...
89. Pipeline will start and will show a blue bar with predefined message as an indication
...
910. Once the script is executed and successful, a green tick will appear on the step
...
1011. The script success can be tracked in the Summary tab > Activity logs as well
...
1112. Output of the script is given below - where the Terraform script is meant to fetch the Subnet IDs of the VPC given as ‘Terraform script argument’.
...
In the below example, we have the Terraform script available at ‘python-example > Terraform > Terraform > subnet.tf’
...
1213. On pipeline, Terraform script step → Click on the ‘Configuration’ icon to enter step level configuration to invoke Terraform script run from GITHUB REPOSITORY
...
1314. On the configuration window, select/enter the details on all the fields
...
1415. Once all the configurations are set, click on ‘Save’ button to save the pipeline step
...
1516. Click on ‘Start Pipeline’ to start the execution of the pipeline
...
1617. Pipeline will start and will show a blue bar with predefined message as an indication
...
1718. Once the script is executed and successful, a green tick will appear on the step
...
1819. The script success can be tracked in the Summary tab > Activity logs as well
...
1920. Output of the script is given below - where the Terraform script is meant to fetch the Subnet IDs of the VPC given as ‘Terraform script argument’.
...