Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Terraform | 39,569 | 469 | 1,307 | 21 hours ago | 665 | November 30, 2023 | 1,910 | other | Go | |
Terraform enables you to safely and predictably create, change, and improve infrastructure. It is a source-available tool that codifies APIs into declarative configuration files that can be shared amongst team members, treated as code, edited, reviewed, and versioned. | ||||||||||
Terraformer | 11,097 | 1 | 4 days ago | 20 | November 15, 2021 | 61 | apache-2.0 | Go | ||
CLI tool to generate terraform files from existing infrastructure (reverse Terraform). Infrastructure to Code | ||||||||||
Infracost | 9,757 | a day ago | 148 | October 27, 2023 | 169 | apache-2.0 | Go | |||
Cloud cost estimates for Terraform in pull requests๐ฐ๐ Love your cloud bill! | ||||||||||
Tfsec | 6,373 | 18 | 3 days ago | 411 | September 11, 2023 | 15 | mit | Go | ||
Security scanner for your Terraform code | ||||||||||
Checkov | 6,151 | 6 | a day ago | 3,174 | December 05, 2023 | 115 | apache-2.0 | Python | ||
Prevent cloud misconfigurations and find vulnerabilities during build-time in infrastructure as code, container images and open source packages with Checkov by Bridgecrew. | ||||||||||
Awesome Terraform | 4,835 | a month ago | 1 | cc0-1.0 | ||||||
Curated list of resources on HashiCorp's Terraform | ||||||||||
Terraform Cdk | 4,601 | 211 | 21 hours ago | 833 | December 04, 2023 | 293 | mpl-2.0 | TypeScript | ||
Define infrastructure resources using programming constructs and provision them using HashiCorp Terraform | ||||||||||
Terrascan | 4,304 | 1 | 5 days ago | 67 | November 30, 2023 | 232 | apache-2.0 | Go | ||
Detect compliance and security violations across Infrastructure as Code to mitigate risk before provisioning cloud native infrastructure. | ||||||||||
Driftctl | 2,342 | a month ago | 188 | October 25, 2023 | 142 | apache-2.0 | Go | |||
Detect, track and alert on infrastructure drift | ||||||||||
Digger | 2,247 | 21 hours ago | 8 | December 01, 2023 | 146 | apache-2.0 | Go | |||
Digger is an open source IaC orchestration tool. Digger allows you to run IaC in your existing CI pipeline โก๏ธ |
A CLI tool that generates tf
/json
and tfstate
files based on existing infrastructure
(reverse Terraform).
tf
/json
+ tfstate
files from existing infrastructure for all
supported objects by resource.terraform_remote_state
(local and bucket).tf
/json
files using a custom folder tree pattern.Terraformer uses Terraform providers and is designed to easily support newly added resources. To upgrade resources with new fields, all you need to do is upgrade the relevant Terraform providers.
Import current state to Terraform configuration from a provider
Usage:
import [provider] [flags]
import [provider] [command]
Available Commands:
list List supported resources for a provider
Flags:
-b, --bucket string gs://terraform-state
-c, --connect (default true)
-ะก, --compact (default false)
-x, --excludes strings firewalls,networks
-f, --filter strings compute_firewall=id1:id2:id4
-h, --help help for google
-O, --output string output format hcl or json (default "hcl")
-o, --path-output string (default "generated")
-p, --path-pattern string {output}/{provider}/ (default "{output}/{provider}/{service}/")
--projects strings
-z, --regions strings europe-west1, (default [global])
-r, --resources strings firewall,networks or * for all services
-s, --state string local or bucket (default "local")
-v, --verbose verbose mode
-n, --retry-number number of retries to perform if refresh fails
-m, --retry-sleep-ms time in ms to sleep between retries
Use " import [provider] [command] --help" for more information about a command.
The tool requires read-only permissions to list service resources.
You can use --resources
parameter to tell resources from what service you want to import.
To import resources from all services, use --resources="*"
. If you want to exclude certain services, you can combine the parameter with --excludes
to exclude resources from services you don't want to import e.g. --resources="*" --excludes="iam"
.
Filters are a way to choose which resources terraformer
imports. It's possible to filter resources by its identifiers or attributes. Multiple filtering values are separated by :
. If an identifier contains this symbol, value should be wrapped in '
e.g. --filter=resource=id1:'project:dataset_id'
. Identifier based filters will be executed before Terraformer will try to refresh remote state.
Use Type
when you need to filter only one of several types of resources. Multiple filters can be combined when importing different resource types. An example would be importing all AWS security groups from a specific AWS VPC:
terraformer import aws -r sg,vpc --filter Type=sg;Name=vpc_id;Value=VPC_ID --filter Type=vpc;Name=id;Value=VPC_ID
Notice how the Name
is different for sg
than it is for vpc
.
For terraform >= 0.13, you can use replace-provider
to migrate state from previous versions.
Example usage:
terraform state replace-provider -auto-approve "registry.terraform.io/-/aws" "hashicorp/aws"
Filtering is based on Terraform resource ID patterns. To find valid ID patterns for your resource, check the import part of the Terraform documentation.
Example usage:
terraformer import aws --resources=vpc,subnet --filter=vpc=myvpcid --regions=eu-west-1
Will only import the vpc with id myvpcid
. This form of filters can help when it's necessary to select resources by its identifiers.
It is possible to filter by specific field name only. It can be used e.g. when you want to retrieve resources only with a specific tag key.
Example usage:
terraformer import aws --resources=s3 --filter="Name=tags.Abc" --regions=eu-west-1
Will only import the s3 resources that have tag Abc
. This form of filters can help when the field values are not important from filtering perspective.
It is possible to filter by a field that contains a dot.
Example usage:
terraformer import aws --resources=s3 --filter="Name=tags.Abc.def" --regions=eu-west-1
Will only import the s3 resources that have tag Abc.def
.
The plan
command generates a planfile that contains all the resources set to be imported. By modifying the planfile before running the import
command, you can rename or filter the resources you'd like to import.
The rest of subcommands and parameters are identical to the import
command.
$ terraformer plan google --resources=networks,firewall --projects=my-project --regions=europe-west1-d
(snip)
Saving planfile to generated/google/my-project/terraformer/plan.json
After reviewing/customizing the planfile, begin the import by running import plan
.
$ terraformer import plan generated/google/my-project/terraformer/plan.json
Terraformer by default separates each resource into a file, which is put into a given service directory.
The default path for resource files is {output}/{provider}/{service}/{resource}.tf
and can vary for each provider.
It's possible to adjust the generated structure by:
--compact
parameter to group resource files within a single service into one resources.tf
file--path-pattern
parameter and passing e.g. --path-pattern {output}/{provider}/
to generate resources for all services in one directoryIt's possible to combine --compact
--path-pattern
parameters together.
Both Terraformer and a Terraform provider plugin need to be installed.
From source:
git clone <terraformer repo> && cd terraformer/
go mod download
go build -v
for all providers OR build with one provider
go run build/main.go {google,aws,azure,kubernetes,etc}
From releases:
export PROVIDER={all,google,aws,kubernetes}
curl -LO "https://github.com/GoogleCloudPlatform/terraformer/releases/download/$(curl -s https://api.github.com/repos/GoogleCloudPlatform/terraformer/releases/latest | grep tag_name | cut -d '"' -f 4)/terraformer-${PROVIDER}-linux-amd64"
chmod +x terraformer-${PROVIDER}-linux-amd64
sudo mv terraformer-${PROVIDER}-linux-amd64 /usr/local/bin/terraformer
export PROVIDER={all,google,aws,kubernetes}
curl -LO "https://github.com/GoogleCloudPlatform/terraformer/releases/download/$(curl -s https://api.github.com/repos/GoogleCloudPlatform/terraformer/releases/latest | grep tag_name | cut -d '"' -f 4)/terraformer-${PROVIDER}-darwin-amd64"
chmod +x terraformer-${PROVIDER}-darwin-amd64
sudo mv terraformer-${PROVIDER}-darwin-amd64 /usr/local/bin/terraformer
From a package manager:
brew install terraformer
.sudo port install terraformer
.choco install terraformer
.Create a working folder and initialize the Terraform provider plugin. This folder will be where you run Terraformer commands.
Run terraform init
against a versions.tf
file to install the plugins required for your platform. For example, if you need plugins for the google provider, versions.tf
should contain:
terraform {
required_providers {
google = {
source = "hashicorp/google"
}
}
required_version = ">= 0.13"
}
Or, copy your Terraform provider's plugin(s) from the list below to folder ~/.terraform.d/plugins/
, as appropriate.
Links to download Terraform provider plugins:
Information on provider plugins: https://www.terraform.io/docs/configuration/providers.html
If you have improvements or fixes, we would love to have your contributions. Please read CONTRIBUTING.md for more information on the process we would like contributors to follow.
Terraformer was built so you can easily add new providers of any kind.
Process for generating tf
/json
+ tfstate
files:
tf
/json
files.tfstate
files.All mapping of resource is made by providers and Terraform. Upgrades are needed only for providers.
For GCP compute resources, use generated code from
providers/gcp/gcp_compute_code_generator
.
To regenerate code:
go run providers/gcp/gcp_compute_code_generator/*.go
Terraforming gets all attributes from cloud APIs and creates HCL and tfstate files with templating. Each attribute in the API needs to map to attribute in Terraform. Generated files from templating can be broken with illegal syntax. When a provider adds new attributes the terraforming code needs to be updated.
Terraformer instead uses Terraform provider files for mapping attributes, HCL library from Hashicorp, and Terraform code.
Look for S3 support in terraforming here and official S3 support Terraforming lacks full coverage for resources - as an example you can see that 70% of S3 options are not supported: