Sections in this category

Setting Up Cloud Integrations

This document outlines how to set up cloud integration for accounts on multiple cloud providers, or multiple acounts on the same cloud provider. Multi-Cloud is an enterprise feature. This configuration can be used independent of or in addition to other cloud integration configurations provided by kubecost. Once configured Kubecost will display cloud assets for all configured account and perform reconcilation for all federated clusters that have there respective accounts configured.

Step #1 Set up Cloud Cost and Usage Reporting

For each Cloud Account that you would like to configure you will need to make sure that it is exporting cost data to its respective service to allow Kubecost to gain access to it.


Set up cost data export following this guide guide


Set up BigQuery billing data exports with this guide


Follow steps #1-3 to set up and configure a CUR in our guide

Step #2 Create Cloud Integration Secret

Secret should contain a file called cloud-integration.json with the following format

"azure": [],
"gcp": [],
"aws": []

This Configuration supports multiple configuration per cloud provider simply add any number cloud configuration objects to their respective arrays in the JSON file. The structure and required values for the configuration objects for each cloud provider is described below. Once you have filled in the configuration object use the command

kubectl create secret generic <SECRET_NAME> --from-file=cloud-integration.json -n kubecost

Once the secret is created, set .Values.kubecostProductConfigs.cloudIntegrationSecret to and upgrade Kubecost via Helm


The values needed to provide access to the Azure Storage Account where cost data is being exported can be found in the Azure portal in the Storage account where the cost data is being exported. - is the id of the subscription that the exported files are being generated for - is the name of the Storage account where the exported CSV is being stored. - can be found by selecting the “Access Keys” option from the navigation sidebar then selecting “Show Keys”. Using either of the two keys will work. - is the name that you choose for the exported cost report when you set it up. This is the name of the container where the CSV cost reports are saved in your Storage account.

Set these values into the following object and add it to the Azure array:

	"azureSubscriptionID": "<SUBSCRIPTION_ID>",
	"azureStorageAccount": "<STORAGE_ACCOUNT_NAME>",
	"azureStorageAccessKey": "<STORE_ACCESS_KEY>",
	"azureStorageContainer": <REPORT_CONTAINER_NAME>


If you don’t already have a GCP service key for any of the projects you would like to configure, you can run the following commands in your command line to generate and export one. Make sure your gcloud project is where your external costs are being run.

export PROJECT_ID=$(gcloud config get-value project)
gcloud iam service-accounts create compute-viewer-kubecost --display-name "Compute Read Only Account Created For Kubecost" --format json
gcloud projects add-iam-policy-binding $PROJECT_ID --member serviceAccount:compute-viewer-kubecost@$ --role roles/compute.viewer
gcloud projects add-iam-policy-binding $PROJECT_ID --member serviceAccount:compute-viewer-kubecost@$ --role roles/bigquery.user
gcloud projects add-iam-policy-binding $PROJECT_ID --member serviceAccount:compute-viewer-kubecost@$ --role roles/bigquery.dataViewer
gcloud projects add-iam-policy-binding $PROJECT_ID --member serviceAccount:compute-viewer-kubecost@$ --role roles/bigquery.jobUser
gcloud iam service-accounts keys create ./compute-viewer-kubecost-key.json --iam-account compute-viewer-kubecost@$ 
You can then get your service account key to paste into the UI (be careful with this!):

 cat compute-viewer-kubecost-key.json 

  • The GCP service key created above. This value should be left as JSON when inserted into the configuration object
  • GCP Project ID should match the Project ID in the GCP service key.
  • BigQuery dataset requires a BigQuery dataset prefix (e.g. billing_data) in addition to the BigQuery table name. A full example is billing_data.gcp_billing_export_v1_018AIF_74KD1D_534A2.

Set these values into the following object and add it to the GCP array:

	"key": <KEY_JSON>
	"projectID": "<PROJECT_ID>",
	"billingDataDataset": "<BILLING_DATA_DATASET>",


For each AWS Account that you would like to configure, create an Access Key for the Kubercost user which has access to the CUR. Navigate to Access Management > Users . Find the Kubecost User and select Security Credentials > Create Access Key. Note the Access key ID and Secret access key.

Gather each of these values from the AWS console for each account you would like to configure. - ID of the Access Key created in the Previous step - Secret of the Access Key created in the - An S3 bucket to store Athena query results that you’ve created that kubecost has permission to access The name of the bucket should match s3://aws-athena-query-results-*, so the IAM roles defined above will automatically allow access to it The bucket can have a Canned ACL of Private or other permissions as you see fit. - The aws region athena is running in - the name of the database created by the Athena setup. The athena database name is available as the value (physical id) of AWSCURDatabase in the CloudFormation stack created above (in Step 2: Setting up the Athena of the AWS guild above) - the name of the table created by the Athena setup The table name is typically the database name with the leading athenacurcfn_ removed (but is not available as a CloudFormation stack resource) - e.g. “530337586277” # The AWS AccountID where the Athena CUR is.

Set these values into the following object and add it to the AWS array:

	"serviceKeyName": "<ACCESS_KEY_ID>",
    "athenaBucketName": "<ATHENA_BUCKET_NAME>",
    "athenaRegion": "<ATHENA_REGION>",
    "athenaDatabase": "<ATHENA_DATABASE>",
    "athenaTable": "<ATHENA_TABLE>",
    "projectID": "<ATHENA_PROJECT_ID>"

Edit this doc on Github