Try Devtron Enterprise!
Start Free Trial
LogoLogo
WebsiteDevtron demoGithub RepoJoin Discord
main
main
  • Introduction
  • Getting Started
  • Install Devtron
    • Install Devtron with CI/CD
    • Install Devtron with CI/CD and GitOps (Argo CD)
    • Install Devtron without Integrations
    • Install Devtron on Minikube, Microk8s, K3s, Kind, Cloud VMs
    • Install Devtron on Airgapped Environment
    • Demo on Popular Cloud Providers
    • Backup for Disaster Recovery
    • Uninstall Devtron
    • FAQs
  • Install Devtron Enterprise Trial
  • Devtron Kubernetes Client
  • Configurations
    • Installation Configurations
    • Override Configurations
    • Ingress Setup
  • Global Configurations
    • Host URL
    • GitOps
    • Projects
    • Clusters & Environments
    • Git Accounts
    • Container/OCI Registry
    • Chart Repositories
    • Deployment Charts
    • Authorization
      • SSO Login Services
        • Google
        • GitHub
        • GitLab
        • Microsoft
        • LDAP
        • OIDC
          • Keycloak
          • Okta
        • OpenShift
      • User Permissions
      • Permission Groups
      • API Tokens
    • Notifications
    • Deployment Window
    • Approval Policy
    • External Links
    • Catalog Framework
    • Scoped Variables
    • Plugin Policy
    • Pull Image Digest
    • Tags Policy
    • Filter Condition
    • Lock Deployment Configuration
    • Image Promotion Policy
    • Build Infra
  • Devtron Upgrade
    • Update Devtron from Devtron UI
    • Upgrade to 1.5.0
    • 0.6.x-0.7.x
    • 0.5.x-0.6.x
    • 0.4.x-0.5.x
    • 0.4.x-0.4.x
    • 0.3.x-0.4.x
    • 0.3.x-0.3.x
    • 0.2.x-0.3.x
  • Usage
    • Applications
      • Create a New Application
      • Clone an Existing Application
      • Deploy a Sample Application
      • App Configuration
        • Git Repository
        • Build Configuration
        • Base Deployment Template
          • Deployment
          • Rollout Deployment
          • Job and Cronjob
          • StatefulSets
        • GitOps Configuration
        • Workflow Editor
          • CI Pipeline
            • Pre-Build/Post-Build Stages
            • Override Build Configuration
          • CD Pipeline
        • ConfigMaps
        • Secrets
          • External Secret Operator (ESO)
            • AWS Secrets Manager
            • Google Secrets Manager
            • HashiCorp Vault
        • Environment Overrides
        • Deleting Application
      • Build and Deploy
        • Triggering CI
        • Triggering CD
        • Rollback Deployment
        • Applying Labels to Images
      • App Details
        • Debugging Deployment And Monitoring
        • Using Ephemeral Containers
        • Application Metrics
      • Application Overview
    • Jobs
      • Create a new job
      • Configurations
      • Workflow Editor
      • Trigger Job
      • Overview
    • Application Groups
    • Software Distribution Hub
      • Tenants
      • Release Hub
    • Resource Browser
    • Resource Watcher
    • Charts
      • Charts Overview
      • Deploy & Observe
      • Examples
        • Deploying Mysql Helm Chart
        • Deploying MongoDB Helm Chart
      • Chart Group
    • Security
      • Security Scans
      • Security Policies
    • Bulk Edit
    • Integrations
      • Build and Deploy (CI/CD)
      • GitOps (Argo CD)
      • Vulnerability Scanning (Clair)
      • Notifications
      • Monitoring (Grafana)
    • Pipeline Plugins
      • Create Your Plugin
      • Our Plugins
        • Ansible Runner
        • Bitbucket Runner Trigger
        • Codacy
        • Code-Scan
        • Copacetic
        • Container Image Exporter
        • Copy Container Image
        • Cosign
        • CraneCopy
        • Dependency track - Maven & Gradle
        • Dependency track - NodeJS
        • Dependency track - Python
        • Devtron CD Trigger
        • Devtron CI Trigger
        • Devtron Job Trigger
        • DockerSlim
        • EKS Create Cluster
        • GCS Create Bucket
        • GitHub Pull Request Updater
        • GKE Provisioner
        • GoLang-migrate
        • Jenkins
        • Jira Issue Validator
        • Jira Issue Updater
        • K6 Load Testing
        • Pull images from container repository
        • Semgrep
        • SonarQube
        • SonarQube v1.1.0
        • Terraform CLI
        • Vulnerability Scanning
  • Resources
    • Glossary
    • Troubleshooting
    • Use Cases
      • Devtron Generic Helm Chart To Run CronJob Or One Time Job
      • Connect SpringBoot with Mysql Database
      • Connect Expressjs With Mongodb Database
      • Connect Django With Mysql Database
      • Pull Helm Charts from OCI Registry
    • Telemetry Overview
    • Devtron on Graviton
    • Release Notes
Powered by GitBook
On this page
  • Introduction
  • Prerequisites
  • Steps
  • User Inputs
  • Task Name
  • Description
  • Input Variables
  • Trigger/Skip Condition
  • Output Variables
  • Pass/Failure Condition

Was this helpful?

Export as PDF
  1. Usage
  2. Pipeline Plugins
  3. Our Plugins

GCS Create Bucket

PreviousEKS Create ClusterNextGitHub Pull Request Updater

Last updated 4 months ago

Was this helpful?

Introduction

The GCS Create Bucket plugin of Devtron enables automated creation of Google Cloud Storage (GCS) bucket directly within CI/CD workflows. By integrating the GCS Create Bucket plugin, teams can simplify cloud storage provisioning and can efficiently manage and store application logs, deployment artifacts, backup data, and other critical application assets in a centralized cloud storage.

Prerequisites

Before integrating the GCS Create Bucket plugin, ensure you have a Google Cloud Platform (GCP) account and GCP Project with appropriate permissions.


Steps

  1. Navigate to the Jobs section, click Create, and choose Job.

  2. In the 'Create job' window, enter Job Name and choose a target project.

  3. Click Create Job.

  4. In the 'Configurations' tab, fill the required fields under the 'Source code' section and click Save.

  5. In Workflow Editor, click + Job Pipeline.

  6. Give a name to the workflow and click Create Workflow.

  7. Click Add job pipeline to this workflow.

  8. Fill the required fields in ‘Basic configuration’ tab.

  9. Go to the ‘Tasks to be executed’ tab.

  10. Under ‘Tasks’, click the + Add task button.

  11. Select the GCS Create Bucket plugin.

  12. Enter the following with appropriate values.


User Inputs

Task Name

Enter the name of your task

e.g., GCS Create Bucket

Description

Add a brief explanation of the task and the reason for choosing the plugin. Include information for someone else to understand the purpose of the task.

e.g., A Plugin to create GCS Bucket

Input Variables

Variable
Format
Description
Sample Value

BucketName

STRING

Name of the GCS bucket to be created

my-app-logs-bucket

StorageClass

STRING

Storage class for the bucket (STANDARD, NEARLINE, COLDLINE, ARCHIVE)

archive

Project

STRING

GCP project ID where the bucket will be created

gcp-68493

EnableBucketPrefix

STRING

Enable prefix for bucket naming (true/false)

true

ServiceAccountCred

STRING

Base64 encoded GCP service account credentials

eyJ0eXBlIjoic2VydmljZV9hY2

LocationType

STRING

Type of location (region/dual-region/multi-region)

region

Location

STRING

Geographic location where bucket will be created

us-central1

EnableAutoClass

BOOL

Automatically optimizes storage costs by moving objects between storage classes based on how frequently they are accessed. The default is false.

true

UniformAccess

STRING

Enable uniform bucket-level access control to create a bucket with bucket-level permissions instead of an Access control list (true/false)

true

Trigger/Skip Condition

Here you can set conditions to execute or skip the task. You can select Set trigger conditions for the execution of a task or Set skip conditions to skip the task.

Output Variables

Variable
Format
Description

BucketName

STRING

The name of the bucket created.

Pass/Failure Condition

Here you can define when a task should be marked as passed or failed. You can select Set pass conditions to define success criteria or Set failure conditions to specify failure scenarios.

Click Update Pipeline.

user inputs