Skip to main content

Check out Port for yourselfย 

Manage your Kubernetes deployments

This guide demonstrates how to bring your Kubernetes deployment management experience into Port. You will learn how to:

  • Ingest Kubernetes cluster, deployment, and pod data into Port's software catalog using Port's Kubernetes integration.
  • Set up self-service actions to manage Kubernetes deployments and pods (restart deployment and delete pod).

Common use casesโ€‹

  • Monitor the status and health of all Kubernetes deployments and pods across clusters from a single interface.
  • Provide self-service capabilities for developers to restart deployments and manage pods.

Prerequisitesโ€‹

This guide assumes the following:

Dedicated Workflows Repository

We recommend creating a dedicated repository for the workflows that are used by Port actions.

Set up self-service actionsโ€‹

We will create self-service actions to manage your Kubernetes deployments and pods directly from Port using GitHub Actions. We will implement workflows to:

  1. Restart a Kubernetes deployment.
  2. Delete a Kubernetes pod.

To implement these use-cases, follow the steps below:

Add GitHub secretsโ€‹

In your GitHub repository, go to Settings > Secrets and add the following secrets:

Configure Kubernetes authenticationโ€‹

Choose one of the following authentication methods based on your cluster setup:

This approach uses Google Cloud's authentication for GKE clusters.

  1. Create a GCP service account in the Google Cloud Console:

    • Go to IAM & Admin โ†’ Service Accounts.
    • Click Create Service Account.
    • Name it github-actions and add a description.
    • Grant the following roles:
      • Kubernetes Engine Cluster Viewer (roles/container.clusterViewer).
      • Kubernetes Engine Admin (roles/container.admin).
  2. Create a service account key:

    • In the service account details, go to the Keys tab.
    • Click Add Key โ†’ Create new key.
    • Choose JSON format and download the key file.
  3. Add to GitHub secrets:

    • GCP_SERVICE_ACCOUNT_KEY - The service account key JSON (minified to a single line).
    • GCP_CLUSTER_LOCATION - The location of your cluster.
Minifying JSON for GitHub Secrets

To avoid aggressive log sanitization, minify your service account JSON into a single line before storing it as a GitHub secret. You can use an online tool or the following command to minify the json:

jq -c '.' your-service-account-key.json | pbcopy

Restart a Kubernetes deploymentโ€‹

Add GitHub workflow

Create the file .github/workflows/restart-k8s-deployment.yaml in the .github/workflows folder of your repository.

Restart GKE Deployment GitHub workflow (Click to expand)
name: Restart GKE Deployment

on:
workflow_dispatch:
inputs:
port_context:
required: true
description: 'Action and general context (blueprint, entity, run id, etc...)'
type: string

jobs:
restart-deployment:
runs-on: ubuntu-latest
steps:
- uses: 'actions/checkout@v4'

- name: Inform Port of workflow start
uses: port-labs/port-github-action@v1
with:
clientId: ${{ secrets.PORT_CLIENT_ID }}
clientSecret: ${{ secrets.PORT_CLIENT_SECRET }}
baseUrl: https://api.getport.io
operation: PATCH_RUN
runId: ${{fromJson(inputs.port_context).runId}}
logMessage: Configuring GCP credentials to restart GKE deployment ${{ fromJson(inputs.port_context).entity.title }}

- id: 'auth'
uses: 'google-github-actions/auth@v2'
with:
credentials_json: '${{ secrets.GCP_SERVICE_ACCOUNT_KEY }}'

- id: 'get-credentials'
uses: 'google-github-actions/get-gke-credentials@v2'
with:
cluster_name: ${{ fromJson(inputs.port_context).entity.properties.Cluster }}
location: '${{ secrets.GCP_CLUSTER_LOCATION }}'

- name: Restart Kubernetes deployment
run: |
kubectl rollout restart deployment/${{ fromJson(inputs.port_context).entity.identifier }} -n ${{ fromJson(inputs.port_context).entity.relations.Namespace }}

- name: Wait for deployment rollout
run: |
kubectl rollout status deployment/${{ fromJson(inputs.port_context).entity.identifier }} -n ${{ fromJson(inputs.port_context).entity.relations.Namespace }} --timeout=300s

- name: Inform Port about deployment restart success
if: success()
uses: port-labs/port-github-action@v1
with:
clientId: ${{ secrets.PORT_CLIENT_ID }}
clientSecret: ${{ secrets.PORT_CLIENT_SECRET }}
baseUrl: https://api.getport.io
operation: PATCH_RUN
runId: ${{ fromJson(inputs.port_context).runId }}
status: 'SUCCESS'
logMessage: โœ… GKE deployment ${{ fromJson(inputs.port_context).entity.title }} restarted successfully
summary: GKE deployment restart completed successfully

- name: Inform Port about deployment restart failure
if: failure()
uses: port-labs/port-github-action@v1
with:
clientId: ${{ secrets.PORT_CLIENT_ID }}
clientSecret: ${{ secrets.PORT_CLIENT_SECRET }}
baseUrl: https://api.getport.io
operation: PATCH_RUN
runId: ${{ fromJson(inputs.port_context).runId }}
status: 'FAILURE'
logMessage: โŒ Failed to restart GKE deployment ${{ fromJson(inputs.port_context).entity.title }}
summary: GKE deployment restart failed

Create Port action

  1. Go to the Self-service page of your portal.

  2. Click on the + New Action button.

  3. Click on the {...} Edit JSON button.

  4. Copy and paste the following JSON configuration into the editor.

    Restart Kubernetes deployment action (Click to expand)
    Modification Required

    Make sure to replace <GITHUB_ORG> and <GITHUB_REPO> with your GitHub organization and repository names respectively.

    {
    "identifier": "restart_k8s_deployment",
    "title": "Restart Kubernetes Deployment",
    "icon": "Cluster",
    "description": "Restart a Kubernetes deployment to trigger a rolling update",
    "trigger": {
    "type": "self-service",
    "operation": "DAY-2",
    "userInputs": {
    "properties": {},
    "required": []
    },
    "blueprintIdentifier": "workload"
    },
    "invocationMethod": {
    "type": "GITHUB",
    "org": "<GITHUB-ORG>",
    "repo": "<GITHUB-REPO>",
    "workflow": "restart-k8s-deployment.yaml",
    "workflowInputs": {
    "port_context": {
    "entity": "{{ .entity }}",
    "runId": "{{ .run.id }}"
    }
    },
    "reportWorkflowStatus": true
    },
    "requiredApproval": false
    }
  5. Click Save.

Now you should see the Restart Kubernetes Deployment action in the self-service page. ๐ŸŽ‰

Delete a Kubernetes podโ€‹

Add GitHub workflow

Create the file .github/workflows/delete-k8s-pod.yaml in the .github/workflows folder of your repository.

Delete GKE Pod GitHub workflow (Click to expand)
name: Delete GKE Pod

on:
workflow_dispatch:
inputs:
port_context:
required: true
description: 'Action and general context (blueprint, entity, run id, etc...)'
type: string

jobs:
delete-pod:
runs-on: ubuntu-latest
steps:
- uses: 'actions/checkout@v4'

- name: Inform Port of workflow start
uses: port-labs/port-github-action@v1
with:
clientId: ${{ secrets.PORT_CLIENT_ID }}
clientSecret: ${{ secrets.PORT_CLIENT_SECRET }}
baseUrl: https://api.getport.io
operation: PATCH_RUN
runId: ${{fromJson(inputs.port_context).runId}}
logMessage: Configuring GCP credentials to delete GKE pod ${{ fromJson(inputs.port_context).entity.title }}

- id: 'auth'
uses: 'google-github-actions/auth@v2'
with:
credentials_json: '${{ secrets.GCP_SERVICE_ACCOUNT_KEY }}'

- name: Set up Cloud SDK
uses: google-github-actions/setup-gcloud@v2

- id: 'get-credentials'
uses: 'google-github-actions/get-gke-credentials@v2'
with:
cluster_name: ${{ fromJson(inputs.port_context).entity.properties.Cluster }}
location: '${{ secrets.GCP_CLUSTER_LOCATION }}'

- name: Delete Kubernetes pod
run: |
kubectl delete pod ${{ fromJson(inputs.port_context).entity.identifier }} -n ${{ fromJson(inputs.port_context).entity.properties.namespace }}

- name: Inform Port about pod deletion success
if: success()
uses: port-labs/port-github-action@v1
with:
clientId: ${{ secrets.PORT_CLIENT_ID }}
clientSecret: ${{ secrets.PORT_CLIENT_SECRET }}
baseUrl: https://api.getport.io
operation: PATCH_RUN
runId: ${{ fromJson(inputs.port_context).runId }}
status: 'SUCCESS'
logMessage: โœ… GKE pod ${{ fromJson(inputs.port_context).entity.title }} deleted successfully
summary: GKE pod deletion completed successfully

- name: Inform Port about pod deletion failure
if: failure()
uses: port-labs/port-github-action@v1
with:
clientId: ${{ secrets.PORT_CLIENT_ID }}
clientSecret: ${{ secrets.PORT_CLIENT_SECRET }}
baseUrl: https://api.getport.io
operation: PATCH_RUN
runId: ${{ fromJson(inputs.port_context).runId }}
status: 'FAILURE'
logMessage: โŒ Failed to delete GKE pod ${{ fromJson(inputs.port_context).entity.title }}
summary: GKE pod deletion failed

Create Port action

  1. Go to the Self-service page of your portal.

  2. Click on the + New Action button.

  3. Click on the {...} Edit JSON button.

  4. Copy and paste the following JSON configuration into the editor.

    Delete Kubernetes pod action (Click to expand)
    Modification Required

    Make sure to replace <GITHUB_ORG> and <GITHUB_REPO> with your GitHub organization and repository names respectively.

    {
    "identifier": "delete_k8s_pod",
    "title": "Delete Kubernetes Pod",
    "icon": "Cluster",
    "description": "Delete a Kubernetes pod (will be recreated by the deployment)",
    "trigger": {
    "type": "self-service",
    "operation": "DELETE",
    "userInputs": {
    "properties": {},
    "required": []
    },
    "blueprintIdentifier": "pod"
    },
    "invocationMethod": {
    "type": "GITHUB",
    "org": "<GITHUB-ORG>",
    "repo": "<GITHUB-REPO>",
    "workflow": "delete-k8s-pod.yaml",
    "workflowInputs": {
    "port_context": {
    "entity": "{{ .entity }}",
    "runId": "{{ .run.id }}"
    }
    },
    "reportWorkflowStatus": true
    },
    "requiredApproval": false
    }
  5. Click Save.

Now you should see the Delete Kubernetes Pod action in the self-service page. ๐ŸŽ‰