pre-loading
backtotop
Google Kubernetes Engine
Google Kubernetes Engine

Streamline Your Business with Containerized Applications on Google Kubernetes Engine (GKE)

August 26, 2022

Introduction: Google Kubernetes Engine (GKE) is amanaged Kubernetes service by Google Cloud that simplifies the deployment andmanagement of containerized applications. This blog will guide you through theprocess of hosting containers on GKE using the "Hello App" as an example.

Objectives:

• Create a GKE cluster

• Deploy a sample app to GKE

• Expose the sample app to the internet

• Deploy a new version of the sample app

• Clean up resources after completion

Costs: Before starting, be aware that this may incur costs for the usage of Google Cloud resources, including GKE cluster instances, storage, and network egress. You can refer to the Google Cloud Pricing documentation for more details on pricing, else reach us at cssdm@quadrasystems.net to estimate.

Before You Begin:

1. Create a Google Cloud project or use an existing one.

2. Enable the necessary APIs, such as the Kubernetes Engine API, Cloud Build API, and Artifact Registry API.

3. Set up the Cloud SDK and authenticate with your Google Cloud account.

4. Install Git and Docker on your local machine (optional).

Create a Repository:

1. Create a new repository or use an existing one to store your source code and Docker configuration files.

Building the Container Image:

GKE accepts Docker images as the application deployment format. Before deploying hello-app to GKE, you must package the hello-app source code as a Docker image.

To build a Docker image, you need source code and a Dockerfile, A Dockerfile contains instructions on how the image is built.

Running Your Container Locally (optional):

1. Run the container locally to test and verify its functionality.

Pushing the Docker Image to Artifact Registry:

1. Configure Docker to authenticate with Artifact Registry.

gcloud auth configure-docker REGION-docker.pkg.dev

2. Tag the container image with the Artifact Registry repository's location.

3. Push the container image to Artifact Registry.

docker push REGION-docker.pkg.dev/${PROJECT_ID}/hello-repo/hello-app:v1

Creating a GKE Cluster:

Use the Cloud Console to create a GKE cluster.

1. Go to the Google Kubernetes Engine page in the Google Cloud console.

2. Click + Create.

3. For GKE Autopilot, click Configure.

4. In the Name field, enter the name hello-cluster.

5. Select a Compute Engine region from the Region drop-down list, such as us-west1.

6. Click Create.

7. Wait for the cluster to be created. When the cluster is ready, a checkmark appears next to the cluster name.

Deploying the Sample App to GKE:

Deploy the sample app to the GKE cluster using the Kubernetes deployment manifest.

1. Go to the Workloads page in the Google Cloud console.

2. Click + Deploy.

3. In the Specify container section, select Existing container image.

4. In the Image path field, click Select.

5. In the Select container image pane, select the hello-app image you pushed to Artifact Registry and click Select.

6. In the Container section, click Done, then click Continue.

7. In the Configuration section, under Labels, enter app for Key and hello-app for Value.

8. Under Configuration YAML, click View YAML. This opens a YAML configuration file representing the two Kubernetes API resources about to be deployed into your cluster: one Deployment, and one HorizontalPodAutoscaler for that Deployment.

9. Click Close, then click Deploy.

10. When the Deployment Pods are ready, the Deployment details page opens.

11. Under Managed pods, note the three running Pods for the hello-app Deployment.

Exposing the Sample App to the Internet:

Create a Kubernetes service to expose the sample app to the internet.

1. Go to the Workloads page in the Google Cloud console.

2. Click hello-app

3. From the Deployment details page, click list Actions > Expose.

4. In the Expose dialog, set the Target port to 8080. This is the port the hello-app container listens on.

5. From the Service type drop-down list, select Load balancer.

6. Click Expose to create a Kubernetes Service for hello-app.

7. When the Load Balancer is ready, the Service details page opens.

8. Scroll down to the External endpoints field and copy the IP address.

Deploying a New Version of the Sample App:

You will upgrade hello-app to a new version by building and deploying a new Docker image to your GKE cluster.

GKE's rolling update feature lets you update your Deployments without downtime. During a rolling update, your GKE cluster incrementally replaces the existing hello-app Pods with Pods containing the Docker image for the new version. During the update, your load balancer service routes traffic only into available Pods.

Return to Cloud Shell, where you have cloned the hello app source code and Dockerfile, Update the function hello() in the main.go file to report the new version 2.0.0.

Now you're ready to update your hello-app Kubernetes Deployment to use a new Docker image.

1. Go to the Workloads page in the Google Cloud console.

2. Click hello-app.

3. On the Deployment details page, click list Actions > Rolling update.

4. In the Rolling update dialog, set the Image of hello-app field to REGION-docker.pkg.dev/PROJECT_ID/hello-repo/hello-app:v2.

5. Click Update.

6. On the Deployment details page, inspect the Active Revisions section. You should now see two Revisions, 1 and 2. Revision 1 corresponds to the initial Deployment you created earlier. Revision 2 is the rolling update you just started.

7. After a few moments, refresh the page. Under Managed pods, all of the replicas of hello-app now correspond to Revision 2.

8. In a separate tab, navigate again to the Service IP address you copied. The Version should be 2.0.0.

Clean Up:

1. Delete the GKE services to avoid incurring unnecessary costs.

kubectl delete service hello-app-service

2. Delete the cluster & Container Images

gcloud container clusters delete hello-cluster --region REGION

gcloud artifacts docker images delete \

   REGION-docker.pkg.dev/${PROJECT_ID}/hello-repo/hello-app:v1 \

   --delete-tags --quiet

gcloud artifacts docker images delete \

   REGION-docker.pkg.dev/${PROJECT_ID}/hello-repo/hello-app:v2 \

   --delete-tags --quiet

Conclusion: In this blog, you might have learnt how to host containers on Google Kubernetes Engine (GKE) using the "Hello App" example. Kudos, you accomplished objectives such as creating a GKE cluster, deploying the sample app, exposing it to the internet, deploying a new version, and cleaning up resources afterward.

GKE provides a reliable and scalable platform for containerized application hosting, empowering you to build and manage robust systems with ease. Refer to the official Google Cloud documentation for more in-depth guidance and explore advanced features of GKE. else reach us at cssdm@quadrasystems.net to handhold wherever needed.

More Blogs

The Predictive Tool That’s Changing Ecommerce Forever
The Predictive Tool That’s Changing Ecommerce Forever
Tue, May 25th 2021 8:04 AM

Discover how Amazon Forecast can revolutionize your business with accurate time-series predictions. Learn about its features, use cases, pricing, and integration with other Amazon services.

Read more 
External link
Quadra’s Comprehensive Guide to Microsoft Copilots
Quadra’s Comprehensive Guide to Microsoft Copilots
Tue, May 25th 2021 8:04 AM

Microsoft's suite of Copilots leverages artificial intelligence to enhance productivity, creativity, and efficiency across its product stack. At Quadra, we created this guide to provide an overview of the various Copilots available, detailing their integration points, licensing requirements, and the benefits they offer.

Read more 
External link
FinOps Hub: The Smart Way to Manage Your Google Cloud Costs.
FinOps Hub: The Smart Way to Manage Your Google Cloud Costs.
Tue, May 25th 2021 8:04 AM

Cloud cost management can be a complex and challenging task, especially for large enterprises. The complexity of cloud pricing models, the need for clear spending visibility, inefficient cloud resource management, and complex metrics can all make it difficult to optimize cloud spending.

Read more 
External link
Go back