Serverless on Google Cloud Platform with Cloud Run and GKE Autopilot – Cloud Cloud Stories #4

The latest news from around the cloud: Club Cloud Stories #4 is here! Luca Cavallin & Jacco Kulman – joined by special guest Antoni Tzavelas (Google Cloud Course Creator and DevOps enthusiast) – are going to discuss Cloud Run and GKE Autopilot!



Luca Cavallin and Matt Watson gave a talk in September about Cloud Run and GKE Autopilot, two new services on Google Cloud to more easily run containers. In this episode of Club Cloud Stories, we talk about the highlights of these products.

Cloud Run

Cloud Run lets you develop and deploy highly scalable containerized applications on a fully managed serverless platform. With it:

  • You can use any language!
  • You can reap the benefits of containers: they require less system resources than VMs, they are easy to scale, and applications run the same regardless of where they are deployed
  • You can reap the benefits of serverless too: fully managed, autoscaling and autohealing made easy, scales to zero.

Cloud Run is a perfect choice for applications such as web services, (light) data processing systems, scheduled tasks (i.e. document generation). Furthermore, it supports Google’s Global Load Balancer, HTTP/2, gRPC, websockets and Eventarc. Cloud Run is also a secure solution sandboxed using gVisor, it provides integrated logging & monitoring and supports Secret Manager, Binary Authorization and user-managed encryption keys.

Cloud Run can also be a good way to save on costs! With it, you pay only when handling requests, and committed use discounts are available.

GKE Autopilot

With GKE Autopilot, Google manages the cluster’s underlying infrastructure: so you can forget about configuration management on GKE!

GKE Autopilot provides a fully-monitored hands-off experience, with automatic node scaling and built-in redundancy.
It is especially useful for lift-and-shift migrations, web applications, scheduled and batch applications (i.e. CI/CD pipelines) and it enforces security by means
of container isolation, forbidding privileged pods and hardened underlying nodes with no SSH access.

GKE pricing is based on a per-pod-resource-requests model, plus a flat cluster fee (similar to "regular" GKE).

Previous episodes

Cloud Club Stories #3

Cloud Club Stories #2

Cloud Club Stories #1

Luca is a Software Engineer and Trainer with full-stack experience ranging from distributed systems to cross-platform apps. He is currently interested in building modern, serverless solutions on Google Cloud using Golang, Rust and React and leveraging SRE and Agile practices. Luca holds 3 Google Cloud certifications, he is part of the Google Developers Experts community and he is the co-organizer of the Google Cloud User Group that holds with Google.
Jacco is a Cloud Consultant at As an experienced development team lead he coded for the banking- and hospitality- and media-industries. He is a big fan of serverless architectures. In his free time he reads science fiction, contributes to open source projects and enjoys being a life-long-learner.
Share this article: Tweet this post / Post on LinkedIn