in ,

Deploy your side-projects at scale for [basically] free – Google Cloud Run, Hacker News

Deploy your side-projects at scale for [basically] free – Google Cloud Run, Hacker News

I have built hundreds of side projects over the years and finding a place to manage and deploy them all has always been tricky. From the early days of a GoDaddy hosting package with random PHP files in folders, through having a persistentDigitalOceandroplet running, and event running a bare minimumKubernetescluster onGoogle Kubernetes Engine (GKE)– I’ve never been satisfied with the outcome.

I have a few requirements when it comes to down deploying random side projects:

  1. Fully-managed – I don’t want to have to worry about servers anymore – it’s (after all – and I want serverless all the things
  2. Cheap – These projects aren’t making me money so need to keep costs down
  3. Language agnostic – One day I maybe playing with something in Node, then next Python, the next Go.
  4. Scalable – in the unlikely something does takeoff, I don’t want to have to worry about it falling over

Thankfully, I’ve found a solution that I am happy with –Google Cloud Run.

  • ************What is Google Cloud Run?
  • Google is terrible at marketing…

    Cloud Run is a fully managed compute platform that automatically scales your stateless containers. Cloud Run is serverless: it abstracts away all infrastructure management, so you can focus on what matters most — building great applications.

    What is means is you can give it a docker container (technically any OCI compatible container) and it will deploy, run and scale it.


  • ******** Why is it good?

    If we evaluate this by my requirements:


    After using Cloud Run

    for over a year now I have never had to touch a server , VM, cluster or anything else. This is truly a deploy and forget service.

    Due to it being fully managed there are a few requirements to make your application compatible. The only real one you have to worry about it to ensure your application runs an HTTP server and listens on the port set in thePORTenvironment variable that is present at runtime of your container.


  • The beauty ofCloud Runis that it is only 'running' your container when it gets traffic. The pricing model is setup that you only pay for the CPU / memory / network bandwidth used when your app is getting requests.

    It achieves this by deploying your app on demand when traffic hits the domain name they give you, it hangs around for a bit (undetermined) until after traffic stops, and the the app is torn down. The other way to look at this is autoscaling - when there is no traffic, it scales to 0.

    There is a cost in the form of time - if you app takes time to 'setup' when it starts up, you will be making your users wait asCloud Runscales up your application from 0 to 1 instances. From my own use I’ve found this to be negligible though.

    Due to this pricing model I've never paid for than a few cents - yes CENTS - a month for all my side projects ( currently deployed. This is a factor of the little traffic I get to them so you may need to do the maths for yours - thepricing page is here.



  • **********************
  • Language Agnostic

    AsCloud Runtakes any container image and deploys it, you can use any language you want. Be it Node, Go, Java, PHP or something entirely obscure, as long as it speaks HTTP and listens on the port defined in thePORTenvironment variable, Cloud Rundoes care what you do inside the container.


    ****************** Scalable

    I am yet to have a side project go 'viral' but I am confident that should such event occur,

  • (Cloud Run) will handle it. The service will create more and more instances of your application up to the limit you defined (currently the cap is 2452 instances).

    As long as you have architected your application to be stateless - storing data in something like a database (eg

    **************** (CloudSQL) or object storage (eg************************** Cloud Storage- then you are good to go. Do consider if any services you depend on can handle the traffic should this scenario occur though. (****************************************************Simple Node.js Example

  • Enough chat, let’s write some code.

    If we take the most basic example of a Node.js Express app serving some JSON – this could be an API server for your statically deployed React frontend example.


    The Application

    You have your usual Node application code – something like:


    ************************** (const) ********************************************** (express) ***********************************************==****************************************** require (********************************************** (

    ************************** (‘express’) **********************************************) (********************************** const app=express(

    const  port=(process) **********************************************env
    (**************************************************  app  .

    ( '/'

    =>(******************************** ({   res  .


        message  : 'Hello World'   (**********************************************  (**********************************************  (**********************************************  (**********************************************  app  .

    ( (port) ,((

    (**********************  ()  (********************************************** Example app listening on port  (**********************************************  ($ {) ********************************************* (port) **********************************************}  (************************************************ (**********************************************! ******************************************  (************************************************
    **********************************************  (************************************************ ************************************************************************  (******************************************************************************************************************  (**********************************************  (**********************************************  (**********************************************  (********************************************  (**********************************************  (**********************************************  (**********************************************

    This is the bare-minimum to run an HTTP server in Node - in this case when you hit the root path, it returns a JSON with a “Hello World” message. The only 'special' bit in this is line 3 where we grab the port number we tell express to listen to from the environment variable if it exists - this will be provided by the Cloud Run service at execution time.

    Next we need aDockerfile to create our docker image from:

    **************************************** (FROM) *********************************************** (node) **********************************************: ******************************************** 12 WORKDIR / usr / src / app COPY package * .json ./ RUN npm install COPY . . CMD

  • ********************************** (************************************************ (********************************************** (********************************************** (********************************************** (************************************************ (********************************************** (************************************************ (************************************************ (********************************************** (************************************************ (********************************************** (************************************************
  • This is a barebonesDockerfile which uses the (node:) ************************************************************************************************************************************ base image, copies of our

  • ****************** package.jsonfile in, installs our dependencies and then will run ourserver .jsfile upon running the container.
  • With that set now it is time to build & deploy.

    Building & Pushing our Image

    Google Cloud Runrequires our images to be in the Google Container Registryof our project for easiest access. As such, you will need to have this enabled on your Google Cloud Project - you can find out how to do this here. You will also need the (gcloud

  • ) CLIinstalled on your machine and logged into your account - then rungcloud auth configure-docker (to setup) gcloudto work with docker.

    Now to build, tag and push our image to the repository. Your container tag will need to be in the following format:***/[app-name]: latest

    Where[gcp-project] is your Google Cloud Project name eg (my-project) and[app-name]is the name of your container egcloud-run-demo

    You then build:

    docker build -t[gcp-project] / [app-name]: latest.

    and then push:

    docker push***/ [app-name]: latest

    If everything worked without errors, you can now move onto deploying via Cloud Run.

  • Deploying on Cloud Run

    You can do this via thegcloud CLI

    gcloud beta run ...

    commands or my preferred way, the Google Cloud Console.

    Start by going to the Cloud Run section of your Google Cloud Console:

             (********************************************************** (************************************************************ (************************************************************