Kubernetes Deployments Using Canary

Sandeep Kumar Patel
4 min readSep 9, 2021

By-Sandeep Kumar Patel

Canary Deployment

A canary deployment, or canary release, is a deployment pattern that allows you to roll out new code/features to a subset of users as an initial test.

Implement Canary Releases-

The initial steps for implementing canary deployment are: create two clones of the production environment, have a load balancer that initially sends all traffic to one version, and creates new functionality in the other version. When you deploy the new software version, you shift some percentage — say, 10% — of your user base to the new version while maintaining 90% of users on the old version. If that 10% reports no errors, you can roll it out to gradually more users, until the new version is being used by everyone. If the 10% has problems, though, you can roll it right back, and 90% of your users will have never even seen the problem.

Note that infrastructure changes and configuration changes should always be tested with canaries because of their sensitivity.

Why Canary Deployment?

Canary deployment benefits include zero downtime, easy rollout and quick rollback — plus the added safety from the gradual rollout process. It also has some drawbacks — the expense of maintaining multiple server instances, the difficult clone-or-don’t-clone database decision.

Typically, software development teams implement blue/green deployment when they’re sure the new version will work properly and want a simple, fast strategy to deploy it. Conversely, canary deployment is most useful when the development team isn’t as sure about the new version and they don’t mind a slower rollout if it means they’ll be able to catch the bugs.

Where Did the Canary Deployment Concept Come From?

You might be wondering why a little yellow bird is used to indicate a test release of a new feature. To answer that, we’ll have to go back to the coal mining days of the 1920s. Miners brought caged canaries into the coal mines because if there was a high level of toxic gases (typically carbon monoxide), the canary would die, alerting the miners to evacuate the tunnel immediately.

In a similar vein, when you release a feature to a small subset of users, those users can act as the canary, providing an early warning if something goes wrong so that you can rollback to the previous, stable version of the application.

Prerequisites

The canary deployments steps expect the following assumptions:

  • An initial service and the respective deployment should already exist in your cluster.
  • The name of each deployment should contain each version
  • The service should have a metadata label that shows what the “production” version is.

These requirements allow each canary deployment to finish into a state that allows the next one to run in a similar manner.

Creating a Deployment

Nginx -Deployment

Service -Deployment

Create the Deployment by running the following command:

kubectl apply -f nginx-deployment.yamlkubectl apply -f service.yamlkubectl get deploymentskubectl get pod

A canary deployment is a deployment strategy that releases an application or service incrementally to a subset of users. All infrastructure in a target environment is updated in small phases (e.g: 2%, 25%, 75%, 100%). A canary release is the lowest risk-prone, compared to all other deployment strategies, because of this control.

kubectl get deployment nginx-deployment kubectl describe deployment nginx-deployment

Pros:

Canary deployments allow organizations to test in production with real users and use cases and compare different service versions side by side. It’s cheaper than a blue-green deployment because it does not require two production environments. And finally, it is fast and safe to trigger a rollback to a previous version of an application.

Cons:

Drawbacks to canary deployments involve testing in production and the implementations needed. Scripting a canary release can be complex: manual verification or testing can take time, and the required monitoring and instrumentation for testing in production may involve additional research.

Proportional scaling

RollingUpdate Deployments support running multiple versions of an application at the same time. When you or an autoscaler scales a RollingUpdate Deployment that is in the middle of a rollout (either in progress or paused), the Deployment controller balances the additional replicas in the existing active ReplicaSets (ReplicaSets with Pods) in order to mitigate risk. This is called proportional scaling.

kubectl get deploykubectl get rs

For More Information

--

--

Sandeep Kumar Patel

Passionate about AI and ML, I see research as purposeful curiosity. Eager for feedback, email -" patelsandeep88@gmail.com"