7 best practices: Building applications for containers and Kubernetes

Let’s examine key considerations for building new applications specifically for containers and Kubernetes, according to cloud-native experts
654 readers like this.
Kubernetes

Don’t let the growing popularity of containers and Kubernetes dupe you into thinking that you should use them to run any and every type of application. You need to distinguish between “can” and “should.”

One basic example of this distinction is the difference between building an app specifically to be run in containers and operated with Kubernetes (some would refer to this as cloud-native development) and using these containers and orchestration for existing monolithic apps.

Building new applications specifically for containers and Kubernetes might be the best starting point for teams just beginning their container work.

We’ll cover the latter scenario in an upcoming post. Today, we’re focused on some of the key considerations for building new applications specifically for containers and Kubernetes – in part because the so-called “greenfield” approach might be the better starting point for teams just beginning with containers and orchestration.

“Containers [and orchestration] are a technical vehicle for building, deploying, and running cloud-native applications,” says Rani Osnat, VP of strategy at Aqua Security. “I typically recommend to those starting their journey with containers to use a new, simple greenfield application as their test case.”

[ Want to learn about building and deploying Kubernetes Operators? Get the free eBook: O’Reilly: Kubernetes Operators: Automating the Container Orchestration Platform. ]

How to develop apps for containers and Kubernetes

We asked Osnat and other cloud-native experts to share their top tips for developing apps specifically to be run in containers using Kubernetes. Let’s dive into six of their best recommendations.

1. Think and build modern

If you’re building a new home today, you’ve got different styles and approaches than you would have, say, 50 years ago. The same is true with software: You’ve got new tools and approaches at your disposal.

“If you’re building an app, build it in a modern way!” says Miles Ward, CTO at SADA. Ward points to microservices and the 12-factor methodology as chief examples of modern application development.

Ward notes that while microservices and containers can work well together, the pairing is not actually a necessity, at least under the right conditions. “Microservices are also mentioned often in conjunction with Kubernetes; however, it is absolutely not a hard requirement,” Ward says. “A monolithic approach can also work, provided it can scale horizontally either as a single horizontal deployment or as multiple deployments with different endpoints to the same codebase.”

The same is true of twelve-factor: “Twelve-factor is a useful starting point, but its tenets aren’t necessarily law,” Ward says.

If you’re building an app from scratch, give strong consideration to the microservices approach.

But if you’re building an application from scratch – as Osnat advises teams do when they are getting started with containers and orchestration – give strong consideration to the microservices approach.

“To maximize the benefits of using containers, architect your app as a microservices app, in a way that will allow it to function even when individual containers are being refreshed,” Osnat advises. “It should also be structured so container images represent units that can be released independently, allowing for an efficient  CI/CD implementation.”

“Modern” development can be defined in various ways. If you’re building an application for containers and Kubernetes, it essentially means making choices that suit these packaging and deployment technologies. Here are two more examples:

  • Define container images as logical units that can scale independently: “For instance, it usually makes sense to implement databases, logging, monitoring, load-balancing, and user session components as their own containers or groups of containers,” Osnat says.
  • Consider cloud-native APIs: “Kubernetes has powerful API extension mechanisms,” says Vaibhav Kamra, VP of engineering and co-founder at Kasten. “By integrating with those, you can immediately take advantage of existing tools in the ecosystem like command-line utilities and authentication.”

“Modern” is a good thing from a software development perspective, too.

“The great thing about most modern languages and frameworks is that they are overwhelmingly container-friendly,” says Ravi Lachhman, DevOps advocate at Harness. “Going back even a few years, runtimes like Java had a difficult time respecting container boundaries, and the dreaded out of memory killer would run seemingly arbitrarily to the operator. Today, because of the popularity of containers and orchestrators, especially Kubernetes, the languages and frameworks have evolved to live in the new paradigm.”

[ As Kubernetes adoption grows, what can IT leaders expect? Read also: 5 Kubernetes trends to watch in 2020. ]

2. CI/CD and automation are your friends

Automation is a critical characteristic of container orchestration; it should be a critical characteristic of virtually all aspects of building an application to be run in containers on Kubernetes. Otherwise, the operational burden can be overwhelming.

“Build applications and services with automation as minimum table stakes,” recommends Chander Damodaran, chief architect at Brillio. “With the proliferation of services and components, this can become an unmanageable issue.”

A well-conceived  CI/CD pipeline can bake automation into many phases of your development and deployment processes.

A well-conceived  CI/CD pipeline is an increasingly popular approach to baking automation into as many phases of your development and deployment processes as possible. Check out our recent primer for IT leaders: How to build a CI/CD pipeline.

Another way to think about the value of automation: It will make your mistakes – which are almost inevitable, especially early on – easier to bounce back from.

“Using any new platform requires a lot of trial and error, and the ease of using Kubernetes does not excuse you from taking necessary precautions,” says Lachhman from Harness. “Having a robust continuous delivery pipeline in place can ensure that confidence-building standards like testing, security, and change management strategies are followed to ensure your applications are running effectively.”

[ Why does Kubernetes matter to IT leaders? Learn more about Red Hat's point of view. ]

3. Keep container images as light as possible

Another key principle when developing an application for containers and Kubernetes: Keep your container images as small as possible for performance, security, and other reasons.

Make sure to remove all other packages – including shell utilities – that are not required by the application.

“Only include what you absolutely need. Often images contain packages which are not needed to run the contained application,” says Ken Mugrage, principal technologist in the office of the CTO at ThoughtWorks. Make sure to remove all other packages – including shell utilities – that are not required by the application. This not only makes the images smaller but reduces the attack surface for security issues, he says.

This is a good example of how building a containerized application might require a shift in traditional practices for some development teams.

“Developers need to rethink how they develop applications. For example, create a smaller container and base image,” says Nilesh Deo, director of product marketing at CloudBolt. “The smaller the image, the faster it can load, and as a result, the application will be faster.”

Let’s examine four other best practices: