Blog
Go serverless in managing your production container fleet using AWS Fargate
Shubham Gupta
September 1, 2020
Back

The Problem: “Works in my Machine”

As a developer, when you start writing code for your application, you tend to make certain configurations in your machine, install dependencies and software. Some of these configurations could be specific to your operating system.

What if other developers in your team try to run the same code base, which you developed, on their machines, but struggle to do so? You would turn around and say hey that worked on my machine! As it turns out, to run the application, you’d need a lot of things in addition to the code itself. These include configuration and dependency management. This problem can potentially hinder development process for the entire team time and again.

And then, you might run into similar issues when deploying applications in production environment. This is because you must take care of installing all those dependencies (with exact versions) on the new machine as well.

What are the containers?

Containers have emerged as the industry standard solution to address the problem we just discussed.

Historically, UNIX based operating systems have been using a word Jail. Jail (now known as containers) is an environment where a process is isolated from the operating systems and has been provided with all its dependencies explicitly. Yet physically constructing containers can be burdensome and can be prone to mistakes. And due to this, containers were out of reach for many. There was a need to standardize the approach towards containers so that they become more predictable, configurable, and scalable.

Enter the Docker!

This is exactly where Docker came to the rescue.

Docker uses existing container engines to build secure and reliable containers for you at a very low cost (computational). You don’t have to worry about any security, updates, or keeping up with the new features of containers. Docker will do all that for you. This will save you a lot of time and money and bring peace of mind.

Just like an architect starts with a blueprint to construct a building, similarly, Docker image is a file which is a blueprint of a Docker container. This is where you specify what dependencies, version of those dependencies, the directory structure of your code, everything in one file. Now when Docker images are executed with the Docker engine, the result is Docker containers. One can create any number of containers from a single image.

You can think of Docker containers just like physical shipping containers. Docker containers are like a box that contains all the code and dependencies required to run a software. And just like transporting, a physical container is easy for trains, ships, in a similar fashion it is easy to run, build and ship Docker containers.

Where to keep the Docker Images?

Docker repositories can be thought of as git repository. You may choose to keep your code on your computer, or you can push it onto the GitHub, GitLab, Bitbucket and others. Likewise, Docker images can be pushed onto the Docker Image Registry. The most common one is the Docker Hub. Since we are talking about Amazon’s Fargate, there is one registry service provided by Amazon itself. That is Elastic Container Registry aka ECR. But Fargate lets you pull the Docker image from any Docker image registry.

Creating, storing and deploying Docker containers is super easy. But managing 100 containers? Oh yes, that is going to be painful. As Docker containers and microservice architecture go hand in hand, scaling your application horizontally will be much easier. Well, you are predicting that you might get 1000 more customers in another 15 minutes, and your containers can’t manage those requests? You just spin up more containers. But after those requests do you still need those containers? They are eating your computation power and money while sitting ideal. Hence it makes sense to shut down those containers. Do you see a problem? You can’t manage all those containers manually. Here is where container-orchestration tools come in handy.

Cluster Management

Container-orchestration tools are like commanders to your container fleet. They do all the powerlifting for you and remove those pain areas like scheduling, placement, task management (health checks) management control plane updates. One such commander is Amazon’s Elastic Container Service (ECS). But cluster management is still half the equation. You still must manage EC2 instances and managing infrastructures. That involves tasks such as Patching and Updating OS agents and Scaling the instance fleet for optimal utilization.

We want to manage only containers without worrying about EC2s (Infrastructure).

What is Fargate?

Fargate is a serverless technology that can be thought of as containers on demand. It can simplify your DevOps because, with Fargate, you just write application logic in your containers without worrying about infrastructure. Fargate lets you focus only on your application where infrastructure is managed by Amazon to provision, scale or manage. Fargate scales up and down seamlessly. And the best part is, to pay only for what you use. Fargate is deeply integrated with the AWS ecosystem like VPC Networking, Elastic Load Balancing, IAM Permission, CloudWatch and more.

To configure a Fargate for running your containers, there is a single file to manage which is the Task Definition file. This file contains information about, but not limited to –

Another convenient thing about Fargate is the task definition file is exactly like ECS, you can switch between EC2 launch type and Fargate with just a flip of a variable. Fargate is not a panacea though. There might be scenarios where it may not be an ideal fit. Let’s delve into when to use it.

​​​​​​​

When to use Fargate?

Fargate makes a lot of sense if –

  1. You want to start building a new application and have little or no visibility over usage patterns
  2. You want to get rid of managing the operational overheads
  3. Your application memory and compute requirements fit in the range of offerings provided by Fargate (maximum 4 vCPU and 30GB memory per task)
  4. You want a cost-effective solution

That being said Fargate is not a solution to everything. In your production environment, you can have a hybrid of both ECS and Fargate. But if you ask me, I will initially start with Fargate to quickly start the development so that I don’t have to manage the Infrastructure. Then maybe in the future when I want more control, I can switch to ECS’s EC2 launch type which gives me more control over the infra. If in case, you need to interact with the clusters through command line then Fargate is not an option. Fargate doesn’t allow command-line interaction so using a traditional approach is suggested.

In the next section, we shall look at a real-world application where we used Fargate to achieve a serverless containerized solution.

​​​​​​​

Case Study – Creating a cloud-based managed solution for providing API Testing-as-a-service

The Problem Statement

The ask was to create a managed platform for performing automated testing of application’s API endpoints. It was also expected that the users should be able to schedule the execution of their API suites.

The application is intended to be used through an intuitive web portal.

The Solution

After Dockerizing, the testing framework can be used for testing the APIs. A cluster of Fargate was created with a task definition in it. This task definition will have a task where the container will run with the testing framework’s docker image. After executing the APIs the task will be stopped and the result of the tested APIs will be pushed to S3 bucket.

Fargate is used in this study because it suits the use case perfectly. After the execution of the task, the application will stop the Fargate tasks corresponding to the suite. This makes the solution very cost-efficient. Fargate charges on basis of compute and memory resources consumed. But in the case of others like EC2 or EBS, the cost could have gone higher than Fargate. Other than the cost aspect, Fargate being a managed service is easier to use.

​​​​​​​

Conclusion

With Fargate, off you go building next big business solution. Don’t worry about the infrastructure to handle million requests and maintaining the servers under your containers. Just focus on building what matters.