Skip to main content
Navigating DockerHub rate limits: the AWS ECR Pipeline solution
Share on socials

Navigating DockerHub rate limits: the AWS ECR Pipeline solution

Harold (Tre) King
Harold (Tre) King
19th December, 2023
6 min read
Person pulling a box from a container
Harold (Tre) King
Harold (Tre) King
19th December, 2023
6 min read

The problem

In the dynamic world of DevOps, efficiency is key. Rapid iteration, the process of quickly repeating a cycle of operations, is a fundamental part of this ecosystem. However, our customers' DevOps teams were facing a significant hurdle during this essential process within Amazon Web Services (AWS). The culprit? DockerHub's rate limiting.
Rate limiting is essentially a control to limit the frequency of user interactions with a specific service, in this case, DockerHub. While it serves to maintain the quality of service and protect the platform from abusive behaviour, it was proving to be a roadblock for DevOps teams. The frequency of their operations was being curtailed, leading to a loss of time efficiency. This was a serious setback as it hindered their ability to quickly build, test, and deploy applications—a crucial aspect of their work.

Our solution

Recognising the need for a more efficient process, our team sought a solution. The objective was clear: to maintain rapid iteration cycles without falling foul of DockerHub's rate limiting.
Our solution was found within AWS itself. We decided to leverage the power of native AWS services to bypass the rate-limiting issue. The strategy was to copy an image from DockerHub and then move it to an AWS Elastic Container Registry (ECR) repository we control.
AWS ECR is a fully managed Docker container registry that makes it easy for developers to store, manage, and deploy Docker container images. By utilising ECR, we could create a buffer between operations and DockerHub's rate limiting, thus maintaining efficiency.
This approach resolved the rate-limiting issue and gave greater control over resources. With our ECR repository, container images could be managed more effectively, ensuring a smoother and more efficient DevOps process.

How does it work?

When DockerHub introduced rate limiting, it significantly challenged rapid iteration processes. In response, we devised a solution that seamlessly combines AWS ECR, CodePipeline, and CodeBuild.
Here's how the process works:
  1. We pull an image from DockerHub.
  2. The image is then tagged, providing a reference point that aids in identifying and pulling the image in the future.
  3. Once tagged, we push the image to our AWS ECR repository.
  4. In the ECR repository, the image is scanned for vulnerabilities, ensuring the security of our operations.
This innovative process allows teams to rapidly iterate and build secure, containerised solutions without experiencing downtime due to DockerHub's rate limiting.

Customer experience improvement

Our customers have reaped significant benefits from this solution. These include:
  • Unlimited efficiency for rapid build/test iterations when using public DockerHub images.
  • Reliable, highly-available, and regional AWS-native container image build pipelines via Infrastructure as Code (IaC).
  • Reusable IaC templates for continuous adaptation using a flexible design pattern.
  • Enhanced DevSecOps shift-left capability by leveraging the native ECR 'scan-on-push' feature.

Solution details

Our solution sets up a fully functional Docker image ECR pipeline. This pipeline pulls the AmazonLinux base image from DockerHub, pushes it to ECR, and scans new images with each push. This ensures that every image we use is secure and up-to-date.
The resources created as part of this solution include:
  • An ECR Repository with Docker image scanning enabled by default. This allows us to constantly monitor and maintain the security of our images.
  • A CodeBuild Project, which provides a versatile environment for building and testing our applications.
  • An S3 Bucket for CodePipeline artefacts, ensuring a secure and reliable storage solution for our pipeline's output.
  • IAM Roles for related services, allowing us to manage permissions and securely access the necessary AWS services involved in the pipeline process.

Deployment instructions

Note:
Create a Personal Access Token for your Github account for the AWS webhook.
Link:
This application is deployed using AWS CloudFormation.
CloudFormation Parameters: (required)
  • GitHubRepo
  • GitHubBranch
  • GitHubToken (do not commit this value)
  • GitHubUser
  • RepositoryName (ECR repository name to be created)
CloudFormation deployment options:

Don't let technical hurdles slow you down. Contact us today and let's discuss how our AWS solutions can help you maintain your rapid iteration cycles and improve your DevOps efficiency.

Adaptavist is committed to protecting and respecting your privacy. From time to time, we would like to contact you about our products and services.

You can unsubscribe from these communications at any time. For more info please review our Privacy Policy.

By clicking submit, you consent to allow Adaptavist to store and process the personal information to provide you the content requested.

Thanks for reaching out to the Adaptavist AWS Cloud Services team we'll be in touch shortly.

Written by
Harold (Tre) King
Harold (Tre) King
Senior DevOps Consultant
Tre is passionately focused on AWS, security, and automation. He excels in solving problems and building innovative solutions, always driven by a commitment to excellence and efficiency.
DevOps