Join our Discord Server
Ajeet Raina Ajeet Singh Raina is a former Docker Captain, Community Leader and Distinguished Arm Ambassador. He is a founder of Collabnix blogging site and has authored more than 700+ blogs on Docker, Kubernetes and Cloud-Native Technology. He runs a community Slack of 9800+ members and discord server close to 2600+ members. You can follow him on Twitter(@ajeetsraina).

Integrating AWS and Docker Cloud 1.0 ~ A Multicloud Application Delivery Service

5 min read

Docker’s acquisition of Tutum, a cross-cloud container management service has really paid-off. Two week back, Docker announced “Docker Cloud 1.0” – a new service by Docker that implements all features previously offered by Tutum plus integration with Docker Hub Registry service and the common Docker ID credentials.. It is basically a SaaS platform that allows you to build, deploy and manage Docker containers in a variety of clouds. Docker Cloud is where developers and IT ops meet to build, ship and run any application, anywhere. Docker Cloud enables you to:

– Deploy and scale any application to your cloud in seconds, with just a few clicks
– Continuously deliver your code with integrated and automated build, test, and deployment workflows
– Get visibility across all your containers across your entire infrastructure
– Access the service programmatically via a RESTful API or a developer friendly CLI tool

Docker Cloud provides a single toolset for working with containers on multiple clouds. Docker Cloud currently offers:

– a HTTP REST API and
– a Websocket Stream API

Docker Cloud Rest API:

The Docker Cloud REST API is reachable through the following hostname:https://cloud.docker.com/
All requests should be sent to this endpoint using Basic authentication using your API key as password as shown below:

ET /api/app/v1/service/ HTTP/1.1
Host: cloud.docker.com
Authorization: Basic dXNlcm5hbWU6YXBpa2V5
Accept: application/json

Docker Websocket Stream API:

The Docker Cloud Stream API is reachable through the following hostname:wss://ws.cloud.docker.com/
The Stream API requires the same authentication mechanism as the REST API as shown:

GET /api/audit/v1/events HTTP/1.1
Host: ws.cloud.docker.com
Authorization: Basic dXNlcm5hbWU6YXBpa2V5
Connection: Upgrade
Upgrade: websocket

Please refer https://docs.docker.com/apidocs/docker-cloud/#actions to read more about Docker Cloud APIs, API roles and authentication details.

My first experience with Docker Cloud was full of excitement. As I work in Enterprise Solution Group, I firmly believe that enterprises rarely deal with only single cloud at a time, and delivering applications across multiple clouds holds the domain of several specialized tools. Docker Cloud is one good reason which enables moving application in between the clouds and believe me, its matter of few clicks.

This article talks about how to get started with Docker Cloud. For this article, I will show how to deploy application  link to Amazon Web Services and Bring Your Own Node (“BYON”). This is a step by step guide to ease your understand and deployment.

  1. Login to https://cloud.docker.com

Image-1

2. Once you login to Docker Cloud window, you are welcome with 5 major steps:

  • Linking to your Cloud Provider
  • Deploying a Node
  • Building a Service
  • Creating a Stack ( stackfiles.io from Tatum)
  • Managing Repositories.

 

 

Image-2

3. Let’s follow each sections one by one. The first section helps you to link to your favorite cloud provider.

Image-3

As I already have Amazon AWS account, I am going to choose AWS and click on credentials. This option helps you to register your AWS account credentials in your Docker Cloud account to deploy node clusters and nodes using Docker Cloud’s dashboard, API or CLI. Under this section, we will also see AWS Security Credentials are required so that Docker Cloud can interact with AWS on your behalf to create and manage your nodes (EC2 instances).

Image-4

4. Click on “Add Credentials” and the new window will open asking for AWS credential.

Image-5

To get Access Key ID, one has to go back to AWS account and create user and services  in AWS IAM.

Let’s create a new service user called dockercloud-user in AWS IAM. To access the IAM panel in AWS go to https://console.aws.amazon.com/iam/#users

Image-6

If you try to go back to Docker Cloud window and try to supply Access ID for Amazon, it still throw warning about the wrong credential. There is still one step left to get it working. Before Docker Cloud can use the new user you just created, you need to give the it privileges so it can provision EC2 resources on your behalf. Go to the AWS IAM panel and click Policies > Create Policy: https://console.aws.amazon.com/iam/#policies as shown below:

Image-7

 

 

 

 

 

 

 

 

 

Image-8

 

 

 

 

 

 

Click on “Create Your Own Policy”

Image-10

 

 

 

 

 

 

 

 

 

I want to limit Docker Cloud to a specific EC2 Region, hence I am going to use the following policy instead, changing out the example us-west-2 US West for my desired region:

{

“Version”: “2012-10-17”,

“Statement”: [

{

“Action”: [

“ec2:*”,

“iam:ListInstanceProfiles”

],

“Effect”: “Allow”,

“Resource”: “*”,

“Condition”: {

“StringEquals”: {

“ec2:Region”: “us-west-2”

}

}

}

]

}

Click on Validate Policy. Once clicked, it shows up the information.

Image-11Click on Create Policy.

 

Image-13

 

 

 

Click on Users.

Image-14

 

 

 

 

 

Time to attach this policy for this docker-cloud user.

Image-17

 

 

 

 

 

 

 

 

 

 

    Image-18

 

 

 

 

 

 

 

 

 

Once you create the new dockercloud-user service user, have its credentials, and set the custom policy that allows Docker Cloud to use it, go back to Docker Cloud to add the service user’s credentials.

Image-19

 

 

 

 

 

 

 

We are ready to deploy our first node.

Image-20

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Lets create a node under zone us-west-2 as per the policy, t1.micro, 15GB disk, 1CPU,1 GB RAM as shown:

Image-21

 

 

 

 

 

 

 

Deployment of Node might take around 5-10 minutes

Image-22

 

 

 

 

 

 

 

 

As I have opted for amazon Free Tier, I need to restrict my Node to us-west-2. I can alter the policy as shown:

{ “Version”: “2012-10-17”, “Statement”: [   {     “Action”: [       “ec2:*”,       “iam:ListInstanceProfiles”     ],     “Effect”: “Allow”,     “Resource”: “*”,     “Condition”: {       “StringEquals”: {         “ec2:Region”: “us-west-2”       }      }   } ]}

If you go to your AWS page, you will find that a new instance is being initialized:

Image-Special

 

 

 

 

Once the instance gets deployed, you will see the screen as shown below:

Image_Sp2

 

 

 

 

 

 

The Container Host gets deployed as shown below:

Image-24 Image-25

Our first Node is completed. Let’s move to services section:

Image-A

 

 

 

 

 

 

 

I was interested in deploying my own repositories. So I choose My Repositories. While I choose it, it will automatically fetch all my containers which are present from Dockerhub.

 

Image-B

 

 

 

 

 

 

 

I want to test drive Nginx image and setup a Nginx server on the cloud( as shown below):

 

Image-C

 

 

 

 

 

 

 

 

This completes deploying a node with Nginx container.

Next section is quite interesting – Creating a stack.

A stack is a collection of services that make up an application in a specific environment. A stack file is a file in YAML format that define one or more services, similar to a docker-compose.yml file but with a few extensions. The default name for this file is docker-cloud.yml.

To learn more about Stackfiles.io, you can refer https://docs.docker.com/docker-cloud/feature-reference/stacks/

Let’s see how to create stack. Click on next section, Create a Stack window gets displayed:

Image-E

 

 

 

 

 

 

 

 

 

 

 

 

 

The stackfiles.io contains number of pre-built stack files which you can import into this page. I picked up “quickstart python” for this example.

Image-F

 

 

 

 

 

 

 

 

As shown above, the stackfiles.io brings enormous opportunity to Docker Cloud to  get the application quickly built and run.    Image-G

 

 

 

 

 

 

 

 

Multicloud Application Delivery Service  is a market which is booming and I believe Docker picked the right time to get into the right market. Intregrating Docker Cloud with Cloud service provider like Microsoft Azure, AWS, Packet, Digital Ocean, SoftLayer etc. is sure to gain momentum and make application migration too easy.

 

 

Have Queries? Join https://launchpass.com/collabnix

Ajeet Raina Ajeet Singh Raina is a former Docker Captain, Community Leader and Distinguished Arm Ambassador. He is a founder of Collabnix blogging site and has authored more than 700+ blogs on Docker, Kubernetes and Cloud-Native Technology. He runs a community Slack of 9800+ members and discord server close to 2600+ members. You can follow him on Twitter(@ajeetsraina).

Platform Engineering vs DevOps vs SRE: A Cheatsheet

According to Gartner®, by 2026, 80% of large software engineering organizations will establish platform engineering teams—a significant leap from 45% in 2022. This shift...
Tanvir Kour
2 min read

How to Develop Event-Driven Applications with Kafka and Docker

Event-driven architectures have become increasingly popular with the rise of microservices. These architectures are built around the concept of reacting to events in real-time,...
Abraham Dahunsi
6 min read

Comments are closed.

Join our Discord Server