Spread the love


Docker’s acquisition of Tutum, a cross-cloud container management service has really paid-off. Two week back, Docker announced “Docker Cloud 1.0” – a new service by Docker that implements all features previously offered by Tutum plus integration with Docker Hub Registry service and the common Docker ID credentials.. It is basically a SaaS platform that allows you to build, deploy and manage Docker containers in a variety of clouds. Docker Cloud is where developers and IT ops meet to build, ship and run any application, anywhere. Docker Cloud enables you to:

– Deploy and scale any application to your cloud in seconds, with just a few clicks
– Continuously deliver your code with integrated and automated build, test, and deployment workflows
– Get visibility across all your containers across your entire infrastructure
– Access the service programmatically via a RESTful API or a developer friendly CLI tool

Docker Cloud provides a single toolset for working with containers on multiple clouds. Docker Cloud currently offers:

– a Websocket Stream API

Docker Cloud Rest API:

The Docker Cloud REST API is reachable through the following hostname:https://cloud.docker.com/
All requests should be sent to this endpoint using Basic authentication using your API key as password as shown below:

ET /api/app/v1/service/ HTTP/1.1
Host: cloud.docker.com
Authorization: Basic dXNlcm5hbWU6YXBpa2V5
Accept: application/json

Docker Websocket Stream API:

The Docker Cloud Stream API is reachable through the following hostname:wss://ws.cloud.docker.com/
The Stream API requires the same authentication mechanism as the REST API as shown:

GET /api/audit/v1/events HTTP/1.1
Host: ws.cloud.docker.com
Authorization: Basic dXNlcm5hbWU6YXBpa2V5
Connection: Upgrade
Upgrade: websocket

Please refer https://docs.docker.com/apidocs/docker-cloud/#actions to read more about Docker Cloud APIs, API roles and authentication details.

My first experience with Docker Cloud was full of excitement. As I work in Enterprise Solution Group, I firmly believe that enterprises rarely deal with only single cloud at a time, and delivering applications across multiple clouds holds the domain of several specialized tools. Docker Cloud is one good reason which enables moving application in between the clouds and believe me, its matter of few clicks.

This article talks about how to get started with Docker Cloud. For this article, I will show how to deploy application  link to Amazon Web Services and Bring Your Own Node (“BYON”). This is a step by step guide to ease your understand and deployment.

  1. Login to https://cloud.docker.com


2. Once you login to Docker Cloud window, you are welcome with 5 major steps:

  • Linking to your Cloud Provider
  • Deploying a Node
  • Building a Service
  • Creating a Stack ( stackfiles.io from Tatum)
  • Managing Repositories.




3. Let’s follow each sections one by one. The first section helps you to link to your favorite cloud provider.


As I already have Amazon AWS account, I am going to choose AWS and click on credentials. This option helps you to register your AWS account credentials in your Docker Cloud account to deploy node clusters and nodes using Docker Cloud’s dashboard, API or CLI. Under this section, we will also see AWS Security Credentials are required so that Docker Cloud can interact with AWS on your behalf to create and manage your nodes (EC2 instances).


4. Click on “Add Credentials” and the new window will open asking for AWS credential.


To get Access Key ID, one has to go back to AWS account and create user and services  in AWS IAM.

Let’s create a new service user called dockercloud-user in AWS IAM. To access the IAM panel in AWS go to https://console.aws.amazon.com/iam/#users


If you try to go back to Docker Cloud window and try to supply Access ID for Amazon, it still throw warning about the wrong credential. There is still one step left to get it working. Before Docker Cloud can use the new user you just created, you need to give the it privileges so it can provision EC2 resources on your behalf. Go to the AWS IAM panel and click Policies > Create Policy: https://console.aws.amazon.com/iam/#policies as shown below:


















Click on “Create Your Own Policy”











I want to limit Docker Cloud to a specific EC2 Region, hence I am going to use the following policy instead, changing out the example us-west-2 US West for my desired region:


“Version”: “2012-10-17”,

“Statement”: [


“Action”: [




“Effect”: “Allow”,

“Resource”: “*”,

“Condition”: {

“StringEquals”: {

“ec2:Region”: “us-west-2”






Click on Validate Policy. Once clicked, it shows up the information.

Image-11Click on Create Policy.






Click on Users.







Time to attach this policy for this docker-cloud user.






















Once you create the new dockercloud-user service user, have its credentials, and set the custom policy that allows Docker Cloud to use it, go back to Docker Cloud to add the service user’s credentials.









We are ready to deploy our first node.

















Lets create a node under zone us-west-2 as per the policy, t1.micro, 15GB disk, 1CPU,1 GB RAM as shown:









Deployment of Node might take around 5-10 minutes










As I have opted for amazon Free Tier, I need to restrict my Node to us-west-2. I can alter the policy as shown:

{ “Version”: “2012-10-17”, “Statement”: [   {     “Action”: [       “ec2:*”,       “iam:ListInstanceProfiles”     ],     “Effect”: “Allow”,     “Resource”: “*”,     “Condition”: {       “StringEquals”: {         “ec2:Region”: “us-west-2”       }      }   } ]}

If you go to your AWS page, you will find that a new instance is being initialized:






Once the instance gets deployed, you will see the screen as shown below:








The Container Host gets deployed as shown below:

Image-24 Image-25

Our first Node is completed. Let’s move to services section:









I was interested in deploying my own repositories. So I choose My Repositories. While I choose it, it will automatically fetch all my containers which are present from Dockerhub.










I want to test drive Nginx image and setup a Nginx server on the cloud( as shown below):











This completes deploying a node with Nginx container.

Next section is quite interesting – Creating a stack.

A stack is a collection of services that make up an application in a specific environment. A stack file is a file in YAML format that define one or more services, similar to a docker-compose.yml file but with a few extensions. The default name for this file is docker-cloud.yml.

To learn more about Stackfiles.io, you can refer https://docs.docker.com/docker-cloud/feature-reference/stacks/

Let’s see how to create stack. Click on next section, Create a Stack window gets displayed:















The stackfiles.io contains number of pre-built stack files which you can import into this page. I picked up “quickstart python” for this example.










As shown above, the stackfiles.io brings enormous opportunity to Docker Cloud to  get the application quickly built and run.    Image-G









Multicloud Application Delivery Service  is a market which is booming and I believe Docker picked the right time to get into the right market. Intregrating Docker Cloud with Cloud service provider like Microsoft Azure, AWS, Packet, Digital Ocean, SoftLayer etc. is sure to gain momentum and make application migration too easy.




Spread the love

Ajeet Raina

My name is Ajeet Singh Raina and I am an author of this blogging site. I am a Docker Captain, ARM Innovator & Docker Bangalore Community Leader. I bagged 2 special awards last year(2019): Firstly, “The Tip of Captain’s Hat Award” at Dockercon 2019, San Francisco, and secondly, “2019 Docker Community Award“. I run Collabnix Community Slack with over 5300+ audience . I have built popular GITHUB repositories like DockerLabs, KubeLabs, Kubetools, RedisPlanet Terraform etc. with the support of Collabnix Community. Currently working as Developer Relations Manager at Redis Labs where I help customers and community members adopt Redis. With over 12,000+ followers over LinkedIn & close to 5100+ twitter followers, I like sharing Docker and Kubernetes related content . You can follow me on Twitter(@ajeetsraina) & GitHub(@ajeetraina)


Sky Moon · 27th July 2016 at 9:58 pm

This website is amazing. I will tell about it to my friends and anybody that could be interested in this subject. Great work guys!

kominki norweskie · 13th October 2016 at 1:42 am

Great beat ! I wish to apprentice even as you amend your site, how could i subscribe for a weblog site? The account helped me a appropriate deal. I had been a little bit familiar of this your broadcast provided bright clear concept

Bosch · 27th October 2016 at 8:44 pm

I know this website provides quality dependent content and extra material, is there any other website which provides such stuff in quality?|

Tadd · 3rd November 2016 at 5:46 am

Wow, amazing blog layout! How lengthy have you ever been blogging for? you made running a blog glance easy. The entire glance of your web site is excellent, as well as the content material!

Tech Magazines · 9th November 2016 at 3:02 pm

Wow! Thank you! I always wanted to write on my site something like that. Can I implement a portion of your post to my blog?

Nathanael Thruston · 10th November 2016 at 5:41 am

It’s a pity you don’t have a donate button! I’d most certainly donate to this excellent blog! I guess for now i’ll settle for book-marking and adding your RSS feed to my Google account. I look forward to brand new updates and will talk about this site with my Facebook group. Talk soon!

Leave a Reply

Your email address will not be published. Required fields are marked *