Continuous Delivery Using Docker Training Course
Continuous Delivery Using Docker training course is a two-day practical workshop, especially relevant to developers wanting to foster a DevOps culture at their workplace. Therefore, delegates are taken from the basics of containers through to clustered infrastructures supporting continuous deployment, horizontal scalability and zero downtime.
Continuous Delivery Using Docker – Course Overview
Delegates build their own solutions on AWS during the DevOps Using Docker training course. Delegates understand the principles underpinning containers before building and monitoring their own continuous delivery pipeline. There is expert tuition and guidance at every step so that delegates can set up and deploy to their own remotely hosted container cluster. It is helpful, but not essential, to have some experience with a Unix shell. No specific language experience is required.
Get in touch
Continuous Delivery Using Docker – Course Content
- An introduction to Docker, also the Docker CLI.
- Build, run and publish Docker images.
- Deploy containers to a remote server.
- Set up a continuous delivery pipeline and, as a result:
- Automate build and deployment as part of a continuous integration process using TeamCity;
- Establish health checks for services using Nagios;
- Implement a service discovery solution using tools such as Consul and Nginx.
- Set up a cluster of Docker daemons using Docker Swarm.
- And, finally, explore techniques and strategies for achieving zero downtime.
Continuous Delivery Using Docker – Prerequisites
- Bring their own laptops.
- Be able to log in over SSH to machines created in AWS.
- Install Git beforehand (because they’ll be cloning sample code from our GitHub repository).
- Also be proficient with their text editor of choice.
What is Docker?
In summary, Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. A developer packages in a container all the parts an application needs, such as libraries and other dependencies, and ships it all out as one package. By doing so, thanks to the container, the developer can be assured that the application will run on any other machine regardless of customised settings that machine might have that differ from the machine used for writing and testing code.