Welcome!

Cognitive Computing Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, Zakia Bouachraoui

Related Topics: @DevOpsSummit, Microservices Expo, Containers Expo Blog

@DevOpsSummit: Blog Feed Post

How to Use Docker | @DevOpsSummit #DevOps #Docker #Containers

In the last few years, this container management service has become immensely popular in development

How to Use Docker
By Ron Gidron

Docker is on a roll. In the last few years, this container management service has become immensely popular in development, especially given the great fit with agile-based projects and continuous delivery. In this article, I want to take a brief look at how you can use Docker to accelerate and streamline the software development lifecycle (SDLC) process.

First however, a brief introduction. The whole concept of Docker is for developers to easily ship applications inside ‘software containers' which can then be deployed and run anywhere. Let's imagine you develop an application on your laptop where it works perfectly. Then you push that into a test or production environment; you've chosen the right stack, the right language, and right version. But it doesn't work. Why? Because it's not the same environment.

Maybe you used a new version of a library, but the ops guy tells you that you can't use this library because all the other applications running on the server will break. So, there's a lot of back and forth between the ops and your team of developers. It delays projects, costs money and is frustrating for everyone involved.

When you develop with Docker, everything is packaged inside a container, or inside several containers that talk to each other. Docker completely isolates your piece of software from any external dependencies. A container is self-sufficient. You simply push the container to another environment; its contents and how it was developed are transparent to the ops team. So how is a ‘software container' like this any different to a standalone computer? Or a virtual machine for that matter?

Containers are lightweight because they don't need the extra load of a hypervisor, but run directly within the host machine's kernel. This means you can run more containers on a given hardware combination than if you were using virtual machines. You can even run Docker containers within host machines that are actually virtual machines. All of this makes containers - and Docker - ideal for continuous integration and continuous delivery workflows.

Docker itself uses a client-server architecture. The client talks to the daemon, which does the heavy lifting of building, running and distributing the containers. The Docker client and daemon can run on the same system, or you can connect the client to a remote daemon. The Docker client and daemon communicate using a REST API, over UNIX sockets or a network interface.

Orchestrating the Management and Deployment of Docker Containers
Despite all its advantages
, Docker still requires its own platform to be managed, someone to manage build, run, assign and stop the containers (as well many additional administrative tasks). And Docker itself, still needs to be aware of the other environments it interacts with, including the infrastructure layer and other traditional software services not currently running within the container landscape. Time spent by developers on tasks like these is considered an overhead which can result in missed deadlines, while time spent by ops performing these tasks creates pipeline bottlenecks right before production deployments, which can in turn lead to production failures.

There is a solution that enables you to orchestrate all the moving parts - people, process and technology - involved in managing and deploying to Docker containers. Release automation packages. The built in Automic container blueprint provisioning system along with the Docker actionpack from Automic allows you to blueprint entire Docker systems, build visual workflows and automate container builds, maintenance, provisioning, configurations and most administration tasks. This not only increases productivity among developers and administrators, it also lowers the risk of errors occurring. One example might be automatically ensuring the underlying infrastructure has enough capacity to support the projected container workloads or that certain container versions are always rolled out with matching external service package versions.

The Docker environment blueprint provisioning capability and actionpack from Automic combines an integrated application packaging system, smart deployment models, and out-of-the-box actions for common deployment tasks with robust workflow design and high-volume execution capability. The Automic Docker package allows users to build, provision, configure and manage Docker containers as part of an automated application deployment process.

Ultimately, this accelerates deployments to Docker containers, ensures the quality of container deployments and minimizes management overhead to help both development and operations grow the business.

Read the original blog entry...

More Stories By Automic Blog

Automic, a leader in business automation, helps enterprises drive competitive advantage by automating their IT factory - from on-premise to the Cloud, Big Data and the Internet of Things.

With offices across North America, Europe and Asia-Pacific, Automic powers over 2,600 customers including Bosch, PSA, BT, Carphone Warehouse, Deutsche Post, Societe Generale, TUI and Swisscom. The company is privately held by EQT. More information can be found at www.automic.com.

IoT & Smart Cities Stories
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. ...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and G...
DXWorldEXPO LLC announced today that "IoT Now" was named media sponsor of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. IoT Now explores the evolving opportunities and challenges facing CSPs, and it passes on some lessons learned from those who have taken the first steps in next-gen IoT services.
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
Founded in 2000, Chetu Inc. is a global provider of customized software development solutions and IT staff augmentation services for software technology providers. By providing clients with unparalleled niche technology expertise and industry experience, Chetu has become the premiere long-term, back-end software development partner for start-ups, SMBs, and Fortune 500 companies. Chetu is headquartered in Plantation, Florida, with thirteen offices throughout the U.S. and abroad.
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...