Docker Containerization and Compose

Docker Containerization and Compose

Unleashing the Power of Docker and Docker Compose for Seamless Application Management

In today's tech-driven environment, effective software deployment and management are critical. Containerization has emerged as a game-changing technology, reducing the time it takes to package and deliver programs. Docker has taken the lead in transforming the way we write and deliver software with technologies like Docker Compose In this blog article, we'll look at containerization and how Docker and Docker Compose, work together to ease application deployment and orchestration.

Introduction

Docker, a revolutionary technology, revolutionizes software development by offering efficient, scalable, and portable containerization platforms like Docker Compose

What is Docker?

Docker is an open-source technology that allows developers to automate application deployment in lightweight, portable containers. These containers bundle a program and all of its dependencies into a single package, guaranteeing consistent behavior across several contexts. Docker containers are independent of the host system and may operate on any system that supports Docker, whether it's a developer's laptop, a test server, or a production cluster.

Containerization

Containerization is the process of combining a program and its dependencies into a single unit known as a container. Containers are isolated, lightweight, and portable, allowing programs to execute reliably across several environments.

Understanding Containerization

A containerization is a lightweight form of virtualization that allows you to package an application and its dependencies into a single unit known as a container. Containers are isolated from the host system and ensure consistency across different environments, making it easier to develop and deploy applications.

Containerization, as provided by Docker, is transforming the way we design, distribute, and execute software. Here's why it's so revolutionary:

  • Isolation

  • Portability

  • Efficiency

  • Version Control

Isolation: Containers encapsulate a program and its dependencies, guaranteeing constant operation regardless of the underlying infrastructure.

Portability: Docker containers are transferable across diverse environments, making local development and deployment easier.

Efficiency: Because containers share the same OS kernel, resource overhead is reduced and startup times are reduced.

Version Control: Docker containers are versioned, which makes it simple to track changes and revert to prior versions if necessary.

Key Technology

The most well-known containerization platform is DocDockerker. Docker containers provide everything required to execute a program, including code, runtime, system libraries, and configuration.

Example 1: Containerizing a web application using Docker

Docker generates a container image for web applications requiring Python and PostgreSQL versions, ensuring constant functionality on any machine, regardless of the host environment.

Step 1: Create the Web Application

A Python web application, using the Flask framework, connects to a PostgreSQL database and displays a list of items on a web page.

mywebapp/
   ├── app.py
   ├── requirements.txt
   └── Dockerfile
  • app.py - Web application Python code.

  • Python prerequisites are listed in requirements.txt.

  • Dockerfile - A set of instructions for creating a Docker image.

Step 2: Create the Python App (app.py).

from flask import Flask
import psycopg2

app = Flask(__name__)             

# Connect to PostgreSQL database
conn = psycopg2.connect(
    database="mydb",
    user="myuser",
    password="mypassword",
    host="db",
    port="5432"
)

@app.route('/')
def index():
    cursor = conn.cursor()
    cursor.execute("SELECT * FROM items")
    items = cursor.fetchall()
    return '<h1>Items:</h1>' + '<br>'.join([item[1] for item in items])

if __name__ == '__main__':
    app.run(host='0.0.0.0')

Step 3: Make a file called requirements.txt.

List the Python packages on which your application depends in the requirements.txt file. In our case, it may contain Flask and psycopg2

Flask==2.1.1
psycopg2-binary==2.9.1

Step 4: Create a Dockerfile (Dockerfile).

A Dockerfile specifies how to create a Docker image for your application. Here's a basic Dockerfile for our web application

# Use the official Python image as the base image
FROM python:3.8

# Set the working directory in the container
WORKDIR /app

# Copy the application code and requirements file into the container
COPY app.py .
COPY requirements.txt .

# Install Python dependencies
RUN pip install -r requirements.txt

# Expose port 5000 for the Flask app
EXPOSE 5000

# Define the command to run when the container starts
CMD ["python", "app.py"]

Step 5: Create a Docker Image

To construct the Docker image, go to the directory containing your Dockerfile and execute the following command:

docker build -t mywebapp .

Docker will create an image called 'mywebapp' with the current directory ('.') as the build context.

Step 6: Start the Docker Container

You may run a container from the image after it has been built

docker run -d -p 5000:5000 --name mywebapp-container mywebapp

This command launches a container named 'mywebapp-container' from the 'mywebapp' image, mapping port 5000 in the container to port 5000 on the host system ('-p 5000:5000').

Your web application is now running in a Docker container, with its own Python environment and PostgreSQL database. You may get to it by going to 'http://localhost:5000' in a web browser.

Docker Compose

Docker Compose is a Docker utility that streamlines the management of multi-container applications. It defines all of your application stack's services, networks, and volumes in a YAML file.

💡
Docker Compose is especially beneficial in development and testing settings where complicated applications with several components must be swiftly spun up.

Example 2: The project involves creating a web application with a React front-end, Node.js back-end API, and a PostgreSQL database.

Assume you're working on a web application that includes a React front-end, a Node.js back-end API, and a PostgreSQL database. You may specify these components in a 'docker-compose.yml' file with Docker Compose. When you execute 'docker-compose up,' it creates and manages containers for each component of your application, ensuring that they can interact with one another.

Step 1: Create a directory

mywebapp/
   ├── frontend/
   │     ├── Dockerfile
   │     └── (React frontend files)
   ├── backend/
   │     ├── Dockerfile
   │     └── (Node.js backend files)
   ├── docker-compose.yml
   └── database/
         └── (PostgreSQL data files)
  • The React front-end files are located in the 'frontend' directory.

  • The Node.js backend files are located in the 'backend' directory.

  • PostgreSQL's data files will be stored in the 'database' directory.

  • Each 'Dockerfile' provides instructions for creating Docker images for the front and back ends of the services.

Step 2: Create Dockerfiles

Dockerfile for the React frontend

# Use the official Node.js image as the base image
FROM node:14

# Set the working directory in the container
WORKDIR /app/frontend

# Copy the frontend files into the container
COPY ./frontend .

# Install dependencies and build the React app
RUN npm install
RUN npm run build

# Expose port 80 for the React app
EXPOSE 80

# Command to start the React app
CMD ["npm", "start"]

Dockerfile for the Node.js backend

# Use the official Node.js image as the base image
FROM node:14

# Set the working directory in the container
WORKDIR /app/backend

# Copy the backend files into the container
COPY ./backend .

# Install dependencies for the backend
RUN npm install

# Expose port 3000 for the Node.js app
EXPOSE 3000

# Command to start the Node.js app
CMD ["npm", "start"]

Step 3: Create the docker-compose.yml file.

To specify the services and their configurations, create a docker-compose.yml file at the root of your project:

version: '3'
services:
  frontend:
    build:
      context: .
      dockerfile: frontend/Dockerfile
    ports:
      - "80:80"

  backend:
    build:
      context: .
      dockerfile: backend/Dockerfile
    ports:
      - "3000:3000"
    depends_on:
      - database

  database:
    image: postgres:latest
    environment:
      POSTGRES_USER: myuser
      POSTGRES_PASSWORD: mypassword
      POSTGRES_DB: mydb
  • We distinguish three types of services: frontend, backend, and database.

  • The Dockerfile for each service is specified in the build section.

  • External access is granted through port 80 for the React app and port 3000 for the Node.js app.

  • The depends_on option guarantees that the backend service only begins after the database service has been started.

    Step 4: Execute Docker Compose.

  • Navigate to the directory where your docker-compose.yml file is located and run the following command

docker-compose up

Docker Compose will construct Docker images for your front-end and back-end services, as well as containers and start them. In addition, it will create a PostgreSQL container for your database.

Your React front-end, Node.js back-end, and PostgreSQL database are now operating in different containers, and Docker Compose guarantees that they can interact as configured in the docker-compose.yml file.

Docker Compose simplifies managing complex multi-container systems by describing services and dependencies in a single configuration file, enabling communication with the PostgreSQL database during development and testing.

Your React front-end can be found at http://localhost, and your Node.js back-end can be found at http://localhost:3000.

Summary
Containerization is a technology for packaging, distributing, and running applications in isolated environments, with Docker being a key platform. Docker Compose is a Docker tool for managing multi-container applications, simplifying the setup process by defining essential services and dependencies in a single configuration file.

Stay tuned for the upcoming articles in the series, where we'll discuss more interesting topics related to Docker. Subscribe to our channel to ensure you don't miss any part of this enlightening journey!

Thank you for reading our blog. Our top priority is your success and satisfaction. We are ready to assist with any questions or additional help.

Warm regards,

Kamilla Preeti Samuel,

Content Editor

ByteScrum Technologies Private Limited! 🙏

Did you find this article valuable?

Support ByteScrum Technologies by becoming a sponsor. Any amount is appreciated!