Getting Started with Docker: Containerizing Your Applications

Categories: Development

Introduction

In the world of modern software development, Docker has emerged as a pivotal tool for containerization, enabling developers to package applications along with their dependencies into a standardized unit called a container. This ensures consistent environments from development to production, eliminating the classic "works on my machine" problem. This comprehensive guide will introduce you to Docker, explain the benefits of containerization, and provide step-by-step instructions to get you started with Docker by building and running your first containerized application.

Table of Contents

  1. Understanding Containers and Docker
  2. Prerequisites
  3. Installing Docker
  4. Creating a Simple Application
  5. Writing a Dockerfile
  6. Building the Docker Image
  7. Running the Docker Container
  8. Managing Docker Containers and Images
  9. Pushing Images to Docker Hub (Optional)
  10. Conclusion
  11. Additional Resources

1. Understanding Containers and Docker

Containers are lightweight, standalone, and executable packages of software that include everything needed to run an application: code, runtime, system tools, libraries, and settings. They share the host system's kernel but are isolated from other containers, providing a consistent environment across different stages of development and deployment.

Docker is an open-source platform that automates the deployment, scaling, and management of applications inside containers. It provides an ecosystem for container management, including tools for building, distributing, and running containers.

Benefits of Using Docker:

  • Consistency: Ensures the application runs the same in development, testing, and production environments.
  • Isolation: Containers run in isolation, preventing conflicts between applications and dependencies.
  • Efficiency: Containers are lightweight and use fewer resources than virtual machines.
  • Scalability: Simplifies scaling applications horizontally by adding more containers.
  • Portability: Containers can run on any system that supports Docker, regardless of underlying hardware or operating system.

2. Prerequisites

  • A computer running Windows 10/11 (64-bit), macOS, or a modern Linux distribution.
  • Administrative privileges to install software.
  • Basic knowledge of command-line interfaces.
  • An internet connection to download Docker and application dependencies.

3. Installing Docker

Docker provides different installation packages for various operating systems. Follow the instructions for your operating system below.

For Windows Users:

  1. Check System Requirements:
    • Windows 10 64-bit: Pro, Enterprise, or Education (Build 18362 or higher), or Windows 11.
    • Enable virtualization in your BIOS settings.
  2. Download Docker Desktop:
  3. Install Docker Desktop:
    • Run the downloaded installer (Docker Desktop Installer.exe).
    • Follow the installation wizard prompts:
      • Accept the license agreement.
      • Choose "Install required components for WSL 2" (recommended).
    • Click "Finish" when installation is complete.
  4. Start Docker Desktop:
    • Launch Docker Desktop from the Start menu.
    • Wait for Docker to start; you should see the Docker icon in the system tray.
  5. Verify Installation:
    • Open Command Prompt or PowerShell.
    • Run the command:
      docker --version

      You should see the Docker version displayed.

For macOS Users:

  1. Check System Requirements:
    • macOS 10.15 (Catalina) or newer.
    • At least 4GB of RAM.
  2. Download Docker Desktop:
  3. Install Docker Desktop:
    • Double-click the downloaded .dmg file.
    • Drag and drop the Docker icon into the Applications folder.
  4. Start Docker Desktop:
    • Launch Docker from the Applications folder.
    • You may be prompted to authorize Docker with your system password.
  5. Verify Installation:
    • Open Terminal.
    • Run the command:
      docker --version

      You should see the Docker version displayed.

For Linux Users:

Docker installation commands vary by distribution. Below are instructions for Ubuntu; for other distributions, refer to the official Docker documentation.

  1. Uninstall Old Versions (if any):
    sudo apt-get remove docker docker-engine docker.io containerd runc
  2. Update the Package Index:
    sudo apt-get update
  3. Install Required Packages:
    sudo apt-get install apt-transport-https ca-certificates curl gnupg-agent software-properties-common
  4. Add Docker’s Official GPG Key:
    curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
  5. Set Up the Stable Repository:
    sudo add-apt-repository \ "deb [arch=amd64] https://download.docker.com/linux/ubuntu \ $(lsb_release -cs) \ stable"
  6. Install Docker Engine:
    sudo apt-get update sudo apt-get install docker-ce docker-ce-cli containerd.io
  7. Verify Installation:
    sudo docker --version

    You should see the Docker version displayed.

  8. Optional - Manage Docker as a Non-Root User:
    sudo usermod -aG docker $USER

    Log out and log back in for the changes to take effect.

4. Creating a Simple Application

Let's create a simple Node.js application that we'll containerize using Docker.

  1. Set Up the Project Directory:
    mkdir docker-tutorial cd docker-tutorial
  2. Initialize a Node.js Project:
    npm init -y

    This creates a package.json file with default settings.

  3. Create the Application File:
    touch app.js

    Open app.js in your preferred code editor and add the following code:

    const http = require('http');
    const hostname = '0.0.0.0'; const port = 3000;

    const server = http.createServer((req, res) => { res.statusCode = 200; res.setHeader('Content-Type', 'text/plain'); res.end('Hello, Docker!'); });

    server.listen(port, hostname, () => { console.log(Server running at http://${hostname}:${port}/); });

    This simple Node.js application starts a web server that responds with "Hello, Docker!"



  4. Install Dependencies:
    npm install

    Since we're using built-in Node.js modules, there are no dependencies, but it's good practice to run this command.

5. Writing a Dockerfile

A Dockerfile is a text document that contains instructions for Docker to build an image.

  1. Create a Dockerfile:
    touch Dockerfile
  2. Edit the Dockerfile:

    Add the following content to the Dockerfile:

    FROM node:14-alpine
    Create app directory
    WORKDIR /usr/src/app

    Copy package.json and package-lock.json (if available)
    COPY package*.json ./

    Install app dependencies
    RUN npm install

    Bundle app source
    COPY . .

    Expose port
    EXPOSE 3000

    Define the command to run your app
    CMD [ "node", "app.js" ]

    Explanation:

    • FROM node:14-alpine: Uses the Node.js 14 Alpine Linux image as the base.
    • WORKDIR /usr/src/app: Sets the working directory inside the container.
    • COPY package*.json ./: Copies package.json and package-lock.json to the working directory.
    • RUN npm install: Installs dependencies.
    • COPY . .: Copies the rest of the application code to the working directory.
    • EXPOSE 3000: Documents that the container listens on port 3000.
    • CMD [ "node", "app.js" ]: Specifies the command to run when the container starts.


6. Building the Docker Image

Now that we have our application and Dockerfile, we can build a Docker image.

  1. Build the Image:
    docker build -t my-node-app .

    Explanation:

    • docker build: Command to build a Docker image.
    • -t my-node-app: Tags the image with the name my-node-app.
    • .: Specifies the build context (current directory).

    You should see Docker executing the steps defined in the Dockerfile.

  2. List Docker Images:
    docker images

    You should see my-node-app listed among the images.

7. Running the Docker Container

With the image built, you can now run a container based on it.

  1. Run the Container:
    docker run -p 3000:3000 my-node-app

    Explanation:

    • docker run: Command to run a container.
    • -p 3000:3000: Maps port 3000 of the host to port 3000 of the container.
    • my-node-app: The image to run.

    You should see the output:

    Server running at http://0.0.0.0:3000/

  2. Test the Application:
    • Open a web browser or use curl to navigate to http://localhost:3000/.
    • You should see "Hello, Docker!" displayed.
  3. Stop the Container:

    Press Ctrl+C in the terminal where the container is running to stop it.

    Alternatively, you can run:

    docker ps

    To list running containers, then stop it using:

    docker stop [container_id]

  4. Run the Container in Detached Mode:
    docker run -d -p 3000:3000 my-node-app

    The -d flag runs the container in detached mode (in the background).

8. Managing Docker Containers and Images

Listing Running Containers:

docker ps

Listing All Containers (including stopped ones):

docker ps -a

Stopping a Running Container:

docker stop [container_id]

Removing a Container:

docker rm [container_id]

Removing an Image:

docker rmi [image_id]

Pruning Unused Data:

docker system prune

This command removes all stopped containers, unused networks, dangling images, and build cache.

9. Pushing Images to Docker Hub (Optional)

You can push your Docker images to Docker Hub to share them with others or deploy them to production environments.

  1. Create a Docker Hub Account:
  2. Log in to Docker Hub from the Command Line:
    docker login

    Enter your Docker Hub username and password when prompted.

  3. Tag Your Image:
    docker tag my-node-app your_dockerhub_username/my-node-app:latest
  4. Push the Image:
    docker push your_dockerhub_username/my-node-app:latest

    Your image is now available on Docker Hub.

10. Conclusion

Congratulations! You've successfully installed Docker, created a simple Node.js application, written a Dockerfile, built a Docker image, and run your application inside a Docker container. This foundational knowledge opens the door to exploring more advanced Docker features, such as docker-compose for multi-container applications, Docker networking, and integrating Docker into your development and deployment workflows.

Docker streamlines the process of application deployment by ensuring consistency across different environments and simplifying dependency management. As you continue to develop your skills, consider containerizing more complex applications and exploring Docker's extensive ecosystem.

Additional Resources

Next Steps

To further enhance your Docker skills, consider exploring the following topics:

  • Docker Compose: Define and run multi-container applications with docker-compose.yml.
  • Docker Networking: Understand how containers communicate with each other.
  • Docker Volumes: Manage data persistence for containers.
  • Orchestration with Kubernetes: Learn how to manage containerized applications at scale.

Happy Containerizing!