Multi-Service Web Application Using Docker and Nginx

 

Multi-Service Web Application Using Docker and Nginx

Aryan Agarwal


Introduction

The Multi-Service Docker Application with NGINX is a modular and containerized system designed to demonstrate the integration of machine learning, data analytics, and system orchestration within a microservices architecture. This project leverages Docker to encapsulate multiple independent services—each responsible for a distinct functionality such as sentiment analysis, dataset analysis, logging, and frontend interaction—ensuring scalability, maintainability, and ease of deployment.

The frontend, developed using Streamlit, provides an interactive interface where users can perform two primary operations: predicting the sentiment of textual data and analyzing datasets through correlation visualizations. The Machine Learning Service handles sentiment predictions, while the Analysis Service processes uploaded datasets to generate visual insights. A dedicated Logging Service, powered by SQLite3, records system activities and service communications, offering transparency and traceability across all operations. To efficiently manage and route traffic among these services, NGINX is employed as a reverse proxy and load balancer, ensuring optimal performance and seamless user experience.

Overall, this project exemplifies the practical implementation of microservice architecture, inter-service communication, and containerized deployment using Docker and Docker Compose, demonstrating how multiple technologies can cohesively operate to build a robust, distributed machine learning application.


Objectives

Procedure 1: Development of a Single Dockerized Sentiment Analysis Model

The primary objective of this phase was to design and containerize a standalone machine learning service capable of performing sentiment analysis on textual data. Using the VADER (Valence Aware Dictionary and sEntiment Reasoner) lexicon, the system aimed to classify input text into positive, negative, or neutral sentiment categories with high interpretability. The integration of Python and Docker ensured that the application could be executed in an isolated and reproducible environment, independent of the host system’s configuration. This stage established the foundational understanding of Docker-based containerization and its applicability in deploying lightweight natural language processing (NLP) models.

Procedure 2: Modularization into Multi-Service Architecture using Docker Compose

The second phase focused on extending the single-container design into a microservices architecture, where each service performed a distinct and encapsulated function. The objectives included developing separate containers for:

  • a Streamlit-based frontend for user interaction,

  • a machine learning service using the VADER lexicon for text sentiment prediction,

  • a dataset analysis service utilizing data visualization and statistical correlation techniques, and

  • a logging service for recording inter-service communications and operational events using SQLite3.

These services were orchestrated using Docker Compose, enabling seamless networking, scalability, and inter-container communication. Additionally, Flask and FastAPI frameworks were employed to establish RESTful APIs for inter-service communication, thus achieving modularity, maintainability, and concurrent service execution. To further boost the system’s scalability, reliability, and network security, NGINX was integrated as a reverse proxy and load balancer. NGINX was configured to act as an intermediary, efficiently routing external user requests to internal containerized services while concealing internal port details. This setup achieved crucial port security, efficient traffic distribution, and fault tolerance through load balancing across service replicas, successfully demonstrating the application of modern distributed computing principles and real-world deployment best practices.

Procedure 3:  Final Presentation and Documentation:

The objective of DA3 is to present the finalized product during the Docker Showdown event, showcasing its capabilities, design, and performance. This phase also includes the preparation and submission of comprehensive project documentation, covering system architecture, implementation details, usage guidelines, and future enhancement possibilities.


Download Links:

https://hub.docker.com/repository/docker/aryanagarwal276/multi-service-app-with-nginx/general  [Docker Compose]


https://github.com/AryanAgarwal1251/Multi-Service-Docker-Application-with-Nginx [Github Link]

Containers Used and Links

The Docker Compose file defines 7 containers, each serving a unique role:


Container Name

Directory/Build Context

Purpose

frontend

./Frontend_App

Streamlit-based user interface for interacting with all backend services.

backup-frontend

./Frontend_App

Redundant instance for load balancing and failover (managed by NGINX).

ml-service

./ML_App

Machine learning API for sentiment prediction using the VADER lexicon.

backup-ml-service

./ML_App

Secondary ML service replica for load balancing.

analysis-service

./Analysis_App

Performs dataset analysis (e.g., correlation heatmaps).

backup-analysis-service

./Analysis_App

Backup instance for load balancing.

logging-service

./Logs_App

Logs all service communications using Flask + SQLite3.

nginx

Official nginx:latest image

Acts as reverse proxy and load balancer for all above services.



Softwares Apart from Docker


Software / Library

Purpose / Role in the Project

Python 3.9

Primary programming language used for developing backend logic in all services.

Streamlit

Framework used to create the frontend interface for user interaction (text input, file upload, log display).

Flask

Lightweight web framework used to build the ML service, Analysis service, and Logging service REST APIs.

FastAPI

High-performance web framework for building async APIs (optional if only Flask was used).

VADER Lexicon (NLTK)

Pre-trained sentiment analysis model used by the ML service to predict text sentiment (positive/negative/neutral).

Matplotlib

Used in the Analysis service to generate visual data plots (e.g., correlation heatmaps).

Seaborn

High-level visualization library used with Matplotlib for producing clean, informative statistical plots.

Pandas

Used for data manipulation and analysis in the Analysis service, especially for handling uploaded CSV files.

SQLite3

Lightweight relational database used in the Logging service to persist log entries (timestamp, service name, message).

Requests

Used in multiple services to send HTTP requests between containers (e.g., for logging and inter-service communication).

NGINX

Acts as a reverse proxy and load balancer between external clients and internal containerized services, improving performance and security.

Docker

Used to containerize all microservices, ensuring platform independence and consistent deployment environments.

Docker Compose

Used to orchestrate all containers, defining dependencies, networking, ports, and load-balanced replicas in a single configuration file.


Overall Architecture:

  1. Procedure 1

Single Container:
A standalone Docker container housing the entire application (sentiment analysis using VADER Lexicon).

Software and Technologies Used:


Component

Purpose

Python 3.9

Primary programming language used for implementing sentiment analysis.

VADER Lexicon (NLTK)

Pre-trained sentiment analysis model used for polarity detection (positive/negative/neutral).

Docker

Used to package the Python application and its dependencies into a single portable container.

Input / Output


Type

Description

Input

User-provided text data (via CLI or script).

Output

Sentiment classification result (Positive, Negative, Neutral) printed on console or returned as JSON.


  1. Procedure 2

Microservice Containers (Docker Architecture)

  1. NGINX Reverse Proxy Container

    • Acts as a secure gateway and load balancer.

    • Routes all external traffic from port 8080 to internal service ports:

      • 8501 — Streamlit Frontend

      • 5000 — ML (VADER) Service

      • 5050 — Dataset Analysis

      • 6000 — Logging Service

    • Ensures centralized routing, security, and controlled external access.

  2. Streamlit Frontend Container

    • Provides an interactive UI for:

      • Entering text for sentiment analysis

      • Uploading CSV datasets

      • Viewing visual analytics and sentiment results

    • Communicates with backend services via REST API through NGINX.

  3. ML (VADER) Container

    • Backend microservice implemented using Flask.

    • Performs sentiment analysis using NLTK’s VADER Lexicon.

    • Returns prediction scores (positive, neutral, negative) to the frontend.

  4. Dataset Analysis Container

    • API microservice for processing CSV datasets:

      • Cleans and analyzes data with Pandas

      • Generates statistical insights and correlation heatmaps using Matplotlib & Seaborn

    • Results are sent back to Streamlit for visualization.

  5. Logging Container

    • Built with Flask + SQLite3 database.

    • Stores logs of:

      • User text inputs

      • API interactions

      • Sentiment results and analysis events

    • Supports viewing logs for monitoring and debugging.


Software / Library

Purpose

Python

Core development language for services

Streamlit

User interface & visualization frontend

Flask

Microservices for ML, Analysis & Logging APIs

FastAPI

Efficient REST communication between services

SQLite3

Log storage in the Logging container

Pandas

Data analysis & preprocessing

Matplotlib & Seaborn

Visual analytics & heatmap generation

VADER (NLTK)

Sentiment analysis model

Docker & Docker Compose

Containerization & orchestration

NGINX

Reverse proxy & request routing


  1. Final Project

Step 1:

Once both containers were ready, they were configured to communicate via Docker’s internal network. The backend service was exposed on port 8000, while the Streamlit frontend ran on port 8501.

Environment variables were used for configuration, ensuring security and flexibility.




Step 2: Using Docker Compose for Orchestration

Docker Compose made it much easier to manage the different parts of my project. With just one configuration file, I could set up all the microservices—like the Streamlit frontend, ML sentiment analysis, dataset analysis, logging, and the NGINX reverse proxy—to run together smoothly. Docker Compose created a shared network so the services could talk to each other securely, handled the order they started up, and made sure everything was connected properly. This meant I could start or stop the entire system with a single command, making deployment and updates much simpler. Overall, Docker Compose helped keep things consistent and easy to manage, saving time and reducing headaches during development.


Final Project Screenshot:

VIII. Outcomes of the Project

  • Successfully developed a multi-container Dockerized application integrating multiple microservices (Frontend, ML, Analysis, and Logging) orchestrated using Docker Compose.

  • Implemented a VADER-based sentiment analysis model capable of analyzing textual input and classifying sentiments as Positive, Negative, or Neutral.

  • Built a dataset analysis module that processes uploaded ZIP files, visualizes correlations, and generates heatmaps for exploratory data insights.

  • Designed a centralized logging service to record activities and interactions across containers for improved monitoring and debugging.

  • Configured NGINX as a reverse proxy and load balancer, providing a secure and efficient gateway for routing requests between the user and internal services.

  • Created a Streamlit-based frontend that allows users to interact seamlessly with all backend services through a unified and user-friendly web interface.

  • Achieved service scalability, modularity, and maintainability, enabling independent updates or debugging of each service without affecting others.


IX. Conclusion

The project demonstrates the power and flexibility of containerized microservice architectures in building scalable and modular machine learning applications. By combining Flask, FastAPI, Streamlit, and NGINX within a Docker ecosystem, the application effectively showcases how different services can communicate and operate cohesively under one framework. The implementation of sentiment analysis and dataset visualization services provides practical, real-world applications of AI, while the integration of a logging mechanism ensures transparency and system reliability. Overall, the project emphasizes modern DevOps practices and establishes a strong foundation for deploying and managing machine learning applications in distributed environments.

X. Acknowledgements

I would like to express my heartfelt gratitude to VIT Chennai for providing the infrastructure and academic environment that made this project possible. I am especially thankful to my professor, Dr. Subbulakshmi T, whose insightful guidance, encouragement, and expertise significantly contributed to the successful completion of this work. As a 5th-semester student, I undertook this entire project under the subject of Cloud Computing, and I am grateful for the opportunity to integrate theoretical knowledge with practical implementation.


XI. References


Name: Aryan Agarwal

Comments