Multi-Service Web Application Using Docker and Nginx
Multi-Service Web Application Using Docker and Nginx
Aryan Agarwal
Introduction
The Multi-Service Docker Application with NGINX is a modular and containerized system designed to demonstrate the integration of machine learning, data analytics, and system orchestration within a microservices architecture. This project leverages Docker to encapsulate multiple independent services—each responsible for a distinct functionality such as sentiment analysis, dataset analysis, logging, and frontend interaction—ensuring scalability, maintainability, and ease of deployment.
The frontend, developed using Streamlit, provides an interactive interface where users can perform two primary operations: predicting the sentiment of textual data and analyzing datasets through correlation visualizations. The Machine Learning Service handles sentiment predictions, while the Analysis Service processes uploaded datasets to generate visual insights. A dedicated Logging Service, powered by SQLite3, records system activities and service communications, offering transparency and traceability across all operations. To efficiently manage and route traffic among these services, NGINX is employed as a reverse proxy and load balancer, ensuring optimal performance and seamless user experience.
Overall, this project exemplifies the practical implementation of microservice architecture, inter-service communication, and containerized deployment using Docker and Docker Compose, demonstrating how multiple technologies can cohesively operate to build a robust, distributed machine learning application.
Objectives
Procedure 1: Development of a Single Dockerized Sentiment Analysis Model
The primary objective of this phase was to design and containerize a standalone machine learning service capable of performing sentiment analysis on textual data. Using the VADER (Valence Aware Dictionary and sEntiment Reasoner) lexicon, the system aimed to classify input text into positive, negative, or neutral sentiment categories with high interpretability. The integration of Python and Docker ensured that the application could be executed in an isolated and reproducible environment, independent of the host system’s configuration. This stage established the foundational understanding of Docker-based containerization and its applicability in deploying lightweight natural language processing (NLP) models.
Procedure 2: Modularization into Multi-Service Architecture using Docker Compose
The second phase focused on extending the single-container design into a microservices architecture, where each service performed a distinct and encapsulated function. The objectives included developing separate containers for:
a Streamlit-based frontend for user interaction,
a machine learning service using the VADER lexicon for text sentiment prediction,
a dataset analysis service utilizing data visualization and statistical correlation techniques, and
a logging service for recording inter-service communications and operational events using SQLite3.
These services were orchestrated using Docker Compose, enabling seamless networking, scalability, and inter-container communication. Additionally, Flask and FastAPI frameworks were employed to establish RESTful APIs for inter-service communication, thus achieving modularity, maintainability, and concurrent service execution. To further boost the system’s scalability, reliability, and network security, NGINX was integrated as a reverse proxy and load balancer. NGINX was configured to act as an intermediary, efficiently routing external user requests to internal containerized services while concealing internal port details. This setup achieved crucial port security, efficient traffic distribution, and fault tolerance through load balancing across service replicas, successfully demonstrating the application of modern distributed computing principles and real-world deployment best practices.
Procedure 3: Final Presentation and Documentation:
The objective of DA3 is to present the finalized product during the Docker Showdown event, showcasing its capabilities, design, and performance. This phase also includes the preparation and submission of comprehensive project documentation, covering system architecture, implementation details, usage guidelines, and future enhancement possibilities.
Download Links:
https://hub.docker.com/repository/docker/aryanagarwal276/multi-service-app-with-nginx/general [Docker Compose]
https://github.com/AryanAgarwal1251/Multi-Service-Docker-Application-with-Nginx [Github Link]
Containers Used and Links
The Docker Compose file defines 7 containers, each serving a unique role:
Softwares Apart from Docker
Overall Architecture:
Procedure 1
Single Container:
A standalone Docker container housing the entire application (sentiment analysis using VADER Lexicon).
Software and Technologies Used:
Input / Output
Procedure 2
Microservice Containers (Docker Architecture)
NGINX Reverse Proxy Container
Acts as a secure gateway and load balancer.
Routes all external traffic from port 8080 to internal service ports:
8501 — Streamlit Frontend
5000 — ML (VADER) Service
5050 — Dataset Analysis
6000 — Logging Service
Ensures centralized routing, security, and controlled external access.
Streamlit Frontend Container
Provides an interactive UI for:
Entering text for sentiment analysis
Uploading CSV datasets
Viewing visual analytics and sentiment results
Communicates with backend services via REST API through NGINX.
ML (VADER) Container
Backend microservice implemented using Flask.
Performs sentiment analysis using NLTK’s VADER Lexicon.
Returns prediction scores (positive, neutral, negative) to the frontend.
Dataset Analysis Container
API microservice for processing CSV datasets:
Cleans and analyzes data with Pandas
Generates statistical insights and correlation heatmaps using Matplotlib & Seaborn
Results are sent back to Streamlit for visualization.
Logging Container
Built with Flask + SQLite3 database.
Stores logs of:
User text inputs
API interactions
Sentiment results and analysis events
Supports viewing logs for monitoring and debugging.
Final Project
Step 1:
Once both containers were ready, they were configured to communicate via Docker’s internal network. The backend service was exposed on port 8000, while the Streamlit frontend ran on port 8501.
Environment variables were used for configuration, ensuring security and flexibility.
Step 2: Using Docker Compose for Orchestration
Docker Compose made it much easier to manage the different parts of my project. With just one configuration file, I could set up all the microservices—like the Streamlit frontend, ML sentiment analysis, dataset analysis, logging, and the NGINX reverse proxy—to run together smoothly. Docker Compose created a shared network so the services could talk to each other securely, handled the order they started up, and made sure everything was connected properly. This meant I could start or stop the entire system with a single command, making deployment and updates much simpler. Overall, Docker Compose helped keep things consistent and easy to manage, saving time and reducing headaches during development.
Final Project Screenshot:
VIII. Outcomes of the Project
Successfully developed a multi-container Dockerized application integrating multiple microservices (Frontend, ML, Analysis, and Logging) orchestrated using Docker Compose.
Implemented a VADER-based sentiment analysis model capable of analyzing textual input and classifying sentiments as Positive, Negative, or Neutral.
Built a dataset analysis module that processes uploaded ZIP files, visualizes correlations, and generates heatmaps for exploratory data insights.
Designed a centralized logging service to record activities and interactions across containers for improved monitoring and debugging.
Configured NGINX as a reverse proxy and load balancer, providing a secure and efficient gateway for routing requests between the user and internal services.
Created a Streamlit-based frontend that allows users to interact seamlessly with all backend services through a unified and user-friendly web interface.
Achieved service scalability, modularity, and maintainability, enabling independent updates or debugging of each service without affecting others.
IX. Conclusion
The project demonstrates the power and flexibility of containerized microservice architectures in building scalable and modular machine learning applications. By combining Flask, FastAPI, Streamlit, and NGINX within a Docker ecosystem, the application effectively showcases how different services can communicate and operate cohesively under one framework. The implementation of sentiment analysis and dataset visualization services provides practical, real-world applications of AI, while the integration of a logging mechanism ensures transparency and system reliability. Overall, the project emphasizes modern DevOps practices and establishes a strong foundation for deploying and managing machine learning applications in distributed environments.
X. Acknowledgements
I would like to express my heartfelt gratitude to VIT Chennai for providing the infrastructure and academic environment that made this project possible. I am especially thankful to my professor, Dr. Subbulakshmi T, whose insightful guidance, encouragement, and expertise significantly contributed to the successful completion of this work. As a 5th-semester student, I undertook this entire project under the subject of Cloud Computing, and I am grateful for the opportunity to integrate theoretical knowledge with practical implementation.
XI. References
Spoken Tutorials IITB course on Docker https://spoken-tutorial.org/tutorial-search/?search_foss=Docker&search_language=English
Docker Documentation. "Get Started | Docker". Docker Inc. Available at: https://docs.docker.com/get-started/ Docker+2Docker+2
NGINX Documentation. “NGINX Reverse Proxy | NGINX Documentation.” Available at: https://docs.nginx.com/nginx/admin-guide/web-server/reverse-proxy/ docs.nginx.com
DigitalOcean. “How To Configure Nginx as a Reverse Proxy on Ubuntu 22.04”. Available at: https://www.digitalocean.com/community/tutorials/how-to-configure-nginx-as-a-reverse-proxy-on-ubuntu-22-04 DigitalOcean
Name: Aryan Agarwal
Comments
Post a Comment