GPCS (General Purpose Chat System) API is an AI powered chat system that allows users to interact with a chatbot. The chatbot is capable of answering questions, providing information, and making recommendations. The chatbot is powered by Google's Generative AI model ( Gemini-pro )
This API is built with cloud driven approach in mind, it uses AWS S3 for storing files, AWS Polly for text-to-speech, GCP Vertex AI ( Multimodal text, image, audio, video, PDF, code, and chat ) for chatbot, GCP Speech API for speech-to-text, Stripe and Razorpay for payment processing and websockets for real-time chat communication ( Between users and chatbot only )
- User authentication
- AI powered chatbot
- Voice interaction with the chatbot
- Hybrid payment system (Stripe and Razorpay)
- Real-time chat through websockets
- NodeJS ( v20.6.x or above )
- AWS S3
- AWS RDS
- AWS Polly
- AWS Keyspaces
- AWS Secrets Manager
- GCP Vertax API
- GCP Speech API
- Stripe API
- Razorpay API
- Docker ( Optional but recommended )
This diagram illustrates the deployment architecture of the GPCS API. The API deployment architecture is designed to be scalable, fault-tolerant and most importantly secure infrastructure.
To achieve this, I preferred to use AWS Elastic Container Service (ECS) with Fargate launch type. This allows me to run the API in a containerized environment without worrying about the underlying infrastructure.
The reason of choosing ECS over EC2 is because ECS is a fully managed container orchestration service that allows me to run, stop, and manage Docker containers on a cluster. It also provides features like auto-scaling, load balancing, and monitoring.
All this advantages comes with relatively less efforts to manage and worry about infra and security which helps me to be more productive.
- Clone the repository
git clone https://github.com/vinitparekh17/gpcs
- Install dependencies
yarn install
- Create a
.env
file in the root directory as per the.env.example
file
cp .env.example .env
- Start the server
yarn server-dev # For development
yarn build && yarn start # For production
docker compose -f ./docker/docker-compose.yml up
- Web-based analytics and interactive visualization platform
- Supports multiple data sources
- Provides customizable dashboards for real-time monitoring
- Open-source systems monitoring and alerting toolkit
- Collects and stores metrics as time-series data
- Supports powerful query language (PromQL)
- Lightweight log aggregation system
- Designed for cloud-native environments
- Optimized for storing and querying container logs
- Docker
- Docker Compose
- Minimum system resources:
- 4 GB RAM
- 2 CPU cores
- Install Docker
- Configure Prometheus targets
- Set up Loki log collection
- Configure Grafana data sources
- Create monitoring dashboards
Important
Make sure to have MongoDB running on your local machine or provide the
connection string in the .env
file
Also, make sure to have the
required AWS, GCP, Stripe and Razorpay credentials in the .env
file which
are required for the API to work properly.