Paying with a credit card usually comes with an online banking option, which can handle the finances management for the user. However, cash payments are trickier.
Elpaso is a containerized microservices based project destined to offer a solution to cash payments management by scanning receipts in Romanian and generating financial analysis reports and product classification.
- Scan a receipt in Romanian and generate an editable financial report thanks to data analysis and OCR processing
- Keep track of everyday's spendings, no matter the payment method
- Get a classification of products based on a romanian database built specifically for Elpaso
- Export the results as CSV, delete or add other entries
- Fast authentication with OAuth2.0
💻 Web application
- ReactJS - Javascript client side library
- Django - Django backend framework
- PostgreSQL - Relational database management system
- PgAdmin - tool used for interracting with the database
- Firebase - UI authentication library
- Google Cloud storage - OAuth 2.0 and storage platform
- OCRly - OCR API
- Certbot - Tool used for HTTPS Let’s Encrypt certificates
🚀 Deployment
- Docker - PaS for service containerization
- Docker Swarm - container orchestration tool
- Docker machine - Virtual machines management tool
- Nginx - reverse proxy
- Gluster FS - Distributed File System
- Sematext - monitoring agent
- AWS/Digital Ocean - IaaS platforms for VMs
DEMO - check it out here
Steps:
- Clone the repository on the
elpaso-compose
branch - Complete the credentials files (.env.*, firebase-config and Google_Cloud_credentials) by generating your own credentials.
- e.g. in .env.local, replace REACT_APP_FIREBASE_API_KEY=xxxx
- Install docker-compose and execute
docker-compose up
to start the containers and the services running inside them. - Connect on http://localhost:3000 and start testing!
At the moment the Orchestrated version is not deployed, due to costs generated by maintaining the VMs.
- Clone the repository on the
elpaso-swarm
branch - Configure the
docker-compose-stack.yml
file with the desired number of replicas, based on the resources you have available. - Configure several virtual machines which will be the workers and manager/s for the Swarm. This project was built with 1 manager and 2 worker nodes. Refer to this for more information on how to do so.
- Install Docker on the virtual machines and Configure the Linux Firewall for Docker Swarm
- Complete the credentials files (.env.*, firebase-config and Google_Cloud_credentials) by generating your own credentials. e.g. in .env.local, replace REACT_APP_FIREBASE_API_KEY=xxxx
- Use a file transfer protocol such as SCP to move the secret keys completed at step 5. and the reverse_proxy/ folder securely to each of the VMs.
- Associate a domain to your manager's IP using, for example, the No-IP domain and host service provider. The code is currently configured with the elpaso.zapto.org domain, therefore if possible, choose this one.
- Generate Let's Encrypt HTTPS certificates using Certbot. Refer to this for more information.
- Execute
docker stack deploy -c docker-compose-stack.yml elpaso
to start the stack and access it on the port 3000 of the domain you chose at step 6.
! In this project, Sematext was used as a monitoring agent. The configuration for it can be skipped if you do not wish to test the monitoring feature.
Files:
-
.env.dev:
- DEBUG: should never be 1 in production
- SQL_*: all the values necessary for accessing the PostgreSQL database. Check here for more information.
- GOOGLE_OAUTH2_CLIENT_ID: Create a new Google Cloud Project. Go to the APIs and Services section of products and add a new Web Client ID and API key. The Client ID that was generated will be added here.
- When creating a new Web Client, allow https://elpaso.zapto.org as a Javascript origin if you are deploying the stack, and http://localhost:8000 and http://localhost:3000 for running locally with compose.
- Similarly, allow https://elpaso.zapto.org and https://elpaso.zapto.org/login or the localhost versions for the redirect URIs.
- RAPID_API_*: Create a Sematext account and subscribe to the OCRLY API. All the information will be found in the Endpoints tab.
- BUCKET_NAME: Using the same GCP Project created above, go to Cloud Storage -> Buckets and create a new public bucket. Add the name here. ! The receipt images are stored only for <1s, enough to send a link to the external API that will do the OCR processing.
-
Google_Cloud_credentials: Go to IAM and Admin section in the Google Cloud Console and select the Cloud service accound. Go to Keys and generate a new one. A json file will be automatically downloaded. Rename it to
Google_Cloud_credentials:
. For more information, read here. -
.env.database.dev: These are values used for accessing the local PostgreSQL database via PgAdmin. The values for the USER, PASSWORD AND DB will correspond with the ones set above. For example:
- PGDATA=/var/lib/postgresql/data
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=elpaso
-
.env.local: Create a Web Firebase project with localhost/elpaso.zapto.org Authorised domains and generate the SDKs. These will provide all the information for the required fields, apart from REACT_APP_CLIENT_ID. This field corresponds to the GOOGLE_OAUTH2_CLIENT_ID set in .env.dev
-
firebase-config.json: Similar to the Google Cloud Storage generation, go to the firebase console to Settings > Service Accounts, generate a new private key and it will be automatically downloaded. Rename it to
firebase-config.json
. For more information, read here. -
.env.sematext: adding Sematext monitoring is not mandatory, especially because it is not a free service. However, if you do wish to add it refer to this tutorial. Otherwise, remove the sematext container from the docker-compose-stack.
That's all! Enjoy safely monitoring your expenses! ✨